Categories
erwin Expert Blog Data Intelligence

erwin Recognized as a March 2020 Gartner Peer Insights Customers’ Choice for Metadata Management Solutions

We’re excited about our recognition as a March 2020 Gartner Peer Insights Customers’ Choice for Metadata Management Solutions.  Our team here at erwin takes great pride in this distinction because customer feedback has always shaped our products and services.

The Gartner Peer Insights Customers’ Choice is a recognition of vendors in the metadata management solutions market by verified end-user professionals, taking into account both the number of reviews and the overall user ratings. To ensure fair evaluation, Gartner maintains rigorous criteria for recognizing vendors with a high customer satisfaction rate.

erwin’s metadata management offering, the erwin Data Intelligence Suite (erwin DI), is comprised of erwin Data Catalog (erwin DC) and erwin Data Literacy (erwin DL) with built-in automation for greater visibility, understanding and use of enterprise data.

The solutions work in tandem to automate the processes involved in harvesting, integrating, activating and governing enterprise data according to business requirements. This automation results in greater accuracy, faster analysis and better decision-making for data governance and digital transformation initiatives.

Metadata management is key to sustainable data governance and any other organizational effort that is data-driven. erwin DC automates enterprise metadata management, data mapping, data cataloging, code generation, data profiling and data lineage. erwin DL provides integrated business glossary management and self-service data discovery tools so both IT and business users can find data relevant to their roles and understand it within a business context.

Together as erwin DI, these solutions give organizations a complete and clear view of their metadata landscape, including semantic, business and technical elements.

Here are some excerpts from customers:

Everyone at erwin is honored to be named as a March 2020 Customers’ Choice for Metadata Management Solutions. To learn more about this distinction, or to read the reviews written about our products by the IT professionals who use them, please visit Customers’ Choice.

And to all of our customers who submitted reviews, thank you! We appreciate you and look forward to building on the experience that led to this distinction!

Customer input will continue to guide our technology road map and the entire customer journey. In fact, it has influenced our entire corporate direction as we expanded our focus from data modeling to enterprise modeling and data governance/intelligence.

Data underpins every type of architecture – business, technology and data – so it only makes sense that both IT and the wider enterprise collaborate to ensure it’s accurate, in context and available to the right people for the right purposes.

If you have an erwin story to share, we encourage you to join the Gartner Peer Insights crowd and weigh in.

Request a complimentary copy of the Gartner Peer Insights ‘Voice of the Customer’: Metadata Management Solutions (March 2020) report.

Gartner Peer Insights Metadata Management Solutions Report

 

The GARTNER PEER INSIGHTS CUSTOMERS’ CHOICE badge is a trademark and service mark of Gartner, Inc., and/or its affiliates, and is used herein with permission. All rights reserved. Gartner Peer Insights Customers’ Choice constitute the subjective opinions of individual end-user reviews, ratings, and data applied against a documented methodology; they neither represent the views of, nor constitute an endorsement by, Gartner or its affiliates.

 

Categories
erwin Expert Blog

Data Governance for Smart Data Distancing

Hello from my home office! I hope you and your family are staying safe, practicing social distancing, and of course, washing your hands.

These are indeed strange days. During this coronavirus emergency, we are all being deluged by data from politicians, government agencies, news outlets, social media and websites, including valid facts but also opinions and rumors.

Happily for us data geeks, the general public is being told how important our efforts and those of data scientists are to analyzing, mapping and ultimately shutting down this pandemic.

Yay, data geeks!

Unfortunately though, not all of the incoming information is of equal value, ethically sourced, rigorously prepared or even good.

As we work to protect the health and safety of those around us, we need to understand the nuances of meaning for the received information as well as the motivations of information sources to make good decisions.

On a very personal level, separating the good information from the bad becomes a matter of life and potential death. On a business level, decisions based on bad external data may have the potential to cause business failures.

In business, data is the food that feeds the body or enterprise. Better data makes the body stronger and provides a foundation for the use of analytics and data science tools to reduce errors in decision-making. Ultimately, it gives our businesses the strength to deliver better products and services to our customers.

How then, as a business, can we ensure that the data we consume is of good quality?

Distancing from Third-Party Data

Just as we are practicing social distancing in our personal lives, so too we must practice data distancing in our professional lives.

In regard to third-party data, we should ask ourselves: How was the data created? What formulas were used? Does the definition (description, classification, allowable range of values, etc.) of incoming, individual data elements match our internal definitions of those data elements?

If we reflect on the coronavirus example, we can ask: How do individual countries report their data? Do individual countries use the same testing protocols? Are infections universally defined the same way (based on widely administered tests or only hospital admissions)? Are asymptomatic infections reported? Are all countries using the same methods and formulas to collect and calculate infections, recoveries and deaths?

In our businesses, it is vital that we work to develop a deeper understanding of the sources, methods and quality of incoming third-party data. This deeper understanding will help us make better decisions about the risks and rewards of using that external data.

Data Governance Methods for Data Distancing

We’ve received lots of instructions lately about how to wash our hands to protect ourselves from coronavirus. Perhaps we thought we already knew how to wash our hands, but nonetheless, a refresher course has been worthwhile.

Similarly, perhaps we think we know how to protect our business data, but maybe a refresher would be useful here as well?

Here are a few steps you can take to protect your business:

  • Establish comprehensive third-party data sharing guidelines (for both inbound and outbound data). These guidelines should include communicating with third parties about how they make changes to collection and calculation methods.
  • Rationalize external data dictionaries to our internal data dictionaries and understand where differences occur and how we will overcome those differences.
  • Ingest to a quarantined area where it can be profiled and measured for quality, completeness, and correctness, and where necessary, cleansed.
  • Periodically review all data ingestion or data-sharing policies, processes and procedures to ensure they remain aligned to business needs and goals.
  • Establish data-sharing training programs so all data stakeholders understand associated security considerations, contextual meaning, and when and when not to share and/or ingest third-party data.

erwin Data Intelligence for Data Governance and Distancing

With solutions like those in the erwin Data Intelligence Suite (erwin DI), organizations can auto-document their metadata; classify their data with respect to privacy, contractual and regulatory requirements; attach data-sharing and management policies; and implement an appropriate level of data security.

If you believe the management of your third-party data interfaces could benefit from a review or tune-up, feel free to reach out to me and my colleagues here at erwin.

We’d be happy to provide a demo of how to use erwin DI for data distancing.

erwin Data Intelligence

Categories
erwin Expert Blog Enterprise Architecture

Enterprise Architecture Tools – Getting Started

Many organizations start an enterprise architecture practice without a specialized enterprise architecture tool.

Instead, they rely on a blend of spreadsheets, Visio diagrams, PowerPoint files and the like.

Under normal circumstances, this approach is difficult. In times of rapid change or crisis, it isn’t viable.

Four Compelling Reasons for An Enterprise Architecture Tool

Enterprise architecture (EA) provides comprehensive documentation of systems, applications, people and processes.

Prior research we conducted reveals four key drivers in the decision to adopt a dedicated enterprise architecture tool:

1) Delay Increases Difficulty.

The use of Visio, MS Office files and even with a framework like ArchiMate is a recipe for anarchy. By getting into an enterprise architecture tool early, you minimize the hurdle of moving a lot of unstructured files and disconnected diagrams to a new repository.

Rather than procrastinate in adopting an enterprise architecture tool, choose a reliable, scalable one now to eliminate the administrative hassle of keeping up with disconnected data and diagrams.

2) Are We Too Dependent on Individuals and Keeping Their Files?

Some EA practices collapse when key people change roles or leave the organization. Who last updated our PPT
for capability X? Where is the previous version of this Visio diagram?

Why does this application have three names, depending on where I look? Are we following the same model and framework, or is each team member re-inventing the wheel? Is there an easier way to collaborate?

If any of these questions sound familiar, an enterprise architecture tool is the answer. With it, your EA practice will be able to survive inevitable staffing changes and you won’t be dependendent on an individual who might become a bottleneck or a risk. You also can eliminate the scramble to keep files and tasks lists in sync.

Enterprise architecture tool

3) File-Based EA Is Not Mature, Sustainable or Scalable.

With a tool that can be updated and changed easily, you can effortlessly scale your EA activities by adding new fields, using new diagrams, etc.

For example, you could decide to slowly start using more and more of a standard enterprise architecture framework by activating different aspects of the tool over time – something incredibly difficult to do with mismatched files.

Stop running next to the bike. Get on it instead.

4) Do I Want to Be the EA Librarian or a Well-Regarded Expert?

EA experts are valuable, so their time shouldn’t be spent correcting data errors in spreadsheets, generating PowerPoint files, or manually syncing up your latest Visio file with yet another spreadsheet.

Enterprise architects should be free to focus on revealing hidden relationships, redundancies and impact analyses. In addition, they need to be able to spot opportunities, presenting roadmaps and advising management about ways to manage innovation.

With an actual enterprise architecture tool, all relevant artifacts and supporting data are accessible in a central repository. And you know what was updated and when. Generate reports on the fly in minutes, not hours or days. Combine information from Kanbans, pivot tables, diagrams and roadmaps, adding your comments and circulating to others for their input.

The Increasing Importance of Collaborative Enterprise Architecture

In addition to its traditional role of IT governance, EA has become increasingly relevant to the wider business. In fact, Gartner says EA is becoming a “form of internal management consulting” because it provides relevant, timely insights management needs to make decisions.

While basic visualization tools and spreadsheets can and have been used, they are limited.

Generic solutions require makeshift collaborative efforts, like sharing PDF files and notes via email. When working remotely, this approach causes significant bottlenecks.

Even before the Covid-19 crisis, this sort of collaboration was becoming more difficult, as an increasing number of organizations become decentralized.

So the collaboration required to methodically and continuously measure and maintain models, frameworks and concepts as they evolve, was hindered.

That’s why enterprise architecture management is more strategic and impactful when powered by technology to centrally document and visualize EA artifacts for better decision-making, which is crucial right now.

erwin Evolve is purpose-built for strategic planning, what-if scenarios, and as-is/to-be modeling and its associated impacts.

Collaboration features are built into the tool enabling IT and business stakeholders to create, edit and collaborate on diagrams through a user-friendly interface.

With erwin Evolve, organizations can encourage the wider business to easily participate in EA/BP modeling, planning, design and deployment for a more complete perspective.

It also provides a central repository of key processes, the systems that support them, and the business continuity plans for every working environment so employees have access to the knowledge they need to operate in a clear and defined way under normal circumstances or times of crisis.

You can try erwin Evolve for yourself and keep any content you produce should you decide to buy.

Covid-19 business resources

Categories
erwin Expert Blog

Data Intelligence and Its Role in Combating Covid-19

Data intelligence has a critical role to play in the supercomputing battle against Covid-19.

Last week, The White House announced the launch of the COVID-19 High Performance Computing Consortium, a public-private partnership to provide COVID-19 researchers worldwide with access to the world’s most powerful high performance computing resources that can significantly advance the pace of scientific discovery in the fight to stop the virus.

Rensselaer Polytechnic Institute (RPI) is one of the organizations that has joined the consortium to provide computing resources to help fight the pandemic.

Data Intelligence COVID-19

While leveraging supercomputing power is a tremendous asset in our fight to combat this global pandemic, in order to deliver life-saving insights, you really have to understand what data you have and where it came from. Answering these questions is at the heart of data intelligence.

Managing and Governing Data From Lots of Disparate Sources

Collecting and managing data from many disparate sources for the Covid-19 High Performance Computing Consortium is on a scale beyond comprehension and, quite frankly, it boggles the mind to even think about it.

To feed the supercomputers with epidemiological data, the information will flow-in from many different and heavily regulated data sources, including population health, demographics, outbreak hotspots and economic impacts.

This data will be collected from organizations such as, the World Health Organization (WHO), the Centers for Disease Control (CDC), and state and local governments across the globe.

Privately it will come from hospitals, labs, pharmaceutical companies, doctors and private health insurers. It also will come from HL7 hospital data, claims administration systems, care management systems, the Medicaid Management Information System, etc.

These numerous data types and data sources most definitely weren’t designed to work together. As a result, the data may be compromised, rendering faulty analyses and insights.

To marry the epidemiological data to the population data it will require a tremendous amount of data intelligence about the:

  • Source of the data;
  • Currency of the data;
  • Quality of the data; and
  • How it can be used from an interoperability standpoint.

To do this, the consortium will need the ability to automatically scan and catalog the data sources and apply strict data governance and quality practices.

Unraveling Data Complexities with Metadata Management

Collecting and understanding this vast amount of epidemiological data in the fight against Covid-19 will require data governance oversite and data intelligence to unravel the complexities of the underlying data sources. To be successful and generate quality results, this consortium will need to adhere to strict disciplines around managing the data that comes into the study.

Metadata management will be critical to the process for cataloging data via automated scans. Essentially, metadata management is the administration of data that describes other data, with an emphasis on associations and lineage. It involves establishing policies and processes to ensure information can be integrated, accessed, shared, linked, analyzed and maintained.

While supercomputing can be used to process incredible amounts of data, a comprehensive data governance strategy plus technology will enable the consortium to determine master data sets, discover the impact of potential glossary changes, audit and score adherence to rules and data quality, discover risks, and appropriately apply security to data flows, as well as publish data to the right people.

Metadata management delivers the following capabilities, which are essential in building an automated, real-time, high-quality data pipeline:

  • Reference data management for capturing and harmonizing shared reference data domains
  • Data profiling for data assessment, metadata discovery and data validation
  • Data quality management for data validation and assurance
  • Data mapping management to capture the data flows, reconstruct data pipelines, and visualize data lineage
  • Data lineage to support impact analysis
  • Data pipeline automation to help develop and implement new data pipelines
  • Data cataloging to capture object metadata for identified data assets
  • Data discovery facilitated via a shared environment allowing data consumers to understand the use of data from a wide array of sources

Supercomputing will be very powerful in helping fight the COVID-19 virus. However, data scientists need access to quality data harvested from many disparate data sources that weren’t designed to work together to deliver critical insights and actionable intelligence.

Automated metadata harvesting, data cataloging, data mapping and data lineage combined with integrated business glossary management and self-service data discovery can give this important consortium data asset visibility and context so they have the relevant information they need to help us stop this virus effecting all of us around the globe.

To learn more about more about metadata management capabilities, download this white paper, Metadata Management: The Hero in Unleashing Enterprise Data’s Value.

COVID-19 Resources

Categories
erwin Expert Blog

Talk Data to Me: Why Employee Data Literacy Matters  

Organizations are flooded with data, so they’re scrambling to find ways to derive meaningful insights from it – and then act on them to improve the bottom line.

In today’s data-driven business, enabling employees to access and understand the data that’s relevant to their roles allows them to use data and put those insights into action. To do this, employees need to “talk data,” aka data literacy.

However, Gartner predicts that this year 50 percent of organizations will lack sufficient AI and data literacy skills to achieve business value. This requires organizations to invest in ensuring their employees are data literate.

Data Literacy & the Rise of the Citizen Analyst

According to Gartner, “data literacy is the ability to read, write and communicate data in context, including an understanding of data sources and constructs, analytical methods and techniques applied — and the ability to describe the use case, application and resulting value.”

Today, your employees are essentially data consumers. There are three technological advances driving this data consumption and, in turn, the ability for employees to leverage this data to deliver business value 1) exploding data production 2) scalable big data computation, and 3) the accessibility of advanced analytics, machine learning (ML) and artificial intelligence (AI).

The confluence of this data explosion has created a fertile environment for data innovation and transformation. As a result, we’re seeing the rise of the “citizen analyst,” who brings business knowledge and subject-matter expertise to data-driven insights.

Some examples of citizen analysts include the VP of finance who may be looking for opportunities to optimize the top- and bottom-line results for growth and profitability. Or the product line manager who wants to understand enterprise impact of pricing changes.

David Loshin explores this concept in an erwin-sponsored whitepaper, Data Intelligence: Empowering the Citizen Analyst with Democratized Data.

In the whitepaper he states, the priority of the citizen analyst is straightforward: find the right data to develop reports and analyses that support a larger business case. However, some practical data management issues contribute to a growing need for enterprise data governance, including:

  • Increasing data volumes that challenge the traditional enterprise’s ability to store, manage and ultimately find data
  • Increased data variety, balancing structured, semi-structured and unstructured data, as well as data originating from a widening array of external sources
  • Reducing the IT bottleneck that creates barriers to data accessibility
  • Desire for self-service to free the data consumers from strict predefined data transformations and organizations
  • Hybrid on-premises/cloud environments that complicate data integration and preparation
  • Privacy and data protection laws from many countries that influence the ways data assets may be accessed and used

Data Democratization Requires Data Intelligence

According to Loshin, organizations need to empower their citizen analysts. A fundamental component of data literacy involves data democratization, sharing data assets with a broad set of data consumer communities in a governed way.

  • The objectives of governed data democratization include:
  • Raising data awareness
  • Improving data literacy
  • Supporting observance of data policies to support regulatory compliance
  • Simplifying data accessibility and use

Effective data democratization requires data intelligence. This is dependent on accumulating, documenting and publishing information about the data assets used across the entire enterprise data landscape.

Here are the steps to effective data intelligence:

  • Reconnaissance: Understanding the data environment and the corresponding business contexts and collecting as much information as possible
  • Surveillance: Monitoring the environment for changes to data sources
  • Logistics and Planning: Mapping the collected information production flows and mapping how data moves across the enterprise
  • Impact Assessment: Using what you have learned to assess how external changes impact the environment
  • Synthesis: Empowering data consumers by providing a holistic perspective associated with specific business terms
  • Sustainability: Embracing automation to always provide up-to-date and correct intelligence
  • Auditability: Providing oversight and being able to explain what you have learned and why

Data Literacy: The Heart of Data-Driven Innovation

Data literacy is at the heart of successful data-driven innovation and accelerating the realization of actionable data-driven insights.

It can reduce data source discovery and analyses cycles, improve accuracy in results, reduce the reliance expensive technical resources, assure the “right” data is used the first time reducing deployed errors and the need for expensive re-work.

Ultimately, a successful data literacy program will empower your employees to:

  • Better understand and identify the data they require
  • Be more self-sufficient in accessing and preparing the data they require
  • Better articulate the gaps that exist in the data landscape when it comes to fulfilling their data needs
  • Share their knowledge and experience with data with other consumers to contribute to the greater good
  • Collaborate more effectively with their partners in data (management and governance) for greater efficiency and higher quality outcomes

erwin offers a data intelligence software suite combining the capabilities of erwin Data Catalog with erwin Data Literacy to fuel an automated, real-time, high-quality data pipeline.

Then all enterprise stakeholders – data scientists, data stewards, ETL developers, enterprise architects, business analysts, compliance officers, citizen analysts, CDOs and CEOs – can access data relevant to their roles for insights they can put into action.

Click here to request a demo of erwin Data Intelligence.

erwin Data Intelligence

Categories
erwin Expert Blog

Business Process Modeling Use Case: Disaster Recovery

In these challenging times, many of our customers are focused on disaster recovery and business contingency planning.

Disaster recovery is not just an event but an entire process defined as identifying, preventing and restoring a loss of technology involving a high-availability, high-value asset in which services and data are in serious jeopardy.

Technical teams charged with maintaining and executing these processes require detailed tasks, and business process modeling is integral to their documentation.

erwin’s Evolve software is integral to modeling process flow requirements, but what about the technology side of the equation? What questions need answering regarding planning and executing disaster recovery measures?

  • Consumers and Dependencies: Who will be affected if an asset goes offline and for how long? How will consumer downtime adversely affect finances? What are the effects on systems if a dependent system crashes?
  • Interconnectivity: How are systems within the same ecosystem tied together, and what happens if one fails?
  • Hardware and Software: Which assets are at risk in the event of an outage? How does everything tie together if there is a break point?
  • Responsibility: Who are the technical and business owners of servers and enterprise applications? What are their roles in the case of a disastrous event?
  • Fail-Over: What exactly happens when a device fails? How long before the fail-over occurs, and which assets will activate in its place?

The erwin disaster recovery model answers these questions by capturing and displaying the relevant data. That data is then used to automatically render simple drawings that display either a current or target state for disaster recovery analysis.

Reports can be generated to gather more in-depth information. Other drawings can be rendered to show flow, plus how a break in the flow will affect other systems.

erwin Rapid Response Resource Center (ERRRC)

So what does an erwin disaster recovery model show?

The erwin model uses a layered ecosystem approach. We first define a company’s logical application ecosystems, which house tightly-coupled technologies and software.

  • For example, a company may have an erwin ecosystem deployed, which consists of various layers. A presentation layer will include web-based products, application layers holding the client software, data layers hosting the databases, etc.
  • Each layer is home to a deployment node, which is home to servers, datastores and software. Each node typically will contain a software component and its hosting server.
  • There are both production nodes and disaster recovery nodes.

Our diagrams and data provide answers such as:

  • Which production servers fail over to which disaster recovery servers
  • What effects an outage will have on dependent systems
  • Downtime metrics, including lost revenue and resources required for restoration
  • Hosting information that provides a detailed view of exactly what software is installed on which servers
  • Technology ownership, including both business and technology owners

The attached diagram is a server-to-server view designed to verify that the correct production to disaster recovery relationships exist (example: “prod fails over to DR”).  It also is used to identify gaps in case there are no DR servers in deployment (example: we filter for “deployed” servers only).

Other views can be generated to show business and technology owners, software, databases, etc.  They all are tied to the deployment nodes, which can be configured for various views. Detailed reports with server IP addresses, technical owners, software instances, and internal and external dependencies also can be generated.

You can try erwin Evolve for yourself and keep any content you produce should you decide to buy.

Our solution strategists and business process consultants also are available to help answer questions about your disaster recovery process modeling needs.

business process disaster recovery

Categories
erwin Expert Blog Enterprise Architecture

Types of Enterprise Architecture Frameworks: ArchiMate, TOGAF, DoDAF and more

In enterprise architecture, there are a number of different types of enterprise architecture frameworks, tailored to meet specific business and/or industry needs.

What is an Enterprise Architecture Framework?

An enterprise architecture framework is a standardized methodology that organizations use to create, describe and change their enterprise architectures.

Enterprise architecture (EA) itself describes the blueprint and structure of an organization’s systems and assets. It’s needed to make informed changes that help bridge the gap between the enterprise architecture’s current and desired future state.

Just like any building or infrastructure project, EA has different stakeholders and plan views.

You wouldn’t build a house without understanding the building’s architecture, plumbing, electrical and ground plans all within the context of each other.

So enterprise architecture provides the plans for different views of the enterprise, and EA frameworks describe the standard views an organization can expect to see.

What Makes Up An Enterprise Architecture Framework?

The EA discipline views an organization as having complex and intertwined systems. Effective management of such complexity and scale requires tools and approaches that architects can use.

An enterprise architecture framework provides the tools and approaches to abstract this information to a level of detail that is manageable. It helps bring enterprise design tasks into focus and produces valuable architecture documentation.

The components of an enterprise architecture framework provide structured guidance for four main areas:

1. Architecture description – How to document the enterprise as a system from different viewpoints

Each view describes one domain of the architecture; it includes those meta-types and associations that address particular concerns of interest to particular stakeholders; it may take the form of a list, a table, a chart, a diagram or a higher level composite of such.

2. Architecture notation – How to visualize the enterprise in a standard manner

Each view can be represented by a standard depiction that is understandable and communicable to all stakeholders. One such notation is ArchiMate from The Open Group.

3. Design method – The processes that architects follow

Usually, an overarching enterprise architecture process, composed of phases, breaks into lower-level processes composed of finer grained activities.

A process is defined by its objectives, inputs, phases (steps or activities) and outputs. Approaches, techniques, tools, principles, rules and practices may support it. Agile architecture is one set of supporting techniques.

4. Team organization – The guidance on the team structure, governance, skills, experience and training needed

Kanban boards and agile architecture can help provide team structure, governance and best practices.

Types of Enterprise Architecture Frameworks

There are a number of different types of enterprise architecture frameworks. Here are some of the most popular:

ArchiMate

An Open Group architecture framework this is widely used and includes a notation for visualizing architecture. It may be used in conjunction with TOGAF.

TOGAF

The Open Group Architecture Framework that is widely used and includes an architectural development method and standards for describing various types of architecture.

DODAF

The Department of Defense Architecture Framework that is the standard for defense architectures in the United States.

MODAF

The Ministry of Defense Architecture Framework that is the standard for defense architectures in the United Kingdom.

NAF

The NATO Architecture Framework that is the standard adopted by NATO allies.

FEAF

A Federal Enterprise Architecture Framework issued by the U.S. CIO Council. FEA, the Federal Enterprise Architecture, provides guidance on categorizing and grouping IT investments as issued by the U.S. Office of Management and Budget.

Zachman Framework

A classification scheme for EA artifacts launched in the early 1980s by John Zachman, who is considered the father of EA.

TM FORUM

Telemanagement Forum is the standard reference mode for telecommunication companies.

Enterprise architecutre frameworks: The Zachman Framework

What’s the Best Enterprise Architecture Framework?

Although this might be somewhat of a non-answer, it’s the only one that rings true: the best enterprise architecture framework is the one that’s most relevant to your organization, and what you’re trying to achieve.

Each different type of enterprise architecture framework has its particular benefits and focus. For example, there are types of enterprise architecture frameworks best suited for organizations concerned with defense.

Having a good understanding of what the different types of EA framework are, can help an organization better understand better understand which EA framework to apply.

Ultimately, organizations will benefit most, from an enterprise architecture management system (EAMS) that supports multiple EA frameworks. This way, the most relevant enterprise architecture framework is always available.

How to Implement an Enterprise Architecture Framework

So you’ve established you need an enterprise architecture framework and assessed the different types of enterprise architecture frameworks, but how should you go about implementing and managing your chosen framework?

The answer? Using an enterprise architecture management suite (EAMS).

An EAMS is used to facilitate the management of an organization’s EA. It adds uniformity and structure, whereas many organizations had previously taken an ad-hoc approach.

And enterprise architecture tools are becoming increasingly important.

Thanks to the rate of digital transformation and the increasing abundance of data organizations have to manage, organizations need more mature, formal approaches to enterprise architecture.

Organization’s seeking to introduce an EAMS, should evaluate which frameworks the technology supports.

With erwin Evolve, users can expect a wide range of support for different types of enterprise architecture frameworks among other benefits, such as:

  • Remote collaboration
  • High-performance, scalable and centralized repository
  • Ability to harmonize EA and business process use cases, with a robust, flexible and Web-based modeling and diagramming interface

erwin Evolve was included in Forrester’s “Now Tech: Enterprise Architecture Management Suites for Q1 2020” report.

To understand why erwin excels in the large vendor category, you can see for yourself by starting a free trial of erwin’s Enterprise Architecture & Business Process Modeling Software.

EA Tool with support for enterprise architecture frameworks

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

Categories
erwin Expert Blog

Using Enterprise Architecture, Data Modeling & Data Governance for Rapid Crisis Response

Because of the coronavirus pandemic, organizations across the globe are changing how they operate.

Teams need to urgently respond to everything from massive changes in workforce access and management to what-if planning for a variety of grim scenarios, in addition to building and documenting new applications and providing fast, accurate access to data for smart decision-making.

We want to help, so we built the erwin Rapid Response Resource Center. It provides free access to videos, webinars, courseware, simulations, frameworks and expert strategic advice leveraging the erwin EDGE platform for rapid response transformation during the COVID-19 crisis.

How can the erwin EDGE platform help? Here are a few examples specific to enterprise architecture and business process modeling, data modeling and data governance.

Enterprise Architecture & Business Process Modeling

In the face of rapid change, your organization needs to move fast to support the business in a way that provides comprehensive documentation of systems, applications, people and processes. Even though we face a new reality that requires flexibility, the business still has to run with order, documentation and traceability for compliance purposes.

erwin Evolve is purpose-built for these situations and can be used for strategic planning, what-if scenarios, as-is/to-be modeling and its associated impacts and more.

Agility and remote working requires a supporting infrastructure and full documentation. No matter what your role in the company, you need access to the processes you support and the details you most need to get your job done.

erwin Evolve is an enterprise architecture tool that provides a central repository of key processes, the systems that support them, and the business continuity plans for every working environment. This gives all your employees the access and knowledge to operate in a clear and defined way.

Data Modeling

Companies everywhere are building innovative business applications to support their customers, partners and employees in this time of need. But even with the “need for speed” to market, new applications must be modeled and documented for compliance and transparency. Building in the cloud? No problem.

erwin Data Modeler can help you find, visualize, design, deploy and standardize high-quality enterprise data assets. And it’s intuitive, so you can get new modelers up and running quickly as you scale to address this new business reality.

Data Governance

In times of crisis, knowledge is power and nothing fuels decision-making better than your enterprise data. Your data scientists need access to quality data harvested from every data source in your organization to deliver insights and actionable intelligence.

erwin Data Catalog and erwin Data Literacy work in tandem as the erwin Data Intelligence Suite to support data governance and any other data-driven initiative.

Automated metadata harvesting, data cataloging, data mapping and data lineage combined with integrated business glossary management and self-service data discovery gives data scientists and all stakeholders data asset visibility and context so they have the relevant information they need to do their jobs effectively.

Rapid Crisis Response

We stand ready to help with the tools and intelligence you need to navigate these unusual circumstances.

Click here to request access to the erwin Rapid Response Resource Center (ERRRC).

Questions for our experts? You can email them here.

We’ll continue to add content to the ERRRC over the coming days and weeks. How can erwin best help you through these challenging times? Email me directly and let me know.

erwin Rapid Response Resource Center (ERRRC)

Categories
erwin Expert Blog

Automation Gives DevOps More Horsepower

Almost 70 percent of CEOs say they expect their companies to change their business models in the next three years, and 62 percent report they have management initiatives or transformation programs underway to make their businesses more digital, according to Gartner.

Wouldn’t it be advantageous for these organizations to accelerate these digital transformation efforts? They have that option with automation, shifting DevOps away from dependence on manual processes. Just like with cars, more horsepower in DevOps translates to greater speed.

DevOps Automation

Doing More with Less

We have clients looking to do more with existing resources, and others looking to reduce full-time employee count on their DevOps teams. With metadata-driven automation, many DevOps processes can be automated, adding more “horsepower” to increase their speed and accuracy. For example:

Auto-documentation of data mappings and lineage: By using data harvesting templates, organizations can eliminate time spent updating and maintaining data mappings, creating them directly from code written by the ETL staff. Such automation can save close to 100 percent of the time usually spent on this type of documentation.

  • Data lineage and impact analysis views for ‘data in motion’ also stay up to date with no additional effort.
  • Human errors are eliminated, leading to higher quality documentation and output.

Automatic updates/changes reflected throughout each release cycle: Updates can be picked up and the ETL job/package generated with 100-percent accuracy. An ETL developer is not required to ‘hand code’ mappings from a spreadsheet – greatly reducing the time spent on the ETL process, and perhaps the total number of resources required to manage that process month over month.

  • ETL skills are still necessary for validation and to compile and execute the automated jobs, but the overall quality of these jobs (machine-generated code) will be much higher, also eliminating churn and rework.

Auto-scanning of source and target data assets with synchronized mappings: This automation eliminates the need for a resource or several resources dealing with manual updates to the design mappings, creating additional time savings and cost reductions associated with data preparation.

  • A change in the source-column header may impact 1,500 design mappings. Managed manually, this process – opening the mapping document, making the change, saving the file with a new version, and placing it into a shared folder for development – could take an analyst several days. But synchronization instantly updates the mappings, correctly versioned, and can be picked up and packaged into an ETL job/package within the same hour. Whether using agile or classic waterfall development, these processes will see exponential improvement and time reduction. 

Data Intelligence: Speed and Quality Without Compromise

Our clients often understand that incredible DevOps improvements are possible, but they fear the “work” it will take to get there.

It really comes down to deciding to embrace change a la automation or continue down the same path. But isn’t the definition of insanity doing the same thing over and over, expecting but never realizing different results?

With traditional means, you may improve speed but sacrifice quality. On the flipside, you may improve quality but sacrifice speed.

However, erwin’s technology shifts this paradigm. You can have both speed and quality.

The erwin Data Intelligence Suite (erwin DI) combines the capabilities of erwin Data Catalog with erwin Data Literacy to fuel an automated, real-time, high-quality data pipeline.

Then all enterprise stakeholders – data scientists, data stewards, ETL developers, enterprise architects, business analysts, compliance officers, CDOs and CEOs – can access data relevant to their roles for insights they can put into action.

It creates the fastest path to value, with an automation framework and metadata connectors configured by our team to deliver the data harvesting and preparation features that make capturing enterprise data assets fast and accurate.

Click here to request a free demo of erwin DI.

erwin Data Intelligence

Categories
erwin Expert Blog

Takeaways from Forrester’s Latest Report on Enterprise Architecture Management Suites

Forrester recently released its “Now Tech: Enterprise Architecture Management Suites for Q1 2020” to give organizations an enterprise architecture (EA) playbook.

It also highlights select enterprise architecture management suite (EAMS) vendors based on size and functionality, including erwin.

The report notes six primary EA competencies in which we excel in the large vendor category: modeling, strategy translation, risk management, financial management, insights and change management.

Given our EA expertise, we thought we’d provide our perspective on the report’s key takeaways and how we see technology trends, business innovation and compliance driving companies to use EA in different ways.

Enterprise Architecture Management Systems (EAMS)

Improve Enterprise Architecture with EAMS

To an EA professional, it may seem obvious that tools provide “a holistic view of business demand impact.” Delivery of innovation at speed is critical, but what does that really mean?

Not only should EA be easy to adopt and roll out, artifacts should be easy to visualize quickly and effectively by various stakeholders in the format they need to make decisions rapidly.

For “EA stakeholders to be more productive and effective,” not only is a central repository a necessity but collaboration and a persona-driven approach also are critical to the organization’s adoption of EA.

Just as an ERP system is a fundamental part of business operations, so is an enterprise architecture management suite. It’s a living, breathing tool that feeds into and off of the other physical repositories in the organization, such as ServiceNow for CMDB assets, RSA Archer for risk logs, and Oracle NetSuite and Salesforce for financials.

Being able to connect the enterprise architecture management suites to your business operating model will give you “real-time insights into strategy and operations.”

And you can further prove the value of EA with integrations to your data catalog and business glossary with real-time insights into the organization’s entire data landscape.

enterprise architecture innovation management

Select Enterprise Architecture Vendors Based on Size and Functionality

EA has re-emerged to help solve compliance challenges in banking and finance plus drive innovation with artificial intelligence (AI), machine learning (ML) and robotic automation in pharmaceuticals.

These are large organizations with significant challenges, which require an EA vendor to invest in research and development to innovate across their offerings so EA can become a fundamental part of an organization’s operating model.

We see the need for a “proprietary product platform” in the next generation of EA, so customers can create their own products and services to meet their particular business needs.

They’re looking for product management, dev/ops, security modeling, personas and portfolio management all to be part of an integrated EA platform. In addition, customers want to ensure platforms are secure with sound coding practices and testing.

Determine the Key Enterprise Architecture Capabilities Needed

With more than 20 years of EA experience, erwin has seen a lot of changes in the market, many in the last 24 months. Guess what? This evolution isn’t slowing down.

We’re working with some of the world’s largest companies (and some smaller ones too) as they try to manage change in their respective industries and organizations.

Yesterday’s use case may not serve tomorrow’s use case. An EA solution should be agile enough to meet both short-term and long-term needs.

Use EA Performance Measures to Validate Enterprise Architecture Management Suite Value

EA should provide a strong ROI and help an organization derive value and successful business outcomes.

Additionally, a persona-based approach that involves configuring the user interface and experience to suit stakeholder needs eases the need for training.

Formalized training is important for EA professionals and some stakeholders, and the user interface and experience should reduce the need for a dedicated formal training program for those deriving value out of EA.

Why erwin for Enterprise Architecture?

Whether documenting systems and technology, designing processes and value streams, or managing innovation and change, organizations need flexible but powerful enterprise architecture tools they can rely on for collecting the relevant information for decision-making.

Like constructing a building or even a city – you need a blueprint to understand what goes where, how everything fits together to support the structure, where you have room to grow, and if it will be feasible to knock down any walls if you need to.

Without a picture of what’s what and the interdependencies, your enterprise can’t make changes at speed and scale to serve its needs.

erwin Evolve is a full-featured, configurable set of enterprise architecture tools, in addition to business process modeling and analysis.

The combined solution enables organizations to map IT capabilities to the business functions they support and determine how people, processes, data, technologies and applications interact to ensure alignment in achieving enterprise objectives.

See for yourself why we were included in the latest Forrester EAMS report. We’re pleased to offer you a free trial of erwin Evolve.enterprise architecture business process