Categories
erwin Expert Blog Data Modeling

How to Do Data Modeling the Right Way

Data modeling supports collaboration among business stakeholders – with different job roles and skills – to coordinate with business objectives.

Data resides everywhere in a business, on-premise and in private or public clouds. And it exists across these hybrid architectures in different formats: big and unstructured and traditional structured business data may physically sit in different places.

What’s desperately needed is a way to understand the relationships and interconnections between so many entities in data sets in detail.

Visualizing data from anywhere defined by its context and definition in a central model repository, as well as the rules for governing the use of those data elements, unifies enterprise data management. A single source of data truth helps companies begin to leverage data as a strategic asset.

What, then, should users look for in a data modeling product to support their governance/intelligence requirements in the data-driven enterprise?

Data Modeling

Nine Steps to Data Modeling

  1. Provide metadata and schema visualization regardless of where data is stored

Data modeling solutions need to account for metadata and schema visualization to mitigate complexity and increase collaboration and literacy across a broad range of data stakeholders. They should automatically generate data models, providing a simple, graphical display to visualize a wide range of enterprise data sources based on a common repository of standard data assets through a single interface.

  1. Have a process and mechanism to capture, document and integrate business and semantic metadata for data sources

As the best way to view metadata to support data governance and intelligence, data models can depict the metadata content for a data catalog. A data modeling solution should make it possible for business and semantic metadata to be created to augment physical data for ingestion into a data catalog, which provides a mechanism for IT and business users to make use of the metadata or data structures underpinning source systems.

High-functioning data catalogs will provide a technical view of information flow as well as deeper insights into semantic lineage – that is, how the data asset metadata maps to corresponding business usage tables.

Data stewards can associate business glossary terms, data element definitions, data models and other semantic details with different mappings, drawing upon visualizations that demonstrate where business terms are in use, how they are mapped to different data elements in different systems and the relationships among these different usage points.

  1. Create database designs from visual models

Time is saved and errors are reduced when visual data models are available for use in translating the high-quality data sources that populate them into new relational and non-relational database design, standardization, deployment and maintenance.

  1. Reverse engineer databases into data models

Ideally a solution will let users create a logical and physical data model by adroitly extracting information from an existing data source – ERP, CRM or other enterprise application — and choosing the objects to use in the model.

This can be employed to translate the technical formats of the major database platforms into detailed physical entity-relationship models rich in business and semantic metadata that visualizes and diagrams the complex database objects.

Database code reverse-engineering, integrated development environment connections and model exchange will ensure efficiency, effectiveness and consistency in the design, standardization, documentation and deployment of data structures for comprehensive enterprise database management. Also helpful is if the offline reverse-engineering process is automated so that modelers can focus on other high-value tasks.

  1. Harness model reusability and design standards

When data modelers can take advantage of intuitive graphical interfaces, they’ll have an easier time viewing data from anywhere in context or meaning and relationships support of artifact reuse for large-scale data integration, master data management, big data and business intelligence/analytics initiatives.

It’s typically the case that modelers will want to create models containing reusable objects such as modeling templates, entities, tables, domains, automation macros. naming and database standards, formatting options, and so on.

The ability to modify the way data types are mapped for specific DBMS data types and to create reusable design standards across the business should be fostered through customizable functionality. Reuse serves to help lower the costs of development and maintenance and ensure data quality for governance requirements.

Additionally, templates should be available to help enable standardization and reuse while accelerating the development and maintenance of models. Standardization and reuse of models across data management environments will be possible when there is support for model exchange.

Consistency and reuse are more efficient when model development and assets are centralized. That makes it easier to publish models across various stakeholders and incorporate comments and changes from them as necessary.

  1. Enable user configuration and point-and-click report interfaces

A key part of data modeling is to create text-based reports for diagrams and metadata via a number of formats – HTML, PDF and CSV. By taking the approach of using point-and-click interfaces, a solution can make it easier to create detailed metadata reports of models and drill down into granular graphical views of reports that are inclusive of object types-tables, UDPS and more.

The process is made even more simple when users can take advantage of out-of-the-box reports that are pertinent to their needs as well as create them for individual models or across multiple models.

When generic ODBC interfaces are included, options grow for querying metadata, regardless of where it is sourced, from a variety of tools and interfaces.

  1. Support an all-inclusive environment of collaboration

When solutions focus on model management in a centralized repository, modular and bidirectional collaboration services are empowered across all data generators – human or machine—and stewards and consumers across the enterprise.

Data siloes, of course, are the enemies of data governance. They make it difficult to have a clear understanding of where information resides and how data is commonly defined.

It’s far better to centralize and manage access to ordered assets – whether by particular internal staff roles or to business partners granted role-based and read-only access – to maintain security.

Such an approach supports coordinated version control, model change management and conflict resolution and seeds cross-model impact analysis across stakeholders. Modeler productivity and independence can be enhanced, too.

  1. Promote data literacy

Stakeholder collaboration, in fact, depends on and is optimized by data literacy, the key to creating an organization that is fluent in the language of data. Everyone in the enterprise – from data scientists to ETL developers to compliance officers to C-level executives – ought to be assured of having a dynamic view of high-quality data pipelines operating on common and standardized terms.

So, it is critical that solutions focus on making the pipeline data available and discoverable in such a way that it reflects different user roles. When consumers can view data relevant to their roles and understand its definition within the business context in which they operate, their ability to produce accurate, actionable insights and collaborate across the enterprise to enact them for the desired outcomes is enhanced.

Data literacy built on business glossaries that enable the collaborative definition of enterprise data in business terms and rules for built-in accountability and workflow promote adherence to governance requirements.

  1. Embed data governance constructs within data models

Data governance should be integrated throughout the data modeling process. It manifests in a solution’s ability to adroitly discover and document any data from anywhere for consistency, clarity and artifact reuse across large-scale data integration, master data management, metadata management and big data requirements.

Data catalogs and business glossaries with properly defined data definitions in a controlled central repository are the result of ingesting metadata from data models for business intelligence and analytics initiatives.

You Don’t Know What You’ve Got

Bottom line, without centralized data models and a metadata hub, there is no efficient means to comply with industry regulations and business standards regarding security and privacy; set permissions for access controls; and consolidate information in easy-to-understand reports for business analysts.

The value of participating in data modeling to classify the data that is most important to the business in terms that are meaningful to the business and having a breakdown of complex data organization scenarios supports critical business reporting, intelligence and analytics tasks. That’s a clear need, as organizations today analyze and use less than 0.5 percent of the information they take in – a huge loss of potential value in the age of data-driven business.

Without illustrative data models businesses may not even realize that they already have the data needed for a new report, and time is lost and costs increase as data is gathered and interfaces are rebuilt.

To learn more about data modeling and its role in the enterprise, join us for our upcoming webinar, Data Modeling Is Cool, Seriously.

Data Modeling Is Cool, Seriously

Categories
erwin Expert Blog Enterprise Architecture

What Is the Analytic Hierarchy Process?

The analytic hierarchy process (AHP) or pairwise comparison is a framework for decision making, rooted in mathematics and psychology. The widely used technique was created by Thomas L. Saaty, a Distinguished University Professor at the University of Pittsburgh, in the 1970s.

Saaty recognized that making decisions is complicated. Knowing how to make the “right” decision is even harder. The AHP is applied as an attempt to introduce structure into the organization and analysis of complex decisions.

The aim of the analytic hierarchy process is not to provide just one unique, correct decision.  Rather, the AHP is a framework for structuring a problem (based around decision-making), relating it to overall goals and providing solution alternatives by allowing a group to map a decision to a particular goal or objective.

When applied correctly, the end result should be a rational decision based on the organization’s goals, and any concept can be ranked and compared within the context of a suitable goal.

The Analytic Hierarchy Process in Practice

The application of AHP begins with users decomposing their decision problem into hierarchy sub-criteria, each of which can be analyzed separately.

The concepts within the hierarchy can relate to any aspect of the decision problem; tangible or intangible, carefully measured or roughly estimated, well or poorly understood—anything at all that applies to the decision at hand.

Once the AHP is built, the decision makers (users) systematically evaluate its various elements by comparing them to one another two at a time (Pairwise comparison), with respect to their impact on a concept (criteria) above them in the hierarchy.

In making the comparisons, the decision makers typically use their judgments about the elements’ relative meaning and importance.

It is the fundamental foundation of the AHP technique that human decisions, and not just the underlying data, can be used in performing the evaluations.

Analytic Hierarchy Process/Pairwise Comparison

It is typical that a question is defined at the criteria level of the hierarchy to guide the decision maker in making the qualitative assessment between the two concepts.

For example, “Which idea helps adoption for driving mobile, cloud and social in our company?”

The AHP framework provides a numerical value for each set of concepts that are part of a pairwise comparison. This technique allows diverse and often incommensurable elements to be compared to one another in a rational and consistent way.

This capability distinguishes the AHP from other decision-making techniques.

Once all concepts have been compared, AHP provides an overall ranking for each concept with the context of the entire problem, for each of the decision alternatives. These numbers represent the alternatives’ relative ability to achieve the goal, so they allow a straightforward consideration of the various courses of action.

Although it has benefits for team decision-making, it can be used by individuals working on straightforward decisions. However, the analytic hierarchy process (AHP) is definitely more beneficial where communities of people are working on complex problems.

It has unique advantages when important elements of the decision are difficult to quantify or compare, or where communication among team members is impeded by their different specializations, terminologies, or perspectives.

Some possible ways AHP can be applied to decision making:

Choice: The selection of one alternative from a given set of alternatives, usually where there are multiple decision criteria involved.

Ranking: Putting a set of alternatives in order from most to least desirable

Prioritization: Determining the relative merit of members of a set of alternatives, as opposed to selecting a single one or merely ranking them

Resource allocation: Apportioning resources among a set of alternatives

Benchmarking: Comparing the processes in one’s own organization with those of other best- of-breed organizations

Quality management: Dealing with the multidimensional aspects of quality and quality improvement

Conflict resolution: Settling disputes between parties with apparently incompatible goals or positions

The Analytic Hierarchy Process In Enterprise Architecture

AHP’s rationality, inclusion of the “human factor” and collaborative approach to decision making have made the analytic hierarchy process popular, and it is now used around the world in a wide variety of decision making situations.

One  example is enterprise architecture. Enterprise architects are encouraged to adopt the technique as a best practice.

Much like Kanban boards, the analytic hierarchy process isn’t an idea that grew out of enterprise architecture, but is one that enterprise architects have adopted to meet their needs.

The three key applications of the analytic hierarchy process in enterprise architecture are:

  • Comparing, ranking ideas against each other to work out the most suitable for a goal
  • Ranking initiatives to see which has the most value
  • Comparing workspaces or solution alternatives for the best option

To learn more about enterprise architecture and the analytic hierarchy process, please see our free eBook on Enterprise Architecture and Innovation Management.

enterprise architecture innovation management

Categories
erwin Expert Blog Enterprise Architecture

What Is An Enterprise Architecture Kanban Board?

Collaboration is vital to enterprise architecture, and one of the ways to facilitate collaboration is through an enterprise architecture Kanban board. It is an ideal way to manage and track work in progress.

Ultimately, the goal of an enterprise architecture initiative is to provide the organization with a complete view of the enterprise, its assets and functions.

A thorough approach to this requires input from the wider business and an enterprise architecture Kanban board can do just that.

A Brief History of Kanban Boards

Kanban boards are based on the concept of Kanban, a tool to visualize, organize and complete work.

In short, they are the visual storyboard for a process or workflow. They represent the journey and the concepts within that journey. Concepts are represented as cards on the board. Concepts may be moved from one stage to another by dragging a concept.

The first official use of Kanban can be traced back to Taiichi Ohno’s work at Toyota. He needed a way to quickly communicate to all workers how much work was being done, what state it was in, and how the work was progressing.

His goal was to make information and processes transparent to everybody and not just the management team – starting to see the relevance to enterprise architecture?

Kanban boards allow users to show the journey of concepts through your processes, improving visibility for the whole team.

Kanban Board Stages and Limits

A Kanban board is composed of stages. Stages are the placeholders for status of work and contain concepts. Depending upon the Kanban board that you are using, different types  of concepts may be present on a stage.

Each stage may be defined with a limit. A limit provides a maximum count for the number of concepts that can exist on a stage at any one time. This prevents a stage from being overloaded with too much work.

The use of stages also allows the Kanban process to be streamlined. The use of limits in various stages can force concepts along a pipeline and ensure that current work is complete before more work is added to a stage. Usually Kanban boards have administrators that may define the stages and limits on a Kanban.

Kanban Board Examples

Following are some examples of Kanban boards:

Innovation Management: Provides a journey through innovation management

Ideas Roadmap: Provides a roadmap by quarter of where ideas are likely to be released

Ideas Management: Provides a journey of the status of ideas on a simple task board

Requirements Management: Visualizes requirements over a development Kanban

Feature Management: Shows where a set of features are in the development process

Skills Management: Provides a view of what skills training is required to be planned

There is no dedicated “enterprise architecture Kanban board,” so they are completely customizable. Therefore, an “enterprise architecture Kanban board” describes a Kanban board used within the context of EA.

Agile Enterprise Architecture & Kanban Boards

Kanban derives from the just-in-time manufacturing methods that revolutionized manufacturing by focusing on what is needed to achieve a particular result and integrating the supply chain to maximize production.

In the agile enterprise architecture approach the production line is our contextual architecture for a particular change program or project and our supply chain in the myriad group of SMEs, partners, suppliers and the overall EA model.

It’s by connecting these parts that we can produce accurate, relevant, verified models to support the project teams that will implement the changes within the organization.

The agile EA approach places Kanban at the heart of managing the change context model and provides a clear focus on which elements of the architecture are needed in a particular context and provides direct connection to the wider stakeholders for collaboration.

When adopting an agile approach to EA, Kanbans provide a great way to move work forward at ease to achieve an end goal or objective. Kanban boards provide a “work- in- progress” view of EA concepts.

They provide an ideal way to track the visibility and status of our work in progress and provide a visual set of stages. Each stage contains a set of ‘cards’ that represent concepts.

Enterprise architecture kanban

For example, a card could be an idea, business capability or an application component. Each stage is identified by a name and a description of the stage. There are many different ways of defining a Kanban board. A typical board has the stages – Parked, To Do, Doing and Done.

However, most companies tailor the Kanban to suit their own environment and projects.

Kanban boards can also have visual indicators on the concepts (cards) such as colors to indicate status of different attributes.

In the example above, tags are shown as different colors on concepts, where they are tagged or categorized for an organization. This helps identify status of the cards on the Kanban and helps decide which we should focus on.

Agility and Collaboration

But as with enterprise architecture more broadly, the temptation to deploy make-shift Kanban boards for less mature enterprises is strong.

However, such approaches are unscalable, and will quickly have a detrimental effect on an organization’s scope to operate with agility and collaborate.

Organizations that recognize the need for greater agility and collaboration in their enterprise architecture initiatives must employ an enterprise architecture tool that facilitates such an approach, like erwin Evolve.

As well as the ability to create, manage and collaborate on Kanban boards, erwin Evolve provides these core capabilities to help EA programs succeed:

  • Flexible Configuration: On-premise or cloud-hosted with a customizable metamodel and adjustable user interface with user-defined views
  • Enterprise Models: Creation and visualization of complex models for strategy, processes, applications, technologies and data
  • Central Repository: Captures all relevant EA data, supporting thousands of users in central or decentralized environments with access from anywhere and automatic mass updates
  • Collaborative Web Platform: User friendly and business-centric to capture and edit data with surveys and other social features to promote communication between IT and business users
  • Reporting: Diagrams, dashboards, workflows and “what-if” analysis
  • Professional Services: Expertise to help maximize ROI, as well as provide custom develop

You can even try erwin Evolve for yourself – for free – and keep any content you produce should you decide to buy.

enterprise architecture innovation management

Categories
erwin Expert Blog Data Modeling

Modern Data Modeling: The Foundation of Enterprise Data Management and Data Governance

The role of data modeling (DM) has expanded to support enterprise data management, including data governance and intelligence efforts. After all, you can’t manage or govern what you can’t see, much less use it to make smart decisions.

Metadata management is the key to managing and governing your data and drawing intelligence from it. Beyond harvesting and cataloging metadata, it also must be visualized to break down the complexity of how data is organized and what data relationships there are so that meaning is explicit to all stakeholders in the data value chain.

Data Governance and Automation

Data models provide this visualization capability, create additional metadata and standardize the data design across the enterprise.

While modeling has always been the best way to understand complex data sources and automate design standards, modern data modeling goes well beyond these domains to ensure and accelerate the overall success of data governance in any organization.

You can’t overestimate the importance of success as data governance keeps the business in line with privacy mandates such as the General Data Protection Regulation (GDPR). It drives innovation too. Companies who want to advance AI initiatives, for instance, won’t get very far without quality data and well-defined data models.

Why Is Data Modeling the Building Block of Enterprise Data Management?

DM mitigates complexity and increases collaboration and literacy across a broad range of data stakeholders.

  • DM uncovers the connections between disparate data elements.

The DM process enables the creation and integration of business and semantic metadata to augment and accelerate data governance and intelligence efforts.

  • DM captures and shares how the business describes and uses data.

DM delivers design task automation and enforcement to ensure data integrity.

  • DM builds higher quality data sources with the appropriate structural veracity.

DM delivers design task standardization to improve business alignment and simplify integration.

  • DM builds a more agile and governable data architecture.

The DM process manages the design and maintenance lifecycle for data sources.

  • DM governs the design and deployment of data across the enterprise.

DM documents, standardizes and aligns any type of data no matter where it lives. 

Realizing the Data Governance Value from Data Modeling

Modeling becomes the point of true collaboration within an organization because it delivers a visual source of truth for everyone to follow – data management and business professionals – to conform to governance requirements.

Information is readily available within intuitive business glossaries, accessible to user roles according to parameters set by the business. The metadata repository behind these glossaries, populated by information stored in data models, serves up the key terms that are understandable and meaningful to every party in the enterprise.

The stage, then, is equally set for improved data intelligence, because stakeholders now can use, understand and trust relevant data to enhance decision-making across the enterprise.

The enterprise is coming to the point where both business and IT co-own data modeling processes and data models. Business analysts and other power users start to understand data complexities because they can grasp terms and contribute to making the data in their organization accurate and complete, and modeling grows in importance in the eyes of business users.

Bringing data to the business and making it easy to access and understand increases the value of  data assets, providing a return on investment and a return on opportunity. But neither would be possible without data modeling providing the backbone for metadata management and proper data governance.

For more information, check out our whitepaper, Drive Business Value and Underpin Data Governance with an Enterprise Data Model.

You also can take erwin DM, the world’s No. 1 data modeling software, for a free spin.

erwin Data Modeler Free Trial - Data Modeling

Categories
erwin Expert Blog Enterprise Architecture

Enterprise Architecture vs. Data Architecture

Although there is some crossover, there are stark differences between data architecture and enterprise architecture (EA). That’s because data architecture is actually an offshoot of enterprise architecture.

In this post:

See also: The Difference Between Enterprise Architecture and Solutions Architecture 

The Difference Between Data Architecture and Enterprise Architecture

In simple terms, EA provides a holistic, enterprise wide overview of an organization’s assets and processes, whereas data architecture gets into the nitty gritty.

The difference between data architecture and enterprise architecture can be represented with the Zachman Framework. The Zachman Framework is an enterprise architecture framework that provides a formalized view of an enterprise across two dimensions.

Data architecture and Enterprise Architecture - The Zachman Framework

The first deals with interrogatives (who, when, why, what, and how – columns). The second deals with reification (the transformation of an abstract idea into concrete implementation – rows/levels).

We can abstract the interrogatives from the columns, into data, process, network, people, timing and motivation perspectives.

So, in terms of the Zachman Framework, the role of an enterprise architect spans the full schema.

Whereas a data architect’s scope is mostly limited to the “What”(data) and from a system model/logical (level 3) perspective.

The Value of Data Architecture

We’re working in a fast-paced digital economy in which data is extremely valuable. Those that can mine it and extract value from it will be successful, from local organizations to international governments. Without it, progress will halt.

Good data leads to better understanding and ultimately better decision-making. Those organizations that can find ways to extract data and use it to their advantage will be successful.

However, we really need to understand what data we have, what it means, and where it is located. Without this understanding, data can proliferate and become more of a risk to the business than a benefit.

Data architecture is an important discipline for understanding data and includes data, technology and infrastructure design.

Data Architecture and Data Modeling

Data modeling is a key facet of data architecture and is the process of creating a formal model that represents the information used by the organization and its systems.

It helps you understand data assets visually and provides a formal practice to discover, analyze and communicate the assets within an organization.

There are various techniques and sets of terminology involved in data modeling. These include conceptual, logical, physical, hierarchical, knowledge graphs, ontologies, taxonomies, semantic models and many more.

Data modeling has gone through four basic growth periods:

Early data modeling, 1960s-early 2000s.

With the advent of the first pure commercial database systems, both General Electric and IBM came up with graph forms to represent and communicate the intent of their own databases. The evolution of programming languages had a strong influence on the modeling techniques and semantics.

Relational data modeling, 1970s.
Edgar F. Codd published ideas he’d developed in the late 1960s and offered an innovative way of representing a database using tables, columns and relations. The relations were accessible by a language. Much higher productivity was achieved, and IBM released SQL (structured query language).

Relational model adoption, 1980s. The relational model became very popular, supported by vendors such as IBM, Oracle and Microsoft. Most industries adopted the relational database systems and they became part of the fabric of every industry.

Growth of non-relational models, 2008-present. With increasing data volumes and digitization becoming the norm, organizations needed to store vast quantities of data regardless of format. The birth of NoSQL databases provided the ability to store data that is often non-relational, doesn’t require rigor or schema and is extremely portable. NoSQL databases are well- suited for handling big data.

Data modeling is therefore more necessary than ever before when dealing with non-relational, portable data because we need to know what data we have, where it is, and which systems use it.

The Imperative for Data Architecture and Enterprise Architecture

The location and usage of data are key facets of EA. Without the context of locations, people, applications and technology, data has no true meaning.

For example, an “order” could be viewed one way by the sales department and another way to the accounting department. We have to know if we are dealing with a sales order from an external customer or an order placed by our organization to the supply chain for raw goods and materials.

Enterprise architecture tools can be leveraged to manage such processes.

Organizations using enterprise architecture tools such as erwin Evolve can  synergize EA with wider data governance and management efforts. That means a clear and full picture of the whole data lifecycle in context, so that the intersections between data and the organization’s assets is clear.

You can even try erwin Evolve for yourself and keep any content you produce should you decide to buy.

Enterprise architecture review

Categories
erwin Expert Blog

Data Equals Truth, and Truth Matters

In these times of great uncertainty and massive disruption, is your enterprise data helping you drive better business outcomes?

The COVID-19 pandemic has forced organizations to tactically adjust their business models, work practices and revenue projections for the short term. But the real challenges will be accelerating recovery and crisis-proofing your business to mitigate the impact of “the next big thing.”

data truth, truth matters

Assure an Unshakable Data Supply Chain to Drive Better Business Outcomes in Turbulent Times

Consider these high-priority scenarios in which the demand for a sound data infrastructure to drive trusted insights is clear and compelling:

  • Organizations contributing to managing the pandemic: (healthcare, government, pharma, etc.)
  • Organizations dealing with major business disruptions in the near and mid-term: (hospitality, retail, transportation)
  • Organizations looking to the post-pandemic future for risk-adverse business models, new opportunities, and/or new approaches to changing markets: (virtually every organization that needs to survive and then thrive)

A data-driven approach has never been more valuable to addressing the complex yet foundational questions enterprises must answer.

However, as we have seen with data surrounding the COVID situation itself, incorrect, incomplete or misunderstood data turn these “what-if” exercises into “WTF” solutions. Organizations that have their data management, data governance and data intelligence houses in order are much better positioned to respond to these challenges and thrive in whatever their new normal turns out to be.

Optimizing data management across the enterprise delivers both tactical and strategic benefits that can mitigate short-term impacts and enable the future-proofing required to ensure stability and success.  Strong data management practices can have:

  • Financial impact (revenue, cash flow, cost structures, etc.)
  • Business capability impact (remote working, lost productivity, restricted access to business-critical infrastructure, supply chain)
  • Market impact (changing customers, market shifts, emerging opportunities)

Turning Data Into a Source of Truth & Regeneration

How can every CEO address the enterprise data dilemma by transforming data into a source of truth and regeneration for post-COVID operations?

  • Accelerate time to value across the data lifecycle (cut time and costs)
    • Decrease data discovery and preparation times
    • Lower the overhead on data related processes and maintenance
    • Reduce latency in the data supply chain
  • Ensure continuity in data capabilities (reduce losses)
    • Automate data management, data intelligence and data governance practices
    • Create always-available and always-transparent data pipelines
    • Reduce necessity for in-person collaboration
  • Ensure company-wide data compliance (reduce risks)
    • Deliver detailed and reliable impact analysis on demand
    • Establish agile and transparent business data governance (policy, classification, rules, usage)
    • Build visibility and traceability into data assets and supporting processes
  • Demand trusted insights based on data truths (Drive innovation and assure veracity)
    • Ensure accurate business context and classification of data
    • Deliver detailed and accurate data lineage on demand
    • Provide visibility into data quality and proven “golden sources”
  • Foster data-driven collaboration (assure agility, visibility and integration of initiatives)
    • Enable navigable data intelligence visualizations and governed feedback loops
    • Govern self-service discovery with rigorous workflows and properly curated data assets
    • Provide visibility across the entire data life cycle (from creation through consumption)

Listen to erwin’s CEO, Adam Famularo, discuss how organizations can use data as a source of truth to navigate current circumstances and determine what’s next on Nasdaq’s Trade Talks.

Data equals truth. #TruthMatters

erwin Rapid Response Resource Center (ERRRC)

Categories
erwin Expert Blog Enterprise Architecture

What Is an Enterprise Architecture Roadmap?

Having an enterprise architecture roadmap is essential in modern business. Without it, understanding the current and desired future state can be difficult.

An enterprise architecture roadmap does not have to be in contrast with efforts to promote an agile enterprise architecture. The focus of innovation and agile EA is to increase the agility of the business for digital transformation.

So it’s essential that an organization understands where it will be at any given period of time, so it’s better prepared to deal with disruption.

To keep pace with the speed of innovation and time to market, organizations need the ability to change quickly – and enterprise architecture roadmaps are a critical tool to view how complex or what the impact of the change is or will be.

Roadmaps in Enterprise Architecture

The idea of a roadmap isn’t exclusive to EA, and enterprise architects are far from the first to adopt them.

That said, the nature of roadmaps significantly compliment the way we articulate an organization’s EA. That’s because EA concepts provide a blueprint of the organization, and many aspects of these concepts can be described with a time dimension.

The time dimension can be used to either display a milestone date at which something is expected to happen, or a date range within which something will take place.

Roadmaps as “Views”

In EA, “views” refer to the different ways to represent an enterprise architecture, while keeping a consistent underlying model – similar to how one might represent the data from an excel table using a pie chart, bar chart or line graph.

The representations can offer different perspectives and/or insight that different parties may find of interest.

This enables enterprise architects to represent the information related to the enterprise architecture, according to stakeholder needs.

So just like a diagram is one view of an architecture model, so is a roadmap – offering a time-based perspective.

A roadmap is usually defined as a view for a specific time period (e.g., one year or the next three months).

Roadmaps may be dynamic and reflect the state of the concept at any moment in time in real-time, or they may be static and show how a set of concepts looked at any moment in time.

Many concepts can have multiple time attributes that represent different time properties.

In enterprise architecture, an application component may have a set of lifecycle times that are associated with it such as ’live’ or ‘sunset.’

Time attributes may simply be a single date such as a milestone or be a time period between two dates.

A roadmap view can consist of lanes. The lanes will show any theme or category for a set of concepts. A roadmap may be divided up to show different types of concepts on one roadmap.

For example, it may be useful to show work package duration and the anticipated idea implementation dates so we can see if our plans are on track.

Time usually flows from left to right on a roadmap diagram.

Example of an Enterprise Architecture Roadmap

The image below is an example showing different time properties for application components.

Enterprise Architecture Roadmap Views

As we can see, we have two lanes, live and sunset. These are themes that we may well be interested in.

We are showing on a single roadmap view both application components (CRM, SafeLogistics, SurveyTool) and a business capability (IT Offshoring).

We can show application components with the live date attribute in the live lane.

We can also view the business capability but with a sunset time period. The time period is between two dates.

In this example, we can see how a roadmap can be used to demonstrate date ranges.

The roadmap is an indication that it takes a much longer time to phase out a business capability. The time it takes to phase out a business capability is important to understand for a number of reasons.

For example, it might be important to know which resources and how many (if any) will be tied up during the process. What has to happen to the current enterprise architecture in order for said capability to be phased out efficiently?

“What-If” and Future Scenarios

Roadmaps provide a time-based view of a model. A time-based view of your concepts is essential for ‘what if’ analysis and planning future scenarios.

In different scenarios, the same set of concepts may have a different time visualization based on different time attributes.

Many organizations will have the concept of a lifecycle. It’s important for companies to adopt a set of lifecycle states that have the same meaning across their stakeholders. For example, sunset or end of life but not both.

As roadmaps are always subject to change and are extremely volatile, then roadmap views should be generated automatically from the model. There should be little reason to create roadmaps without a model. They become extremely difficult to maintain and view in different ways later on.

Organizations using erwin Evolve can take advantage of enterprise architecture roadmaps and views in a collaborative enabling, user friendly enterprise architecture tool.

You can even try erwin Evolve for yourself and keep any content you produce should you decide to buy.

enterprise architecture business process

Categories
erwin Expert Blog Data Governance

What is Data Lineage? Top 5 Benefits of Data Lineage

What is Data Lineage and Why is it Important?

Data lineage is the journey data takes from its creation through its transformations over time. It describes a certain dataset’s origin, movement, characteristics and quality.

Tracing the source of data is an arduous task.

Many large organizations, in their desire to modernize with technology, have acquired several different systems with various data entry points and transformation rules for data as it moves into and across the organization.

data lineage

These tools range from enterprise service bus (ESB) products, data integration tools; extract, transform and load (ETL) tools, procedural code, application program interfaces (API)s, file transfer protocol (FTP) processes, and even business intelligence (BI) reports that further aggregate and transform data.

With all these diverse data sources, and if systems are integrated, it is difficult to understand the complicated data web they form much less get a simple visual flow. This is why data’s lineage must be tracked and why its role is so vital to business operations, providing the ability to understand where data originates, how it is transformed, and how it moves into, across and outside a given organization.

Data Lineage Use Case: From Tracing COVID-19’s Origins to Data-Driven Business

A lot of theories have emerged about the origin of the coronavirus. A recent University of California San Francisco (UCSF) study conducted a genetic analysis of COVID-19 to determine how the virus was introduced specifically to California’s Bay Area.

It detected at least eight different viral lineages in 29 patients in February and early March, suggesting no regional patient zero but rather multiple independent introductions of the pathogen. The professor who directed the study said, “it’s like sparks entering California from various sources, causing multiple wildfires.”

Much like understanding viral lineage is key to stopping this and other potential pandemics, understanding the origin of data, is key to a successful data-driven business.

Top Five Data Lineage Benefits

From my perspective in working with customers of various sizes across multiple industries, I’d like to highlight five data lineage benefits:

1. Business Impact

Data is crucial to every organization’s survival. For that reason, businesses must think about the flow of data across multiple systems that fuel organizational decision-making.

For example, the marketing department uses demographics and customer behavior to forecast sales. The CEO also makes decisions based on performance and growth statistics. An understanding of the data’s origins and history helps answer questions about the origin of data in a Key Performance Indicator (KPI) reports, including:

  • How the report tables and columns are defined in the metadata?
  • Who are the data owners?
  • What are the transformation rules?

Without data lineage, these functions are irrelevant, so it makes sense for a business to have a clear understanding of where data comes from, who uses it, and how it transforms. Also, when there is a change to the environment, it is valuable to assess the impacts to the enterprise application landscape.

In the event of a change in data expectations, data lineage provides a way to determine which downstream applications and processes are affected by the change and helps in planning for application updates.

2. Compliance & Auditability

Business terms and data policies should be implemented through standardized and documented business rules. Compliance with these business rules can be tracked through data lineage, incorporating auditability and validation controls across data transformations and pipelines to generate alerts when there are non-compliant data instances.

Regulatory compliance places greater transparency demands on firms when it comes to tracing and auditing data. For example, capital markets trading firms must understand their data’s origins and history to support risk management, data governance and reporting for various regulations such as BCBS 239 and MiFID II.

Also, different organizational stakeholders (customers, employees and auditors) need to be able to understand and trust reported data. Data lineage offers proof that the data provided is reflected accurately.

3. Data Governance

An automated data lineage solution stitches together metadata for understanding and validating data usage, as well as mitigating the associated risks.

It can auto-document end-to-end upstream and downstream data lineage, revealing any changes that have been made, by whom and when.

This data ownership, accountability and traceability is foundational to a sound data governance program.

See: The Benefits of Data Governance

4. Collaboration

Analytics and reporting are data-dependent, making collaboration among different business groups and/or departments crucial.

The visualization of data lineage can help business users spot the inherent connections of data flows and thus provide greater transparency and auditability.

Seeing data pipelines and information flows further supports compliance efforts.

5. Data Quality

Data quality is affected by data’s movement, transformation, interpretation and selection through people, process and technology.

Root-cause analysis is the first step in repairing data quality. Once a data steward determines where a data flaw was introduced, the reason for the error can be determined.

With data lineage and mapping, the data steward can trace the information flow backward to examine the standardizations and transformations applied to confirm whether they were performed correctly.

See Data Lineage in Action

Data lineage tools document the flow of data into and out of an organization’s systems. They capture end-to-end lineage and ensure proper impact analysis can be performed in the event of problems or changes to data assets as they move across pipelines.

The erwin Data Intelligence Suite (erwin DI) automatically generates end-to-end data lineage, down to the column level and between repositories. You can view data flows from source systems to the reporting layers, including intermediate transformation and business logic.

Join us for the next live demo of erwin Data Intelligence (DI) to see metadata-driven, automated data lineage in action.

erwin data intelligence

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

Categories
erwin Expert Blog Enterprise Architecture

What Is Agile Enterprise Architecture?

Having an agile enterprise architecture (EA) is the difference between whether an organization flourishes or flounders in an increasingly changing business climate.

Over the years, EA has gotten a bad reputation for not providing business value. However, enterprise architecture frameworks and languages like TOGAF and ArchiMate aren’t responsible for this perception. In fact, these standards provide a mechanism for communication and delivery, but the way enterprise architects historically have used them has caused issues.

Today, organizations need to embrace enterprise architecture – and enterprise architecture tools – because of the value it does provide. How else can they respond to business and IT needs or manage change without first documenting what they have, want and need?

Because that’s exactly what EA addresses. It provides business and IT alignment by mapping applications, technologies and data to the value streams and business functions they support.

Essentially, it’s a holistic, top-down view of an organization and its assets that can be used to better inform strategic planning.

But what is an agile enterprise architecture, and what are its advantages?

The Need for Agile Enterprise Architecture

The old adage that anything of any complexity needs to be modeled before it can be changed definitely holds true.

The issue is that enterprise architects tend to model everything down to an excruciating level of detail, often getting lost in the weeds and rarely surfacing for air to see what the rest of the business is doing and realizing what it needs.

This often makes communicating an organization’s enterprise architecture more difficult, adding to the perception of enterprise architects working in an ivory tower.

Just-in-Time vs Just-Enough Enterprise Architecture

Just in time, just enough and agile development and delivery are phrases we’ve all heard. But how do they pertain to EA?

Just-in-time enterprise architecture

Agile is based on the concept of “just in time.” You can see this in many of the agile practices, especially in DevOps. User stories are created when they are needed and not before, and releases happen when there is appropriate value in releasing, not before and not after. Additionally, each iteration has a commitment that is met on time by the EA team.

Just-enough enterprise architecture

EA is missing the answer to the question of “what exactly is getting delivered?” This is where we introduce the phrase “just enough, just in time” because stakeholders don’t just simply want it in time, they also want just enough of it — regardless of what it is.

This is especially important when communicating with non-EA professionals. In the past, enterprise architects have focused on delivering all of the EA assets to stakeholders and demonstrating the technical wizardry required to build the actual architecture.

Agile Enterprise Architecture Best Practices and Techniques

The following techniques and methods can help you provide just-enough EA:

Campaigns

Create a marketing-style campaign to focus on EA initiatives, gathering and describing only what is required to satisfy the goal of the campaign.

Models

At the start of the project, it doesn’t make sense to build a fancy EA that is going to change anyway. Teams should strive to build just enough architecture to support the campaigns in the pipeline.

Collaboration

Agile teams certainly have high levels of collaboration, and that’s because that level is just enough to help them be successful.

In light of the global pandemic, such collaboration might be more difficult to achieve. But organizations can take advantage of collaborative enterprise architecture tools that support remote working.

Planning

In iteration planning, we don’t look at things outside the iteration. We do just enough planning to make sure we can accomplish our goal for the iteration. Work packages and tasks play a large role in both planning and collaboration.

Agile Enterprise Architecture to Keep Pace with Change

As one of the top job roles in 2020, it’s clear organizations recognize the need for enterprise architects in keeping pace with change.

In modern business, what’s also clear is that maximizing the role’s potential requires an agile approach, or else organizations could fall into the same ivory-tower trappings burdening the discipline in the past.

Organizations can use erwin Evolve to tame complexity, manage change and increase operational efficiency. Its many benefits include:

  • Agility & Efficiency: Achieve faster time to actionable insights and value with integrated views across initiatives to understand and focus on business outcomes.
  • Lower Risks & Costs: Improve performance and profitability with harmonized, optimized and visible processes to enhance training and lower IT costs.
  • Creation & Visualization of Complex Models: Harmonize EA/BP modelling capabilities for greater visibility, control and intelligence in managing any use case.
  • Powerful Analysis: Quickly and easily explore model elements, links and dependencies, plus identify and understand the impact of changes.
  • Documentation & Knowledge Retention: Capture, document and publish information for key business functions to increase employee education and awareness and maintain institutional knowledge, including standard operating procedures.
  • Democratization & Decision-Making: Break down organizational silos and facilitate enterprise collaboration among those both in IT and business roles for more informed decisions that drive successful outcomes.

You can try erwin Evolve for yourself and keep any content you produce should you decide to buy.

Collaborative enterprise architecture

Categories
Data Intelligence erwin Expert Blog

What is a Data Catalog?

The easiest way to understand a data catalog is to look at how libraries catalog books and manuals in a hierarchical structure, making it easy for anyone to find exactly what they need.

Similarly, a data catalog enables businesses to create a seamless way for employees to access and consume data and business assets in an organized manner.

By combining physical system catalogs, critical data elements, and key performance measures with clearly defined product and sales goals, you can manage the effectiveness of your business and ensure you understand what critical systems are for business continuity and measuring corporate performance.

As illustrated above, a data catalog is essential to business users because it synthesizes all the details about an organization’s data assets across multiple data sources. It organizes them into a simple, easy- to-digest format and then publishes them to data communities for knowledge-sharing and collaboration.

Another foundational purpose of a data catalog is to streamline, organize and process the thousands, if not millions, of an organization’s data assets to help consumers/users search for specific datasets and understand metadata, ownership, data lineage and usage.

Look at Amazon and how it handles millions of different products, and yet we, as consumers, can find almost anything about everything very quickly.

Beyond Amazon’s advanced search capabilities, they also give detailed information about each product, the seller’s information, shipping times, reviews and a list of companion products. The company measure sales down to a zip-code territory level across product categories.

Data Catalog Use Case Example: Crisis Proof Your Business

One of the biggest lessons we’re learning from the global COVID-19 pandemic is the importance of data, specifically using a data catalog to comply, collaborate and innovate to crisis-proof our businesses.

As COVID-19 continues to spread, organizations are evaluating and adjusting their operations in terms of both risk management and business continuity. Data is critical to these decisions, such as how to ramp up and support remote employees, re-engineer processes, change entire business models, and adjust supply chains.

Think about the pandemic itself and the numerous global entities involved in identifying it, tracking its trajectory, and providing guidance to governments, healthcare systems and the general public. One example is the European Union (EU) Open Data Portal, which is used to document, catalog and govern EU data related to the pandemic. This information has helped:

  • Provide daily updates
  • Give guidance to governments, health professionals and the public
  • Support the development and approval of treatments and vaccines
  • Help with crisis coordination, including repatriation and humanitarian aid
  • Put border controls in place
  • Assist with supply chain control and consular coordination

So one of the biggest lessons we’re learning from COVID-19 is the need for data collection, management and governance. What’s the best way to organize data and ensure it is supported by business policies and well-defined, governed systems, data elements and performance measures?

According to Gartner, “organizations that offer a curated catalog of internal and external data to diverse users will realize twice the business value from their data and analytics investments than those that do not.”

Data Catalog Benefits

5 Advantages of Using a Data Catalog for Crisis Preparedness & Business Continuity

The World Bank has been able to provide an array of real-time data, statistical indicators, and other types of data relevant to the coronavirus pandemic through its authoritative data catalogs. The World Bank data catalogs contain datasets, policies, critical data elements and measures useful for analysis and modeling the virus’ trajectory to help organizations measure the impact.

What can your organization learn from this example when it comes to crisis preparedness and business continuity? By developing and maintaining a data catalog as part of a larger data governance program supported by stakeholders across the organization, you can:

  1. Catalog and Share Information Assets

Catalog critical systems and data elements, plus enable the calculation and evaluation of key performance measures. It’s also important to understand data linage and be able to analyze the impacts to critical systems and essential business processes if a change occurs.

  1. Clearly Document Data Policies and Rules

Managing a remote workforce creates new challenges and risks. Do employees have remote access to essential systems? Do they know what the company’s work-from-home policies are? Do employees understand how to handle sensitive data? Are they equipped to maintain data security and privacy? A data catalog with self-service access serves up the correct policies and procedures.

  1. Reduce Operational Costs While Increasing Time to Value

Datasets need to be properly scanned, documented, tagged and annotated with their definitions, ownership, lineage and usage. Automating the cataloging of data assets saves initial development time and streamlines its ongoing maintenance and governance. Automating the curation of data assets also accelerates the time to value for analytics/insights reporting significantly reduce operational costs.

  1. Make Data Accessible & Usable

Open your organization’s data door, making it easier to access, search and understand information assets. A data catalog is the core of data analysis for decision-making, so automating its curation and access with the associated business context will enable stakeholders to spend more time analyzing it for meaningful insights they can put into action.

  1. Ensure Regulatory Compliance

Regulations like the California Consumer Privacy Act (CCPA) and the European Union’s General Data Protection Regulation (GDPR) require organizations to know where all their customer, prospect and employee data resides to ensure its security and privacy.

A fine for noncompliance is the last thing you need on top of everything else your organization is dealing with, so using a data catalog centralizes data management and the associated usage policies and guardrails.

See a Data Catalog in Action

The erwin Data Intelligence Suite (erwin DI) provides data catalog and data literacy capabilities with built-in automation so you can accomplish all of the above and more.

Join us for the next live demo of erwin DI.

Data Intelligence for Data Automation