Categories
erwin Expert Blog Enterprise Architecture

Data-Driven Enterprise Architecture

It’s time to consider data-driven enterprise architecture.

The traditional approach to enterprise architecture – the analysis, design, planning and implementation of IT capabilities for the successful execution of enterprise strategy – seems to be missing something … data.

I’m not saying that enterprise architects only worry about business structure and high-level processes without regard for business needs, information requirements, data processes, and technology changes necessary to execute strategy.

But I am saying that enterprise architects should look at data, technology and strategy as a whole to develop perspectives in line with all enterprise requirements.

That’s right. When it comes to technology and governance strategies, policies and standards, data should be at the center.

Strategic Building Blocks for Data-Driven EA

The typical notion is that enterprise architects and data (and metadata) architects are in opposite corners. Therefore, most frameworks fail to address the distance.

At Avydium, we believe there’s an important middle ground where different architecture disciplines coexist, including enterprise, solution, application, data, metadata and technical architectures. This is what we call the Mezzo.

Avydium Compass Mezzo view
Figure 1 – The Avydium Compass™ Mezzo view

So we created a set of methods, frameworks and reference architectures that address all these different disciplines, strata and domains. We treat them as a set of deeply connected components, objects, concepts and principles that guide a holistic approach to vision, strategy, solutioning and implementations for clients.

For us at Avydium, we see the layers of this large and complex architecture continuum as a set of building blocks that need to work together – each supporting the others.

Avydium Compass view of enterprise architecture
Figure 2 – The Avydium Compass® view of enterprise architecture

For instance, you can’t develop a proper enterprise strategy without implementing a proper governance strategy, and you can’t have an application strategy without first building your data and metadata strategies. And they all need to support your infrastructure and technology strategies.

Where do these layers connect? With governance, which sets its fundamental components firmly on data, metadata and infrastructure. For any enterprise to make the leap from being a reactive organization to a true leader in its space, it must focus on data as the driver of that transformation.

DATA-DRIVEN BUSINESS TRANSFORMATION – USING DATA AS A STRATEGIC ASSET AND TRANSFORMATIONAL TOOL TO SUCCEED IN THE DIGITAL AGE

 

Data-Driven Enterprise Architecture and Cloud Migration

Let’s look at the example of cloud migration, which most enterprises see as a way to shorten development cycles, scale at demand, and reduce operational expenses. But as cloud migrations become more prevalent, we’re seeing more application modernization efforts fail, which should concern all of us in enterprise architecture.

The most common cause for these failures is disregarding data and metadata, omitting these catalogs from inventory efforts, part of application rationalization and portfolio consolidation that must occur prior to any application being migrated to the cloud.

Thus, key steps of application migration planning, such as data preparation, master data management and reference data management, end up being ignored with disastrous and costly ramifications. Applications fail to work together, data is integrated incorrectly causing massive duplication, and worse.

At Avydium, our data-driven enterprise architecture approach puts data and metadata at the center of cloud migration or any application modernization or digital transformation effort. That’s because we want to understand – and help clients understand – important nuances only visible at the data level, such as compliance and privacy/security risks (remember GDPR?). You want to be proactive in identifying potential issues with sensitive data so you can plan accordingly.

The one piece of advice we give most often to our clients contemplating a move to the cloud – or any application modernization effort for that matter – is take a long hard look at their applications and the associated data.

Start by understanding your business requirements and then determine your technology capabilities so you can balance the two. Then look at your data to ensure you understand what you have, where it is, how it is used and by whom. Only with answers to these questions can you plan and executive a successful move to the cloud.

Categories
erwin Expert Blog Enterprise Architecture

Agile Enterprise Architecture for DevOps Explained …

How do organizations innovate? Taking an idea from concept to delivery requires strategic planning and the ability to execute. In the case of software development, understanding agile enterprise architecture and its relevance to DevOps is also key.

DevOps, the fusion of software development and IT operations, stems from the agile development movement. In more practical terms, it integrates developers and operations teams to improve collaboration and productivity by automating infrastructure, workflows and continuously measuring application performance.

The goal is to balance the competing needs of getting new products into production while maintaining 99.9-percent application uptime for customers in an agile manner. 

To understand this increase in complexity, we need to look at how new features and functions are applied to software delivery. The world of mobile apps, middleware and cloud deployment has reduced release cycles to days and weeks not months — with an emphasis on delivering incremental change.

Previously, a software release would occur every few months with a series of modules that were hopefully still relevant to the business goals.

The shorter, continuous-delivery lifecycle helps organizations:

  • Achieve shorter releases by incremental delivery and delivering faster innovation
  • Be more responsive to business needs by improved collaboration, better quality and more frequent releases
  • Manage the number of applications impacted by a business release by allowing local variants for a global business and continuous delivery within releases

The DevOps approach achieves this by providing an environment that:

  • Minimizes software delivery batch sizes to increase flexibility and enable continuous feedback as every team delivers features to production as they are completed
  • Replaces projects with release trains that minimize batch-waiting time to reduce lead times and waste
  • Shifts from central planning to decentralized execution with a pull philosophy, thus minimizing batch transaction cost to improve efficiency
  • Makes DevOps economically feasible through test virtualization, build automation and automated release management as we prioritize and sequence batches to maximize business value and select the right batches, sequence them in the right order, guide the implementation, track execution and make planning adjustments to maximize business value

An Approach with an Enterprise Architecture View

So far, we have only looked at the delivery aspects. So how does this approach integrate with an enterprise architecture view?

To understand this, we need to look more closely at the strategic planning lifecycle. The figure below shows how the strategic planning lifecycle supports an ‘ideas-to-delivery’ framework.

Agile Enterprise Architecture: The Strategic Planning Lifecycle

Figure 1: The strategic planning lifecycle

You can see the high-level relationship between the strategy and goals of an organization and the projects that deliver the change to meet these goals. Enterprise architecture provides the model to govern the delivery of projects in line with these goals.

However, we must ensure that any model built include ‘just-enough’ enterprise architecture to produce the right level of analysis for driving change. The agile enterprise architecture model, then, is then one that enables enough analysis to plan which projects should be undertaken and ensures full architectural governance for delivery. The last part of this is achieved by connecting to the tools used in the agile space.

Agile Enterprise Architecture: Detailed View of the Strategic Planning Lifecycle

Figure 2: Detailed view of the strategic planning lifecycle

The Agile Enterprise Architecture Lifecycle

An agile enterprise architecture has its own lifecycle with six stages.

Vision and strategy: Initially, the organization begins by revisiting its corporate vision and strategy. What things will differentiate the organization from its competitors in five years? What value propositions will it offer customers to create that differentiation? The organization can create a series of campaigns or challenges to solicit new ideas and requirements for its vision and strategy.

Value proposition: The ideas and requirements are rationalized into a value proposition that can be examined in more detail.

Resources: The company can look at what resources it needs to have on both the business side and the IT side to deliver the capabilities needed to realize the value propositions. For example, a superior customer experience might demand better internet interactions and new applications, processes, and infrastructure on which to run. Once the needs are understood, they are compared to what the organization already has. The transition planning determines how the gaps will be addressed.

Execution: With the strategy and transition plan in place, enterprise architecture execution begins. The transition plan provides input to project prioritization and planning since those projects aligned with the transition plan are typically prioritized over those that do not align. This determines which projects are funded and entered into or continue to the DevOps stage.

Guidelines: As the solutions are developed, enterprise architecture assets such as models, building blocks, rules, patterns, constraints and guidelines are used and followed. Where the standard assets aren’t suitable for a project, exceptions are requested from the governance board. These exceptions are tracked carefully. Where assets are frequently the subject of exception requests, they must be examined to see if they really are suitable for the organization.

Updates: Periodic updates to the organization’s vision and strategy require a reassessment of the to-be state of the enterprise architecture. This typically results in another look at how the organization will differentiate itself in five years, what value propositions it will offer, the capabilities and resources needed, and so on. If we’re not doing things the way we said we wanted them done, then we must ask if our target architectures are still correct. This helps keep the enterprise architecture current and useful.

Enterprise Architecture Tools for DevOps

DevOps can use a number of enterprise architecture solutions. For example, erwin’s enterprise architecture products use open standards to link to other products within the overall lifecycle. This approach integrates agile enterprise architecture with agile development, connecting project delivery with effective governance of the project lifecycle. Even if the software delivery process is agile, goals and associated business needs are linked and can be met.

To achieve this goal, a number of internal processes must be interoperable. This is a significant challenge, but one that can be met by building an internal center of excellence and finding a solution by starting small and building a working environment.

The erwin EA product line takes a rigorous approach to enterprise architecture to ensure that current and future states are published for a wider audience to consume. The erwin EA repository can be used as an enterprise continuum (in TOGAF terms).

Available as a cloud-based platform or on-premise, erwin EA solutions provide a quick and cost-effective path for launching a collaborative enterprise architecture program. With built-in support for such industry frameworks as ArchiMate® and TOGAF®,  erwin enables you to model the enterprise, capture the IT blueprint, generate roadmaps and provide meaningful insights to both technical and business stakeholders.

According to Gartner, enterprise architecture is becoming a “form of internal management consulting,” helping define and shape business and operating models, identify risks and opportunities, and then create technology roadmaps. Understanding how vision and strategy impacts enterprise architecture is important – with an overall goal of traceability from our ideas and initiatives all the way through delivery.

enterprise architecture devops

Categories
erwin Expert Blog

The Intersection of Innovation, Enterprise Architecture and Project Delivery

The only thing that’s constant for most organizations is change. Today there’s an unprecedented, rapid rate of change across all industry sectors, even those that have been historically slow to innovate like healthcare and financial services.

In the past, managing ideation to the delivery of innovation was either not done or was relegated within organizational silos, creating a disconnect across the business. This, in turn, resulted in change not being implemented properly or a focus on the wrong type of change.

For an organization to successfully embrace change, innovation, enterprise architecture and project delivery need to be intertwined and traceable.

Enterprise Architecture Helps Bring Ideas to Life

Peter Drucker famously declared “innovate or die.” But where do you start?

Many companies start with campaigns and ideation. They run challenges and solicit ideas both from inside and outside their walls. Ideas are then prioritized and evaluated. Sometimes prototypes are built and tested, but what happens next?

Organizations often turn to the blueprints or roadmaps generated by their enterprise architectures, IT architectures and or business process architectures for answers. They evaluate how a new idea and its supporting technology, such as service-oriented architecture (SOA) or enterprise-resource planning (ERP), fits into the broader architecture. They manage their technology portfolio by looking at their IT infrastructure needs.

A lot of organizations form program management boards to evaluate ideas, initiatives and their costs. In reality, these evaluations are based on lightweight business cases without broader context. They don’t have a comprehensive understanding of what systems, processes and resources they have, what they are being used for, how much they cost, and the effects of regulations.

Projects are delivered and viewed on an individual basis without regard for the bigger picture. Enterprise-, technology- and process-related decisions are made within the flux of change and without access to the real knowledge contained within the organization or in the marketplace. All too often, IT is ultimately in the hot seat of this type of decision-making.

5 Questions to Ask of Enterprise Architecture

The Five EA Questions IT Needs to Ask

While IT planning should be part of a broader enterprise architecture or market analysis, IT involvement in technology investments is often done close to the end of the strategic planning process and without proper access to enterprise or market data.

The following five questions illustrate the competing demands found within the typical IT environment:

  1. How can we manage the prioritization of business-, architectural-and project-driven initiatives?

Stakeholders place a large number of tactical and strategic requirements on IT. IT is required to offer different technology investment options but is often constrained by a competition for resources.

  1. How do we balance enterprise architecture’s role with IT portfolio management?

An enterprise architect provides a high-level view of the risks and benefits of a project and the alignment to future goals. It can illustrate the project complexities and the impact of change. Future-state architectures and transition plans can be used to define investment portfolio content. At the same time, portfolio management provides a detailed perspective of development and implementation. Balancing these often-competing viewpoints can be tricky.

  1. How well are application lifecycles being managed?

Application management requires a product/service/asset view over time. Well-managed application lifecycles demand a process of continuous releases, especially when time to market is key. The higher-level view required by portfolio management provides a broader perspective of how all assets work together. Balancing application lifecycle demands against a broader portfolio framework can present an inherent conflict about priorities and a struggle for resources.

  1. How do we manage the numerous and often conflicting governance requirements across the delivery process?

As many organizations move to small-team agile development, coordinating the various application development projects becomes more difficult. Managing the development process using waterfall methods can shorten schedules but also can increase the chance of errors and a disconnect with broader portfolio and enterprise goals.

  1. How do we address different lifecycles and tribes in the organization?

Lifecycles such as innovation management, enterprise architecture, business process management and solution delivery are all necessary but are not harmonized across the enterprise. The connection among these lifecycles is important to the effective delivery of initiatives and understanding the impact of change.

The Business Value of Enterprise Architecture

Enterprise architects are crucial to delivering innovation. However, all too often, enterprise architecture has been executed by IT groups for IT groups and has involved the idea that everything in the current state has to be drawn and modeled before you can start to derive value. This approach has wasted effort, taken too long to show results, and provided insufficient added value to the organization.

Enterprise and data architects who relate what they are doing back to what the C-suite really wants find it easier to get budget and stay relevant. It’s important to remember that enterprise architecture is about smarter decision-making, enabling management to make decisions more quickly because they have access to the right information in the right format at the right time. Of course, focusing on future state (desired business outcome) first, helps to reduce the scope of current-state analysis and speed up the delivery of value.

Data Management and Data Governance: Solving the Enterprise Data Dilemma

Categories
erwin Expert Blog

Enterprise Architect: A Role That Keeps Evolving

Enterprise architect is a common job title within IT organizations at large companies, but the term lacks any standard definition. Ask someone on the business side what their organization’s enterprise architects do, and you’ll likely get a response like, “They work with IT,” which is true, but also pretty vague.

What the enterprise architects at your organization do depends in large part on how the IT department is organized. At some organizations, enterprise architects work closely with the software applications in a role that some might refer to as a solution architect.

In other organizations, the role of enterprise architect might carry more traditional IT responsibilities around systems management. Other enterprise architects, especially at large organizations, might specialize in exploring how emerging technologies can be tested and later integrated into the business.

Technology research and advisory firm Gartner predicts that enterprise architects will increasingly move into an internal consultancy function within large organizations. While this use of the role is not currently widespread, it’s easy to see how it could make sense for some businesses.

If, for example, a business sets a goal to increase its website sales by 20 percent in one year’s time, meeting that goal will require that different IT and business functions work together.

The business side might tackle changes to the marketing plan and collect data about website visitors and shoppers, but ultimately they will need to collaborate with someone on the technology side to discuss how IT can help reach that goal. And that’s where an enterprise architect in the role of an internal consultant comes into play.

Each business is going to organize its enterprise architects in a way that best serves the organization and helps achieve its goals.

That’s one of the reasons the enterprise architect role has no standard definition. Most teams consist of members with broad IT experience, but each member will often have some role-specific knowledge. One team member might specialize in security, for example, and another in applications.

Like the tech industry in general, the only constant in enterprise architecture is change. Roles and titles will continue to evolve, and as the business and IT sides of the organization continue to come together in the face of digital transformation, how these teams are organized, where they report, and the types of projects they focus on are sure to change over time.

Enterprise integration architect is one role in enterprise architecture that’s on the rise. These architects specialize in integrating the various cloud and on-premise systems that are now common in the hybrid/multi-cloud infrastructures powering the modern enterprise.

Enterprise Architect: A Role That Keeps Evolving

For the Enterprise Architect, Business Experience Becomes a Valuable Commodity

Regardless of the specific title, enterprise architects need the ability to work with both their business and IT colleagues to help improve business outcomes. As enterprise architecture roles move closer to the business, those with business knowledge are becoming valuable assets. This is especially true for industry-specific business knowledge.

As industry and government compliance regulations, for example, become part of the business fabric in industries like financial services, healthcare and pharmaceuticals, many enterprise architects are developing specializations in these industries that demonstrate their understanding of the business and IT sides of these regulations.

This is important because compliance permeates every area of many of these organizations, from the enterprise architecture to the business processes, and today it’s all enabled by software. Compliance is another area where Gartner’s internal consultancy model for enterprise architects could benefit a number of organizations. The stakes are simply too high to do anything but guarantee all of your processes are compliant.

Enterprise architect is just one role in the modern organization that increasingly stands with one foot on the business side and the other in IT. As your organization navigates its digital transformation, it’s important to use tools that can do the same.

erwin, Inc.’s industry-leading tools for enterprise architecture and business process modeling use a common repository and role-based views, so business users, IT users and those who straddle the line have the visibility they need. When everyone uses the same tools and the same data, they can speak the same language, collaborate more effectively, and produce better business outcomes. That’s something the whole team can support, regardless of job title.

Business Process Modeling Use Cases

Categories
erwin Expert Blog

Enterprise Architecture and Business Process: Common Goals Require Common Tools

For decades now, the professional world has put a great deal of energy into discussing the gulf that exists between business and IT teams within organizations.

They speak different languages, it’s been said, and work toward different goals. Technology plans don’t seem to account for the reality of the business, and business plans don’t account for the capabilities of the technology.

Data governance is one area where business and IT never seemed to establish ownership. Early attempts at data governance treated the idea as a game of volleyball, passing ownership back and forth, with one team responsible for storing data and running applications, and one responsible for using the data for business outcomes.

Today, we see ample evidence this gap is closing at many organizations. Consider:

  • Many technology platforms and software applications now are designed for business users. Business intelligence is a prime example; it’s rare today to see IT pros have to run reports for business users thanks to self-service.
  • Many workers, especially those that came of age surrounded by technology, have a better understanding of both the business and technology that runs their organizations. Education programs also have evolved to help students develop a background in both business and technology.
  • There’s more portability in roles, with technology minds moving to business leadership positions and vice versa.

“The business domain has always existed in enterprise architecture,” says Manuel Ponchaux, director of product management at erwin, Inc. “However, enterprise architecture has traditionally been an IT function with a prime focus on IT. We are now seeing a shift with a greater focus on business outcomes.”

You can see evidence of this blended focus in some of the titles, like “business architect,” being bestowed upon what was traditionally at IT function. These titles demonstrate an understanding that technology cannot exist in the modern organization for the sake of technology alone – technology needs to support the business and its customers. This concept is also a major focus of the digital transformation wave that’s washing over the business world, and thus we see it reflected in job titles that simply didn’t exist a decade ago.

Job titles aside, enterprise architecture (EA) and business process (BP) teams still have different goals, though at many organizations they now work more closely together than they did in the past. Today, both EA and BP teams recognize that their common goal is better business outcomes. Along the way to that goal, each team conducts a number of similar tasks.

Enterprise Architecture and Business Process: Better Together

One prominent example is modeling. Both enterprise architecture and business process teams do modeling, but they do it in different ways at different levels, and they often use different data and tools. This lack of coordination and communication makes it difficult to develop a true sense of a process from the IT and business sides of the equation. It can also lead to duplication of efforts, which is inefficient and likely to add further confusion when trying to understand outcomes.

Building better business outcomes is like following a plan at a construction site. If different teams are making their own decisions about the materials they’re going to use and following their own blueprints, you’re unlikely to see the building you expect to see at the end of the job.

And that’s essentially what is missing at many organizations: A common repository with role-based views, interfaces and dashboard so that enterprise architecture and business process can truly work together using the same blueprint. When enterprise architecture and business process can use common tools that both aid collaboration and help them understand the elements most important to their roles, the result is greater accuracy, increased efficiency and improved outcomes.

erwin’s enterprise architecture and business process tools provide the common repository and role-based views that help these teams work collaboratively toward their common goals. Finally, enterprise architecture and business process can be on the same page.

Business Process Modeling Use Cases

Categories
erwin Expert Blog

What’s Business Process Modeling Got to Do with It? – Choosing A BPM Tool

With business process modeling (BPM) being a key component of data governance, choosing a BPM tool is part of a dilemma many businesses either have or will soon face.

Historically, BPM didn’t necessarily have to be tied to an organization’s data governance initiative.

However, data-driven business and the regulations that oversee it are becoming increasingly extensive, so the need to view data governance as a collective effort – in terms of personnel and the tools that make up the strategy – is becoming harder to ignore.

Data governance also relies on business process modeling and analysis to drive improvement, including identifying business practices susceptible to security, compliance or other risks and adding controls to mitigate exposures.

Choosing a BPM Tool: An Overview

As part of a data governance strategy, a BPM tool aids organizations in visualizing their business processes, system interactions and organizational hierarchies to ensure elements are aligned and core operations are optimized.

The right BPM tool also helps organizations increase productivity, reduce errors and mitigate risks to achieve strategic objectives.

With  insights from the BPM tool, you can clarify roles and responsibilities – which in turn should influence an organization’s policies about data ownership and make data lineage easier to manage.

Organizations also can use a BPM tool to identify the staff who function as “unofficial data repositories.” This has both a primary and secondary function:

1. Organizations can document employee processes to ensure vital information isn’t lost should an employee choose to leave.

2. It is easier to identify areas where expertise may need to be bolstered.

Organizations that adopt a BPM tool also enjoy greater process efficiency. This is through a combination of improving existing processes or designing new process flows, eliminating unnecessary or contradictory steps, and documenting results in a shareable format that is easy to understand so the organization is pulling in one direction.

Choosing a BPM Tool

Silo Buster

Understanding the typical use cases for business process modeling is the first step. As with any tech investment, it’s important to understand how the technology will work in the context of your organization/business.

For example, it’s counter-productive to invest in a solution that reduces informational silos only to introduce a new technological silo through a lack of integration.

Ideally, organizations want a BPM tool that works in conjunction with the wider data management platform and data governance initiative – not one that works against them.

That means it must support data imports and integrations from/with external sources, a solution that enables in-tool collaboration to reduce departmental silos, and most crucial, a solution that taps into a central metadata repository to ensure consistency across the whole data management and governance initiatives.

The lack of a central metadata repository is a far too common thorn in an organization’s side. Without it, they have to juggle multiple versions as changes to the underlying data aren’t automatically updated across the platform.

It also means organizations waste crucial time manually manufacturing and maintaining data quality, when an automation framework could achieve the same goal instantaneously, without human error and with greater consistency.

A central metadata repository ensures an organization can acknowledge and get behind a single source of truth. This has a wealth of favorable consequences including greater cohesion across the organization, better data quality and trust, and faster decision-making with less false starts due to plans based on misleading information.

Three Key Questions to Ask When Choosing a BPM Tool

Organizations in the market for a BPM tool should also consider the following:

1. Configurability: Does the tool support the ability to model and analyze business processes with links to data, applications and other aspects of your organization? And how easy is this to achieve?

2. Role-based views: Can the tool develop integrated business models for a single source of truth but with different views for different stakeholders based on their needs – making regulatory compliance more manageable? Does it enable cross-functional and enterprise collaboration through discussion threads, surveys and other social features?

3. Business and IT infrastructure interoperability: How well does the tool integrate with other key components of data governance including enterprise architecture, data modeling, data cataloging and data literacy? Can it aid in providing data intelligence to connect all the pieces of the data management and governance lifecycles?

For more information and to find out how such a solution can integrate with your organization and current data management and data governance initiatives, click here.

BPM Tool - erwin BP powered by Casewise

Categories
erwin Expert Blog

Data Mapping Tools: What Are the Key Differentiators

The need for data mapping tools in light of increasing volumes and varieties of data – as well as the velocity at which it must be processed – is growing.

It’s not difficult to see why either. Data mapping tools have always been a key asset for any organization looking to leverage data for insights.

Isolated units of data are essentially meaningless. By linking data and enabling its categorization in relation to other data units, data mapping provides the context vital for actionable information.

Now with the General Data Protection Regulation (GDPR) in effect, data mapping has become even more significant.

The scale of GDPR’s reach has set a new precedent and is the closest we’ve come to a global standard in terms of data regulations. The repercussions can be huge – just ask Google.

Data mapping tools are paramount in charting a path to compliance for said new, near-global standard and avoiding the hefty fines.

Because of GDPR, organizations that may not have fully leveraged data mapping for proactive data-driven initiatives (e.g., analysis) are now adopting data mapping tools with compliance in mind.

Arguably, GDPR’s implementation can be viewed as an opportunity – a catalyst for digital transformation.

Those organizations investing in data mapping tools with compliance as the main driver will definitely want to consider this opportunity and have it influence their decision as to which data mapping tool to adopt.

With that in mind, it’s important to understand the key differentiators in data mapping tools and the associated benefits.

Data Mapping Tools: erwin Mapping Manager

Data Mapping Tools: Automated or Manual?

In terms of differentiators for data mapping tools, perhaps the most distinct is automated data mapping versus data mapping via manual processes.

Data mapping tools that allow for automation mean organizations can benefit from in-depth, quality-assured data mapping, without the significant allocations of resources typically associated with such projects.

Eighty percent of data scientists’ and other data professionals’ time is spent on manual data maintenance. That’s anything and everything from addressing errors and inconsistencies and trying to understand source data or track its lineage. This doesn’t even account for the time lost due to missed errors that contribute to inherently flawed endeavors.

Automated data mapping tools render such issues and concerns void. In turn, data professionals’ time can be put to much better, proactive use, rather than them being bogged down with reactive, house-keeping tasks.

FOUR INDUSTRY FOCUSSED CASE STUDIES FOR AUTOMATED METADATA-DRIVEN AUTOMATION 
(BFSI, PHARMA, INSURANCE AND NON-PROFIT) 

 

As well as introducing greater efficiency to the data governance process, automated data mapping tools enable data to be auto-documented from XML that builds mappings for the target repository or reporting structure.

Additionally, a tool that leverages and draws from a single metadata repository means that mappings are dynamically linked with underlying metadata to render automated lineage views, including full transformation logic in real time.

Therefore, changes (e.g., in the data catalog) will be reflected across data governance domains (business process, enterprise architecture and data modeling) as and when they’re made – no more juggling and maintaining multiple, out-of-date versions.

It also enables automatic impact analysis at the table and column level – even for business/transformation rules.

For organizations looking to free themselves from the burden of juggling multiple versions, siloed business processes and a disconnect between interdepartmental collaboration, this feature is a key benefit to consider.

Data Mapping Tools: Other Differentiators

In light of the aforementioned changes to data regulations, many organizations will need to consider the extent of a data mapping tool’s data lineage capabilities.

The ability to reverse-engineer and document the business logic from your reporting structures for true source-to-report lineage is key because it makes analysis (and the trust in said analysis) easier. And should a data breach occur, affected data/persons can be more quickly identified in accordance with GDPR.

Article 33 of GDPR requires organizations to notify the appropriate supervisory authority “without undue delay and, where, feasible, not later than 72 hours” after discovering a breach.

As stated above, a data governance platform that draws from a single metadata source is even more advantageous here.

Mappings can be synchronized with metadata so that source or target metadata changes can be automatically pushed into the mappings – so your mappings stay up to date with little or no effort.

The Data Mapping Tool For Data-Driven Businesses

Nobody likes manual documentation. It’s arduous, error-prone and a waste of resources. Quite frankly, it’s dated.

Any organization looking to invest in data mapping, data preparation and/or data cataloging needs to make automation a priority.

With automated data mapping, organizations can achieve “true data intelligence,”. That being the ability to tell the story of how data enters the organization and changes throughout the entire lifecycle to the consumption/reporting layer.  If you’re working harder than your tool, you have the wrong tool.

The manual tools of old do not have auto documentation capabilities, cannot produce outbound code for multiple ETL or script types, and are a liability in terms of accuracy and GDPR.

Automated data mapping is the only path to true GDPR compliance, and erwin Mapping Manager can get you there in a matter of weeks thanks to our robust reverse-engineering technology. 

Learn more about erwin’s automation framework for data governance here.

Automate Data Mapping

Categories
erwin Expert Blog

Data Governance Stock Check: Using Data Governance to Take Stock of Your Data Assets

For regulatory compliance (e.g., GDPR) and to ensure peak business performance, organizations often bring consultants on board to help take stock of their data assets. This sort of data governance “stock check” is important but can be arduous without the right approach and technology. That’s where data governance comes in …

While most companies hold the lion’s share of operational data within relational databases, it also can live in many other places and various other formats. Therefore, organizations need the ability to manage any data from anywhere, what we call our “any-squared” (Any2) approach to data governance.

Any2 first requires an understanding of the ‘3Vs’ of data – volume, variety and velocity – especially in context of the data lifecycle, as well as knowing how to leverage the key  capabilities of data governance – data cataloging, data literacy, business process, enterprise architecture and data modeling – that enable data to be leveraged at different stages for optimum security, quality and value.

Following are two examples that illustrate the data governance stock check, including the Any2 approach in action, based on real consulting engagements.

Data Governance Stock Check

Data Governance “Stock Check” Case 1: The Data Broker

This client trades in information. Therefore, the organization needed to catalog the data it acquires from suppliers, ensure its quality, classify it, and then sell it to customers. The company wanted to assemble the data in a data warehouse and then provide controlled access to it.

The first step in helping this client involved taking stock of its existing data. We set up a portal so data assets could be registered via a form with basic questions, and then a central team received the registrations, reviewed and prioritized them. Entitlement attributes also were set up to identify and profile high-priority assets.

A number of best practices and technology solutions were used to establish the data required for managing the registration and classification of data feeds:

1. The underlying metadata is harvested followed by an initial quality check. Then the metadata is classified against a semantic model held in a business glossary.

2. After this classification, a second data quality check is performed based on the best-practice rules associated with the semantic model.

3. Profiled assets are loaded into a historical data store within the warehouse, with data governance tools generating its structure and data movement operations for data loading.

4. We developed a change management program to make all staff aware of the information brokerage portal and the importance of using it. It uses a catalog of data assets, all classified against a semantic model with data quality metrics to easily understand where data assets are located within the data warehouse.

5. Adopting this portal, where data is registered and classified against an ontology, enables the client’s customers to shop for data by asset or by meaning (e.g., “what data do you have on X topic?”) and then drill down through the taxonomy or across an ontology. Next, they raise a request to purchase the desired data.

This consulting engagement and technology implementation increased data accessibility and capitalization. Information is registered within a central portal through an approved workflow, and then customers shop for data either from a list of physical assets or by information content, with purchase requests also going through an approval workflow. This, among other safeguards, ensures data quality.

Benefits of Data Governance

Data Governance “Stock Check” Case 2: Tracking Rogue Data

This client has a geographically-dispersed organization that stored many of its key processes in Microsoft Excel TM spreadsheets. They were planning to move to Office 365TM and were concerned about regulatory compliance, including GDPR mandates.

Knowing that electronic documents are heavily used in key business processes and distributed across the organization, this company needed to replace risky manual processes with centralized, automated systems.

A key part of the consulting engagement was to understand what data assets were in circulation and how they were used by the organization. Then process chains could be prioritized to automate and outline specifications for the system to replace them.

This organization also adopted a central portal that allowed employees to register data assets. The associated change management program raised awareness of data governance across the organization and the importance of data registration.

For each asset, information was captured and reviewed as part of a workflow. Prioritized assets were then chosen for profiling, enabling metadata to be reverse-engineered before being classified against the business glossary.

Additionally, assets that were part of a process chain were gathered and modeled with enterprise architecture (EA) and business process (BP) modeling tools for impact analysis.

High-level requirements for new systems then could be defined again in the EA/BP tools and prioritized on a project list. For the others, decisions could be made on whether they could safely be placed in the cloud and whether macros would be required.

In this case, the adoption of purpose-built data governance solutions helped build an understanding of the data assets in play, including information about their usage and content to aid in decision-making.

This client then had a good handle of the “what” and “where” in terms of sensitive data stored in their systems. They also better understood how this sensitive data was being used and by whom, helping reduce regulatory risks like those associated with GDPR.

In both scenarios, we cataloged data assets and mapped them to a business glossary. It acts as a classification scheme to help govern data and located data, making it both more accessible and valuable. This governance framework reduces risk and protects its most valuable or sensitive data assets.

Focused on producing meaningful business outcomes, the erwin EDGE platform was pivotal in achieving these two clients’ data governance goals – including the infrastructure to undertake a data governance stock check. They used it to create an “enterprise data governance experience” not just for cataloging data and other foundational tasks, but also for a competitive “EDGE” in maximizing the value of their data while reducing data-related risks.

To learn more about the erwin EDGE data governance platform and how it aids in undertaking a data governance stock check, register for our free, 30-minute demonstration here.

Categories
erwin Expert Blog

Digital Transformation in Municipal Government: The Hidden Force Powering Smart Cities

Smart cities are changing the world.

When you think of real-time, data-driven experiences and modern applications to accomplish tasks faster and easier, your local town or city government probably doesn’t come to mind. But municipal government is starting to embrace digital transformation and therefore data governance.

Municipal government has never been an area in which to look for tech innovation. Perpetually strapped for resources and budget, often relying on legacy applications and infrastructure, and perfectly happy being available during regular business hours (save for emergency responders), most municipal governments lacked the ability and motivation to (as they say in the private sector) digitally transform. Then an odd thing happened – the rest of the world started transforming.

If you shop at a retailer that doesn’t deliver a modern, personalized experience, thousands more retailers are just a click away. But people rarely pick up and move to a new city because the new city offers a better website or mobile app. The motivation for municipal governments to transform simply isn’t there in the same way it is for the private sector.

But there are some things many city residents care about deeply: public safety, quality of life, how their tax dollars are spent, and the ability to do business with their local government when they want, not when it’s convenient for the municipality. And much like the private sector, better decisions around all of these concerns can be made when accurate, timely data is available to help inform them.

Digital transformation in municipal government is taking place in two main areas today: constituent services and the “smart cities” movement.

Digital Transformation in Municipal Government: Being “Smart” About It

The ability to serve constituents easily and efficiently is of increasing importance and a key objective of digital transformation in municipal government. It’s a direct result of the data-driven customer experiences that are increasingly the norm in the private sector.

Residents want the ability to pay their taxes online, report a pothole from their phone, and generally make it easier to interact with their local officials and services. This can be accomplished with dashboards and constituent portals.

The smart cities movement refers to the broad effort of municipal governments to incorporate sensors, data collection and analysis to improve responses to everything from rush-hour traffic to air quality to crime prevention. When the McKinsey Global Institute examined smart technologies that could be deployed by cities, it found that the public sector would be the natural owner of 70 percent of the applications it reviewed.

“Cities are getting in on the data game,” says Danny Sandwell, product marketing director at erwin, Inc. And with information serving as the lifeblood of many of these projects, the effectiveness of the services offered, the return on the investments in hardware and software, and the happiness of the users all depend on timely, accurate and effective data.

These initiatives present a pretty radical departure from the way cities have traditionally been managed.

A constituent portal, for example, requires that users can be identified, authenticated and then have access to information that resides in various departments, such as the tax collector to view and pay taxes, the building department to view a building permit, and the parking authority to manage public parking permits.

For many municipalities, this is uncharted territory.

Smart Cities

Data Governance: The Force Powering Smart Cities

The efficiencies offered by smart city technologies only exist if the data leads to a proper allocation of resources.

If you can identify an increase in crime in a certain neighborhood, for example, you can increase police patrols in response. But if the data is inaccurate, those patrols are wasted while other neighborhoods experience a rise in crime.

Now that they’re in the data game, it’s time for municipal governments to understand data governance – the driving force behind any successful data-driven operation. When you have the ability to understand all of the information related to a piece of data, you have more confidence in how it is analyzed, used and protected.

Data governance doesn’t take place at a single application or in the data warehouse. It needs to be woven into the enterprise architecture and processes of the municipality to ensure data is accurate, timely and accessible to those who need it (and inaccessible to everyone else).

When this all comes together – good data, solid analytics and improved services for residents – the results can be quite striking. New efficiencies will make municipal governments better stewards of tax dollars. An improved quality of life can lift tax revenue by making the city more appealing to citizens and developers.

There’s a lot for cities to gain if they get in the data game. And truly smart cities will make sure they play the game right with effective data governance.

Benefits of Data Governance

Categories
erwin Expert Blog

Data Preparation and Data Mapping: The Glue Between Data Management and Data Governance to Accelerate Insights and Reduce Risks

Organizations have spent a lot of time and money trying to harmonize data across diverse platforms, including cleansing, uploading metadata, converting code, defining business glossaries, tracking data transformations and so on. But the attempts to standardize data across the entire enterprise haven’t produced the desired results.

A company can’t effectively implement data governance – documenting and applying business rules and processes, analyzing the impact of changes and conducting audits – if it fails at data management.

The problem usually starts by relying on manual integration methods for data preparation and mapping. It’s only when companies take their first stab at manually cataloging and documenting operational systems, processes and the associated data, both at rest and in motion, that they realize how time-consuming the entire data prepping and mapping effort is, and why that work is sure to be compounded by human error and data quality issues.

To effectively promote business transformation, as well as fulfil regulatory and compliance mandates, there can’t be any mishaps.

It’s obvious that the manual road is very challenging to discover and synthesize data that resides in different formats in thousands of unharvested, undocumented databases, applications, ETL processes and procedural code.

Consider the problematic issue of manually mapping source system fields (typically source files or database tables) to target system fields (such as different tables in target data warehouses or data marts).

These source mappings generally are documented across a slew of unwieldy spreadsheets in their “pre-ETL” stage as the input for ETL development and testing. However, the ETL design process often suffers as it evolves because spreadsheet mapping data isn’t updated or may be incorrectly updated thanks to human error. So questions linger about whether transformed data can be trusted.

Data Quality Obstacles

The sad truth is that high-paid knowledge workers like data scientists spend up to 80 percent of their time finding and understanding source data and resolving errors or inconsistencies, rather than analyzing it for real value.

Statistics are similar when looking at major data integration projects, such as data warehousing and master data management with data stewards challenged to identify and document data lineage and sensitive data elements.

So how can businesses produce value from their data when errors are introduced through manual integration processes? How can enterprise stakeholders gain accurate and actionable insights when data can’t be easily and correctly translated into business-friendly terms?

How can organizations master seamless data discovery, movement, transformation and IT and business collaboration to reverse the ratio of preparation to value delivered.

What’s needed to overcome these obstacles is establishing an automated, real-time, high-quality and metadata- driven pipeline useful for everyone, from data scientists to enterprise architects to business analysts to C-level execs.

Doing so will require a hearty data management strategy and technology for automating the timely delivery of quality data that measures up to business demands.

From there, they need a sturdy data governance strategy and technology to automatically link and sync well-managed data with core capabilities for auditing, statutory reporting and compliance requirements as well as to drive business insights.

Creating a High-Quality Data Pipeline

Working hand-in-hand, data management and data governance provide a real-time, accurate picture of the data landscape, including “data at rest” in databases, data lakes and data warehouses and “data in motion” as it is integrated with and used by key applications. And there’s control of that landscape to facilitate insight and collaboration and limit risk.

With a metadata-driven, automated, real-time, high-quality data pipeline, all stakeholders can access data that they now are able to understand and trust and which they are authorized to use. At last they can base strategic decisions on what is a full inventory of reliable information.

The integration of data management and governance also supports industry needs to fulfill regulatory and compliance mandates, ensuring that audits are not compromised by the inability to discover key data or by failing to tag sensitive data as part of integration processes.

Data-driven insights, agile innovation, business transformation and regulatory compliance are the fruits of data preparation/mapping and enterprise modeling (business process, enterprise architecture and data modeling) that revolves around a data governance hub.

erwin Mapping Manager (MM) combines data management and data governance processes in an automated flow through the integration lifecycle from data mapping for harmonization and aggregation to generating the physical embodiment of data lineage – that is the creation, movement and transformation of transactional and operational data.

Its hallmark is a consistent approach to data delivery (business glossaries connect physical metadata to specific business terms and definitions) and metadata management (via data mappings).

Automate Data Mapping