Categories
erwin Expert Blog

What’s Business Process Modeling Got to Do with It? – Choosing A BPM Tool

With business process modeling (BPM) being a key component of data governance, choosing a BPM tool is part of a dilemma many businesses either have or will soon face.

Historically, BPM didn’t necessarily have to be tied to an organization’s data governance initiative.

However, data-driven business and the regulations that oversee it are becoming increasingly extensive, so the need to view data governance as a collective effort – in terms of personnel and the tools that make up the strategy – is becoming harder to ignore.

Data governance also relies on business process modeling and analysis to drive improvement, including identifying business practices susceptible to security, compliance or other risks and adding controls to mitigate exposures.

Choosing a BPM Tool: An Overview

As part of a data governance strategy, a BPM tool aids organizations in visualizing their business processes, system interactions and organizational hierarchies to ensure elements are aligned and core operations are optimized.

The right BPM tool also helps organizations increase productivity, reduce errors and mitigate risks to achieve strategic objectives.

With  insights from the BPM tool, you can clarify roles and responsibilities – which in turn should influence an organization’s policies about data ownership and make data lineage easier to manage.

Organizations also can use a BPM tool to identify the staff who function as “unofficial data repositories.” This has both a primary and secondary function:

1. Organizations can document employee processes to ensure vital information isn’t lost should an employee choose to leave.

2. It is easier to identify areas where expertise may need to be bolstered.

Organizations that adopt a BPM tool also enjoy greater process efficiency. This is through a combination of improving existing processes or designing new process flows, eliminating unnecessary or contradictory steps, and documenting results in a shareable format that is easy to understand so the organization is pulling in one direction.

Choosing a BPM Tool

Silo Buster

Understanding the typical use cases for business process modeling is the first step. As with any tech investment, it’s important to understand how the technology will work in the context of your organization/business.

For example, it’s counter-productive to invest in a solution that reduces informational silos only to introduce a new technological silo through a lack of integration.

Ideally, organizations want a BPM tool that works in conjunction with the wider data management platform and data governance initiative – not one that works against them.

That means it must support data imports and integrations from/with external sources, a solution that enables in-tool collaboration to reduce departmental silos, and most crucial, a solution that taps into a central metadata repository to ensure consistency across the whole data management and governance initiatives.

The lack of a central metadata repository is a far too common thorn in an organization’s side. Without it, they have to juggle multiple versions as changes to the underlying data aren’t automatically updated across the platform.

It also means organizations waste crucial time manually manufacturing and maintaining data quality, when an automation framework could achieve the same goal instantaneously, without human error and with greater consistency.

A central metadata repository ensures an organization can acknowledge and get behind a single source of truth. This has a wealth of favorable consequences including greater cohesion across the organization, better data quality and trust, and faster decision-making with less false starts due to plans based on misleading information.

Three Key Questions to Ask When Choosing a BPM Tool

Organizations in the market for a BPM tool should also consider the following:

1. Configurability: Does the tool support the ability to model and analyze business processes with links to data, applications and other aspects of your organization? And how easy is this to achieve?

2. Role-based views: Can the tool develop integrated business models for a single source of truth but with different views for different stakeholders based on their needs – making regulatory compliance more manageable? Does it enable cross-functional and enterprise collaboration through discussion threads, surveys and other social features?

3. Business and IT infrastructure interoperability: How well does the tool integrate with other key components of data governance including enterprise architecture, data modeling, data cataloging and data literacy? Can it aid in providing data intelligence to connect all the pieces of the data management and governance lifecycles?

For more information and to find out how such a solution can integrate with your organization and current data management and data governance initiatives, click here.

BPM Tool - erwin BP powered by Casewise

Categories
erwin Expert Blog

Data Mapping Tools: What Are the Key Differentiators

The need for data mapping tools in light of increasing volumes and varieties of data – as well as the velocity at which it must be processed – is growing.

It’s not difficult to see why either. Data mapping tools have always been a key asset for any organization looking to leverage data for insights.

Isolated units of data are essentially meaningless. By linking data and enabling its categorization in relation to other data units, data mapping provides the context vital for actionable information.

Now with the General Data Protection Regulation (GDPR) in effect, data mapping has become even more significant.

The scale of GDPR’s reach has set a new precedent and is the closest we’ve come to a global standard in terms of data regulations. The repercussions can be huge – just ask Google.

Data mapping tools are paramount in charting a path to compliance for said new, near-global standard and avoiding the hefty fines.

Because of GDPR, organizations that may not have fully leveraged data mapping for proactive data-driven initiatives (e.g., analysis) are now adopting data mapping tools with compliance in mind.

Arguably, GDPR’s implementation can be viewed as an opportunity – a catalyst for digital transformation.

Those organizations investing in data mapping tools with compliance as the main driver will definitely want to consider this opportunity and have it influence their decision as to which data mapping tool to adopt.

With that in mind, it’s important to understand the key differentiators in data mapping tools and the associated benefits.

Data Mapping Tools: erwin Mapping Manager

Data Mapping Tools: Automated or Manual?

In terms of differentiators for data mapping tools, perhaps the most distinct is automated data mapping versus data mapping via manual processes.

Data mapping tools that allow for automation mean organizations can benefit from in-depth, quality-assured data mapping, without the significant allocations of resources typically associated with such projects.

Eighty percent of data scientists’ and other data professionals’ time is spent on manual data maintenance. That’s anything and everything from addressing errors and inconsistencies and trying to understand source data or track its lineage. This doesn’t even account for the time lost due to missed errors that contribute to inherently flawed endeavors.

Automated data mapping tools render such issues and concerns void. In turn, data professionals’ time can be put to much better, proactive use, rather than them being bogged down with reactive, house-keeping tasks.

FOUR INDUSTRY FOCUSSED CASE STUDIES FOR AUTOMATED METADATA-DRIVEN AUTOMATION 
(BFSI, PHARMA, INSURANCE AND NON-PROFIT) 

 

As well as introducing greater efficiency to the data governance process, automated data mapping tools enable data to be auto-documented from XML that builds mappings for the target repository or reporting structure.

Additionally, a tool that leverages and draws from a single metadata repository means that mappings are dynamically linked with underlying metadata to render automated lineage views, including full transformation logic in real time.

Therefore, changes (e.g., in the data catalog) will be reflected across data governance domains (business process, enterprise architecture and data modeling) as and when they’re made – no more juggling and maintaining multiple, out-of-date versions.

It also enables automatic impact analysis at the table and column level – even for business/transformation rules.

For organizations looking to free themselves from the burden of juggling multiple versions, siloed business processes and a disconnect between interdepartmental collaboration, this feature is a key benefit to consider.

Data Mapping Tools: Other Differentiators

In light of the aforementioned changes to data regulations, many organizations will need to consider the extent of a data mapping tool’s data lineage capabilities.

The ability to reverse-engineer and document the business logic from your reporting structures for true source-to-report lineage is key because it makes analysis (and the trust in said analysis) easier. And should a data breach occur, affected data/persons can be more quickly identified in accordance with GDPR.

Article 33 of GDPR requires organizations to notify the appropriate supervisory authority “without undue delay and, where, feasible, not later than 72 hours” after discovering a breach.

As stated above, a data governance platform that draws from a single metadata source is even more advantageous here.

Mappings can be synchronized with metadata so that source or target metadata changes can be automatically pushed into the mappings – so your mappings stay up to date with little or no effort.

The Data Mapping Tool For Data-Driven Businesses

Nobody likes manual documentation. It’s arduous, error-prone and a waste of resources. Quite frankly, it’s dated.

Any organization looking to invest in data mapping, data preparation and/or data cataloging needs to make automation a priority.

With automated data mapping, organizations can achieve “true data intelligence,”. That being the ability to tell the story of how data enters the organization and changes throughout the entire lifecycle to the consumption/reporting layer.  If you’re working harder than your tool, you have the wrong tool.

The manual tools of old do not have auto documentation capabilities, cannot produce outbound code for multiple ETL or script types, and are a liability in terms of accuracy and GDPR.

Automated data mapping is the only path to true GDPR compliance, and erwin Mapping Manager can get you there in a matter of weeks thanks to our robust reverse-engineering technology. 

Learn more about erwin’s automation framework for data governance here.

Automate Data Mapping

Categories
erwin Expert Blog

Data Governance Stock Check: Using Data Governance to Take Stock of Your Data Assets

For regulatory compliance (e.g., GDPR) and to ensure peak business performance, organizations often bring consultants on board to help take stock of their data assets. This sort of data governance “stock check” is important but can be arduous without the right approach and technology. That’s where data governance comes in …

While most companies hold the lion’s share of operational data within relational databases, it also can live in many other places and various other formats. Therefore, organizations need the ability to manage any data from anywhere, what we call our “any-squared” (Any2) approach to data governance.

Any2 first requires an understanding of the ‘3Vs’ of data – volume, variety and velocity – especially in context of the data lifecycle, as well as knowing how to leverage the key  capabilities of data governance – data cataloging, data literacy, business process, enterprise architecture and data modeling – that enable data to be leveraged at different stages for optimum security, quality and value.

Following are two examples that illustrate the data governance stock check, including the Any2 approach in action, based on real consulting engagements.

Data Governance Stock Check

Data Governance “Stock Check” Case 1: The Data Broker

This client trades in information. Therefore, the organization needed to catalog the data it acquires from suppliers, ensure its quality, classify it, and then sell it to customers. The company wanted to assemble the data in a data warehouse and then provide controlled access to it.

The first step in helping this client involved taking stock of its existing data. We set up a portal so data assets could be registered via a form with basic questions, and then a central team received the registrations, reviewed and prioritized them. Entitlement attributes also were set up to identify and profile high-priority assets.

A number of best practices and technology solutions were used to establish the data required for managing the registration and classification of data feeds:

1. The underlying metadata is harvested followed by an initial quality check. Then the metadata is classified against a semantic model held in a business glossary.

2. After this classification, a second data quality check is performed based on the best-practice rules associated with the semantic model.

3. Profiled assets are loaded into a historical data store within the warehouse, with data governance tools generating its structure and data movement operations for data loading.

4. We developed a change management program to make all staff aware of the information brokerage portal and the importance of using it. It uses a catalog of data assets, all classified against a semantic model with data quality metrics to easily understand where data assets are located within the data warehouse.

5. Adopting this portal, where data is registered and classified against an ontology, enables the client’s customers to shop for data by asset or by meaning (e.g., “what data do you have on X topic?”) and then drill down through the taxonomy or across an ontology. Next, they raise a request to purchase the desired data.

This consulting engagement and technology implementation increased data accessibility and capitalization. Information is registered within a central portal through an approved workflow, and then customers shop for data either from a list of physical assets or by information content, with purchase requests also going through an approval workflow. This, among other safeguards, ensures data quality.

Benefits of Data Governance

Data Governance “Stock Check” Case 2: Tracking Rogue Data

This client has a geographically-dispersed organization that stored many of its key processes in Microsoft Excel TM spreadsheets. They were planning to move to Office 365TM and were concerned about regulatory compliance, including GDPR mandates.

Knowing that electronic documents are heavily used in key business processes and distributed across the organization, this company needed to replace risky manual processes with centralized, automated systems.

A key part of the consulting engagement was to understand what data assets were in circulation and how they were used by the organization. Then process chains could be prioritized to automate and outline specifications for the system to replace them.

This organization also adopted a central portal that allowed employees to register data assets. The associated change management program raised awareness of data governance across the organization and the importance of data registration.

For each asset, information was captured and reviewed as part of a workflow. Prioritized assets were then chosen for profiling, enabling metadata to be reverse-engineered before being classified against the business glossary.

Additionally, assets that were part of a process chain were gathered and modeled with enterprise architecture (EA) and business process (BP) modeling tools for impact analysis.

High-level requirements for new systems then could be defined again in the EA/BP tools and prioritized on a project list. For the others, decisions could be made on whether they could safely be placed in the cloud and whether macros would be required.

In this case, the adoption of purpose-built data governance solutions helped build an understanding of the data assets in play, including information about their usage and content to aid in decision-making.

This client then had a good handle of the “what” and “where” in terms of sensitive data stored in their systems. They also better understood how this sensitive data was being used and by whom, helping reduce regulatory risks like those associated with GDPR.

In both scenarios, we cataloged data assets and mapped them to a business glossary. It acts as a classification scheme to help govern data and located data, making it both more accessible and valuable. This governance framework reduces risk and protects its most valuable or sensitive data assets.

Focused on producing meaningful business outcomes, the erwin EDGE platform was pivotal in achieving these two clients’ data governance goals – including the infrastructure to undertake a data governance stock check. They used it to create an “enterprise data governance experience” not just for cataloging data and other foundational tasks, but also for a competitive “EDGE” in maximizing the value of their data while reducing data-related risks.

To learn more about the erwin EDGE data governance platform and how it aids in undertaking a data governance stock check, register for our free, 30-minute demonstration here.

Categories
erwin Expert Blog

Digital Transformation in Municipal Government: The Hidden Force Powering Smart Cities

Smart cities are changing the world.

When you think of real-time, data-driven experiences and modern applications to accomplish tasks faster and easier, your local town or city government probably doesn’t come to mind. But municipal government is starting to embrace digital transformation and therefore data governance.

Municipal government has never been an area in which to look for tech innovation. Perpetually strapped for resources and budget, often relying on legacy applications and infrastructure, and perfectly happy being available during regular business hours (save for emergency responders), most municipal governments lacked the ability and motivation to (as they say in the private sector) digitally transform. Then an odd thing happened – the rest of the world started transforming.

If you shop at a retailer that doesn’t deliver a modern, personalized experience, thousands more retailers are just a click away. But people rarely pick up and move to a new city because the new city offers a better website or mobile app. The motivation for municipal governments to transform simply isn’t there in the same way it is for the private sector.

But there are some things many city residents care about deeply: public safety, quality of life, how their tax dollars are spent, and the ability to do business with their local government when they want, not when it’s convenient for the municipality. And much like the private sector, better decisions around all of these concerns can be made when accurate, timely data is available to help inform them.

Digital transformation in municipal government is taking place in two main areas today: constituent services and the “smart cities” movement.

Digital Transformation in Municipal Government: Being “Smart” About It

The ability to serve constituents easily and efficiently is of increasing importance and a key objective of digital transformation in municipal government. It’s a direct result of the data-driven customer experiences that are increasingly the norm in the private sector.

Residents want the ability to pay their taxes online, report a pothole from their phone, and generally make it easier to interact with their local officials and services. This can be accomplished with dashboards and constituent portals.

The smart cities movement refers to the broad effort of municipal governments to incorporate sensors, data collection and analysis to improve responses to everything from rush-hour traffic to air quality to crime prevention. When the McKinsey Global Institute examined smart technologies that could be deployed by cities, it found that the public sector would be the natural owner of 70 percent of the applications it reviewed.

“Cities are getting in on the data game,” says Danny Sandwell, product marketing director at erwin, Inc. And with information serving as the lifeblood of many of these projects, the effectiveness of the services offered, the return on the investments in hardware and software, and the happiness of the users all depend on timely, accurate and effective data.

These initiatives present a pretty radical departure from the way cities have traditionally been managed.

A constituent portal, for example, requires that users can be identified, authenticated and then have access to information that resides in various departments, such as the tax collector to view and pay taxes, the building department to view a building permit, and the parking authority to manage public parking permits.

For many municipalities, this is uncharted territory.

Smart Cities

Data Governance: The Force Powering Smart Cities

The efficiencies offered by smart city technologies only exist if the data leads to a proper allocation of resources.

If you can identify an increase in crime in a certain neighborhood, for example, you can increase police patrols in response. But if the data is inaccurate, those patrols are wasted while other neighborhoods experience a rise in crime.

Now that they’re in the data game, it’s time for municipal governments to understand data governance – the driving force behind any successful data-driven operation. When you have the ability to understand all of the information related to a piece of data, you have more confidence in how it is analyzed, used and protected.

Data governance doesn’t take place at a single application or in the data warehouse. It needs to be woven into the enterprise architecture and processes of the municipality to ensure data is accurate, timely and accessible to those who need it (and inaccessible to everyone else).

When this all comes together – good data, solid analytics and improved services for residents – the results can be quite striking. New efficiencies will make municipal governments better stewards of tax dollars. An improved quality of life can lift tax revenue by making the city more appealing to citizens and developers.

There’s a lot for cities to gain if they get in the data game. And truly smart cities will make sure they play the game right with effective data governance.

Benefits of Data Governance

Categories
erwin Expert Blog

Digital Transformation Examples: How Data Is Transforming the Hospitality Industry

The rate at which organizations have adopted data-driven strategies means there are a wealth of digital transformation examples for organizations to draw from.

By now, you probably recognize this recurring pattern in the discussions about digital transformation:

  • An industry set in its ways slowly moves toward using information technology to create efficiencies, automate processes or help identify new customer or product opportunities.
  • All is going fine until a new kid on the block, born in the age of IT and the internet, quickly starts to create buzz and redefine what customers expect from the industry.
  • To keep pace, the industry stalwarts rush into catch-up mode but make inevitably mistakes. ROI doesn’t meet expectations, the customer experience isn’t quite right, and data gets exposed or mishandled.

There’s one industry we’re all familiar with that welcomes billions of global customers every year; that’s in the midst of a strong economic run; is dealing with high-profile disruptors; and suffered a very public data breach to one of its storied brands in 2018 that raised eyebrows around the world.

Welcome to the hospitality industry.

The hotel and hospitality industry was expected to see 5 to 6 percent growth in 2018, part of an impressive run of performance fueled by steady demand, improved midmarket offerings, and a new supply of travelers from developing regions.

All this despite challenges from upstarts like AirB2B, HomeAway and Couchsurfing plus a data breach at Marriott/Starwood that exposed the data of 500 million customers.

Digital Transformation Examples: Data & the Hospitality Industry

Online start-ups such as Airbnb, HomeAway and Couchsurfing are some of the most clear cut digital transformation examples in the hospitality industry.

Digital Transformation Examples: Hospitality – Data, Data Everywhere

As with other industries, digital transformation examples in the hospitality industry are abundant – and in turn, those businesses are awash in data with sources that include:

  • Data generated by reservations and payments
  • The data hotels collect to drive their loyalty programs
  • Data used to enhance the customer experience
  • Data shared as part of the billions of handoffs between hotel chains and the various booking sites and agencies that travelers use to plan trips

But all of this data, which now permeates the industry, is relatively new.

“IT wasn’t always a massive priority for [the hospitality industry],” says Danny Sandwell, director of product marketing for erwin, Inc. “So now there’s a lot of data, but these organizations often have a weak backend.

The combination of data and analytics carries a great deal of potential for companies in the hospitality industry. Today’s demanding customers want experiences, not just a bed to sleep in; they want to do business with brands that understand their likes and dislikes; and that send offers relevant to their interests and desired destinations.

All of this is possible when a business collects and analyzes data on the scale that many hotel brands do. However, all of this can fail loudly if there is a problem with that data.

Getting a return on their investments in analytics and marketing technology requires hospitality companies to thoroughly understand the source of their data, the quality of the data, and the relevance of the data. This is where data governance comes into play.

When hospitality businesses are confident in their data, they can use it a number of ways, including:

  • Customer Experience: Quality data can be used to power a best-in-class experience for hotels in a number of areas, including the Web experience, mobile experience, and the in-person guest experience. This is similar to the multi-channel strategy of retailers hoping to deliver memorable and helpful experiences based on what they know about customers, including the ability to make predictions and deliver cross-sell and up-sell opportunities. 
  • Mergers and Acquisitions: Hospitality industry disruptors have some industry players thinking about boosting their businesses via mergers and acquisitions. Good data can identify the best targets and help discover the regions or price points where M&A makes the most sense and will deliver the most value. Accurate data can also help pinpoint the true cost of M&A activity.
  • Security: Marriott’s data breach, which actually began as a breach at Starwood before Marriott acquired it, highlights the importance of data security in the hospitality industry. Strong data governance can help prevent breaches, as well as help control breaches so organizations more quickly identify the scope and action behind a breach, an important part of limiting damage.
  • Partnerships: The hospitality industry is increasingly connected, not just because of booking sites working with dozens of hotel brands but also because of tour operators turning a hotel stay into an experience and transportation companies arranging travel for guests. Providing a room is no longer enough.

Data governance is not an application or a tool. It is a strategy. When it is done correctly and it is deployed in a holistic manner, data governance becomes woven into an organization’s business processes and enterprise architecture.

It then improves the organization’s ability to understand where its data is, where it came from, its value, its quality, and how the data is accessed and used by people and applications.

It’s this level of data maturity that provides comfort to employees – from IT staff to the front desk and everyone in between – that the data they are working with is accurate and helping them better perform their jobs and improve the way they serve customers.

Over the next few weeks, we’ll be looking closely at digital transformation examples in other sectors, including retail and government. Subscribe to to stay in the loop.

GDPR White Paper

Categories
erwin Expert Blog

erwin Automation Framework: Achieving Faster Time-to-Value in Data Preparation, Deployment and Governance

Data governance is more important to the enterprise than ever before. It ensures everyone in the organization can discover and analyze high-quality data to quickly deliver business value.

It assists in successfully meeting increasingly strict compliance requirements, such as those in the General Data Protection Regulation (GDPR). And it provides a clear gauge on business performance.

A mature and sustainable data governance initiative must include data integration.

This often requires reconciling two groups of individuals within the organization: 1) those who care about governance and the meaningful use of data and 2) those who care about getting and transforming the data from source to target for actionable insights.

Both ends of the data value chain are covered when governance is coupled programmatically with IT’s integration practices.

The tools and processes for this should automatically generate “pre-ETL” source-to-target mapping to minimize human errors that can occur while manually compiling and interpreting a multitude of Excel-based data mappings that exist across the organization.

In addition to reducing errors and improving data quality, the efficiencies gained through automation, including minimizing rework, can help cut system development lifecycle costs in half.

In fact, being able to rely on automated and repeatable processes can result in up to 50 percent in design savings, up to 70 percent conversion savings, and up to 70 percent acceleration in total project delivery.

Data Governance and the System Development Lifecycle

Boosting data governance maturity starts with a central metadata repository (data dictionary) for version-controlling metadata imported from a broad array of file and database types to inform data mappings. It can be used to automatically generate governed design mappings and code in the design phase of the system development lifecycle.

The right toolset – one that supports a unifying and underlying metadata model – will be a design and code-generation platform that introduces efficiency, visibility and governance principles while reducing the opportunity for human error.

Automatically generating ETL/ELT jobs for leading ETL tools based on best design practices accommodates those principles; it functions according to approved corporate and industry standards.

Automatically importing mappings from developers’ Excel sheets, flat files, access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement – for lineage and impact analysis – without interrupting system developers’ normal work methods.

GDPR compliance, for example, requires a business to discover source-to-target mappings with all accompanying transactions, such as what business rules in the repository are applied to it, to comply with audits.

THE REGULATORY RATIONALE FOR INTEGRATING DATA MANAGEMENT & DATA GOVERNANCE

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology – that is, reverse-engineering code from existing XML within the ETL layer – to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

This is one example of how to meet data governance demands with more agility and accuracy at high speed.

Faster Time-to-Value with the erwin Automation Framework

The erwin Automation Framework is a metadata-driven universal code generator that works hand in hand with erwin Mapping Manager (MM) for:

  • Pre-ETL enterprise data mapping
  • Governing metadata
  • Governing and versioning source-to-target mappings throughout the lifecycle
  • Data lineage, impact analysis and business rules repositories
  • Automated code generation

If you’d like to save time and money in preparing, deploying and governing you organization’s data, please join us for a demo of erwin MM.

Automate Data Mapping

Categories
erwin Expert Blog

Data Preparation and Data Mapping: The Glue Between Data Management and Data Governance to Accelerate Insights and Reduce Risks

Organizations have spent a lot of time and money trying to harmonize data across diverse platforms, including cleansing, uploading metadata, converting code, defining business glossaries, tracking data transformations and so on. But the attempts to standardize data across the entire enterprise haven’t produced the desired results.

A company can’t effectively implement data governance – documenting and applying business rules and processes, analyzing the impact of changes and conducting audits – if it fails at data management.

The problem usually starts by relying on manual integration methods for data preparation and mapping. It’s only when companies take their first stab at manually cataloging and documenting operational systems, processes and the associated data, both at rest and in motion, that they realize how time-consuming the entire data prepping and mapping effort is, and why that work is sure to be compounded by human error and data quality issues.

To effectively promote business transformation, as well as fulfil regulatory and compliance mandates, there can’t be any mishaps.

It’s obvious that the manual road is very challenging to discover and synthesize data that resides in different formats in thousands of unharvested, undocumented databases, applications, ETL processes and procedural code.

Consider the problematic issue of manually mapping source system fields (typically source files or database tables) to target system fields (such as different tables in target data warehouses or data marts).

These source mappings generally are documented across a slew of unwieldy spreadsheets in their “pre-ETL” stage as the input for ETL development and testing. However, the ETL design process often suffers as it evolves because spreadsheet mapping data isn’t updated or may be incorrectly updated thanks to human error. So questions linger about whether transformed data can be trusted.

Data Quality Obstacles

The sad truth is that high-paid knowledge workers like data scientists spend up to 80 percent of their time finding and understanding source data and resolving errors or inconsistencies, rather than analyzing it for real value.

Statistics are similar when looking at major data integration projects, such as data warehousing and master data management with data stewards challenged to identify and document data lineage and sensitive data elements.

So how can businesses produce value from their data when errors are introduced through manual integration processes? How can enterprise stakeholders gain accurate and actionable insights when data can’t be easily and correctly translated into business-friendly terms?

How can organizations master seamless data discovery, movement, transformation and IT and business collaboration to reverse the ratio of preparation to value delivered.

What’s needed to overcome these obstacles is establishing an automated, real-time, high-quality and metadata- driven pipeline useful for everyone, from data scientists to enterprise architects to business analysts to C-level execs.

Doing so will require a hearty data management strategy and technology for automating the timely delivery of quality data that measures up to business demands.

From there, they need a sturdy data governance strategy and technology to automatically link and sync well-managed data with core capabilities for auditing, statutory reporting and compliance requirements as well as to drive business insights.

Creating a High-Quality Data Pipeline

Working hand-in-hand, data management and data governance provide a real-time, accurate picture of the data landscape, including “data at rest” in databases, data lakes and data warehouses and “data in motion” as it is integrated with and used by key applications. And there’s control of that landscape to facilitate insight and collaboration and limit risk.

With a metadata-driven, automated, real-time, high-quality data pipeline, all stakeholders can access data that they now are able to understand and trust and which they are authorized to use. At last they can base strategic decisions on what is a full inventory of reliable information.

The integration of data management and governance also supports industry needs to fulfill regulatory and compliance mandates, ensuring that audits are not compromised by the inability to discover key data or by failing to tag sensitive data as part of integration processes.

Data-driven insights, agile innovation, business transformation and regulatory compliance are the fruits of data preparation/mapping and enterprise modeling (business process, enterprise architecture and data modeling) that revolves around a data governance hub.

erwin Mapping Manager (MM) combines data management and data governance processes in an automated flow through the integration lifecycle from data mapping for harmonization and aggregation to generating the physical embodiment of data lineage – that is the creation, movement and transformation of transactional and operational data.

Its hallmark is a consistent approach to data delivery (business glossaries connect physical metadata to specific business terms and definitions) and metadata management (via data mappings).

Automate Data Mapping

Categories
erwin Expert Blog

The Unified Data Platform – Connecting Everything That Matters

Businesses stand to gain a lot from a unified data platform.

This decade has seen data-driven leaders dominate their respective markets and inspire other organizations across the board to use data to fuel their businesses, leveraging this strategic asset to create more value below the surface. It’s even been dubbed “the new oil,” but data is arguably more valuable than the analogy suggests.

Data governance (DG) is a key component of the data value chain because it connects people, processes and technology as they relate to the creation and use of data. It equips organizations to better deal with  increasing data volumes, the variety of data sources, and the speed in which data is processed.

But for an organization to realize and maximize its true data-driven potential, a unified data platform is required. Only then can all data assets be discovered, understood, governed and socialized to produce the desired business outcomes while also reducing data-related risks.

Benefits of a Unified Data Platform

Data governance can’t succeed in a bubble; it has to be connected to the rest of the enterprise. Whether strategic, such as risk and compliance management, or operational, like a centralized help desk, your data governance framework should span and support the entire enterprise and its objectives, which it can’t do from a silo.

Let’s look at some of the benefits of a unified data platform with data governance as the key connection point.

Understand current and future state architecture with business-focused outcomes:

A unified data platform with a single metadata repository connects data governance to the roles, goals strategies and KPIs of the enterprise. Through integrated enterprise architecture modeling, organizations can capture, analyze and incorporate the structure and priorities of the enterprise and related initiatives.

This capability allows you to plan, align, deploy and communicate a high-impact data governance framework and roadmap that sets manageable expectations and measures success with metrics important to the business.

Document capabilities and processes and understand critical paths:

A unified data platform connects data governance to what you do as a business and the details of how you do it. It enables organizations to document and integrate their business capabilities and operational processes with the critical data that serves them.

It also provides visibility and control by identifying the critical paths that will have the greatest impacts on the business.

Realize the value of your organization’s data:

A unified data platform connects data governance to specific business use cases. The value of data is realized by combining different elements to answer a business question or meet a specific requirement. Conceptual and logical schemas and models provide a much richer understanding of how data is related and combined to drive business value.

2020 Data Governance and Automation Report

Harmonize data governance and data management to drive high-quality deliverables:

A unified data platform connects data governance to the orchestration and preparation of data to drive the business, governing data throughout the entire lifecycle – from creation to consumption.

Governing the data management processes that make data available is of equal importance. By harmonizing the data governance and data management lifecycles, organizations can drive high-quality deliverables that are governed from day one.

Promote a business glossary for unanimous understanding of data terminology:

A unified data platform connects data governance to the language of the business when discussing and describing data. Understanding the terminology and semantic meaning of data from a business perspective is imperative, but most business consumers of data don’t have technical backgrounds.

A business glossary promotes data fluency across the organization and vital collaboration between different stakeholders within the data value chain, ensuring all data-related initiatives are aligned and business-driven.

Instill a culture of personal responsibility for data governance:

A unified data platform is inherently connected to the policies, procedures and business rules that inform and govern the data lifecycle. The centralized management and visibility afforded by linking policies and business rules at every level of the data lifecycle will improve data quality, reduce expensive re-work, and improve the ideation and consumption of data by the business.

Business users will know how to use (and how not to use) data, while technical practitioners will have a clear view of the controls and mechanisms required when building the infrastructure that serves up that data.

Better understand the impact of change:

Data governance should be connected to the use of data across roles, organizations, processes, capabilities, dashboards and applications. Proactive impact analysis is key to efficient and effective data strategy. However, most solutions don’t tell the whole story when it comes to data’s business impact.

By adopting a unified data platform, organizations can extend impact analysis well beyond data stores and data lineage for true visibility into who, what, where and how the impact will be felt, breaking down organizational silos.

Getting the Competitive “EDGE”

The erwin EDGE delivers an “enterprise data governance experience” in which every component of the data value chain is connected.

Now with data mapping, it unifies data preparation, enterprise modeling and data governance to simplify the entire data management and governance lifecycle.

Both IT and the business have access to an accurate, high-quality and real-time data pipeline that fuels regulatory compliance, innovation and transformation initiatives with accurate and actionable insights.

Categories
erwin Expert Blog

Healthcare Data Governance: What’s the Prognosis?

Healthcare data governance has far more applications than just meeting compliance standards. Healthcare costs are always a topic of discussion, as is the state of health insurance and policies like the Affordable Care Act (ACA).

Costs and policy are among a number of significant trends called out in the executive summary of the Stanford Medicine 2017 Health Trend Report. But the summary also included a common thread that connects them all:

“Behind these trends is one fundamental force driving health care transformation: the power of data.”

Indeed, data is essential to healthcare in areas like:

  • Medical research – Collecting and reviewing increasingly large data sets has the potential to introduce new levels of speed and efficiency into what has been an often slow and laborious process.
  • Preventative care – Wearable devices help consumers track exercise, diet, weight and nutrition, as well as clinical applications like genetic sequencing.
  • The patient experience – Healthcare is not immune to issues of customer service and the need to provide timely, accurate responses to questions or complaints.
  • Disease and outbreak prevention – Data and analysis can help spot patterns, so clinicians get ahead of big problems before they become epidemics.

Data Management and Data Governance

Data Vulnerabilities in Healthcare

Data is valuable to the healthcare industry. But it also carries risks because of the volume and velocity with which it is collected and stored. Foremost among these are regulatory compliance and security.

Because healthcare data is so sensitive, the ways in which it is secured and shared are watched closely by regulators. HIPAA (Health Information Portability and Accountability Act) is probably the most recognized regulation governing data in healthcare, but it is not the only one.

In addition to privacy and security policies, other challenges that prevent the healthcare industry from maximizing the ways it puts data to work include:

  • High costs, which are further exacerbated by expected lower commercial health insurance payouts and higher payouts from low-margin services like Medicare, as well as rising labor costs. Data and analytics can potentially help hospitals better plan for these challenges, but thin margins might prevent the investments necessary in this area.
  • Electronic medical records, which the Stanford report cited as a cause of frustration that negatively impacts relationships between patients and healthcare providers.
  • Silos of data, which often are caused by mergers and acquisitions within the industry, but that are also emblematic of the number of platforms and applications used by providers, insurers and other players in the healthcare market.

Early 2018 saw a number of mergers and acquisitions in the healthcare industry, including hospital systems in New England, as well as in the Philadelphia area of the United States. The $69 billion dollar merger of Aetna and CVS also was approved by shareholders in early 2018, making it one of the most significant deals of the past decade.

Each merger and acquisition requires careful and difficult decisions concerning the application portfolio and data of each organization. Redundancies need to identified, as do gaps, so the patient experience and care continues without serious disruption.

Truly understanding healthcare data requires a holistic approach to data governance that is embedded in business processes and enterprise architecture. When implemented properly, data governance initiatives help healthcare organizations understand what data they have, where it is, where it came from, its value, its quality and how it’s used and accessed by people and applications.

Healthcare Data Governance

Improving Healthcare Analytics and Patient Care with Healthcare Data Governance

Data governance plays a vital role in compliance because data is easier to protect when you know where it is stored, what it is, and how it needs to be governed. According to a 2017 survey by erwin, Inc. and UBM, 60 percent of organizations said compliance was driving their data governance initiatives.

With a solid understand of their data and the ways it is collected and consumed throughout their organizations, healthcare players are better positioned to reap the benefits of analytics. As Deloitte pointed out in a perspectives piece about healthcare analytics, the shift to value-based care makes analytics within the industry more essential than ever.

With increasing pressure on margins, the combination of data governance and analytics is critical to creating value and finding efficiencies. Investments in analytics are only as valuable as the data they are fed, however.

Poor decisions based on poor data will lead to bad outcomes, but they also diminish trust in the analytics platform, which will ruin the ROI as it is used less and less.

Most important, healthcare data governance plays a critical role in helping improve patient outcomes and value. In healthcare, the ability to make timely, accurate decisions based on quality data can be a matter of life or death.

In areas like preventative care and the patient experience, good data can mean better advice to patients, more accurate programs for follow-up care, and the ability to meet their medical and lifestyle needs within a healthcare facility or beyond.

As healthcare organizations look to improve efficiencies, lower costs and provide quality, value-based care, healthcare data governance will be essential to better outcomes for patients, providers and the industry at large.

For more information, please download our latest whitepaper, The Regulatory Rationale for Integrating Data Management and Data Governance.

If you’re interested in healthcare data governance, or evaluating new data governance technologies for another industry, you can schedule a demo of erwin’s data mapping and data governance solutions.

Data Mapping Demo CTA

Michael Pastore is the Director, Content Services at QuinStreet B2B Tech.

Categories
erwin Expert Blog

Financial Services Data Governance: Helping Value ‘the New Currency’

For organizations operating in financial services data governance is becoming increasingly more important. When financial services industry board members and executives gathered for EY’s Financial Services Leadership Summit in early 2018, data was a major topic of conversation.

Attendees referred to data as “the new oil” and “the new currency,” and with good reason. Financial services organizations, including banks, brokerages, insurance companies, asset management firms and more, collect and store massive amounts of data.

But data is only part of the bigger picture in financial services today. Many institutions are investing heavily in IT to help transform their businesses to serve customers and partners who are quickly adopting new technologies. For example, Gartner research expects the global banking industry will spend $519 billion on IT in 2018.

The combination of more data and technology and fewer in-person experiences puts a premium on trust and customer loyalty. Trust has long been at the heart of the financial services industry. It’s why bank buildings in a bygone era were often erected as imposing stone structures that signified strength at a time before deposit insurance, when poor management or even a bank robbery could have devastating effects on a local economy.

Trust is still vital to the health of financial institutions, except today’s worst-case scenario often involves faceless hackers pillaging sensitive data to use or re-sell on the dark web. That’s why governing all of the industry’s data, and managing the risks that comes with collecting and storing such vast amounts of information, is increasingly a board-level issue.

The boards of modern financial services institutions understand three important aspects of data:

  1. Data has a tremendous amount of value to the institution in terms of helping identify the wants and needs of customers.
  2. Data is central to security and compliance, and there are potentially severe consequences for organizations that run afoul of either.
  3. Data is central to the transformation underway at many financial institutions as they work to meet the needs of the modern customer and improve their own efficiencies.

Data Management and Data Governance: Solving the Enterprise Data Dilemma

Data governance helps organizations in financial services understand their data. It’s essential to protecting that data and to helping comply with the many government and industry regulations in the industry. But financial services data governance – all data governance in fact – is about more than security and compliance; it’s about understanding the value and quality of data.

When done right and deployed in a holistic manner that’s woven into the business processes and enterprise architecture, data governance helps financial services organizations better understand where their data is, where it came from, its value, its quality, and how the data is accessed and used by people and applications.

Financial Services Data Governance: It’s Complicated

Financial services data governance is getting increasingly complicated for a number of reasons.

Mergers & Acquisitions

Deloitte’s 2018 Banking and Securities M&A Outlook described 2017 as “stuck in neutral,” but there is reason to believe the market picks up steam in 2018 and beyond, especially when it comes to financial technology (or fintech) firms. Bringing in new sets of data, new applications and new processes through mergers and acquisitions creates a great deal of complexity.

The integrations can be difficult, and there is an increased likelihood of data sprawl and data silos. Data governance not only helps organizations better understand the data, but it also helps make sense of the application portfolios of merging institutions to discover gaps and redundancies.

Regulatory Environment

There is a lengthy list of regulations and governing bodies that oversee the financial services industry, covering everything from cybersecurity to fraud protection to payment processing, all in an effort to minimize risk and protect customers.

The holistic view of data that results from a strong data governance initiative is becoming essential to regulatory compliance. According to a 2017 survey by erwin, Inc. and UBM, 60 percent of organizations said compliance drives their data governance initiatives.

More Partnerships and Networks

According to research by IBM, 45 percent of bankers say partnerships and alliances help improve their agility and competitiveness. Like consumers, today’s financial institutions are more connected than ever before, and it’s no longer couriers and cash that are being transferred in these partnerships; it’s data.

Understanding the value, quality and risk of the data shared in these alliances is essential – not only to be a good partner and derive a business benefit from the relationship, but also to evaluate whether or not an alliance or partnership makes good business sense.

Financial Services Data Governance

More Sources of Data, More Touch Points

Financial services institutions are at the forefront of the multi-channel customer experience and have been for years. People do business with institutions by phone, in person, via the Web, and using mobile devices.

All of these touch points generate data, and it is essential that organizations can tie them all together to understand their customers. This information is not only important to customer service, but also to finding opportunities to grow relationships with customers by identifying where it makes sense to upsell and cross-sell products and services.

Grow the Business, Manage the Risk

In the end, financial services organizations need to understand the ways their data can help grow the business and manage risk. Data governance plays an important role in both.

Financial services data governance can better enable:

  • The personalized, self-service, applications customers want
  • The machine learning solutions that automate decision-making and create more efficient business processes
  • Faster and more accurate identification of cross-sell and upsell opportunities
  • Better decision-making about the application portfolio, M&A targets, M&A success and more

If you’re interested in financial services data governance, or evaluating new data governance technologies for another industry, you can schedule a demo of erwin’s data mapping and data governance solutions.

Data Mapping Demo CTA

And you also might want to download our latest e-book, Solving the Enterprise Data Dilemma.

Michael Pastore is the Director, Content Services at QuinStreet B2B Tech.