Categories
erwin Expert Blog Data Governance

Are Data Governance Bottlenecks Holding You Back?

Better decision-making has now topped compliance as the primary driver of data governance. However, organizations still encounter a number of bottlenecks that may hold them back from fully realizing the value of their data in producing timely and relevant business insights.

While acknowledging that data governance is about more than risk management and regulatory compliance may indicate that companies are more confident in their data, the data governance practice is nonetheless growing in complexity because of more:

  • Data to handle, much of it unstructured
  • Sources, like IoT
  • Points of integration
  • Regulations

Without an accurate, high-quality, real-time enterprise data pipeline, it will be difficult to uncover the necessary intelligence to make optimal business decisions.

So what’s holding organizations back from fully using their data to make better, smarter business decisions?

Data Governance Bottlenecks

erwin’s 2020 State of Data Governance and Automation report, based on a survey of business and technology professionals at organizations of various sizes and across numerous industries, examined the role of automation in  data governance and intelligence  efforts.  It uncovered a number of obstacles that organizations have to overcome to improve their data operations.

The No.1 bottleneck, according to 62 percent of respondents, was documenting complete data lineage. Understanding the quality of source data is the next most serious bottleneck (58 percent); followed by finding, identifying, and harvesting data (55 percent); and curating assets with business context (52 percent).

The report revealed that all but two of the possible bottlenecks were marked by more than 50 percent of respondents. Clearly, there’s a massive need for a data governance framework to keep these obstacles from stymying enterprise innovation.

As we zeroed in on the bottlenecks of day-to-day operations, 25 percent of respondents said length of project/delivery time was the most significant challenge, followed by data quality/accuracy is next at 24 percent, time to value at 16 percent, and reliance on developer and other technical resources at 13 percent.

Are Data Governance Bottlenecks Holding You Back?

Overcoming Data Governance Bottlenecks

The 80/20 rule describes the unfortunate reality for many data stewards: they spend 80 percent of their time finding, cleaning and reorganizing huge amounts of data and only 20 percent on actual data analysis.

In fact, we found that close to 70 percent of our survey respondents spent an average of 10 or more hours per week on data-related activities, most of it searching for and preparing data.

What can you do to reverse the 80/20 rule and subsequently overcome data governance bottlenecks?

1. Don’t ignore the complexity of data lineage: It’s a risky endeavor to support data lineage using a manual approach, and businesses that attempt it that way will find it’s not sustainable given data’s constant movement from one place to another via multiple routes – and doing it correctly down to the column level. Adopting automated end-to-end lineage makes it possible to view data movement from the source to reporting structures, providing a comprehensive and detailed view of data in motion.

2. Automate code generation: Alleviate the need for developers to hand code connections from data sources to target schema. Mapping data elements to their sources within a single repository to determine data lineage and harmonize data integration across platforms reduces the need for specialized, technical resources with knowledge of ETL and database procedural code. It also makes it easier for business analysts, data architects, ETL developers, testers and project managers to collaborate for faster decision-making.

3. Use an integrated impact analysis solution: By automating data due diligence for IT you can deliver operational intelligence to the business. Business users benefit from automating impact analysis to better examine value and prioritize individual data sets. Impact analysis has equal importance to IT for automatically tracking changes and understanding how data from one system feeds other systems and reports. This is an aspect of data lineage, created from technical metadata, ensuring nothing “breaks” along the change train.

4. Put data quality first: Users must have confidence in the data they use for analytics. Automating and matching business terms with data assets and documenting lineage down to the column level are critical to good decision-making. If this approach hasn’t been the case to date, enterprises should take a few steps back to review data quality measures before jumping into automating data analytics.

5. Catalog data using a solution with a broad set of metadata connectors: All data sources will be leveraged, including big data, ETL platforms, BI reports, modeling tools, mainframe, and relational data as well as data from many other types of systems. Don’t settle for a data catalog from an emerging vendor that only supports a narrow swath of newer technologies, and don’t rely on a catalog from a legacy provider that may supply only connectors for standard, more mature data sources.

6. Stress data literacy: You want to ensure that data assets are used strategically. Automation expedites the benefits of data cataloging. Curated internal and external datasets for a range of content authors doubles business benefits and ensures effective management and monetization of data assets in the long-term if linked to broader data governance, data quality and metadata management initiatives. There’s a clear connection to data literacy here because of its foundation in business glossaries and socializing data so all stakeholders can view and understand it within the context of their roles.

7. Make automation the norm across all data governance processes: Too many companies still live in a world where data governance is a high-level mandate, not practically implemented. To fully realize the advantages of data governance and the power of data intelligence, data operations must be automated across the board. Without automated data management, the governance housekeeping load on the business will be so great that data quality will inevitably suffer. Being able to account for all enterprise data and resolve disparity in data sources and silos using manual approaches is wishful thinking.

8. Craft your data governance strategy before making any investments: Gather multiple stakeholders—both business and IT— with multiple viewpoints to discover where their needs mesh and where they diverge and what represents the greatest pain points to the business. Solve for these first, but build buy-in by creating a layered, comprehensive strategy that ultimately will address most issues. From there, it’s on to matching your needs to an automated data governance solution that squares with business and IT – both for immediate requirements and future plans.

Register now for the first of a new, six-part webinar series on the practice of data governance and how to proactively deal with the complexities. “The What & Why of Data Governance” webinar on Tuesday, Feb. 23rd at 3 pm GMT/10 am ET.

Categories
erwin Expert Blog

Managing Ideation and Innovation with Enterprise Architecture

Organizations largely recognize the need for enterprise architecture tools, yet some still struggle to communicate their value and prioritize such initiatives.

As data-driven business thrives, organizations will have to overcome these challenges because managing IT trends and emerging technologies makes enterprise architecture (EA) increasingly relevant.

“By 2021, 40 percent of organizations will use enterprise architects to help ideate new business innovations made possible by emerging technologies,” says Marcus Blosch, Vice President Analyst, Gartner.

With technology now vital to every aspect of the business, enterprise architecture tools and EA as a function help generate and evaluate ideas that move the business forward.

Every business has its own (often ad hoc) way of gathering ideas and evaluating them to see how they can be implemented and what it would take to deploy them.

But organizations can use enterprise architecture tools to bridge the gap between ideation and implementation, making more informed choices in the process.

By combining enterprise architecture tools with the EA team’s knowledge in a process for managing ideas and innovation, organizations can be more strategic in their planning.

Emerging technologies is one of the key areas in which such a process benefits an organization. The timely identification of emerging technologies can make or break a business. The more thought that goes into the planning of when and how to use emerging technologies, the better the implementation, which leads to better outcomes and greater ROI.

Gartner emphasize the value of enterprise architecture tools

Enterprise Architecture Tools: The Fabric of Your Organization

At its 2019 Gartner Enterprise Architecture & Technology Innovation Summit, Gartner identified 10 emerging and strategic technology trends that will shape IT in the coming years.

They included trends that utilize intelligence, such as autonomous things and augmented analytics; digital trends like empowered edge and immersive experiences; mesh trends like Blockchain and smart spaces; as well as broad concepts like digital ethics and privacy and quantum computing.

As these trends develop into applications or become part of your organization’s fabric, you need to think about how they can help grow your business in the near and long term. How will your business investigate their use? How will you identify the people who understand how they can be used to drive your business?

Many organizations lack a structured approach for gathering and investigating employee ideas, especially those around emerging technologies. This creates two issues:

1. When employee ideas fall into a black hole where they don’t get feedback, the employees become less engaged.

2. The emerging technology and its implementation are disconnected, which leads to silos or wasted resources.

How Enterprise Architecture Tools Help Communicate the Value of Emerging Technologies

When your enterprise architecture is aligned with your business outcomes it provides a way to help your business ideate and investigate the viability of ideas on both the technical and business level. When aligned correctly, emerging technologies can be evaluated based on how they meet business needs and what the IT organization must do to support them.

But the only way you can accurately make those determinations is by having visibility into your IT services and the application portfolio. And that’s how enterprise architecture can help communicate the value of emerging technologies in your organization.

erwin EA provides a way to quickly and efficiently understand opportunities offered by new technologies, process improvements and portfolio rationalization and translate them into an actionable strategy for the entire organization.

Take erwin EA for a free spin thanks to our secure, cloud-based trial.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Choosing the Right Data Modeling Tool

The need for an effective data modeling tool is more significant than ever.

For decades, data modeling has provided the optimal way to design and deploy new relational databases with high-quality data sources and support application development. But it provides even greater value for modern enterprises where critical data exists in both structured and unstructured formats and lives both on premise and in the cloud.

In today’s hyper-competitive, data-driven business landscape, organizations are awash with data and the applications, databases and schema required to manage it.

For example, an organization may have 300 applications, with 50 different databases and a different schema for each. Additional challenges, such as increasing regulatory pressures – from the General Data Protection Regulation (GDPR) to the Health Insurance Privacy and Portability Act (HIPPA) – and growing stores of unstructured data also underscore the increasing importance of a data modeling tool.

Data modeling, quite simply, describes the process of discovering, analyzing, representing and communicating data requirements in a precise form called the data model. There’s an expression: measure twice, cut once. Data modeling is the upfront “measuring tool” that helps organizations reduce time and avoid guesswork in a low-cost environment.

From a business-outcome perspective, a data modeling tool is used to help organizations:

  • Effectively manage and govern massive volumes of data
  • Consolidate and build applications with hybrid architectures, including traditional, Big Data, cloud and on premise
  • Support expanding regulatory requirements, such as GDPR and the California Consumer Privacy Act (CCPA)
  • Simplify collaboration across key roles and improve information alignment
  • Improve business processes for operational efficiency and compliance
  • Empower employees with self-service access for enterprise data capability, fluency and accountability

Data Modeling Tool

Evaluating a Data Modeling Tool – Key Features

Organizations seeking to invest in a new data modeling tool should consider these four key features.

  1. Ability to visualize business and technical database structures through an integrated, graphical model.

Due to the amount of database platforms available, it’s important that an organization’s data modeling tool supports a sufficient (to your organization) array of platforms. The chosen data modeling tool should be able to read the technical formats of each of these platforms and translate them into highly graphical models rich in metadata. Schema can be deployed from models in an automated fashion and iteratively updated so that new development can take place via model-driven design.

  1. Empowering of end-user BI/analytics by data source discovery, analysis and integration. 

A data modeling tool should give business users confidence in the information they use to make decisions. Such confidence comes from the ability to provide a common, contextual, easily accessible source of data element definitions to ensure they are able to draw upon the correct data; understand what it represents, including where it comes from; and know how it’s connected to other entities.

A data modeling tool can also be used to pull in data sources via self-service BI and analytics dashboards. The data modeling tool should also have the ability to integrate its models into whatever format is required for downstream consumption.

  1. The ability to store business definitions and data-centric business rules in the model along with technical database schemas, procedures and other information.

With business definitions and rules on board, technical implementations can be better aligned with the needs of the organization. Using an advanced design layer architecture, model “layers” can be created with one or more models focused on the business requirements that then can be linked to one or more database implementations. Design-layer metadata can also be connected from conceptual through logical to physical data models.

  1. Rationalize platform inconsistencies and deliver a single source of truth for all enterprise business data.

Many organizations struggle to breakdown data silos and unify data into a single source of truth, due in large part to varying data sources and difficulty managing unstructured data. Being able to model any data from anywhere accounts for this with on-demand modeling for non-relational databases that offer speed, horizontal scalability and other real-time application advantages.

With NoSQL support, model structures from non-relational databases, such as Couchbase and MongoDB can be created automatically. Existing Couchbase and MongoDB data sources can be easily discovered, understood and documented through modeling and visualization. Existing entity-relationship diagrams and SQL databases can be migrated to Couchbase and MongoDB too. Relational schema also will be transformed to query-optimized NoSQL constructs.

Other considerations include the ability to:

  • Compare models and databases.
  • Increase enterprise collaboration.
  • Perform impact analysis.
  • Enable business and IT infrastructure interoperability.

When it comes to data modeling, no one knows it better. For more than 30 years, erwin Data Modeler has been the market leader. It is built on the vision and experience of data modelers worldwide and is the de-facto standard in data model integration.

You can learn more about driving business value and underpinning governance with erwin DM in this free white paper.

Data Modeling Drives Business Value

Categories
erwin Expert Blog

Digital Transformation Examples: Three Industries Dominating Digital Transformation

Digital transformation examples can be found almost anywhere, in almost any industry. Its past successes – and future potential – are well documented, chronicled in the billion-dollar valuations of the frontrunners in the practice.

Amazon began as a disruptor to brick-and-mortar bookstores, eventually becoming one of the most obvious digital transformation examples as it went on to revolutionize online shopping.

Netflix’s origins were similar – annihilating its former rival Blockbuster and the entire DVD rental market to become a dominant streaming platform and media publisher.

Disruption is the common theme. Netflix decimated the DVD rental market while Amazon continues to play a role in “high-street” shopping’s decline.

As technology continues to disrupt markets, digital transformation is do or die.

According to IDC’s digital transformation predictions report for 2019, these types of initiatives are going to flood the enterprise during the next five years.

The following three examples highlight the extent to which digital transformation is reshaping the nature of business and government and how we – as a society – interact with the world.

Digital Transformation in Retail

The inherently competitive nature of retail has made the sector a leader in adopting data-driven strategy.

From loyalty cards to targeted online ads, retail has always had to adapt to stay relevant.

Four main areas in retail demonstrate digital transformation, with a healthy data governance initiative driving them all.

Digital transformation examples

With accurate, relevant and accessible data, organizations can address the following:

  • Customer experience: If your data shows a lot of abandoned carts from mobile app users, then that’s an area to investigate, and good data will identify it.
  • Competitive differentiation: Are personalized offers increasing sales and creating customer loyalty? This is an important data point for marketing strategy.
  • Supply chain:Can a problem with quality be related to items shipping from a certain warehouse? Data will zero in on the location of the problem.
  • Partnerships:Are your partnerships helping grow other parts of your business and creating new customers? Or are your existing customers using partners in place of visiting your store? Data can tell you.

This article further explores digital transformation and data governance in retail.

Digital Transformation in Hospitality

Hospitality is another industry awash in digital transformation examples. Brick-and-mortar travel agencies are ceding ground to mobile-first (and mobile-only) businesses.

Their offerings range from purchasing vacation packages to the ability to check in and order room service via mobile devices.

With augmented and virtual reality, it even may be possible to one day “test drive” holiday plans from the comfort of the sofa – say before swimming with sharks or going on safari.

The extent of digitization now possible in the hospitality industry means these businesses have to account for and manage an abundance of data types and sources to glean insights to fuel the best customer experiences.

Unsurprisingly, this is yet another area where a healthy data governance initiative can be the difference between industry-disrupting success and abject failure.

This piece further discusses how data is transforming the hospitality industry and the role of data governance in it.

Digital Transformation in Municipal Government

Historically, municipal government isn’t seen as an area at the forefront of adopting emerging technology.

But the emergence of “smart cities” is a prominent example of digital transformation.

Even the concept of a smart city is a response to existing digital transformation in the private sector, as governments have been coerced into updating infrastructure to reflect the modern world.

Today, municipal governments around the world are using digital transformation to improve residents’ quality of life, from improving transportation and public safety to making it convenient to pay bills or request services online.

Of course, when going “smart,” municipal governments will need an understanding of data governance best practices.

This article analyzes how municipal governments can be “smart” about their transformation efforts.

Mitigating Digital Transformation Risks

Risks come with any investment. But in the context of digital transformation, taking risks is both a necessity and an inevitability.

Organizations also will need to consult their data to ensure they transform themselves the right way – and not just for transformation’s sake.

A recent PwC study found that successful digital transformation risk-takers “find the right fit for emerging technologies.”

Doing so points to the need for both effective data governance to find, understand and socialize the most relevant data assets and healthy enterprise architecture to learn what systems and applications create, store and use those data assets.

With application portfolio management and impact analysis, organizations can identify immediate opportunities for digital transformation and areas where more consideration and planning may be necessary before making changes.

As the data governance company, we provide data governance as well as enterprise architecture software, plus tools for business process and data modeling, data cataloging and data literacy. As an integrated software platform, organizations ensure IT and business collaboration to drive risk management, innovation and transformation efforts.

If you’d like to learn more about digital transformation and other use cases for data governance technologies, stay up to date with the erwin Experts here.

erwin Expert Blog Subscribe

Categories
erwin Expert Blog

A Guide to CCPA Compliance and How the California Consumer Privacy Act Compares to GDPR

California Consumer Privacy Act (CCPA) compliance shares many of the same requirements in the European Unions’ General Data Protection Regulation (GDPR).

While the CCPA has been signed into law, organizations have until Jan. 1, 2020, to enact its mandates. Luckily, many organizations have already laid the regulatory groundwork for it because of their efforts to comply with GDPR.

However, there are some key differences that we’ll explore in the Q&A below.

Data governance, thankfully, provides a framework for compliance with either or both – in addition to other regulatory mandates your organization may be subject to.

CCPA Compliance Requirements vs. GDPR FAQ

Does CCPA apply to not-for-profit organizations? 

No, CCPA compliance only applies to for-profit organizations. GDPR compliance is required for any organization, public or private (including not-for-profit).

What for-profit businesses does CCPA apply to?

The mandate for CCPA compliance only applies if a for-profit organization:

  • Has an annual gross revenue exceeding $25 million
  • Collects, sells or shares the personal data of 50,000 or more consumers, households or devices
  • Earns 50% of more of its annual revenue by selling consumers’ personal information

Does the CCPA apply outside of California?

As the name suggests, the legislation is designed to protect the personal data of consumers who reside in the state of California.

But like GDPR, CCPA compliance has impacts outside the area of origin. This means businesses located outside of California, but selling to (or collecting the data of) California residents must also comply.

Does the CCPA exclude anything that GDPR doesn’t? 

GDPR encompasses all categories of “personal data,” with no distinctions.

CCPA does make distinctions, particularly when other regulations may overlap. These include:

  • Medical information covered by the Confidentiality of Medical Information Act (CMIA) and the Health Insurance Portability and Accountability Act (HIPAA)
  • Personal information covered by the Gramm-Leach-Bliley Act (GLBA)
  • Personal information covered by the Driver’s Privacy Protection Act (DPPA)
  • Clinical trial data
  • Information sold to or by consumer reporting agencies
  • Publicly available personal information (federal, state and local government records)

What about access requests? 

Under the GDPR, organizations must make any personal data collected from an EU citizen available upon request.

CCPA compliance only requires data collected within the last 12 months to be shared upon request.

Does the CCPA include the right to opt out?

CCPA, like GDPR, empowers gives consumers/citizens the right to opt out in regard to the processing of their personal data.

However, CCPA compliance only requires an organization to observe an opt-out request when it comes to the sale of personal data. GDPR does not make any distinctions between “selling” personal data and any other kind of data processing.

To meet CCPA compliance opt-out standards, organizations must provide a “Do Not Sell My Personal Information” link on their home pages.

Does the CCPA require individuals to willingly opt in?

No. Whereas the GDPR requires informed consent before an organization sells an individual’s information, organizations under the scope of the CCPA can still assume consent. The only exception involves the personal information of children (under 16). Children over 13 can consent themselves, but if the consumer is a child under 13, a parent or guardian must authorize the sale of said child’s personal data.

What about fines for CCPA non-compliance? 

In theory, fines for CCPA non-compliance are potentially more far reaching than those of GDPR because there is no ceiling for CCPA penalties. Under GDPR, penalties have a ceiling of 4% of global annual revenue or €20 million, whichever is greater. GDPR recently resulted in a record fine for Google.

Organizations outside of CCPA compliance can only be fined up to $7,500 per violation, but there is no upper ceiling.

CCPA compliance is a data governance issue

Data Governance for Regulatory Compliance

While CCPA has a more narrow geography and focus than GDPR, compliance is still a serious effort for organizations under its scope. And as data-driven business continues to expand, so too will the pressure on lawmakers to regulate how organizations process data. Remember the Facebook hearings and now inquiries into Google and Twitter, for example?

Regulatory compliance remains a key driver for data governance. After all, to understand how to meet data regulations, an organization must first understand its data.

An effective data governance initiative should enable just that, by giving an organization the tools to:

  • Discover data: Identify and interrogate metadata from various data management silos
  • Harvest data: Automate the collection of metadata from various data management silos and consolidate it into a single source
  • Structure data: Connect physical metadata to specific business terms and definitions and reusable design standards
  • Analyze data: Understand how data relates to the business and what attributes it has
  • Map data flows: Identify where to integrate data and track how it moves and transforms
  • Govern data: Develop a governance model to manage standards and policies and set best practices
  • Socialize data: Enable all stakeholders to see data in one place in their own context

A Regulatory EDGE

The erwin EDGE software platform creates an “enterprise data governance experience” to transform how all stakeholders discover, understand, govern and socialize data assets. It includes enterprise modeling, data cataloging and data literacy capabilities, giving organizations visibility and control over their disparate architectures and all the supporting data.

Both IT and business stakeholders have role-based, self-service access to the information they need to collaborate in making strategic decisions. And because many of the associated processes can be automated, you reduce errors and increase the speed and quality of your data pipeline. This data intelligence unlocks knowledge and value.

The erwin EDGE provides the most agile, efficient and cost-effective means of launching and sustaining a strategic and comprehensive data governance initiative, whether you wish to deploy on premise or in the cloud. But you don’t have to implement every component of the erwin EDGE all at once to see strategic value.

Because of the platform’s federated design, you can address your organization’s most urgent needs, such as regulatory compliance, first. Then you can proactively address other organization objectives, such as operational efficiency, revenue growth, increasing customer satisfaction and improving overall decision-making.

You can learn more about leveraging data governance to navigate the changing tide of data regulations here.

Are you compliant with data regulations?

Categories
erwin Expert Blog

Keeping Up with New Data Protection Regulations

Keeping up with new data protection regulations can be difficult, and the latest – the General Data Protection Regulation (GDPR) – isn’t the only new data protection regulation organizations should be aware of.

California recently passed a law that gives residents the right to control the data companies collect about them. Some suggest the California Consumer Privacy Act (CCPA), which takes effect January 1, 2020, sets a precedent other states will follow by empowering consumers to set limits on how companies can use their personal information.

In fact, organizations should expect increasing pressure on lawmakers to introduce new data protection regulations. A number of high-profile data breaches and scandals have increased public awareness of the issue.

Facebook was in the news again last week for another major problem around the transparency of its user data, and the tech-giant also is reportedly facing 10 GDPR investigations in Ireland – along with Apple, LinkedIn and Twitter.

Some industries, such as healthcare and financial services, have been subject to stringent data regulations for years: GDPR now joins the Health Insurance Portability and Accountability Act (HIPAA), the Payment Card Industry Data Security Standard (PCI DSS) and the Basel Committee on Banking Supervision (BCBS).

Due to these pre-existing regulations, organizations operating within these sectors, as well as insurance, had some of the GDPR compliance bases covered in advance.

Other industries had their own levels of preparedness, based on the nature of their operations. For example, many retailers have robust, data-driven e-commerce operations that are international. Such businesses are bound to comply with varying local standards, especially when dealing with personally identifiable information (PII).

Smaller, more brick-and-mortar-focussed retailers may have had to start from scratch.

But starting position aside, every data-driven organization should strive for a better standard of data management — and not just for compliance sake. After all, organizations are now realizing that data is one of their most valuable assets.

New Data Protection Regulations – Always Be Prepared

When it comes to new data protection regulations in the face of constant data-driven change, it’s a matter of when, not if.

As they say, the best defense is a good offense. Fortunately, whenever the time comes, the first point of call will always be data governance, so organizations can prepare.

Effective compliance with new data protection regulations requires a robust understanding of the “what, where and who” in terms of data and the stakeholders with access to it (i.e., employees).

The Regulatory Rationale for Integrating Data Management & Data Governance

This is also true for existing data regulations. Compliance is an on-going requirement, so efforts to become compliant should not be treated as static events.

Less than four months before GDPR came into effect, only 6 percent of enterprises claimed they were prepared for it. Many of these organizations will recall a number of stressful weeks – or even months – tidying up their databases and their data management processes and policies.

This time and money was spent reactionarily, at the behest of proactive efforts to grow the business.

The implementation and subsequent observation of a strong data governance initiative ensures organizations won’t be put on the spot going forward. Should an audit come up, current projects aren’t suddenly derailed as they reenact pre-GDPR panic.

New Data Regulations

Data Governance: The Foundation for Compliance

The first step to compliance with new – or old – data protection regulations is data governance.

A robust and effective data governance initiative ensures an organization understands where security should be focussed.

By adopting a data governance platform that enables you to automatically tag sensitive data and track its lineage, you can ensure nothing falls through the cracks.

Your chosen data governance solution should enable you to automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data – Gain better visibility and control across the enterprise to identify data security threats and reduce associated risks.
  • Enriching business data elements for sensitive data discovery – By leveraging a comprehensive mechanism to define business data elements for PII, PHI and PCI across database systems, cloud and Big Data stores, you can easily identify sensitive data based on a set of algorithms and data patterns.
  • Providing metadata and value-based analysis – Simplify the discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data, including PII, PHI and PCI.

With these precautionary steps, organizations are primed to respond if a data breach occurs. Having a well governed data ecosystem with data lineage capabilities means issues can be quickly identified.

Additionally, if any follow-up is necessary –  such as with GDPR’s data breach reporting time requirements – it can be handles swiftly and in accordance with regulations.

It’s also important to understand that the benefits of data governance don’t stop with regulatory compliance.

A better understanding of what data you have, where it’s stored and the history of its use and access isn’t only beneficial in fending off non-compliance repercussions. In fact, such an understanding is arguably better put to use proactively.

Data governance improves data quality standards, it enables better decision-making and ensures businesses can have more confidence in the data informing those decisions.

The same mechanisms that protect data by controlling its access also can be leveraged to make data more easily discoverable to approved parties – improving operational efficiency.

All in all, the cumulative result of data governance’s influence on data-driven businesses both drives revenue (through greater efficiency) and reduces costs (less errors, false starts, etc.).

To learn more about data governance and the regulatory rationale for its implementation, get our free guide here.

DG RediChek

Categories
erwin Expert Blog Data Governance

Data Governance Frameworks: The Key to Successful Data Governance Implementation

A strong data governance framework is central to successful data governance implementation in any data-driven organization because it ensures that data is properly maintained, protected and maximized.

But despite this fact, enterprises often face push back when implementing a new data governance initiative or trying to mature an existing one.

Let’s assume you have some form of informal data governance operation with some strengths to build on and some weaknesses to correct. Some parts of the organization are engaged and behind the initiative, while others are skeptical about its relevance or benefits.

Some other common data governance implementation obstacles include:

  • Questions about where to begin and how to prioritize which data streams to govern first
  • Issues regarding data quality and ownership
  • Concerns about data lineage
  • Competing project and resources (time, people and funding)

By using a data governance framework, organizations can formalize their data governance implementation and subsequent adherence to. This addressess common concerns including data quality and data lineage, and provides a clear path to successful data governance implementation.

In this blog, we will cover three key steps to successful data governance implementation. We will also look into how we can expand the scope and depth of a data governance framework to ensure data governance standards remain high.

Data Governance Implementation in 3 Steps

When maturing or implementing data governance and/or a data governance framework, an accurate assessment of the ‘here and now’ is key. Then you can rethink the path forward, identifying any current policies or business processes that should be incorporated, being careful to avoid making the same mistakes of prior iterations.

With this in mind, here are three steps we recommend for implementing data governance and a data governance framework.

Data Governance Framework

Step 1: Shift the culture toward data governance

Data governance isn’t something to set and forget; it’s a strategic approach that needs to evolve over time in response to new opportunities and challenges. Therefore, a successful data governance framework has to become part of the organization’s culture but such a shift requires listening – and remembering that it’s about people, empowerment and accountability.

In most cases, a new data governance framework requires people – those in IT and across the business, including risk management and information security – to change how they work. Any concerns they raise or recommendations they make should be considered. You can encourage feedback through surveys, workshops and open dialog.

Once input has been discussed and plan agreed upon, it is critical to update roles and responsibilities, provide training and ensure ongoing communication. Many organizations now have internal certifications for different data governance roles who wear these badges with pride.

A top-down management approach will get a data governance initiative off the ground, but only bottom-up cultural adoption will carry it out.

Step 2: Refine the data governance framework

The right capabilities and tools are important for fueling an accurate, real-time data pipeline and governing it for maximum security, quality and value. For example:

Data catalogingOrganization’s implementing a data governance framework will benefit from automated metadata harvesting, data mapping, code generation and data lineage with reference data management, lifecycle management and data quality. With these capabilities, you can  efficiently integrate and activate enterprise data within a single, unified catalog in accordance with business requirements.

Data literacy Being able to discover what data is available and understand what it means in common, standardized terms is important because data elements may mean different things to different parts of the organization. A business glossary answers this need, as does the ability for stakeholders to view data relevant to their roles and understand it within a business context through a role-based portal.

Such tools are further enhanced if they can be integrated across data and business architectures and when they promote self-service and collaboration, which also are important to the cultural shift.

 

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

 

 

Step 3: Prioritize then scale the data governance framework

Because data governance is on-going, it’s important to prioritize the initial areas of focus and scale from there. Organizations that start with 30 to 50 data items are generally more successful than those that attempt more than 1,000 in the early stages.

Find some representative (familiar) data items and create examples for data ownership, quality, lineage and definition so stakeholders can see real examples of the data governance framework in action. For example:

  • Data ownership model showing a data item, its definition, producers, consumers, stewards and quality rules (for profiling)
  • Workflow showing the creation, enrichment and approval of the above data item to demonstrate collaboration

Whether your organization is just adopting data governance or the goal is to refine an existing data governance framework, the erwin DG RediChek will provide helpful insights to guide you in the journey.

Categories
erwin Expert Blog

Digital Transformation in Municipal Government: The Hidden Force Powering Smart Cities

Smart cities are changing the world.

When you think of real-time, data-driven experiences and modern applications to accomplish tasks faster and easier, your local town or city government probably doesn’t come to mind. But municipal government is starting to embrace digital transformation and therefore data governance.

Municipal government has never been an area in which to look for tech innovation. Perpetually strapped for resources and budget, often relying on legacy applications and infrastructure, and perfectly happy being available during regular business hours (save for emergency responders), most municipal governments lacked the ability and motivation to (as they say in the private sector) digitally transform. Then an odd thing happened – the rest of the world started transforming.

If you shop at a retailer that doesn’t deliver a modern, personalized experience, thousands more retailers are just a click away. But people rarely pick up and move to a new city because the new city offers a better website or mobile app. The motivation for municipal governments to transform simply isn’t there in the same way it is for the private sector.

But there are some things many city residents care about deeply: public safety, quality of life, how their tax dollars are spent, and the ability to do business with their local government when they want, not when it’s convenient for the municipality. And much like the private sector, better decisions around all of these concerns can be made when accurate, timely data is available to help inform them.

Digital transformation in municipal government is taking place in two main areas today: constituent services and the “smart cities” movement.

Digital Transformation in Municipal Government: Being “Smart” About It

The ability to serve constituents easily and efficiently is of increasing importance and a key objective of digital transformation in municipal government. It’s a direct result of the data-driven customer experiences that are increasingly the norm in the private sector.

Residents want the ability to pay their taxes online, report a pothole from their phone, and generally make it easier to interact with their local officials and services. This can be accomplished with dashboards and constituent portals.

The smart cities movement refers to the broad effort of municipal governments to incorporate sensors, data collection and analysis to improve responses to everything from rush-hour traffic to air quality to crime prevention. When the McKinsey Global Institute examined smart technologies that could be deployed by cities, it found that the public sector would be the natural owner of 70 percent of the applications it reviewed.

“Cities are getting in on the data game,” says Danny Sandwell, product marketing director at erwin, Inc. And with information serving as the lifeblood of many of these projects, the effectiveness of the services offered, the return on the investments in hardware and software, and the happiness of the users all depend on timely, accurate and effective data.

These initiatives present a pretty radical departure from the way cities have traditionally been managed.

A constituent portal, for example, requires that users can be identified, authenticated and then have access to information that resides in various departments, such as the tax collector to view and pay taxes, the building department to view a building permit, and the parking authority to manage public parking permits.

For many municipalities, this is uncharted territory.

Smart Cities

Data Governance: The Force Powering Smart Cities

The efficiencies offered by smart city technologies only exist if the data leads to a proper allocation of resources.

If you can identify an increase in crime in a certain neighborhood, for example, you can increase police patrols in response. But if the data is inaccurate, those patrols are wasted while other neighborhoods experience a rise in crime.

Now that they’re in the data game, it’s time for municipal governments to understand data governance – the driving force behind any successful data-driven operation. When you have the ability to understand all of the information related to a piece of data, you have more confidence in how it is analyzed, used and protected.

Data governance doesn’t take place at a single application or in the data warehouse. It needs to be woven into the enterprise architecture and processes of the municipality to ensure data is accurate, timely and accessible to those who need it (and inaccessible to everyone else).

When this all comes together – good data, solid analytics and improved services for residents – the results can be quite striking. New efficiencies will make municipal governments better stewards of tax dollars. An improved quality of life can lift tax revenue by making the city more appealing to citizens and developers.

There’s a lot for cities to gain if they get in the data game. And truly smart cities will make sure they play the game right with effective data governance.

Benefits of Data Governance

Categories
erwin Expert Blog

Digital Transformation Examples: How Data Is Transforming the Hospitality Industry

The rate at which organizations have adopted data-driven strategies means there are a wealth of digital transformation examples for organizations to draw from.

By now, you probably recognize this recurring pattern in the discussions about digital transformation:

  • An industry set in its ways slowly moves toward using information technology to create efficiencies, automate processes or help identify new customer or product opportunities.
  • All is going fine until a new kid on the block, born in the age of IT and the internet, quickly starts to create buzz and redefine what customers expect from the industry.
  • To keep pace, the industry stalwarts rush into catch-up mode but make inevitably mistakes. ROI doesn’t meet expectations, the customer experience isn’t quite right, and data gets exposed or mishandled.

There’s one industry we’re all familiar with that welcomes billions of global customers every year; that’s in the midst of a strong economic run; is dealing with high-profile disruptors; and suffered a very public data breach to one of its storied brands in 2018 that raised eyebrows around the world.

Welcome to the hospitality industry.

The hotel and hospitality industry was expected to see 5 to 6 percent growth in 2018, part of an impressive run of performance fueled by steady demand, improved midmarket offerings, and a new supply of travelers from developing regions.

All this despite challenges from upstarts like AirB2B, HomeAway and Couchsurfing plus a data breach at Marriott/Starwood that exposed the data of 500 million customers.

Digital Transformation Examples: Data & the Hospitality Industry

Online start-ups such as Airbnb, HomeAway and Couchsurfing are some of the most clear cut digital transformation examples in the hospitality industry.

Digital Transformation Examples: Hospitality – Data, Data Everywhere

As with other industries, digital transformation examples in the hospitality industry are abundant – and in turn, those businesses are awash in data with sources that include:

  • Data generated by reservations and payments
  • The data hotels collect to drive their loyalty programs
  • Data used to enhance the customer experience
  • Data shared as part of the billions of handoffs between hotel chains and the various booking sites and agencies that travelers use to plan trips

But all of this data, which now permeates the industry, is relatively new.

“IT wasn’t always a massive priority for [the hospitality industry],” says Danny Sandwell, director of product marketing for erwin, Inc. “So now there’s a lot of data, but these organizations often have a weak backend.

The combination of data and analytics carries a great deal of potential for companies in the hospitality industry. Today’s demanding customers want experiences, not just a bed to sleep in; they want to do business with brands that understand their likes and dislikes; and that send offers relevant to their interests and desired destinations.

All of this is possible when a business collects and analyzes data on the scale that many hotel brands do. However, all of this can fail loudly if there is a problem with that data.

Getting a return on their investments in analytics and marketing technology requires hospitality companies to thoroughly understand the source of their data, the quality of the data, and the relevance of the data. This is where data governance comes into play.

When hospitality businesses are confident in their data, they can use it a number of ways, including:

  • Customer Experience: Quality data can be used to power a best-in-class experience for hotels in a number of areas, including the Web experience, mobile experience, and the in-person guest experience. This is similar to the multi-channel strategy of retailers hoping to deliver memorable and helpful experiences based on what they know about customers, including the ability to make predictions and deliver cross-sell and up-sell opportunities. 
  • Mergers and Acquisitions: Hospitality industry disruptors have some industry players thinking about boosting their businesses via mergers and acquisitions. Good data can identify the best targets and help discover the regions or price points where M&A makes the most sense and will deliver the most value. Accurate data can also help pinpoint the true cost of M&A activity.
  • Security: Marriott’s data breach, which actually began as a breach at Starwood before Marriott acquired it, highlights the importance of data security in the hospitality industry. Strong data governance can help prevent breaches, as well as help control breaches so organizations more quickly identify the scope and action behind a breach, an important part of limiting damage.
  • Partnerships: The hospitality industry is increasingly connected, not just because of booking sites working with dozens of hotel brands but also because of tour operators turning a hotel stay into an experience and transportation companies arranging travel for guests. Providing a room is no longer enough.

Data governance is not an application or a tool. It is a strategy. When it is done correctly and it is deployed in a holistic manner, data governance becomes woven into an organization’s business processes and enterprise architecture.

It then improves the organization’s ability to understand where its data is, where it came from, its value, its quality, and how the data is accessed and used by people and applications.

It’s this level of data maturity that provides comfort to employees – from IT staff to the front desk and everyone in between – that the data they are working with is accurate and helping them better perform their jobs and improve the way they serve customers.

Over the next few weeks, we’ll be looking closely at digital transformation examples in other sectors, including retail and government. Subscribe to to stay in the loop.

GDPR White Paper

Categories
erwin Expert Blog

Four Use Cases Proving the Benefits of Metadata-Driven Automation

Organization’s cannot hope to make the most out of a data-driven strategy, without at least some degree of metadata-driven automation.

The volume and variety of data has snowballed, and so has its velocity. As such, traditional – and mostly manual – processes associated with data management and data governance have broken down. They are time-consuming and prone to human error, making compliance, innovation and transformation initiatives more complicated, which is less than ideal in the information age.

So it’s safe to say that organizations can’t reap the rewards of their data without automation.

Data scientists and other data professionals can spend up to 80 percent of their time bogged down trying to understand source data or addressing errors and inconsistencies.

That’s time needed and better used for data analysis.

By implementing metadata-driven automation, organizations across industry can unleash the talents of their highly skilled, well paid data pros to focus on finding the goods: actionable insights that will fuel the business.

Metadata-Driven Automation

Metadata-Driven Automation in the BFSI Industry

The banking, financial services and insurance industry typically deals with higher data velocity and tighter regulations than most. This bureaucracy is rife with data management bottlenecks.

These bottlenecks are only made worse when organizations attempt to get by with systems and tools that are not purpose-built.

For example, manually managing data mappings for the enterprise data warehouse via MS Excel spreadsheets had become cumbersome and unsustainable for one BSFI company.

After embracing metadata-driven automation and custom code automation templates, it saved hundreds of thousands of dollars in code generation and development costs and achieved more work in less time with fewer resources. ROI on the automation solutions was realized within the first year.

Metadata-Driven Automation in the Pharmaceutical Industry

Despite its shortcomings, the Excel spreadsheet method for managing data mappings is common within many industries.

But with the amount of data organizations need to process in today’s business climate, this manual approach makes change management and determining end-to-end lineage a significant and time-consuming challenge.

One global pharmaceutical giant headquartered in the United States experienced such issues until it adopted metadata-driven automation. Then the pharma company was able to scan in all source and target system metadata and maintain it within a single repository. Users now view end-to-end data lineage from the source layer to the reporting layer within seconds.

On the whole, the implementation resulted in extraordinary time savings and a total cost reduction of 60 percent.

Metadata-Driven Automation in the Insurance Industry

Insurance is another industry that has to cope with high data velocity and stringent data regulations. Plus many organizations in this sector find that they’ve outgrown their systems.

For example, an insurance company using a CDMA product to centralize data mappings is probably missing certain critical features, such as versioning, impact analysis and lineage, which adds to costs, times to market and errors.

By adopting metadata-driven automation, organizations can standardize the pre-ETL data mapping process and better manage data integration through the change and release process. As a result, both internal data mapping and cross functional teams now have easy and fast web-based access to data mappings and valuable information like impact analysis and lineage.

Here is the story of a business that adopted such an approach and achieved operational excellence and a delivery time reduction by 80 percent, as well as achieving ROI within 12 months.

Metadata-Driven Automation for a Non-Profit

Another common issue cited by organizations using manual data mapping is ballooning complexity and subsequent confusion.

Any organization expanding its data-driven focus without sufficiently maturing data management initiative(s) will experience this at some point.

One of the world’s largest humanitarian organizations, with millions of members and volunteers operating all over the world, was confronted with this exact issue.

It recognized the need for a solution to standardize the pre-ETL data mapping process to make data integration more efficient and cost-effective.

With metadata-driven automation, the organization would be able to scan and store metadata and data dictionaries in a central repository, as well as manage the business definitions and data dictionary for legacy systems contributing data to the enterprise data warehouse.

By adopting such an approach, the organization realized time savings across all IT development and cross-functional testing teams. Additionally, they were able to more easily manage mappings, code sets, reference data and data validation rules.

Again, ROI was achieved within a year.

A Universal Solution for Metadata-Driven Automation

Metadata-driven automation is a capability any organization can benefit from – regardless of industry, as demonstrated by the various real-world use cases chronicled here.

The erwin Automation Framework is a key component of the erwin EDGE platform for comprehensive data management and data governance.

With it, data professionals realize these industry-agnostic benefits:

  • Centralized and standardized code management with all automation templates stored in a governed repository
  • Better quality code and minimized rework
  • Business-driven data movement and transformation specifications
  • Superior data movement job designs based on best practices
  • Greater agility and faster time-to-value in data preparation, deployment and governance
  • Cross-platform support of scripting languages and data movement technologies

Learn more about metadata-driven automation as it relates to data preparation and enterprise data mapping.

Join one our weekly erwin Mapping Manager demos.

Automate Data Mapping