Categories
erwin Expert Blog Data Governance

How Data Governance Protects Sensitive Data

 

Data governance reduces the risk of sensitive data.

Organizations are managing more data than ever. In fact, the global datasphere is projected to reach 175 zettabytes by 2025, according to IDC.

With more companies increasingly migrating their data to the cloud to ensure availability and scalability, the risks associated with data management and protection also are growing.

How can companies protect their enterprise data assets, while also ensuring their availability to stewards and consumers while minimizing costs and meeting data privacy requirements?

Data Security Starts with Data Governance

Lack of a solid data governance foundation increases the risk of data-security incidents. An assessment of the data breaches that crop up like weeds each year supports the conclusion that companies, absent data governance, wind up building security architectures strictly from a technical perspective.

Given that every company has in its possession important information about and relationships with people based on the private data they provide, every business should understand the related risks and protect against them under the banner of data governance—and avoid the costs and reputation damage that data breaches can inflict more intelligently and better. That’s especially true as the data-driven enterprise momentum grows along with self-service analytics that enable users to have greater access to information, often using it without IT’s knowledge.

Indeed, with nearly everyone in the enterprise involved either in maintaining or using the company’s data, it only makes sense that both business and IT begin to work together to discover, understand, govern and socialize those assets. This should come as part of a data governance plan that emphasizes making all stakeholders responsible not only for enhancing data for business benefit, but also for reducing the risks that unfettered access to and use of it can pose.

With data catalog and literacy capabilities, you provide the context to keep relevant data private and secure – the assets available, their locations, the relationships between them, associated systems and processes, authorized users and guidelines for usage.

Without data governance, organizations lack the ability to connect the dots across data governance, security and privacy – and to act accordingly. So they can’t answer these fundamental questions:

  • What data do we have and where is it now?
  • Where did it come from and how has it changed?
  • Is it sensitive data or are there any risks associated with it?
  • Who is authorized to use it and how?

When an organization knows what data it has, it can define that data’s business purpose. And knowing the business purpose translates into actively governing personal data against potential privacy and security violations.

Do You Know Where Your Sensitive Data Is?

Data is a valuable asset used to operate, manage and grow a business. While sometimes at rest in databases, data lakes and data warehouses; a large percentage is federated and integrated across the enterprise, management and governance issues that must be addressed.

Knowing where sensitive data is located and properly governing it with policy rules, impact analysis and lineage views is critical for risk management, data audits and regulatory compliance.

For example, understanding and protecting sensitive data is especially critical for complying with privacy regulations like the European Union’s General Data Protection Regulation (GDPR).

The demands GDPR places on organizations are all-encompassing. Protecting what traditionally has been considered personally identifiable information (PII) — people’s names, addresses, government identification numbers and so forth — that a business collects, and hosts is just the beginning of GDPR mandates. Personal data now means anything collected or stored that can be linked to an individual (right down to IP addresses), and the term doesn’t only apply to individual pieces of information but also to how they may be combined in revealing relationships. And it isn’t just about protecting the data your business gathers, processes and stores but also any data it may leverage from third-party sources.

When key data isn’t discovered, harvested, cataloged, defined and standardized as part of integration processes, audits may be flawed putting your organization at risk.

Sensitive data – at rest or in motion – that exists in various forms across multiple systems must be automatically tagged, its lineage automatically documented, and its flows depicted so that it is easily found, and its usage easily traced across workflows.

Fortunately, tools are available to help automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data: Better visibility and control across the enterprise to identify data security threats and reduce associated risks
  • Enriching business data elements for sensitive data discovery: Comprehensive mechanism to define business data element for PII, PHI and PCI across database systems, cloud and Big Data stores to easily identify sensitive data based on a set of algorithms and data patterns
  • Providing metadata and value-based analysis: Discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data including PII, PHI, PCI and other sensitive information.

Minimizing Risk Exposure with Data Intelligence

Organizations suffering data losses won’t benefit from the money spent on security technologies nor the time invested in developing data privacy classifications if they can’t get a handle on how they handle their data.

They also may face heavy fines and other penalties – not to mention bad PR.

Don’t let that happen to your organization.

A well-formed security architecture that is driven by and aligned by data intelligence is your best defense. Being prepared means you can minimize your risk exposure.

With erwin Data Intelligence by Quest, you’ll have an unfettered view of where sensitive data resides with the ability to seamlessly apply privacy rules and create access privileges.

Additionally, with Quest’s acquisition of erwin comes the abilities to mask, encrypt, redact and audit sensitive data for an automated and comprehensive solution to resolve sensitive-data issues.

When an organization knows what data it has, it can define that data’s business purpose. And knowing the business purpose translates into actively governing personal data against potential privacy and security violations.

From risk management and regulatory compliance to innovation and digital transformation, you need data intelligence. With erwin by Quest, you will know your data so you can fully realize its business benefits.

[blog-cta header=”erwin Data Intelligence” body=”Click here to request a demo of erwin Data Intelligence by Quest.” button=”Request Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-data-intelligence-free-demo/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog Data Governance

The Value of Data Governance and How to Quantify It

erwin recently hosted the second in its six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, the second webinar focused on “The Value of Data Governance & How to Quantify It.”

As Mr. Pörschmann highlighted at the beginning of the series, data governance works best when it is strongly aligned with the drivers, motivations and goals of the business.

The business drivers and motivation should be the starting point for any data governance initiative. If there is no clear end goal in sight, it will be difficult to get stakeholders on board. And with many competing projects and activities vying for people’s time, it must be clear to people why choosing data governance activities will have a direct benefit to them.

“Usually we talk about benefits which are rather qualitative measures, but what we need for decision-making processes are values,” Pörschmann says. “We need quantifiable results or expected results that are fact-based. And the interesting thing with data governance, it seems to be easier for organizations and teams to state the expected benefits.”

The Data Governance Productivity Matrix

In terms of quantifying data governance, Pörschmann cites the productivity matrix as a relatively simple way to calculate real numbers. He says, “the basic assumption is if an organization equips their managers with the appropriate capabilities and instruments, then it’s management’s obligation to realize productivity potential over time.”

According to IDC, professionals who work with data spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analytics.

Specifically, 80 percent of data professionals’ time is spent on data discovery, preparation and protection, and only 20 percent on analysis leading to insights.

Data governance maturity includes the ability to rely on automated and repeatable processes, which ultimately helps to increase productivity.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement — for data lineage and impact analysis — without interrupting system developers’ normal work methods.

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology — that is, reverse-engineering code from existing XML within the ETL layer — to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

With automation, data professionals can meet the above needs at a fraction of the cost of the traditional, manual way. To summarize, just some of the benefits of data automation are:

  • Centralized and standardized code management with all automation templates stored in a governed repository
  • Better quality code and minimized rework
  • Business-driven data movement and transformation specifications
  • Superior data movement job designs based on best practices
  • Greater agility and faster time to value in data preparation, deployment and governance
  • Cross-platform support of scripting languages and data movement technologies

For example, one global pharmaceutical giant reduced cost by 70 percent and generated 95 percent of production code with “zero touch.” With automation, the company improved the time to business value and significantly reduced the costly re-work associated with error-prone manual processes.

Risk Management and Regulatory Compliance

Risk management, specifically around regulatory compliance, is an important use case to demonstrate the true value of data governance.

According to Pörschmann, risk management asks two main questions.

  1. How likely is a specific event to happen?
  2. What is the impact or damage if this event happens? (e.g.m, cost of repair, cost of reputation, etc.)

“You have to understand the concept or thinking of risk officers or the risk teams,” he says. The risk teams are process-oriented, and they understand how to calculate and how to cover IT risks. But to be successful in communicating data risks with the risk management team, you need to understand how your risk teams are thinking in terms of the risk matrix.

Take the European Union’s General Data Protection Regulation (GDPR) as an example of a data cost. Your team needs to ask, “what is the likelihood that we will fail on data-based activities related to GDPR?” And then ask, “what can we do from the data side to reduce the impact or the total damage?”

But it’s not easy to design and deploy compliance in an environment that’s not well understood and difficult in which to maneuver. Data governance enables organizations to plan and document how they will discover and understand their data within context, track its physical existence and lineage, and maximize its security, quality and value.

With the right technology, organizations can automate and accelerate regulatory compliance in five steps:

  1. Catalog systems. Harvest, enrich/transform and catalog data from a wide array of sources to enable any stakeholder to see the interrelationships of data assets across the organization.
  2. Govern PII “at rest”. Classify, flag and socialize the use and governance of personally identifiable information regardless of where it is stored.
  3. Govern PII “in motion”. Scan, catalog and map personally identifiable information to understand how it moves inside and outside the organization and how it changes along the way.
  4. Manage policies and rules. Govern business terminology in addition to data policies and rules, depicting relationships to physical data catalogs and the applications that use them with lineage and impact analysis views.
  5. Strengthen data security. Identify regulatory risks and guide the fortification of network and encryption security standards and policies by understanding where all personally identifiable information is stored, processed and used.

It’s also important to understand that the benefits of data governance don’t stop with regulatory compliance.

A better understanding of what data you have, where it’s stored and the history of its use and access isn’t only beneficial in fending off non-compliance repercussions. In fact, such an understanding is arguably better put to use proactively.

Data governance improves data quality standards, it enables better decision-making and ensures businesses can have more confidence in the data informing those decisions.

[blog-cta header=”erwin DG Webinar Series” body=”Register now for the March 30 webinar ‘Data Governance Maturity & Tracking Progress.'” button=”Register Now” button_link=”https://register.gotowebinar.com/register/8531817018173466635″ image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog Data Governance

The What & Why of Data Governance

Modern data governance is a strategic, ongoing and collaborative practice that enables organizations to discover and track their data, understand what it means within a business context, and maximize its security, quality and value.

It is the foundation for regulatory compliance and de-risking operations for competitive differentiation and growth.

However, while digital transformation and other data-driven initiatives are desired outcomes, few organizations know what data they have or where it is, and they struggle to integrate known data in various formats and numerous systems – especially if they don’t have a way to automate those processes.

But when IT-driven data management and business-oriented data governance work together in terms of both personnel, processes and technology, decisions can be made and their impacts determined based on a full inventory of reliable information.

Recently, erwin held the first in a six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, it examined “The What & Why of Data Governance.”

The What: Data Governance Defined

Data governance has no standard definition. However, Dataversity defines it as “the practices and processes which help to ensure the formal management of data assets within an organization.”

At erwin by Quest, we further break down this definition by viewing data governance as a strategic, continuous commitment to ensuring organizations are able to discover and track data, accurately place it within the appropriate business context(s), and maximize its security, quality and value.

Mr. Pörschmann asked webinar attendees to stop trying to explain what data governance is to executives and clients. Instead, he suggests they put data governance in real-world scenarios to answer these questions: “What is the problem you believe data governance is the answer to?” Or “How would you recognize having effective data governance in place?”

In essence, Mr. Pörschmann laid out the “enterprise data dilemma,” which stems from three important but difficult questions for an enterprise to answer: What data do we have? Where is it? And how do we get value from it?

Asking how you recognize having effective data governance in place is quite helpful in executive discussions, according to Mr. Pörschmann. And when you talk about that question at a high level, he says, you get a very “simple answer,”– which is ‘the only thing we want to have is the right data with the right quality to the right person at the right time at the right cost.’

The Why: Data Governance Drivers

Why should companies care about data governance?

erwin’s 2020 State of Data Governance and Automation report found that better decision-making is the primary driver for data governance (62 percent), with analytics secondary (51 percent), and regulatory compliance coming in third (48 percent).

In the webinar, Mr. Pörschmann called out that the drivers of data governance are the same as those for digital transformation initiatives. “This is not surprising at all,” he said. “Because data is one of the success elements of a digital agenda or digital transformation agenda. So without having data governance and data management in place, no full digital transformation will be possible.”

Drivers of data governance

Data Privacy Regulations

While compliance is not the No. 1 driver for data governance, it’s still a major factor – especially since the rollout of the European Union’s General Data Protection Regulation (GDPR) in 2018.

According to Mr. Pörschmann, many decision-makers believe that if they get GDPR right, they’ll be fine and can move onto other projects. But he cautions “this [notion] is something which is not really likely to happen.”

For the EU, he warned, organizations need to prepare for the Digital Single Market, agreed on last year by the European Parliament and commission. With it comes clear definitions or rules on data access and exchange, especially across digital platforms, as well as clear regulations and also instruments to execute on data ownership. He noted, “Companies will be forced to share some specific data which is relevant for public security, i.e., reduction of carbon dioxide. So companies will be forced to classify their data and to find mechanisms to share it with such platforms.”

GDPR is also proving to be the de facto model for data privacy across the United States. The new Virginia Consumer Data Privacy Act, which was modeled on the California Consumer Privacy Act (CCPA), and the California Privacy Rights Act (CPRA), all share many of the same requirements as GDPR.

Like CCPA, the Virginia bill would give consumers the right to access their data, correct inaccuracies, and request the deletion of information. Virginia residents also would be able to opt out of data collection.

Nevada, Vermont, Maine, New York, Washington, Oklahoma and Utah also are leading the way with some type of consumer privacy regulation. Several other bills are on the legislative docket in Alabama, Arizona, Florida, Connecticut and Kentucky, all of which follow a similar format to the CCPA.

Stop Wasting Time

In addition to drivers like digital transformation and compliance, it’s really important to look at the effect of poor data on enterprise efficiency/productivity.

Respondents to McKinsey’s 2019 Global Data Transformation Survey reported that an average of 30 percent of their total enterprise time was spent on non-value-added tasks because of poor data quality and availability.

Wasted time is also an unfortunate reality for many data stewards, who spend 80 percent of their time finding, cleaning and reorganizing huge amounts of data, and only 20 percent of their time on actual data analysis.

According to erwin’s 2020 report, about 70 percent of respondents – a combination of roles from data architects to executive managers – said they spent an average of 10 or more hours per week on data-related activities.

The Benefits of erwin Data Intelligence

erwin Data Intelligence by Quest supports enterprise data governance, digital transformation and any effort that relies on data for favorable outcomes.

The software suite combines data catalog and data literacy capabilities for greater awareness of and access to available data assets, guidance on their use, and guardrails to ensure data policies and best practices are followed.

erwin Data Intelligence automatically harvests, transforms and feeds metadata from a wide array of data sources, operational processes, business applications and data models into a central catalog. Then it is accessible and understandable via role-based, contextual views so stakeholders can make strategic decisions based on accurate insights.

You can request a demo of erwin Data Intelligence here.

[blog-cta header=”Webinar: The Value of Data Governance & How to Quantify It” body=”Join us March 15 at 10 a.m. ET for the second webinar in this series, “The Value of Data Governance & How to Quantify It.” Mr. Pörschmann will discuss how justifying a data governance program requires building a solid business case in which you can prove its value.” button=”Register Now” button_link=”https://attendee.gotowebinar.com/register/5489626673791671307″ image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog Business Process

In Times of Rapid Change, Business Process Modeling Becomes a Critical Tool

With the help of business process modeling (BPM) organizations can visualize processes and all the associated information identifying the areas ripe for innovation, improvement or reorganization.

In the blink of an eye, COVID-19 has disrupted all industries and quickly accelerated their plans for digital transformation. As part of their transformations, businesses are moving quickly from on premise to the cloud and therefore need to create business process models available to everyone within the organization so they understand what data is tied to what applications and what processes are in place.

There’s a clear connection between business process modeling and digital transformation initiatives. With it, an organization can explore models to understand information assets within a business context, from internal operations to full customer experiences.

This practice identifies and drives digital transformation opportunities to increase revenue while limiting risks and avoiding regulatory and compliance gaffes.

Business Process Data Governance

Bringing IT and Business Together to Make More Informed Decisions

Developing a shared repository is key to aligning IT systems to accomplish business strategies, reducing the time it takes to make decisions and accelerating solution delivery.

It also serves to operationalize and govern mission-critical information by making it available to the wider enterprise at the right levels to identify synergies and ensure the appropriate collaboration.

One customer says his company realized early on that there’s a difference between business expertise and process expertise, and when you partner the two you really start to see the opportunities for success.

By bringing your business and IT together via BPM, you create a single point of truth within your organization — delivered to stakeholders within the context of their roles.

You then can understand where your data is, how you can find it, how you can monetize it, how you can report on it, and how you can visualize it. You are able to do it in an easy format that you can catalog, do mappings, lineage and focus on tying business and IT together to make more informed decisions.

BPM for Regulatory Compliance

Business process modeling is also critical for risk management and regulatory compliance. When thousands of employees need to know what compliance processes to follow, such as those associated with the European Union’s General Data Protection Regulation (GDPR), ensuring not only access to proper documentation but current, updated information is critical.

Industry and government regulations affect businesses that work in or do business with any number of industries or in specific geographies. Industry-specific regulations in areas like healthcare, pharmaceuticals and financial services have been in place for some time.

Now, broader mandates like GDPR and the California Consumer Privacy Act (CCPA) require businesses across industries to think about their compliance efforts. Business process modeling helps organizations prove what they are doing to meet compliance requirements and understand how changes to their processes impact compliance efforts (and vice versa).

This same customer says, “The biggest bang for the buck is having a single platform, a one-stop shop, for when you’re working with auditors.” You go to one place that is your source of truth: Here are processes; here’s how we have implemented these controls; here are the list of our controls and where they’re implemented in our business.”

He also notes that a single BPM platform “helps cut through a lot of questions and get right to the heart of the matter.” As a result, the company has had positive audit findings and results because they have a structure, a plan, and it’s easy to see the connection between how they’re ensuring their controls are adhered to and where those results are in their business processes.

Change Is Constant

Heraclitus, the Greek philosopher said, “The only constant in life is change.” This applies to business, as well. Today things are changing quite quickly. And with our current landscape, executives are not going to wait around for months as impact analyses are being formulated. They want actionable intelligence – fast.

For business process architects, being able to manage change and address key issues is what keeps the job function highly relevant to stakeholders. The key point is that useful change comes from routinely looking at process models and spotting a sub-optimality. Business process modeling supports many beneficial use cases and transformation projects used to empower employees and therefore better serve customers.

Organizational success depends on agility and adaptability in responding to change across the enterprise, both planned and unplanned. To be agile and responsive to changes in markets and consumer demands, you need a visual representation of what your business does and how it does it.

Companies that maintain accurate business process models also are well-positioned to analyze and optimize end-to-end process threads—lead-to-cash, problem-to-resolution or hire-to-retire, for example—that contribute to strategic business objectives, such as improving customer journeys or maximizing employee retention.

They also can slice and dice their models in multiple other ways, such as by functional hierarchies to understand what business groups organize or participate in processes as a step in driving better collaboration or greater efficiencies.

erwin Evolve enables communication and collaboration across the enterprise with reliable tools that make it possible to quickly and accurately gather information, make decisions, and then ensure consistent standards, policies and processes are established and available for consumption internally and externally as required.

Try erwin Evolve for yourself in a no-cost, risk-free trial.

Categories
erwin Expert Blog

Enterprise Architecture and Business Process Modeling Tools Have Evolved

Enterprise architecture (EA) and business process (BP) modeling tools are evolving at a rapid pace. They are being employed more strategically across the wider organization to transform some of business’s most important value streams.

Recently, Glassdoor named enterprise architecture the top tech job in the UK, indicating its increasing importance to the enterprise in the tech and data-driven world.

Whether documenting systems and technology, designing processes and value streams, or managing innovation and change, organizations need flexible but powerful EA and BP tools they can rely on for collecting relevant information for decision-making.

It’s like constructing a building or even a city – you need a blueprint to understand what goes where, how everything fits together to support the structure, where you have room to grow, and if it will be feasible to knock down any walls if you need to.

 

Data-Driven Enterprise Architecture

 

Without a picture of what’s what and the interdependencies, your enterprise can’t make changes at speed and scale to serve its needs.

Recognizing this evolution, erwin has enhanced and repackaged its EA/BP platform as erwin Evolve.

The combined solution enables organizations to map IT capabilities to the business functions they support and determine how people, processes, data, technologies and applications interact to ensure alignment in achieving enterprise objectives.

These initiatives can include digital transformation, cloud migration, portfolio and infrastructure rationalization, regulatory compliance, mergers and acquisitions, and innovation management.

Regulatory Compliance Through Enterprise Architecture & Business Process Modeling Software

A North American banking group is using erwin Evolve to integrate information across the organization and provide better governance to boost business agility. Developing a shared repository was key to aligning IT systems to accomplish business strategies, reducing the time it takes to make decisions, and accelerating solution delivery.

It also operationalizes and governs mission-critical information by making it available to the wider enterprise at the right levels to identify synergies and ensure the appropriate collaboration.

EA and BP modeling are both critical for risk management and regulatory compliance, a major concern for financial services customers like the one above when it comes to ever-changing regulations on money laundering, fraud and more. erwin helps model, manage and transform mission-critical value streams across industries, as well as identify sensitive information.

Additionally, when thousands of employees need to know what compliance processes to follow, such as those associated with regulations like the General Data Protection Regulation (GDPR), ensuring not only access to proper documentation but current, updated information is critical.

The Advantages of Enterprise Architecture & Business Process Modeling from erwin

The power to adapt the EA/BP platform leads global giants in critical infrastructure, financial services, healthcare, manufacturing and pharmaceuticals to deploy what is now erwin Evolve for both EA and BP use cases. Its unique advantages are:

  • Integrated, Web-Based Modeling & Diagramming: Harmonize EA/BP capabilities with a robust, flexible and web-based modeling and diagramming interface easy for all stakeholders to use.
  • High-Performance, Scalable & Centralized Repository: See an integrated set of views for EA and BP content in a central, enterprise-strength repository capable of supporting thousands of global users.
  • Configurable Platform with Role-Based Views: Configure the metamodel, frameworks and user interface for an integrated, single source of truth with different views for different stakeholders based on their roles and information needs.
  • Visualizations & Dashboards: View mission-critical data in the central repository in the form of user-friendly automated visualizations, dashboards and diagrams.
  • Third-Party Integrations: Synchronize data with such enterprise applications as CAST, Cloud Health, RSA Archer, ServiceNow and Zendesk.
  • Professional Services: Tap into the knowledge of our veteran EA and BP consultants for help with customizations and integrations, including support for ArchiMate.

erwin Evolve 2020’s specific enhancements include web-based diagramming for non-IT users, stronger document generation and analytics, TOGAF support, improved modeling and navigation through inferred relationships, new API extensions, and modular packaging so customers can choose the components that best meet their needs.

erwin Evolve is also part of the erwin EDGE with data modeling, data catalog and data literacy capabilities for overall data intelligence.

enterprise architecture business process

Categories
erwin Expert Blog

How Metadata Makes Data Meaningful

Metadata is an important part of data governance, and as a result, most nascent data governance programs are rife with project plans for assessing and documenting metadata. But in many scenarios, it seems that the underlying driver of metadata collection projects is that it’s just something you do for data governance.

So most early-stage data governance managers kick off a series of projects to profile data, make inferences about data element structure and format, and store the presumptive metadata in some metadata repository. But are these rampant and often uncontrolled projects to collect metadata properly motivated?

There is rarely a clear directive about how metadata is used. Therefore prior to launching metadata collection tasks, it is important to specifically direct how the knowledge embedded within the corporate metadata should be used.

Managing metadata should not be a sub-goal of data governance. Today, metadata is the heart of enterprise data management and governance/ intelligence efforts and should have a clear strategy – rather than just something you do.

metadata data governance

What Is Metadata?

Quite simply, metadata is data about data. It’s generated every time data is captured at a source, accessed by users, moved through an organization, integrated or augmented with other data from other sources, profiled, cleansed and analyzed. Metadata is valuable because it provides information about the attributes of data elements that can be used to guide strategic and operational decision-making. It answers these important questions:

  • What data do we have?
  • Where did it come from?
  • Where is it now?
  • How has it changed since it was originally created or captured?
  • Who is authorized to use it and how?
  • Is it sensitive or are there any risks associated with it?

The Role of Metadata in Data Governance

Organizations don’t know what they don’t know, and this problem is only getting worse. As data continues to proliferate, so does the need for data and analytics initiatives to make sense of it all. Here are some benefits of metadata management for data governance use cases:

  • Better Data Quality: Data issues and inconsistencies within integrated data sources or targets are identified in real time to improve overall data quality by increasing time to insights and/or repair.
  • Quicker Project Delivery: Accelerate Big Data deployments, Data Vaults, data warehouse modernization, cloud migration, etc., by up to 70 percent.
  • Faster Speed to Insights: Reverse the current 80/20 rule that keeps high-paid knowledge workers too busy finding, understanding and resolving errors or inconsistencies to actually analyze source data.
  • Greater Productivity & Reduced Costs: Being able to rely on automated and repeatable metadata management processes results in greater productivity. Some erwin customers report productivity gains of 85+% for coding, 70+% for metadata discovery, up to 50% for data design, up to 70% for data conversion, and up to 80% for data mapping.
  • Regulatory Compliance: Regulations such as GDPR, HIPAA, PII, BCBS and CCPA have data privacy and security mandates, so sensitive data needs to be tagged, its lineage documented, and its flows depicted for traceability.
  • Digital Transformation: Knowing what data exists and its value potential promotes digital transformation by improving digital experiences, enhancing digital operations, driving digital innovation and building digital ecosystems.
  • Enterprise Collaboration: With the business driving alignment between data governance and strategic enterprise goals and IT handling the technical mechanics of data management, the door opens to finding, trusting and using data to effectively meet organizational objectives.

Giving Metadata Meaning

So how do you give metadata meaning? While this sounds like a deep philosophical question, the reality is the right tools can make all the difference.

erwin Data Intelligence (erwin DI) combines data management and data governance processes in an automated flow.

It’s unique in its ability to automatically harvest, transform and feed metadata from a wide array of data sources, operational processes, business applications and data models into a central data catalog and then make it accessible and understandable within the context of role-based views.

erwin DI sits on a common metamodel that is open, extensible and comes with a full set of APIs. A comprehensive list of erwin-owned standard data connectors are included for automated harvesting, refreshing and version-controlled metadata management. Optional erwin Smart Data Connectors reverse-engineer ETL code of all types and connect bi-directionally with reporting and other ecosystem tools. These connectors offer the fastest and most accurate path to data lineage, impact analysis and other detailed graphical relationships.

Additionally, erwin DI is part of the larger erwin EDGE platform that integrates data modelingenterprise architecturebusiness process modelingdata cataloging and data literacy. We know our customers need an active metadata-driven approach to:

  • Understand their business, technology and data architectures and the relationships between them
  • Create an automate a curated enterprise data catalog, complete with physical assets, data models, data movement, data quality and on-demand lineage
  • Activate their metadata to drive agile and well-governed data preparation with integrated business glossaries and data dictionaries that provide business context for stakeholder data literacy

erwin was named a Leader in Gartner’s “2019 Magic Quadrant for Metadata Management Solutions.”

Click here to get a free copy of the report.

Click here to request a demo of erwin DI.

Gartner Magic Quadrant Metadata Management

 

Categories
erwin Expert Blog

Metadata Management, Data Governance and Automation

Can the 80/20 Rule Be Reversed?

erwin released its State of Data Governance Report in February 2018, just a few months before the General Data Protection Regulation (GDPR) took effect.

This research showed that the majority of responding organizations weren’t actually prepared for GDPR, nor did they have the understanding, executive support and budget for data governance – although they recognized the importance of it.

Of course, data governance has evolved with astonishing speed, both in response to data privacy and security regulations and because organizations see the potential for using it to accomplish other organizational objectives.

But many of the world’s top brands still seem to be challenged in implementing and sustaining effective data governance programs (hello, Facebook).

We wonder why.

Too Much Time, Too Few Insights

According to IDC’s “Data Intelligence in Context” Technology Spotlight sponsored by erwin, “professionals who work with data spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analytics.”

Specifically, 80 percent of data professionals’ time is spent on data discovery, preparation and protection, and only 20 percent on analysis leading to insights.

In most companies, an incredible amount of data flows from multiple sources in a variety of formats and is constantly being moved and federated across a changing system landscape.

Often these enterprises are heavily regulated, so they need a well-defined data integration model that will help avoid data discrepancies and remove barriers to enterprise business intelligence and other meaningful use.

IT teams need the ability to smoothly generate hundreds of mappings and ETL jobs. They need their data mappings to fall under governance and audit controls, with instant access to dynamic impact analysis and data lineage.

But most organizations, especially those competing in the digital economy, don’t have enough time or money for data management using manual processes. Outsourcing is also expensive, with inevitable delays because these vendors are dependent on manual processes too.

The Role of Data Automation

Data governance maturity includes the ability to rely on automated and repeatable processes.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement — for data lineage and impact analysis — without interrupting system developers’ normal work methods.

GDPR compliance, for instance, requires a business to discover source-to-target mappings with all accompanying transactions, such as what business rules in the repository are applied to it, to comply with audits.

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology — that is, reverse-engineering code from existing XML within the ETL layer — to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

With automation, data professionals can meet the above needs at a fraction of the cost of the traditional, manual way. To summarize, just some of the benefits of data automation are:

• Centralized and standardized code management with all automation templates stored in a governed repository
• Better quality code and minimized rework
• Business-driven data movement and transformation specifications
• Superior data movement job designs based on best practices
• Greater agility and faster time-to-value in data preparation, deployment and governance
• Cross-platform support of scripting languages and data movement technologies

One global pharmaceutical giant reduced costs by 70 percent and generated 95 percent of production code with “zero touch.” With automation, the company improved the time to business value and significantly reduced the costly re-work associated with error-prone manual processes.

Gartner Magic Quadrant Metadata Management

Help Us Help You by Taking a Brief Survey

With 2020 just around the corner and another data regulation about to take effect, the California Consumer Privacy Act (CCPA), we’re working with Dataversity on another research project.

And this time, you guessed it – we’re focusing on data automation and how it could impact metadata management and data governance.

We would appreciate your input and will release the findings in January 2020.

Click here to take the brief survey

Categories
erwin Expert Blog

Data Governance Makes Data Security Less Scary

Happy Halloween!

Do you know where your data is? What data you have? Who has had access to it?

These can be frightening questions for an organization to answer.

Add to the mix the potential for a data breach followed by non-compliance, reputational damage and financial penalties and a real horror story could unfold.

In fact, we’ve seen some frightening ones play out already:

  1. Google’s record GDPR fine – France’s data privacy enforcement agency hit the tech giant with a $57 million penalty in early 2019 – more than 80 times the steepest fine the U.K.’s Information Commissioner’s Office had levied against both Facebook and Equifax for their data breaches.
  2. In July 2019, British Airways received the biggest GDPR fine to date ($229 million) because the data of more than 500,000 customers was compromised.
  3. Marriot International was fined $123 million, or 1.5 percent of its global annual revenue, because 330 million hotel guests were affected by a breach in 2018.

Now, as Cybersecurity Awareness Month comes to a close – and ghosts and goblins roam the streets – we thought it a good time to resurrect some guidance on how data governance can make data security less scary.

We don’t want you to be caught off guard when it comes to protecting sensitive data and staying compliant with data regulations.

Data Governance Makes Data Security Less Scary

Don’t Scream; You Can Protect Your Sensitive Data

It’s easier to protect sensitive data when you know what it is, where it’s stored and how it needs to be governed.

Data security incidents may be the result of not having a true data governance foundation that makes it possible to understand the context of data – what assets exist and where, the relationship between them and enterprise systems and processes, and how and by what authorized parties data is used.

That knowledge is critical to supporting efforts to keep relevant data secure and private.

Without data governance, organizations don’t have visibility of the full data landscape – linkages, processes, people and so on – to propel more context-sensitive security architectures that can better assure expectations around user and corporate data privacy. In sum, they lack the ability to connect the dots across governance, security and privacy – and to act accordingly.

This addresses these fundamental questions:

  1. What private data do we store and how is it used?
  2. Who has access and permissions to the data?
  3. What data do we have and where is it?

Where Are the Skeletons?

Data is a critical asset used to operate, manage and grow a business. While sometimes at rest in databases, data lakes and data warehouses; a large percentage is federated and integrated across the enterprise, introducing governance, manageability and risk issues that must be managed.

Knowing where sensitive data is located and properly governing it with policy rules, impact analysis and lineage views is critical for risk management, data audits and regulatory compliance.

However, when key data isn’t discovered, harvested, cataloged, defined and standardized as part of integration processes, audits may be flawed and therefore your organization is at risk.

Sensitive data – at rest or in motion – that exists in various forms across multiple systems must be automatically tagged, its lineage automatically documented, and its flows depicted so that it is easily found and its usage across workflows easily traced.

Thankfully, tools are available to help automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data: Better visibility and control across the enterprise to identify data security threats and reduce associated risks
  • Enriching business data elements for sensitive data discovery: Comprehensively defining business data element for PII, PHI and PCI across database systems, cloud and Big Data stores to easily identify sensitive data based on a set of algorithms and data patterns
  • Providing metadata and value-based analysis: Discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data including PII, PHI, PCI and other sensitive information.

No Hocus Pocus

Truly understanding an organization’s data, including its value and quality, requires a harmonized approach embedded in business processes and enterprise architecture.

Such an integrated enterprise data governance experience helps organizations understand what data they have, where it is, where it came from, its value, its quality and how it’s used and accessed by people and applications.

An ounce of prevention is worth a pound of cure  – from the painstaking process of identifying what happened and why to notifying customers their data and thus their trust in your organization has been compromised.

A well-formed security architecture that is driven by and aligned by data intelligence is your best defense. However, if there is nefarious intent, a hacker will find a way. So being prepared means you can minimize your risk exposure and the damage to your reputation.

Multiple components must be considered to effectively support a data governance, security and privacy trinity. They are:

  1. Data models
  2. Enterprise architecture
  3. Business process models

Creating policies for data handling and accountability and driving culture change so people understand how to properly work with data are two important components of a data governance initiative, as is the technology for proactively managing data assets.

Without the ability to harvest metadata schemas and business terms; analyze data attributes and relationships; impose structure on definitions; and view all data in one place according to each user’s role within the enterprise, businesses will be hard pressed to stay in step with governance standards and best practices around security and privacy.

As a consequence, the private information held within organizations will continue to be at risk.

Organizations suffering data breaches will be deprived of the benefits they had hoped to realize from the money spent on security technologies and the time invested in developing data privacy classifications.

They also may face heavy fines and other financial, not to mention PR, penalties.

Gartner Magic Quadrant Metadata Management

Categories
erwin Expert Blog

Very Meta … Unlocking Data’s Potential with Metadata Management Solutions

Untapped data, if mined, represents tremendous potential for your organization. While there has been a lot of talk about big data over the years, the real hero in unlocking the value of enterprise data is metadata, or the data about the data.

However, most organizations don’t use all the data they’re flooded with to reach deeper conclusions about how to drive revenue, achieve regulatory compliance or make other strategic decisions. They don’t know exactly what data they have or even where some of it is.

Quite honestly, knowing what data you have and where it lives is complicated. And to truly understand it, you need to be able to create and sustain an enterprise-wide view of and easy access to underlying metadata.

This isn’t an easy task. Organizations are dealing with numerous data types and data sources that were never designed to work together and data infrastructures that have been cobbled together over time with disparate technologies, poor documentation and with little thought for downstream integration.

As a result, the applications and initiatives that depend on a solid data infrastructure may be compromised, leading to faulty analysis and insights.

Metadata Is the Heart of Data Intelligence

A recent IDC Innovators: Data Intelligence Report says that getting answers to such questions as “where is my data, where has it been, and who has access to it” requires harnessing the power of metadata.

Metadata is generated every time data is captured at a source, accessed by users, moves through an organization, and then is profiled, cleansed, aggregated, augmented and used for analytics to guide operational or strategic decision-making.

In fact, data professionals spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analysis, according to IDC.

To flip this 80/20 rule, they need an automated metadata management solution for:

• Discovering data – Identify and interrogate metadata from various data management silos.
• Harvesting data – Automate the collection of metadata from various data management silos and consolidate it into a single source.
• Structuring and deploying data sources – Connect physical metadata to specific data models, business terms, definitions and reusable design standards.
• Analyzing metadata – Understand how data relates to the business and what attributes it has.
• Mapping data flows – Identify where to integrate data and track how it moves and transforms.
• Governing data – Develop a governance model to manage standards, policies and best practices and associate them with physical assets.
• Socializing data – Empower stakeholders to see data in one place and in the context of their roles.

Addressing the Complexities of Metadata Management

The complexities of metadata management can be addressed with a strong data management strategy coupled with metadata management software to enable the data quality the business requires.

This encompasses data cataloging (integration of data sets from various sources), mapping, versioning, business rules and glossary maintenance, and metadata management (associations and lineage).

erwin has developed the only data intelligence platform that provides organizations with a complete and contextual depiction of the entire metadata landscape.

It is the only solution that can automatically harvest, transform and feed metadata from operational processes, business applications and data models into a central data catalog and then made accessible and understandable within the context of role-based views.

erwin’s ability to integrate and continuously refresh metadata from an organization’s entire data ecosystem, including business processes, enterprise architecture and data architecture, forms the foundation for enterprise-wide data discovery, literacy, governance and strategic usage.

Organizations then can take a data-driven approach to business transformation, speed to insights, and risk management.
With erwin, organizations can:

1. Deliver a trusted metadata foundation through automated metadata harvesting and cataloging
2. Standardize data management processes through a metadata-driven approach
3. Centralize data-driven projects around centralized metadata for planning and visibility
4. Accelerate data preparation and delivery through metadata-driven automation
5. Master data management platforms through metadata abstraction
6. Accelerate data literacy through contextual metadata enrichment and integration
7. Leverage a metadata repository to derive lineage, impact analysis and enable audit/oversight ability

With erwin Data Intelligence as part of the erwin EDGE platform, you know what data you have, where it is, where it’s been and how it transformed along the way, plus you can understand sensitivities and risks.

With an automated, real-time, high-quality data pipeline, enterprise stakeholders can base strategic decisions on a full inventory of reliable information.

Many of our customers are hard at work addressing metadata management challenges, and that’s why erwin was Named a Leader in Gartner’s “2019 Magic Quadrant for Metadata Management Solutions.”

Gartner Magic Quadrant Metadata Management

Categories
erwin Expert Blog

Top Use Cases for Enterprise Architecture: Architect Everything

Architect Everything: New use cases for enterprise architecture are increasing enterprise architect’s stock in data-driven business

As enterprise architecture has evolved, so to have the use cases for enterprise architecture.

Analyst firm Ovum recently released a new report titled Ovum Market Radar: Enterprise Architecture. In it, they make the case that enterprise architecture (EA) is becoming AE – or “architect everything”.

The transition highlights enterprise architecture’s evolution from being solely an IT function to being more closely aligned with the business. As such, the function has changed from EA to AE.

At erwin, we’re definitely witnessing this EA evolution as more and more as organizations undertake digital transformation initiatives, including rearchitecting their business models and value streams, as well as responding to increasing regulatory pressures.

This is because EA provides the right information to the right people at the right time for smarter decision-making.

Following are some of the top use cases for enterprise architecture that demonstrate how EA is moving beyond IT and into the business.

Enterprise Architecture Use Cases

Top 7 Use Cases for Enterprise Architecture

Compliance. Enterprise architecture is critical for regulatory compliance. It helps model, manage and transform mission-critical value streams across industries, as well as identify sensitive information. When thousands of employees need to know what compliance processes to follow, such as those associated with regulations (e.g., GDPR, HIPAA, SOX, CCPA, etc.) it ensures not only access to proper documentation but also current, updated information.

The Regulatory Rationale for Integrating Data Management & Data Governance

Data security/risk management. EA should be commonplace in data security planning. Any flaw in the way data is stored or monitored is a potential ‘in’ for a breach, and so businesses have to ensure security surrounding sensitive information is thorough and covers the whole business. Security should be proactive, not reactive, which is why EA should be a huge part of security planning.

Data governance. Today’s enterprise embraces data governance to drive data opportunities, including growing revenue, and limit data risks, including regulatory and compliance gaffes.

EA solutions that provide much-needed insight into the relationship between data assets and applications make it possible to appropriately direct data usage and flows, as well as focus greater attention, if warranted, on applications where data use delivers optimal business value.

Digital transformation. For an organization to successfully embrace change, innovation, EA and project delivery need to be intertwined and traceable. Enterprise architects are crucial to delivering innovation. Taking an idea from concept to delivery requires strategic planning and the ability to execute. An enterprise architecture roadmap can help focus such plans and many organizations are now utilizing them to prepare their enterprise architectures for 5G.

Mergers & acquisitions. Enterprise architecture is essential to successful mergers and acquisitions. It helps alignment by providing a business- outcome perspective for IT and guiding transformation. It also helps define strategy and models, improving interdepartmental cohesion and communication.

In an M&A scenario, businesses need to ensure their systems are fully documented and rationalized. This way they can comb through their inventories to make more informed decisions about which systems to cut or phase out to operate more efficiently.

Innovation management. EA is crucial to innovation and project delivery. Using open standards to link to other products within the overall project lifecycle, integrating agile enterprise architecture with agile development and connecting project delivery with effective governance.

It takes a rigorous approach to ensure that current and future states are published for a wider audience for consumption and collaboration – from modeling to generating road maps with meaningful insights provided to both technical and business stakeholders during every step.

Knowledge retention. Unlocking knowledge and then putting systems in place to retain that knowledge is a key benefit of EA. Many organizations lack a structured approach for gathering and investigating employee ideas. Ideas can fall into a black hole where they don’t get feedback and employees become less engaged.

When your enterprise architecture is aligned with your business outcomes, it provides a way to help your business ideate and investigate the viability of ideas on both the technical and business level.

If the benefits of enterprise architecture would help your business, here’s how you can try erwin EA for free.

Enterprise Architecture Business Process Trial