Categories
Data Governance erwin Expert Blog

The Value of Data Governance and How to Quantify It

erwin recently hosted the second in its six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, the second webinar focused on “The Value of Data Governance & How to Quantify It.”

As Mr. Pörschmann highlighted at the beginning of the series, data governance works best when it is strongly aligned with the drivers, motivations and goals of the business.

The business drivers and motivation should be the starting point for any data governance initiative. If there is no clear end goal in sight, it will be difficult to get stakeholders on board. And with many competing projects and activities vying for people’s time, it must be clear to people why choosing data governance activities will have a direct benefit to them.

“Usually we talk about benefits which are rather qualitative measures, but what we need for decision-making processes are values,” Pörschmann says. “We need quantifiable results or expected results that are fact-based. And the interesting thing with data governance, it seems to be easier for organizations and teams to state the expected benefits.”

The Data Governance Productivity Matrix

In terms of quantifying data governance, Pörschmann cites the productivity matrix as a relatively simple way to calculate real numbers. He says, “the basic assumption is if an organization equips their managers with the appropriate capabilities and instruments, then it’s management’s obligation to realize productivity potential over time.”

According to IDC, professionals who work with data spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analytics.

Specifically, 80 percent of data professionals’ time is spent on data discovery, preparation and protection, and only 20 percent on analysis leading to insights.

Data governance maturity includes the ability to rely on automated and repeatable processes, which ultimately helps to increase productivity.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement — for data lineage and impact analysis — without interrupting system developers’ normal work methods.

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology — that is, reverse-engineering code from existing XML within the ETL layer — to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

With automation, data professionals can meet the above needs at a fraction of the cost of the traditional, manual way. To summarize, just some of the benefits of data automation are:

  • Centralized and standardized code management with all automation templates stored in a governed repository
  • Better quality code and minimized rework
  • Business-driven data movement and transformation specifications
  • Superior data movement job designs based on best practices
  • Greater agility and faster time to value in data preparation, deployment and governance
  • Cross-platform support of scripting languages and data movement technologies

For example, one global pharmaceutical giant reduced cost by 70 percent and generated 95 percent of production code with “zero touch.” With automation, the company improved the time to business value and significantly reduced the costly re-work associated with error-prone manual processes.

Risk Management and Regulatory Compliance

Risk management, specifically around regulatory compliance, is an important use case to demonstrate the true value of data governance.

According to Pörschmann, risk management asks two main questions.

  1. How likely is a specific event to happen?
  2. What is the impact or damage if this event happens? (e.g.m, cost of repair, cost of reputation, etc.)

“You have to understand the concept or thinking of risk officers or the risk teams,” he says. The risk teams are process-oriented, and they understand how to calculate and how to cover IT risks. But to be successful in communicating data risks with the risk management team, you need to understand how your risk teams are thinking in terms of the risk matrix.

Take the European Union’s General Data Protection Regulation (GDPR) as an example of a data cost. Your team needs to ask, “what is the likelihood that we will fail on data-based activities related to GDPR?” And then ask, “what can we do from the data side to reduce the impact or the total damage?”

But it’s not easy to design and deploy compliance in an environment that’s not well understood and difficult in which to maneuver. Data governance enables organizations to plan and document how they will discover and understand their data within context, track its physical existence and lineage, and maximize its security, quality and value.

With the right technology, organizations can automate and accelerate regulatory compliance in five steps:

  1. Catalog systems. Harvest, enrich/transform and catalog data from a wide array of sources to enable any stakeholder to see the interrelationships of data assets across the organization.
  2. Govern PII “at rest”. Classify, flag and socialize the use and governance of personally identifiable information regardless of where it is stored.
  3. Govern PII “in motion”. Scan, catalog and map personally identifiable information to understand how it moves inside and outside the organization and how it changes along the way.
  4. Manage policies and rules. Govern business terminology in addition to data policies and rules, depicting relationships to physical data catalogs and the applications that use them with lineage and impact analysis views.
  5. Strengthen data security. Identify regulatory risks and guide the fortification of network and encryption security standards and policies by understanding where all personally identifiable information is stored, processed and used.

It’s also important to understand that the benefits of data governance don’t stop with regulatory compliance.

A better understanding of what data you have, where it’s stored and the history of its use and access isn’t only beneficial in fending off non-compliance repercussions. In fact, such an understanding is arguably better put to use proactively.

Data governance improves data quality standards, it enables better decision-making and ensures businesses can have more confidence in the data informing those decisions.

[blog-cta header=”erwin DG Webinar Series” body=”Register now for the March 30 webinar ‘Data Governance Maturity & Tracking Progress.'” button=”Register Now” button_link=”https://register.gotowebinar.com/register/8531817018173466635″ image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog

Documenting and Managing Governance, Risk and Compliance with Business Process

Managing an organization’s governance, risk and compliance (GRC) via its enterprise and business architectures means managing them against business processes (BP).

Shockingly, a lot of organizations, even today, manage this through, either homemade tools or documents, checklists, Excel files, custom-made databases and so on and so forth. The three main reasons organizations tend to still operate in this manual and disparate way comes down to three reasons:

  1. Cost
  2. Governance, risk and compliance are treated as isolated bubbles.
  3. Data-related risks are not connected with the data architects/data scientists.

If we look at this past year, COVID-19 fundamentally changed everything overnight – and it was something that nobody could have anticipated. However, only organizations that had their risks mapped at the process level could see their operational risk profiles and also see what processes needed adjustments – quickly.

Furthermore, by linking compliance with process, those organizations were prepared to answer very specific compliance questions. For example, if a customer asked, “Since most of your employees are working from home now, how can you ensure that my data is not shared with their kids?” Organizations with business process could respond with, “We have anticipated these kinds of risks and implemented the following controls, and this is how we protect you in different layers.”

Every company must understand its business processes, particularly those in industries in which quality, regulatory, health, safety or environmental standards are serious considerations. BP modeling and analysis shows process flows, system interactions and organizational hierarchies to identity areas for improvement as well as practices susceptible to the greatest security, compliance or other risks so controls and audits can be implemented to mitigate exposures.

Connecting the GRC, Data and Process Layers

The GRC layer comprises mandatory components like risks, controls and compliance elements. Traditionally, these are manually documented, monitored and managed.

For example, if tomorrow you decide you want ISO (International Organization for Standardization) 27001 compliance for your information security management system, you can go to the appropriate ISO site, download the entire standard with all the assessments with all the descriptions, mandates, questions and documents that you will need to provide. All of these items would comprise the GRC layer.

However, many organizations maintain Excel files with risk and control information and other Office files with compliance files and information in isolation. Or some of these files are uploaded to various systems, but they don’t talk to each other or any other enterprise systems for that matter. This is the data layer, which is factual, objective and, as opposed to the GRC layer, can be either fully or partly automated.

Now, let’s add the process layer to the equation. Why? Because that is where the GRC and data layers meet. How? Processes produce, process and consume data –information captured in the metadata layer. By following the process sequence, I can actually trace the data lineage as it flows across the entire business ecosystem, beyond the application layer.

Taking it further, from processes, I can look at how the data is being managed by my capabilities. In other words, if I do have a data breach, how do I mitigate it? What impact will it have on my organization? And what are the necessary controls to manage it? Looking at them from right to left, I can identify the effected systems, and I can identify the interfaces between systems.

Mitigating Data Breaches

Most data breaches happen either at the database or interface level. Interfaces are how applications talk to each other.

Organizations are showing immense interest in expanding the development of risk profiles, not only for isolated layers but also in how those layers interact – how applications talk to each other, how processes use data, how data is stored, and how infrastructure is managed. Understanding these profiles allows for more targeted and even preemptive risk mitigation, enabling organizations to fortify their weak points with sufficient controls but also practical and effective processes.

We’re moving from a world in which everything is performed manually and in isolation to one that is fully automated and integrated.

erwin instructs how to document and manage governance, risk and compliance using business process modeling and enterprise architecture solution erwin Evolve.

The C-Level Demands GRC Real-Time Impact Analysis

Impact analysis is critical. Everything needs to be clearly documented, covering all important and relevant aspects. No service, capability or delivery process is considered complete unless the risks and controls that affect it, or are implemented through it, are mapped and that assessment is used to generate risk profiles for the process, service or capability. And the demand for this to happen automatically increases daily.

This is now one of the key mandates across many organizations. C-level executives now demand risk profile dashboards at the process ,organizational and local level.

For example, an executive travelling from one country to another, or from one continent to another, can make a query: “I’m traveling to X, so what is the country’s risk profile and how is it being managed What do I need to be aware of or address while I’m there?” Or when a new legislation is introduced affecting multiple countries, the impact of that legislation to those countries’ risk profiles can be quickly and accurately calculated and actions planned accordingly.

erwin Evolve

GRC is more critical than ever. Organizations and specifically the C-suite are demanding to see risk profiles at different slices and dices of a particular process. But this is impossible without automation.

erwin Evolve is a full-featured, configurable enterprise architecture (EA) and BP modeling and analysis software suite that aids regulatory and industry compliance and maps business systems that support the enterprise. Its automated visualization, documentation and enterprise collaboration capabilities turn EA and BP artifacts into insights both IT and business users can access in a central location for making strategic decisions and managing GRC.

Please click here to start your free trial of erwin Evolve.

Categories
erwin Expert Blog

Using Enterprise Architecture to Improve Security

The personal data of more than 143 million people – half the United States’ entire population – may have been compromised in the recent Equifax data breach. With every major data breach comes post-mortems and lessons learned, but one area we haven’t seen discussed is how enterprise architecture might aid in the prevention of data breaches.

For Equifax, the reputational hit, loss of profits/market value, and potential lawsuits is really bad news. For other organizations that have yet to suffer a breach, be warned. The clock is ticking for the General Data Protection Regulation (GDPR) to take effect in May 2018. GDPR changes everything, and it’s just around the corner.

Organizations of all sizes must take greater steps to protect consumer data or pay significant penalties. Negligent data governance and data management could cost up to 4 percent of an organization’s global annual worldwide turnover or up to 20 million Euros, whichever is greater.

With this in mind, the Equifax data breach – and subsequent lessons – is a discussion potentially worth millions.

Enterprise architecture for security

Proactive Data Protection and Cybersecurity

Given that data security has long been considered paramount, it’s surprising that enterprise architecture is one approach to improving data protection that has been overlooked.

It’s a surprise because when you consider enterprise architecture use cases and just how much of an organization it permeates (which is really all of it), EA should be commonplace in data security planning.

So, the Equifax breach provides a great opportunity to explore how enterprise architecture could be used for improving cybersecurity.

Security should be proactive, not reactive, which is why EA should be a huge part of security planning. And while we hope the Equifax incident isn’t the catalyst for an initial security assessment and improvements, it certainly should prompt a re-evaluation of data security policies, procedures and technologies.

By using well-built enterprise architecture for the foundation of data security, organizations can help mitigate risk. EA’s comprehensive view of the organization means security can be involved in the planning stages, reducing risks involved in new implementations. When it comes to security, EA should get a seat at the table.

Enterprise architecture also goes a long way in nullifying threats born of shadow IT, out-dated applications, and other IT faux pas. Well-documented, well-maintained EA gives an organization the best possible view of current tech assets.

This is especially relevant in Equifax’s case as the breach has been attributed to the company’s failure to update a web application although it had sufficient warning to do so.

By leveraging EA, organizations can shore up data security by ensuring updates and patches are implemented proactively.

Enterprise Architecture, Security and Risk Management

But what about existing security flaws? Implementing enterprise architecture in security planning now won’t solve them.

An organization can never eliminate security risks completely. The constantly evolving IT landscape would require businesses to spend an infinite amount of time, resources and money to achieve zero risk. Instead, businesses must opt to mitigate and manage risk to the best of their abilities.

Therefore, EA has a role in risk management too.

In fact, EA’s risk management applications are more widely appreciated than its role in security. But effective EA for risk management is a fundamental part of how EA for implementing security works.

Enterprise architecture’s comprehensive accounting of business assets (both technological and human) means it’s best placed to align security and risk management with business goals and objectives. This can give an organization insight into where time and money can best be spent in improving security, as well as the resources available to do so.

This is because of the objective view enterprise architecture analysis provides for an organization.

To use somewhat of a crude but applicable analogy, consider the risks of travel. A fear of flying is more common than fear of driving in a car. In a business sense, this could unwarrantedly encourage more spending on mitigating the risks of flying. However, an objective enterprise architecture analysis would reveal, that despite fear, the risk of travelling by car is much greater.

Applying the same logic to security spending, enterprise architecture analysis would give an organization an indication of how to prioritize security improvements.