Categories
erwin Expert Blog Data Governance

Are Data Governance Bottlenecks Holding You Back?

Better decision-making has now topped compliance as the primary driver of data governance. However, organizations still encounter a number of bottlenecks that may hold them back from fully realizing the value of their data in producing timely and relevant business insights.

While acknowledging that data governance is about more than risk management and regulatory compliance may indicate that companies are more confident in their data, the data governance practice is nonetheless growing in complexity because of more:

  • Data to handle, much of it unstructured
  • Sources, like IoT
  • Points of integration
  • Regulations

Without an accurate, high-quality, real-time enterprise data pipeline, it will be difficult to uncover the necessary intelligence to make optimal business decisions.

So what’s holding organizations back from fully using their data to make better, smarter business decisions?

Data Governance Bottlenecks

erwin’s 2020 State of Data Governance and Automation report, based on a survey of business and technology professionals at organizations of various sizes and across numerous industries, examined the role of automation in  data governance and intelligence  efforts.  It uncovered a number of obstacles that organizations have to overcome to improve their data operations.

The No.1 bottleneck, according to 62 percent of respondents, was documenting complete data lineage. Understanding the quality of source data is the next most serious bottleneck (58 percent); followed by finding, identifying, and harvesting data (55 percent); and curating assets with business context (52 percent).

The report revealed that all but two of the possible bottlenecks were marked by more than 50 percent of respondents. Clearly, there’s a massive need for a data governance framework to keep these obstacles from stymying enterprise innovation.

As we zeroed in on the bottlenecks of day-to-day operations, 25 percent of respondents said length of project/delivery time was the most significant challenge, followed by data quality/accuracy is next at 24 percent, time to value at 16 percent, and reliance on developer and other technical resources at 13 percent.

Are Data Governance Bottlenecks Holding You Back?

Overcoming Data Governance Bottlenecks

The 80/20 rule describes the unfortunate reality for many data stewards: they spend 80 percent of their time finding, cleaning and reorganizing huge amounts of data and only 20 percent on actual data analysis.

In fact, we found that close to 70 percent of our survey respondents spent an average of 10 or more hours per week on data-related activities, most of it searching for and preparing data.

What can you do to reverse the 80/20 rule and subsequently overcome data governance bottlenecks?

1. Don’t ignore the complexity of data lineage: It’s a risky endeavor to support data lineage using a manual approach, and businesses that attempt it that way will find it’s not sustainable given data’s constant movement from one place to another via multiple routes – and doing it correctly down to the column level. Adopting automated end-to-end lineage makes it possible to view data movement from the source to reporting structures, providing a comprehensive and detailed view of data in motion.

2. Automate code generation: Alleviate the need for developers to hand code connections from data sources to target schema. Mapping data elements to their sources within a single repository to determine data lineage and harmonize data integration across platforms reduces the need for specialized, technical resources with knowledge of ETL and database procedural code. It also makes it easier for business analysts, data architects, ETL developers, testers and project managers to collaborate for faster decision-making.

3. Use an integrated impact analysis solution: By automating data due diligence for IT you can deliver operational intelligence to the business. Business users benefit from automating impact analysis to better examine value and prioritize individual data sets. Impact analysis has equal importance to IT for automatically tracking changes and understanding how data from one system feeds other systems and reports. This is an aspect of data lineage, created from technical metadata, ensuring nothing “breaks” along the change train.

4. Put data quality first: Users must have confidence in the data they use for analytics. Automating and matching business terms with data assets and documenting lineage down to the column level are critical to good decision-making. If this approach hasn’t been the case to date, enterprises should take a few steps back to review data quality measures before jumping into automating data analytics.

5. Catalog data using a solution with a broad set of metadata connectors: All data sources will be leveraged, including big data, ETL platforms, BI reports, modeling tools, mainframe, and relational data as well as data from many other types of systems. Don’t settle for a data catalog from an emerging vendor that only supports a narrow swath of newer technologies, and don’t rely on a catalog from a legacy provider that may supply only connectors for standard, more mature data sources.

6. Stress data literacy: You want to ensure that data assets are used strategically. Automation expedites the benefits of data cataloging. Curated internal and external datasets for a range of content authors doubles business benefits and ensures effective management and monetization of data assets in the long-term if linked to broader data governance, data quality and metadata management initiatives. There’s a clear connection to data literacy here because of its foundation in business glossaries and socializing data so all stakeholders can view and understand it within the context of their roles.

7. Make automation the norm across all data governance processes: Too many companies still live in a world where data governance is a high-level mandate, not practically implemented. To fully realize the advantages of data governance and the power of data intelligence, data operations must be automated across the board. Without automated data management, the governance housekeeping load on the business will be so great that data quality will inevitably suffer. Being able to account for all enterprise data and resolve disparity in data sources and silos using manual approaches is wishful thinking.

8. Craft your data governance strategy before making any investments: Gather multiple stakeholders—both business and IT— with multiple viewpoints to discover where their needs mesh and where they diverge and what represents the greatest pain points to the business. Solve for these first, but build buy-in by creating a layered, comprehensive strategy that ultimately will address most issues. From there, it’s on to matching your needs to an automated data governance solution that squares with business and IT – both for immediate requirements and future plans.

Register now for the first of a new, six-part webinar series on the practice of data governance and how to proactively deal with the complexities. “The What & Why of Data Governance” webinar on Tuesday, Feb. 23rd at 3 pm GMT/10 am ET.

Categories
erwin Expert Blog

Data Governance Makes Data Security Less Scary

Happy Halloween!

Do you know where your data is? What data you have? Who has had access to it?

These can be frightening questions for an organization to answer.

Add to the mix the potential for a data breach followed by non-compliance, reputational damage and financial penalties and a real horror story could unfold.

In fact, we’ve seen some frightening ones play out already:

  1. Google’s record GDPR fine – France’s data privacy enforcement agency hit the tech giant with a $57 million penalty in early 2019 – more than 80 times the steepest fine the U.K.’s Information Commissioner’s Office had levied against both Facebook and Equifax for their data breaches.
  2. In July 2019, British Airways received the biggest GDPR fine to date ($229 million) because the data of more than 500,000 customers was compromised.
  3. Marriot International was fined $123 million, or 1.5 percent of its global annual revenue, because 330 million hotel guests were affected by a breach in 2018.

Now, as Cybersecurity Awareness Month comes to a close – and ghosts and goblins roam the streets – we thought it a good time to resurrect some guidance on how data governance can make data security less scary.

We don’t want you to be caught off guard when it comes to protecting sensitive data and staying compliant with data regulations.

Data Governance Makes Data Security Less Scary

Don’t Scream; You Can Protect Your Sensitive Data

It’s easier to protect sensitive data when you know what it is, where it’s stored and how it needs to be governed.

Data security incidents may be the result of not having a true data governance foundation that makes it possible to understand the context of data – what assets exist and where, the relationship between them and enterprise systems and processes, and how and by what authorized parties data is used.

That knowledge is critical to supporting efforts to keep relevant data secure and private.

Without data governance, organizations don’t have visibility of the full data landscape – linkages, processes, people and so on – to propel more context-sensitive security architectures that can better assure expectations around user and corporate data privacy. In sum, they lack the ability to connect the dots across governance, security and privacy – and to act accordingly.

This addresses these fundamental questions:

  1. What private data do we store and how is it used?
  2. Who has access and permissions to the data?
  3. What data do we have and where is it?

Where Are the Skeletons?

Data is a critical asset used to operate, manage and grow a business. While sometimes at rest in databases, data lakes and data warehouses; a large percentage is federated and integrated across the enterprise, introducing governance, manageability and risk issues that must be managed.

Knowing where sensitive data is located and properly governing it with policy rules, impact analysis and lineage views is critical for risk management, data audits and regulatory compliance.

However, when key data isn’t discovered, harvested, cataloged, defined and standardized as part of integration processes, audits may be flawed and therefore your organization is at risk.

Sensitive data – at rest or in motion – that exists in various forms across multiple systems must be automatically tagged, its lineage automatically documented, and its flows depicted so that it is easily found and its usage across workflows easily traced.

Thankfully, tools are available to help automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data: Better visibility and control across the enterprise to identify data security threats and reduce associated risks
  • Enriching business data elements for sensitive data discovery: Comprehensively defining business data element for PII, PHI and PCI across database systems, cloud and Big Data stores to easily identify sensitive data based on a set of algorithms and data patterns
  • Providing metadata and value-based analysis: Discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data including PII, PHI, PCI and other sensitive information.

No Hocus Pocus

Truly understanding an organization’s data, including its value and quality, requires a harmonized approach embedded in business processes and enterprise architecture.

Such an integrated enterprise data governance experience helps organizations understand what data they have, where it is, where it came from, its value, its quality and how it’s used and accessed by people and applications.

An ounce of prevention is worth a pound of cure  – from the painstaking process of identifying what happened and why to notifying customers their data and thus their trust in your organization has been compromised.

A well-formed security architecture that is driven by and aligned by data intelligence is your best defense. However, if there is nefarious intent, a hacker will find a way. So being prepared means you can minimize your risk exposure and the damage to your reputation.

Multiple components must be considered to effectively support a data governance, security and privacy trinity. They are:

  1. Data models
  2. Enterprise architecture
  3. Business process models

Creating policies for data handling and accountability and driving culture change so people understand how to properly work with data are two important components of a data governance initiative, as is the technology for proactively managing data assets.

Without the ability to harvest metadata schemas and business terms; analyze data attributes and relationships; impose structure on definitions; and view all data in one place according to each user’s role within the enterprise, businesses will be hard pressed to stay in step with governance standards and best practices around security and privacy.

As a consequence, the private information held within organizations will continue to be at risk.

Organizations suffering data breaches will be deprived of the benefits they had hoped to realize from the money spent on security technologies and the time invested in developing data privacy classifications.

They also may face heavy fines and other financial, not to mention PR, penalties.

Gartner Magic Quadrant Metadata Management

Categories
erwin Expert Blog

Using Strategic Data Governance to Manage GDPR/CCPA Complexity

In light of recent, high-profile data breaches, it’s past-time we re-examined strategic data governance and its role in managing regulatory requirements.

News broke earlier this week of British Airways being fined 183 million pounds – or $228 million – by the U.K. for alleged violations of the European Union’s General Data Protection Regulation (GDPR). While not the first, it is the largest penalty levied since the GDPR went into effect in May 2018.

Given this, Oppenheimer & Co. cautions:

“European regulators could accelerate the crackdown on GDPR violators, which in turn could accelerate demand for GDPR readiness. Although the CCPA [California Consumer Privacy Act, the U.S. equivalent of GDPR] will not become effective until 2020, we believe that new developments in GDPR enforcement may influence the regulatory framework of the still fluid CCPA.”

With all the advance notice and significant chatter for GDPR/CCPA,  why aren’t organizations more prepared to deal with data regulations?

In a word? Complexity.

The complexity of regulatory requirements in and of themselves is aggravated by the complexity of the business and data landscapes within most enterprises.

So it’s important to understand how to use strategic data governance to manage the complexity of regulatory compliance and other business objectives …

Designing and Operationalizing Regulatory Compliance Strategy

It’s not easy to design and deploy compliance in an environment that’s not well understood and difficult in which to maneuver. First you need to analyze and design your compliance strategy and tactics, and then you need to operationalize them.

Modern, strategic data governance, which involves both IT and the business, enables organizations to plan and document how they will discover and understand their data within context, track its physical existence and lineage, and maximize its security, quality and value. It also helps enterprises put these strategic capabilities into action by:

  • Understanding their business, technology and data architectures and their inter-relationships, aligning them with their goals and defining the people, processes and technologies required to achieve compliance.
  • Creating and automating a curated enterprise data catalog, complete with physical assets, data models, data movement, data quality and on-demand lineage.
  • Activating their metadata to drive agile data preparation and governance through integrated data glossaries and dictionaries that associate policies to enable stakeholder data literacy.

Strategic Data Governance for GDPR/CCPA

Five Steps to GDPR/CCPA Compliance

With the right technology, GDPR/CCPA compliance can be automated and accelerated in these five steps:

  1. Catalog systems

Harvest, enrich/transform and catalog data from a wide array of sources to enable any stakeholder to see the interrelationships of data assets across the organization.

  1. Govern PII “at rest”

Classify, flag and socialize the use and governance of personally identifiable information regardless of where it is stored.

  1. Govern PII “in motion”

Scan, catalog and map personally identifiable information to understand how it moves inside and outside the organization and how it changes along the way.

  1. Manage policies and rules

Govern business terminology in addition to data policies and rules, depicting relationships to physical data catalogs and the applications that use them with lineage and impact analysis views.

  1. Strengthen data security

Identify regulatory risks and guide the fortification of network and encryption security standards and policies by understanding where all personally identifiable information is stored, processed and used.

How erwin Can Help

erwin is the only software provider with a complete, metadata-driven approach to data governance through our integrated enterprise modeling and data intelligence suites. We help customers overcome their data governance challenges, with risk management and regulatory compliance being primary concerns.

However, the erwin EDGE also delivers an “enterprise data governance experience” in terms of agile innovation and business transformation – from creating new products and services to keeping customers happy to generating more revenue.

Whatever your organization’s key drivers are, a strategic data governance approach – through  business process, enterprise architecture and data modeling combined with data cataloging and data literacy – is key to success in our modern, digital world.

If you’d like to get a handle on handling your data, you can sign up for a free, one-on-one demo of erwin Data Intelligence.

For more information on GDPR/CCPA, we’ve also published a white paper on the Regulatory Rationale for Integrating Data Management and Data Governance.

GDPR White Paper

Categories
erwin Expert Blog Data Governance

Data Governance Frameworks: The Key to Successful Data Governance Implementation

A strong data governance framework is central to successful data governance implementation in any data-driven organization because it ensures that data is properly maintained, protected and maximized.

But despite this fact, enterprises often face push back when implementing a new data governance initiative or trying to mature an existing one.

Let’s assume you have some form of informal data governance operation with some strengths to build on and some weaknesses to correct. Some parts of the organization are engaged and behind the initiative, while others are skeptical about its relevance or benefits.

Some other common data governance implementation obstacles include:

  • Questions about where to begin and how to prioritize which data streams to govern first
  • Issues regarding data quality and ownership
  • Concerns about data lineage
  • Competing project and resources (time, people and funding)

By using a data governance framework, organizations can formalize their data governance implementation and subsequent adherence to. This addressess common concerns including data quality and data lineage, and provides a clear path to successful data governance implementation.

In this blog, we will cover three key steps to successful data governance implementation. We will also look into how we can expand the scope and depth of a data governance framework to ensure data governance standards remain high.

Data Governance Implementation in 3 Steps

When maturing or implementing data governance and/or a data governance framework, an accurate assessment of the ‘here and now’ is key. Then you can rethink the path forward, identifying any current policies or business processes that should be incorporated, being careful to avoid making the same mistakes of prior iterations.

With this in mind, here are three steps we recommend for implementing data governance and a data governance framework.

Data Governance Framework

Step 1: Shift the culture toward data governance

Data governance isn’t something to set and forget; it’s a strategic approach that needs to evolve over time in response to new opportunities and challenges. Therefore, a successful data governance framework has to become part of the organization’s culture but such a shift requires listening – and remembering that it’s about people, empowerment and accountability.

In most cases, a new data governance framework requires people – those in IT and across the business, including risk management and information security – to change how they work. Any concerns they raise or recommendations they make should be considered. You can encourage feedback through surveys, workshops and open dialog.

Once input has been discussed and plan agreed upon, it is critical to update roles and responsibilities, provide training and ensure ongoing communication. Many organizations now have internal certifications for different data governance roles who wear these badges with pride.

A top-down management approach will get a data governance initiative off the ground, but only bottom-up cultural adoption will carry it out.

Step 2: Refine the data governance framework

The right capabilities and tools are important for fueling an accurate, real-time data pipeline and governing it for maximum security, quality and value. For example:

Data catalogingOrganization’s implementing a data governance framework will benefit from automated metadata harvesting, data mapping, code generation and data lineage with reference data management, lifecycle management and data quality. With these capabilities, you can  efficiently integrate and activate enterprise data within a single, unified catalog in accordance with business requirements.

Data literacy Being able to discover what data is available and understand what it means in common, standardized terms is important because data elements may mean different things to different parts of the organization. A business glossary answers this need, as does the ability for stakeholders to view data relevant to their roles and understand it within a business context through a role-based portal.

Such tools are further enhanced if they can be integrated across data and business architectures and when they promote self-service and collaboration, which also are important to the cultural shift.

 

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

 

 

Step 3: Prioritize then scale the data governance framework

Because data governance is on-going, it’s important to prioritize the initial areas of focus and scale from there. Organizations that start with 30 to 50 data items are generally more successful than those that attempt more than 1,000 in the early stages.

Find some representative (familiar) data items and create examples for data ownership, quality, lineage and definition so stakeholders can see real examples of the data governance framework in action. For example:

  • Data ownership model showing a data item, its definition, producers, consumers, stewards and quality rules (for profiling)
  • Workflow showing the creation, enrichment and approval of the above data item to demonstrate collaboration

Whether your organization is just adopting data governance or the goal is to refine an existing data governance framework, the erwin DG RediChek will provide helpful insights to guide you in the journey.

Categories
erwin Expert Blog

Top 7 Data Governance Blog Posts of 2018

The driving factors behind data governance adoption vary.

Whether implemented as preventative measures (risk management and regulation) or proactive endeavors (value creation and ROI), the benefits of a data governance initiative is becoming more apparent.

Historically most organizations have approached data governance in isolation and from the former category. But as data’s value to the enterprise has grown, so has the need for a holistic, collaborative means of discovering, understanding and governing data.

So with the impetus of the General Data Protection Regulation (GDPR) and the opportunities presented by data-driven transformation, many organizations are re-evaluating their data management and data governance practices.

With that in mind, we’ve compiled a list of the very best, best-practice blog posts from the erwin Experts in 2018.

Defining data governance: DG Drivers

Defining Data Governance

www.erwin.com/blog/defining-data-governance/

Data governance’s importance has become more widely understood. But for a long time, the discipline was marred with a poor reputation owed to consistent false starts, dogged implementations and underwhelming ROI.

The evolution from Data Governance 1.0 to Data Governance 2.0 has helped shake past perceptions, introducing a collaborative approach. But to ensure the collaborative take on data governance is implemented properly, an organization must settle on a common definition.

The Top 6 Benefits of Data Governance

www.erwin.com/blog/top-6-benefits-of-data-governance/

GDPR went into effect for businesses trading with the European Union, including hefty fines for noncompliance with its data collection, storage and usage standards.

But it’s important for organizations to understand that the benefits of data governance extend beyond just GDPR or compliance with any other internal or external regulations.

Data Governance Readiness: The Five Pillars

www.erwin.com/blog/data-governance-readiness/

GDPR had organizations scrambling to implement data governance initiatives by the effective date, but many still lag behind.

Enforcement and fines will increase in 2019, so an understanding of the five pillars of data governance readiness are essential: initiative sponsorship, organizational support, allocation of team resources, enterprise data management methodology and delivery capability.

Data Governance and GDPR: How the Most Comprehensive Data Regulation in the World Will Affect Your Business

www.erwin.com/blog/data-governance-and-gdpr/

Speaking of GDPR enforcement, this post breaks down how the regulation affects business.

From rules regarding active consent, data processing and the tricky “right to be forgotten” to required procedures for notifying afflicted parties of a data breach and documenting compliance, GDPR introduces a lot of complexity.

The Top Five Data Governance Use Cases and Drivers

www.erwin.com/blog/data-governance-use-cases/

An erwin-UBM study conducted in late 2017 sought to determine the biggest drivers for data governance.

In addition to compliance, top drivers turned out to be improving customer satisfaction, reputation management, analytics and Big Data.

Data Governance 2.0 for Financial Services

www.erwin.com/blog/data-governance-2-0-financial-services/

Organizations operating within the financial services industry were arguably the most prepared for GDPR, given its history. However, the huge Equifax data breach was a stark reminder that organizations still have work to do.

As well as an analysis of data governance for regulatory compliance in financial services, this article examines the value data governance can bring to these organizations – up to $30 billion could be on the table.

Understanding and Justifying Data Governance 2.0

www.erwin.com/blog/justifying-data-governance/

For some organizations, the biggest hurdle in implementing a new data governance initiative or strengthening an existing one is support from business leaders. Its value can be hard to demonstrate to those who don’t work directly with data and metadata on a daily basis.

This article examines this data governance roadblock and others in addition to advice on how to overcome them.

 

Automate Data Mapping

Categories
erwin Expert Blog Data Governance

For Pharmaceutical Companies Data Governance Shouldn’t Be a Hard Pill to Swallow

Using data governance in the pharmaceutical industry is a critical piece of the data management puzzle.

Pharmaceutical and life sciences companies face many of the same digital transformation pressures as other industries, such as financial services and healthcare that we have explored previously.

In response, they are turning to technologies like advanced analytics platforms and cloud-based resources to help better inform their decision-making and create new efficiencies and better processes.

Among the conditions that set digital transformation in pharmaceuticals and life sciences apart from other sectors are the regulatory environment and the high incidence of mergers and acquisitions (M&A).

Data Governance, GDPR and Your Business

Protecting sensitive data in these industries is a matter of survival, in terms of the potential penalties for failing to comply with any number of industry and government regulations and because of the near-priceless value of data around research and development (R&D).

The high costs and huge potential of R&D is one of the driving factors of M&A activity in the pharmaceutical and life sciences space. With roughly $156 billion in M&A deals in healthcare in the first quarter of 2018 alone – many involving drug companies – the market is the hottest it’s been in more than a decade. Much of the M&A activity is being driven by companies looking to buy competitors, acquire R&D, and offset losses from expiring drug patents.

 

[GET THE FREE E-BOOK]: APPLICATION PORTFOLIO MANAGEMENT FOR MERGERS & ACQUISITIONS IN THE FINANCIAL SERVICES SECTOR

 

With M&A activity comes the challenge of integrating two formerly separate companies into one. That means integrating technology platforms, business processes, and, of course, the data each organization brings to the deal.

Data Integrity for Risk Management and More

As in virtual every other industry, data is quickly becoming one of the most valuable assets within pharmaceutical and life science companies. In its 2018 Global Life Sciences Outlook, Deloitte speaks to the importance of “data integrity,” which it defines as data that is complete, consistent and accurate throughout the data lifecycle.

Data integrity helps manage risk in pharmaceutical and life sciences by making it easier to comply with a complex web of regulations that touch many different parts of these organizations, from finance to the supply chain and beyond. Linking these cross-functional teams to data they can trust eases the burden of compliance by supplying team members with what many industries now refer to as “a single version of truth” – which is to say, data with integrity.

Data integrity also helps deliver insights for important initiatives in the pharmaceutical and life sciences industries like value-based pricing and market access.

Developing data integrity and taking advantage of it to reduce risk and identify opportunities in pharmaceuticals and life sciences isn’t possible without a holistic approach to data governance that permeates every part of these companies, including business processes and enterprise architecture.

Healthcare Data

Data Governance in the Pharmaceutical Industry Maximizes Value

Data governance gives businesses the visibility they need to understand where their data is, where it came from, its value, its quality and how it can be used by people and software applications. This type of understanding of your data is, of course, essential to compliance. In fact, according to a 2017 survey by erwin, Inc. and UBM, 60 percent of organizations said compliance is driving their data governance initiatives.

Using data governance in the pharmaceutical industry helps organizations contemplating M&A, not only by helping them understand the data they are acquiring, but also by informing decisions around complex IT infrastructures and applications that need to be integrated. Decisions about application rationalization and business processes are easier to make when they are viewed through the lens of a pervasive data governance strategy.

Data governance in the pharmaceutical industry can be leveraged to hone data integrity and move toward what Deloitte refers to as end-to-end evidence management (E2E), which unifies the data in pharmaceuticals and life sciences from R&D to clinical trials and through commercialization.

Once implemented, Deloitte predicts E2E will help organizations maximize the value of their data by:

  • Providing a better understanding of emerging risks
  • Enabling collaboration with health systems, patient advocacy groups, and other constituents
  • Streamlining the development of new therapies
  • Driving down costs

If that list of benefits sounds familiar, it’s because it matches up nicely with the goals of digital transformation at many organizations – more efficient processes, better collaboration, improved visibility and better cost management. And it’s all built on a foundation of data and data governance.

To learn more, download our free whitepaper on the Regulatory Rationale for Integrating Data Management & Data Governance.

Data Modeling Data Goverance

 

Categories
erwin Expert Blog

Financial Services Data Governance: Helping Value ‘the New Currency’

For organizations operating in financial services data governance is becoming increasingly more important. When financial services industry board members and executives gathered for EY’s Financial Services Leadership Summit in early 2018, data was a major topic of conversation.

Attendees referred to data as “the new oil” and “the new currency,” and with good reason. Financial services organizations, including banks, brokerages, insurance companies, asset management firms and more, collect and store massive amounts of data.

But data is only part of the bigger picture in financial services today. Many institutions are investing heavily in IT to help transform their businesses to serve customers and partners who are quickly adopting new technologies. For example, Gartner research expects the global banking industry will spend $519 billion on IT in 2018.

The combination of more data and technology and fewer in-person experiences puts a premium on trust and customer loyalty. Trust has long been at the heart of the financial services industry. It’s why bank buildings in a bygone era were often erected as imposing stone structures that signified strength at a time before deposit insurance, when poor management or even a bank robbery could have devastating effects on a local economy.

Trust is still vital to the health of financial institutions, except today’s worst-case scenario often involves faceless hackers pillaging sensitive data to use or re-sell on the dark web. That’s why governing all of the industry’s data, and managing the risks that comes with collecting and storing such vast amounts of information, is increasingly a board-level issue.

The boards of modern financial services institutions understand three important aspects of data:

  1. Data has a tremendous amount of value to the institution in terms of helping identify the wants and needs of customers.
  2. Data is central to security and compliance, and there are potentially severe consequences for organizations that run afoul of either.
  3. Data is central to the transformation underway at many financial institutions as they work to meet the needs of the modern customer and improve their own efficiencies.

Data Management and Data Governance: Solving the Enterprise Data Dilemma

Data governance helps organizations in financial services understand their data. It’s essential to protecting that data and to helping comply with the many government and industry regulations in the industry. But financial services data governance – all data governance in fact – is about more than security and compliance; it’s about understanding the value and quality of data.

When done right and deployed in a holistic manner that’s woven into the business processes and enterprise architecture, data governance helps financial services organizations better understand where their data is, where it came from, its value, its quality, and how the data is accessed and used by people and applications.

Financial Services Data Governance: It’s Complicated

Financial services data governance is getting increasingly complicated for a number of reasons.

Mergers & Acquisitions

Deloitte’s 2018 Banking and Securities M&A Outlook described 2017 as “stuck in neutral,” but there is reason to believe the market picks up steam in 2018 and beyond, especially when it comes to financial technology (or fintech) firms. Bringing in new sets of data, new applications and new processes through mergers and acquisitions creates a great deal of complexity.

The integrations can be difficult, and there is an increased likelihood of data sprawl and data silos. Data governance not only helps organizations better understand the data, but it also helps make sense of the application portfolios of merging institutions to discover gaps and redundancies.

Regulatory Environment

There is a lengthy list of regulations and governing bodies that oversee the financial services industry, covering everything from cybersecurity to fraud protection to payment processing, all in an effort to minimize risk and protect customers.

The holistic view of data that results from a strong data governance initiative is becoming essential to regulatory compliance. According to a 2017 survey by erwin, Inc. and UBM, 60 percent of organizations said compliance drives their data governance initiatives.

More Partnerships and Networks

According to research by IBM, 45 percent of bankers say partnerships and alliances help improve their agility and competitiveness. Like consumers, today’s financial institutions are more connected than ever before, and it’s no longer couriers and cash that are being transferred in these partnerships; it’s data.

Understanding the value, quality and risk of the data shared in these alliances is essential – not only to be a good partner and derive a business benefit from the relationship, but also to evaluate whether or not an alliance or partnership makes good business sense.

Financial Services Data Governance

More Sources of Data, More Touch Points

Financial services institutions are at the forefront of the multi-channel customer experience and have been for years. People do business with institutions by phone, in person, via the Web, and using mobile devices.

All of these touch points generate data, and it is essential that organizations can tie them all together to understand their customers. This information is not only important to customer service, but also to finding opportunities to grow relationships with customers by identifying where it makes sense to upsell and cross-sell products and services.

Grow the Business, Manage the Risk

In the end, financial services organizations need to understand the ways their data can help grow the business and manage risk. Data governance plays an important role in both.

Financial services data governance can better enable:

  • The personalized, self-service, applications customers want
  • The machine learning solutions that automate decision-making and create more efficient business processes
  • Faster and more accurate identification of cross-sell and upsell opportunities
  • Better decision-making about the application portfolio, M&A targets, M&A success and more

If you’re interested in financial services data governance, or evaluating new data governance technologies for another industry, you can schedule a demo of erwin’s data mapping and data governance solutions.

Data Mapping Demo CTA

And you also might want to download our latest e-book, Solving the Enterprise Data Dilemma.

Michael Pastore is the Director, Content Services at QuinStreet B2B Tech.

Categories
erwin Expert Blog

Avoiding Operational Disasters with a Process-Based Approach to Risk Management

Risk avoidance and risk management are hot topics that seem to govern decision-making – and with good reason. Risk comes with potentially massive operational, financial, reputational and legal repercussions, so it makes absolute sense to do everything possible to model it, understand it, analyse it and ultimately mitigate it.

But not all risk is created equal. Nothing illustrates this point better than recent research showing how much global financial institutions lost to different types of operational risk during the last six years. As shown in the chart below, they lost $210 billion between 2011 and 2016, with more than $180 billion of that amount attributed to execution, delivery and process management combined with clients, products and business practices.

So, major banks lost more money because of bad process management than all other risks combined. I’d argue that client, product and business practices, which comprise the largest risk category, essentially come down to process application and management as well.

Business Process-Based Approach to Risk Management

The Data Disconnect

Despite the actual statistics, we hear more about data/technology and compliance risk. While these are significant and justified concerns, financial institutions don’t seem to realize they are losing more money due to other types of risks.

Therefore, I want to remind them – and all of us – that managing operational risk is an ongoing initiative, which needs to include better risk analysis, documentation, process impact analysis and mitigation.

While dozens of methodologies and systems are available in today’s marketplace, they only focus on  or attempt to address the smaller, individual components of operational risk. However, all the risk categories listed above require an effective, practical and – most important – easy-to-implement system to address all the underlying components in a collaborative effort – not in isolation.

According to ORX, the largest operational risk association in the financial services sector, managing  and thereby reducing risk involves managing four different but interconnected layers: people, IT, organizational processes and regulations.

More and more organizations seem to believe that once IT embeds their applications with the necessary controls to meet regulatory requirements, then all is right with the world. But experience has shown that isn’t true. Without adapting the processes using the applications, training employees, and putting sufficient controls in place to ensure all regulatory elements are not only applied but applied correctly, then technical controls alone will ever be effective.

And many will argue that little can be done within an organization regarding regulations, but that’s not true either. While regulations are developed and passed by governments and other external regulatory bodies, what really matters is how organizations adopt those regulations and embed them into their culture and daily operations – which is where all the layers of risk management intersect.

Avoiding Heisenberg’s Uncertainty Principle in Risk Management

As part of his Nobel Prize-winning work, physicist and quantum mechanics pioneer Werner Heisenberg developed the eponymous uncertainty principle that asserts it is only possible to know either the position or movement of a particle but not both. This theory applies to many aspects of everyday life, including organizational operations. It’s difficult to know both an organization’s current state and where it’s headed, and every organization struggles with the same risk management question in this vein: how do we manage risk while also being agile enough to support growth?

ORX is clear that effective risk management requires implementing controls throughout the entire process ecosystem by integrating risk management into the organization’s very fabric. This means clearly defining roles and responsibilities, embedding process improvements, and regularly controlling process performance. Of course, the common thread here is more streamlined and controlled processes.

Make no mistake – effort is still required, but all the above is much simpler today. Thanks to new methodologies and comprehensive business process modeling systems, you can identify which risks are applicable, where they are most likely to occur, and who is responsible for managing them to reduce their probability and impact. Therefore, operational risk can be viewed and then addressed quickly and effectively.

In fact, erwin has worked with an increasing number of financial institutions launching process improvement, automation and management initiatives specifically designed to restructure their processes to promote flexibility as a growth driver without sacrificing traceability and control.

We can help you do the same, regardless of your industry.

To find out about how erwin can help in empowering your data-driven business initiatives, please click here.

Data-Driven Business Transformation whitepaper

Categories
erwin Expert Blog

Data Governance and Risk Management

Risk management is crucial for any data-driven business. Former FBI Director Robert Mueller famously said, “There are only two types of companies: those that have been hacked and those that will be.” This statement struck a chord when first spoken in 2012, and the strings are still ringing.

As data continues to be more deeply intertwined in our day-to-day lives, the associated risks are growing in number and severity. So, there’s increasing scrutiny on organizations’ data governance practices – and for good reason.

Governmental scrutiny, in particular, is gearing up. The General Data Protection Regulation (GDPR) introduces strict formality in the way data is governed across the European Union, including organizations outside the EU that wish to do business with its member nations.

But in certain sectors, public scrutiny is just as – if not more – important to consider. We’ve been talking since September about the data breach at Equifax, which has just been hit with a 50-state, class-action lawsuit.

And we just learned that Uber was hacked, resulting in the personal data of 57 million customers and Uber drivers being stolen. What’s more, the company concealed the breach for more than a year.

Whether we’re talking about financial or reputational damage, it’s absolutely clear that bad data governance is bad business.

Risk Management Data Governance

Risk Management for IoT

Think about the Internet of Things (IoT) for a moment …

IoT devices are gaining more stock in daily life – from the mundane of smart refrigerators and thermostats to the formidable of medical devices. Despite the degree of severity here, personal data is personal data, and the steps taken to mitigate security risks must be evidenced to be compliant.

Data governance is fundamental to risk mitigation and management. That’s because data governance is largely concerned with understanding two key things: where your data is kept and what it’s used for. Considering the scope of IoT data, this is no easy feat.

Estimates indicate that by 2020, 50 billion connected devices will be in circulation. Misunderstanding where and what this data is could leave the records of millions exposed.

On top of the already pressing need for effective data governance for risk management, we’re constantly approaching uncharted territories in data applications.

Lessons from Driverless Cars

The driverless car industry is one such example on the not-too-distant horizon.

Businesses from BMW to Google are scrambling to win the driverless car race, but fears that driverless cars could be hacked are well founded. Earlier this year, a Deloitte Insights report considered the likely risks of introducing autonomous vehicles onto public roads.

It reads, “The very innovations that aim to enhance the way we move from place to place entail first-order cybersecurity challenges.” It also indicates that organizations need to make radical changes in how they view cybersecurity to ensure connected vehicles are secure, vigilant and resilient:

  • Secure – Work on risk management by prioritizing sensitive assets to balance security and productivity.
  • Vigilant – Integrate threat data, IT data and business data to be equipped with context-rich alerts to prioritize incident handling and streamline incident investigation.
  • Resilient – Rapidly adapt and respond to internal or external changes to continue operations with limited business impacts.

The first thing organizations should take away is that this advice applies to the handling of all sensitive data; it’s by no means exclusive to autonomous vehicles. And second, security, vigilance and resilience all are enabled by data governance.

Data Governance Leads the Way

As discussed, data governance is about knowing where your data is and what it’s used for.  This understanding indicates where security resources should be spent to help mitigate data breaches.

Data governance also makes threat data, IT data and business data more readily discoverable, understandable and applicable, meaning any decisions you make regarding security investments are well informed.

In terms of resilience and the ability to rapidly respond, businesses must be agile and collaborative, points of contention in traditional data governance. However, Data Governance 2.0 as defined by Forrester addresses agility in terms of “just enough controls for managing risk, which enables broader and more insightful use of data required by the evolving needs of an expanding business ecosystem.”

As GDPR looms ever near, an understanding of data governance best practices will be indispensable. To get the best of them, click here.

Data governance is everyone's business

Categories
erwin Expert Blog

Using Enterprise Architecture to Improve Security

The personal data of more than 143 million people – half the United States’ entire population – may have been compromised in the recent Equifax data breach. With every major data breach comes post-mortems and lessons learned, but one area we haven’t seen discussed is how enterprise architecture might aid in the prevention of data breaches.

For Equifax, the reputational hit, loss of profits/market value, and potential lawsuits is really bad news. For other organizations that have yet to suffer a breach, be warned. The clock is ticking for the General Data Protection Regulation (GDPR) to take effect in May 2018. GDPR changes everything, and it’s just around the corner.

Organizations of all sizes must take greater steps to protect consumer data or pay significant penalties. Negligent data governance and data management could cost up to 4 percent of an organization’s global annual worldwide turnover or up to 20 million Euros, whichever is greater.

With this in mind, the Equifax data breach – and subsequent lessons – is a discussion potentially worth millions.

Enterprise architecture for security

Proactive Data Protection and Cybersecurity

Given that data security has long been considered paramount, it’s surprising that enterprise architecture is one approach to improving data protection that has been overlooked.

It’s a surprise because when you consider enterprise architecture use cases and just how much of an organization it permeates (which is really all of it), EA should be commonplace in data security planning.

So, the Equifax breach provides a great opportunity to explore how enterprise architecture could be used for improving cybersecurity.

Security should be proactive, not reactive, which is why EA should be a huge part of security planning. And while we hope the Equifax incident isn’t the catalyst for an initial security assessment and improvements, it certainly should prompt a re-evaluation of data security policies, procedures and technologies.

By using well-built enterprise architecture for the foundation of data security, organizations can help mitigate risk. EA’s comprehensive view of the organization means security can be involved in the planning stages, reducing risks involved in new implementations. When it comes to security, EA should get a seat at the table.

Enterprise architecture also goes a long way in nullifying threats born of shadow IT, out-dated applications, and other IT faux pas. Well-documented, well-maintained EA gives an organization the best possible view of current tech assets.

This is especially relevant in Equifax’s case as the breach has been attributed to the company’s failure to update a web application although it had sufficient warning to do so.

By leveraging EA, organizations can shore up data security by ensuring updates and patches are implemented proactively.

Enterprise Architecture, Security and Risk Management

But what about existing security flaws? Implementing enterprise architecture in security planning now won’t solve them.

An organization can never eliminate security risks completely. The constantly evolving IT landscape would require businesses to spend an infinite amount of time, resources and money to achieve zero risk. Instead, businesses must opt to mitigate and manage risk to the best of their abilities.

Therefore, EA has a role in risk management too.

In fact, EA’s risk management applications are more widely appreciated than its role in security. But effective EA for risk management is a fundamental part of how EA for implementing security works.

Enterprise architecture’s comprehensive accounting of business assets (both technological and human) means it’s best placed to align security and risk management with business goals and objectives. This can give an organization insight into where time and money can best be spent in improving security, as well as the resources available to do so.

This is because of the objective view enterprise architecture analysis provides for an organization.

To use somewhat of a crude but applicable analogy, consider the risks of travel. A fear of flying is more common than fear of driving in a car. In a business sense, this could unwarrantedly encourage more spending on mitigating the risks of flying. However, an objective enterprise architecture analysis would reveal, that despite fear, the risk of travelling by car is much greater.

Applying the same logic to security spending, enterprise architecture analysis would give an organization an indication of how to prioritize security improvements.