Categories
erwin Expert Blog

The Unified Data Platform – Connecting Everything That Matters

Businesses stand to gain a lot from a unified data platform.

This decade has seen data-driven leaders dominate their respective markets and inspire other organizations across the board to use data to fuel their businesses, leveraging this strategic asset to create more value below the surface. It’s even been dubbed “the new oil,” but data is arguably more valuable than the analogy suggests.

Data governance (DG) is a key component of the data value chain because it connects people, processes and technology as they relate to the creation and use of data. It equips organizations to better deal with  increasing data volumes, the variety of data sources, and the speed in which data is processed.

But for an organization to realize and maximize its true data-driven potential, a unified data platform is required. Only then can all data assets be discovered, understood, governed and socialized to produce the desired business outcomes while also reducing data-related risks.

Benefits of a Unified Data Platform

Data governance can’t succeed in a bubble; it has to be connected to the rest of the enterprise. Whether strategic, such as risk and compliance management, or operational, like a centralized help desk, your data governance framework should span and support the entire enterprise and its objectives, which it can’t do from a silo.

Let’s look at some of the benefits of a unified data platform with data governance as the key connection point.

Understand current and future state architecture with business-focused outcomes:

A unified data platform with a single metadata repository connects data governance to the roles, goals strategies and KPIs of the enterprise. Through integrated enterprise architecture modeling, organizations can capture, analyze and incorporate the structure and priorities of the enterprise and related initiatives.

This capability allows you to plan, align, deploy and communicate a high-impact data governance framework and roadmap that sets manageable expectations and measures success with metrics important to the business.

Document capabilities and processes and understand critical paths:

A unified data platform connects data governance to what you do as a business and the details of how you do it. It enables organizations to document and integrate their business capabilities and operational processes with the critical data that serves them.

It also provides visibility and control by identifying the critical paths that will have the greatest impacts on the business.

Realize the value of your organization’s data:

A unified data platform connects data governance to specific business use cases. The value of data is realized by combining different elements to answer a business question or meet a specific requirement. Conceptual and logical schemas and models provide a much richer understanding of how data is related and combined to drive business value.

2020 Data Governance and Automation Report

Harmonize data governance and data management to drive high-quality deliverables:

A unified data platform connects data governance to the orchestration and preparation of data to drive the business, governing data throughout the entire lifecycle – from creation to consumption.

Governing the data management processes that make data available is of equal importance. By harmonizing the data governance and data management lifecycles, organizations can drive high-quality deliverables that are governed from day one.

Promote a business glossary for unanimous understanding of data terminology:

A unified data platform connects data governance to the language of the business when discussing and describing data. Understanding the terminology and semantic meaning of data from a business perspective is imperative, but most business consumers of data don’t have technical backgrounds.

A business glossary promotes data fluency across the organization and vital collaboration between different stakeholders within the data value chain, ensuring all data-related initiatives are aligned and business-driven.

Instill a culture of personal responsibility for data governance:

A unified data platform is inherently connected to the policies, procedures and business rules that inform and govern the data lifecycle. The centralized management and visibility afforded by linking policies and business rules at every level of the data lifecycle will improve data quality, reduce expensive re-work, and improve the ideation and consumption of data by the business.

Business users will know how to use (and how not to use) data, while technical practitioners will have a clear view of the controls and mechanisms required when building the infrastructure that serves up that data.

Better understand the impact of change:

Data governance should be connected to the use of data across roles, organizations, processes, capabilities, dashboards and applications. Proactive impact analysis is key to efficient and effective data strategy. However, most solutions don’t tell the whole story when it comes to data’s business impact.

By adopting a unified data platform, organizations can extend impact analysis well beyond data stores and data lineage for true visibility into who, what, where and how the impact will be felt, breaking down organizational silos.

Getting the Competitive “EDGE”

The erwin EDGE delivers an “enterprise data governance experience” in which every component of the data value chain is connected.

Now with data mapping, it unifies data preparation, enterprise modeling and data governance to simplify the entire data management and governance lifecycle.

Both IT and the business have access to an accurate, high-quality and real-time data pipeline that fuels regulatory compliance, innovation and transformation initiatives with accurate and actionable insights.

Categories
erwin Expert Blog

Once You Understand Your Data, Everything Is Easier

As a data-driven organization in the modern, hyper-competetive business landscape, it’s imperative that employees, business leaders and decision makers can understand your data.

In a previous article, I argued that business process management without data governance is a perilous experiment. The same can be said for enterprise architecture initiatives that traditionally stop at the process level with disregard for the data element.

Therefore, an organization that ignores the data layer of both its process and enterprise architectures isn’t tapping into their full potential. You run the risk of being siloed, confined to traditional and qualitative structures that will make improvement and automation more difficult, time-consuming and ultimately ineffective. However, it does not have to be this way.

I’m going to make a bold statement, and then we can explore it together. Ready? Without data governance, a process management or enterprise architecture initiative will result in a limited enterprise architecture and any efforts that may stem from it (process improvement, consolidation, automation, etc.) also will be limited.

So how exactly does data governance fit within the larger world of enterprise architecture, and why is it so critical?

Understand Your Data – What Lies Beneath

A constant source of unpleasant surprise for most medium and large-scale enterprise architecture and process management initiatives is discovering just how tricky it is to establish interconnectivity between low-level processes and procedures in a way that is easy sustainable and above all, objective.

Traditionally, most projects focus on some type of top-down classification, using either organizational or capability-based groupings. These structures are usually qualitative or derived from process owners, subject matter experts (SMEs) or even department heads and business analysts. While usually accurate, they are primarily isolated, top-down views contained within organizational silos.

But more and more enterprise architecture initiatives are attempting to move beyond these types of groupings to create horizontal, interconnected processes. To do so, process owners must rely on handover points – inputs and outputs, documents and, you guessed it, data. The issue is that these handover points are still qualitative and unsustainable in the long term, which is where data and data governance comes in.

By providing an automated, fully integrated view of data ownership, lineage and interconnectivity, data governance serves as the missing link between disparate and disconnected processes in a way that has always proved elusive and time consuming. It is an objective link, driven by factual relationships that brings everything together.

Data governance also provides an unparalleled level of process connectivity, so organizations can see how processes truly interact with each other, across any type of organizational silo, enabling realistic and objective impact analysis. How is this possible? By conducting data governance in conjunction with any enterprise architecture initiative, using both a clear methodology and specialized system.

Understand Your Data – Data Is Everywhere

Everyone knows that processes generate, use and own data. The problem is understanding what “process data” is and how to incorporate that information into standard business process management.

Most process owners, SMEs and business analysts view data as a collection of information, usually in terms of documents and data groups such as “customer information” or “request details,” that is generated and completed through a series of processes or process steps. Links between documents/data groups and processes are assumed to be communicated to other processes that use them and so on. But this picture is not accurate.

Most of the time, a document or data group is not processed in its entirety by any subsequent/recipient processes; some information is processed by one process while the remainder is reserved for another or is entirely useless. This means that only single, one-dimensional connections are made, ignoring derived or more complex connections.

Therefore, any attempted impact analysis would ignore additional dimensions, which account for most of the process improvement and re-engineering projects that either fail or present significant overruns in terms of both time and budget.

With data governance, data is identified and tracked with ownership, lineage and use established and associated with the appropriate process elements, showing the connections between processes at the most practical informational level.

In addition and possibly most important, process ownership/responsibility becomes more objective and clear because it can be determined according to who owns/is responsible for the information the process generates and uses – as opposed to the more arbitrary/qualitative assignment that tends to be the norm. RACI and CRUD matrix analyses become faster, more efficient and infinitely more effective, and for the first time, they encompass elements of derived ownership (through data lineage).

Process automation projects also become more efficient, effective and shorter in duration because the right data is known from the beginning, process interconnectivity is mapped objectively, and responsibilities are clearly visible from the initial design phase.

With requirements accompanied by realistic process mapping information, development of workflows is easier, faster and less prone to errors, making process automation more attractive and feasible, even to smaller organizations for which even the scoping and requirements-gathering exercise has traditionally proved too complicated.

Understand Your Data – More Upside to Data Governance

Data governance enables an organization to manage and mitigate data risks, protecting itself from legal and reputational challenges to ensure continued operation. And once data is connected with business processes through effective, proactive data governance, additional benefits are realized:

  • Process risk management and mitigation becomes more factual and less qualitative, making the organization more effective;
  • Process compliance also becomes more factual, empowering not only compliance efforts but also compliance control and assessment with easier impact analyses; and
  • Organizational redesign can be based on what groupings actually do.

While the above benefits may appear vague and far-removed from either a pure enterprise architecture or data governance initiative, they are more tangible and achievable than ever before as organizations begin to rely more on object-oriented management systems.

Combining the strategic, informational-level detail and management provided by data governance with the more functional, organizational view of enterprise architecture proves that one plus one really does equal more than two.

To learn more about how data governance underpins organization’s data strategies click here.

Categories
erwin Expert Blog

Why Data Governance is the Key to Better Decision-Making

The ability to quickly collect vast amounts of data, analyze it, and then use what you’ve learned to help foster better decision-making is the dream of many a business executive. But like any number of things that can be summarized in a single sentence, it’s much harder to execute on such a vision than it might first appear.

According to Forrester, 74 percent of firms say they want to be “data-driven,” but only 29 percent say they are good at connecting analytics to action. Consider this: Forrester found that business satisfaction with analytics dropped by 21 percent between 2014 and 2015 – a period of great promise and great investment in Big Data. In other words, the more data businesses were collecting and mining, the less happy they were with their analytics.

A number of factors are potentially at play here, including the analytics software, the culture of the business, and the skill sets of the people using the data. But your analytics applications and the conclusions you draw from your analysis are only as good as the data that is collected and analyzed. Collecting, safeguarding and mining large amounts of data isn’t an inexpensive exercise, and as the saying goes, “garbage in, garbage out.”

“It’s a big investment and if people don’t trust data, they won’t use things like business intelligence tools because they won’t have faith in what they tell them,” says Danny Sandwell, director of product marketing at erwin, Inc.

Using data to inform business decisions is hardly new, of course. The modern idea of market research dates back to the 1920s, and ever since businesses have collected, analyzed and drawn conclusions from information they draw from customers or prospective customers.

The difference today, as you might expect, is the amount of data and how it’s collected. Data is generated by machines large and small, by people, and by old-fashioned market research. It enters today’s businesses from all angles, at lightning speed, and can, in many cases, be available for instant analysis.

As the volume and velocity of data increases, overload becomes a potential problem. Unless the business has a strategic plan for data governance, decisions around where the data is stored, who and what can access it, and how it can be used, becomes increasingly difficult to understand.

Not every business collects massive amounts of data like Facebook and Yahoo, but recent headlines demonstrate how those companies’ inability to govern data is harming their reputations and bottom lines. For Facebook, it was the revelation that the data of 87 million users was improperly obtained to influence the 2016 U. S. presidential election. For Yahoo, the U.S. Securities and Exchange Commission (SEC) levied a $35 million fine for failure to disclose a data breach in a timely manner.

In both the Facebook and Yahoo cases, the misuse or failure to protect data was one problem. Their inability to quickly quantify the scope of the problem and disclose the details made a big issue even worse – and kept it in the headlines even longer.

The issues of data security, data privacy and data governance may not be top of mind for some business users, but these issues manifest themselves in a number of ways that affect what they do on a daily basis. Think of it this way: somewhere in all of the data your organization collects, a piece of information that can support or refute a decision you’re about to make is likely there. Can you find it? Can you trust it?

If the answer to these questions is “no,” then it won’t be easy for your organization to make data-driven decisions.

Better Decision-Making - Data Governance

Powering Better Decision-Making with Data Governance

Nearly half (45 percent) of the respondents to a November 2017 survey by erwin and UBM said better decision-making was one of the factors driving their data governance initiatives.

Data governance helps businesses understand what data they have, how good it is, where it is, and how it’s used. A lot of people are talking about data governance today, and some are putting that talk into action. The erwin/UBM survey found that 52 percent of respondents say data is critically important to their organization and they have a formal data governance strategy in place. But almost as many respondents (46 percent) say they recognize the value of data to their organization but don’t have a formal governance strategy.

Many early attempts at instituting data governance failed to deliver results. They were narrowly focused, and their proponents often had difficulty articulating the value of data governance to the organization, making it difficult to secure budget. Some organizations even understood data governance as a type of data security, locking up data so tightly that the people who wanted to use it to foster better decision-making had trouble getting access.

Issues of ownership also stymied early data governance efforts, as IT and the business couldn’t agree on which side was responsible for a process that affects both on a regular basis. Today, organizations are better equipped to resolve issues of ownership, thanks in large part to a new corporate structure that recognizes how important data is to modern businesses. Roles like chief data officer (CDO), which increasingly sits on the business side, and the data protection officer (DPO), are more common than they were a few years ago.

A modern data governance strategy works a lot like data itself – it permeates the business and its infrastructure. It is part of the enterprise architecture, the business processes, and it help organizations better understand the relationships between data assets using techniques like visualization. Perhaps most important, a modern approach to data governance is ongoing, because organizations and their data are constantly changing and transforming, so their approach to data governance can’t sit still.

As you might expect, better visibility into your data goes a long way toward using that data to make more informed decisions. There is, however, another advantage to the visibility offered by a holistic data governance strategy: it helps you better understand what you don’t know.

By helping businesses understand the areas where they can improve their data collection, data governance helps organizations continually work to create better data, which manifests itself in real business advantages, like better decision-making and top-notch customer experiences, all of which will help grow the business.

Michael Pastore is the Director, Content Services at QuinStreet B2B Tech. This content originally appeared as a sponsored post on http://www.eweek.com/.

Previous posts:

You can determine how effective your current data governance initiative is by taking erwin’s DG RediChek.

Take the DG RediChek

Categories
erwin Expert Blog

Five Pillars of Data Governance Readiness: Organizational Support

It’s important that business leaders foster organizational support for their data governance efforts.

The clock is counting down to the May 25 effective date for the General Data Protection Regulation (GDPR). With the deadline just a stone’s throw away, organizations need to ensure they are data governance-ready.

We’re continuing our blog series on the Five Pillars of Data Governance (DG). Today, we’ll explore the second pillar of data governance, organizational support, and why it’s essential to ensuring DG success.

In the modern, data-driven business world, data is an organization’s most valuable asset, and successful organizations treat it as such. In this respect, we can see data governance as a form of asset maintenance.

Take a production line in a manufacturing facility, for example. Organizations understand that equipment maintenance is an important and on-going process. They require employees using the equipment to be properly trained, ensuring it is clean, safe and working accordingly with no misuse.

They do this because they know that maintenance can prevent, or at the very least postpone repair that can be costly and lead to lost revenue during downtime.

Organizational Support: Production Lines of Information

Data Governance: Organizational Support

Despite the intangible nature of data, the same ideas for maintaining physical assets can and should be applied. After all, data-driven businesses are essentially data production lines of information. Data is created and moved through the pipeline/organization, eventually driving revenue.

In that respect – as with machinery on a production line and those who use it – everybody that uses data should be involved in maintaining and governing it.

Poor data governance leads to similar problems as poor maintenance of a production line. If it’s not well-kept, the fallout can permeate throughout the whole business.

If a DG initiative is failing, data discovery becomes more difficult, slowing down data’s journey through the pipeline.

Inconsistencies in a business glossary lead to data units with poor or no context. This in turn leads to data units that the relevant users don’t know how to put together to create information worth using.

Additionally, and perhaps most damning, if an organization has poorly managed systems of permissions, the wrong people can access data. This could lead to unapproved changes, or in light of GDPR, serious fines – and ultimately diminished customer trust, falling stock prices and tarnished brands.

Facebook has provided a timely reminder of the importance of data governance and the potential scale of fallout should its importance be understated. Facebook’s lack of understanding as to how third-party vendors could use and were using its data landed them in hot PR water (to put it lightly).

Reports indicate 50 million users were affected, and although this is nowhere near the biggest leak in history (or even in recent history, see: Equifax), it’s proof that the reputational damage of a data breach is extensive. And with GDPR fast approaching, that cost will only escalate.

At the very least, organization’s need to demonstrate that they’ve taken the necessary steps to prevent such breaches. This requires understanding what data they currently have, where it is, and also how it may be used by any third parties with access. This is where data governance comes in, but for it to work, many organizations need a culture change.

A Change in Culture

Fostering organizational support for data governance might require a change in organizational culture.

This is especially apparent in organizations that have only adopted the Data Governance 1.0 approach in which DG is siloed from the wider organization and viewed as an “IT-problem.” Such an approach denies data governance initiatives the business contexts needed to function in a data-driven organization.

Data governance is based primarily on three bodies of knowledge: the data dictionary, business glossary and data usage catalog. For these three bodies of knowledge to be complete, they need input from the wider business.

In fact, countless past cases of failed DG implementations can be attributed to organizations lacking organizational support for data governance.

For example, leaving IT to document and assemble a business glossary naturally leads to inconsistencies. In this case, IT departments are tasked with creating a business glossary for terms they often aren’t aware of, don’t understand the context of, or don’t recognize the applications or implications for.

This approach preemptively dooms the initiative, ruling out the value-adding benefits of mature data governance initiatives from the onset.

In erwin’s 2018 State of Data Governance Report, it found that IT departments continue to foot the bill for data governance at 40% of organizations. Budget for data governance comes from the audit and compliance function at 20% of organizations, while the business covers the bill at just 8% of the companies surveyed.

To avoid the aforementioned pitfalls, business leaders need to instill a culture of data governance throughout the organization. This means viewing DG as a strategic initiative and investing in it with inherent organizational and financial support as an on-going practice.

To that end, organizations tend to overvalue the things that can be measured and undervalue the things that cannot. Most organizations want to quantify the value of data governance. As part of a culture shift, organizations should develop a business case for an enterprise data governance initiative that includes calculations for ROI.

By limiting its investment to departmental budgets, data governance must contend with other departmental priorities. As a long-term initiative, it often will lose out to short-term gains.

Of course, this means business leaders need to be heavily invested and involved in data governance themselves – a pillar of data governance readiness in its own right.

Ideally, organizations should implement a collaborative data governance solution to facilitate the organization-wide effort needed to make DG work.

Collaborative in the sense of enabling inter-departmental collaboration so the whole organization’s data assets can be accounted for, but also  in the sense that it works with the other tools that make data governance effective and sustainable – e.g., enterprise architecture, data modeling and business process.

We call this all-encompassing approach to DG an ‘enterprise data governance experience’ or ‘EDGE.’ It’s the Data Governance 2.0 approach, made to reflect how data can be used within the modern enterprise for greater control, context, collaboration and value creation.

To determine your organization’s current state of data governance readiness, take the erwin DG RediChek.

To learn more about the erwin EDGE, reserve your seat for this webinar.

Take the DG RediChek

Categories
erwin Expert Blog

Data Governance: Your Engine for Driving Results

In my previous post, I described how organizational success depends on certain building blocks that work in alignment with common business objectives. These building blocks include business activities, data and analytics.

Governance is also one of the required building blocks because it provides cohesion in the standards to align people, processes, data and technology for successful and sustainable results. Although it has been somewhat of an abstract concept, data governance is foundational to helping organizations use data as a corporate asset.

Assets are acquired and used to help organizations execute their business models. Principles of asset management require that assets be cataloged, inventoried, protected and accessible to authorized people with the skills and experience to optimize them.

Assets typically generate more value if they have high levels of utilization. In the context of data, this means governed data assets will be more valuable if they strengthen existing operations and guide improvements, supported by analytics.

As organizations seek to unlock more value by implementing a wider analytics footprint across more business functions, data governance will guide their journeys.

 A New Perspective on Data

Becoming a data-driven enterprise means making decisions based on empirical evidence, not a “gut feeling.” This transformation requires a clear vision, strategy and disciplined execution. The desired business opportunity must be well thought out, understood and communicated to others – from the C suite to the front lines.

Organizations that want to succeed in the digital age understand that their cultures and therefore their decision-making processes must become more proactive and collaborative. Of course, data is at the core of business performance and continuous improvement.

In this modern era of Big Data, non-traditional data sets generated externally are being blended with traditional data generated internally. As such, a key element of data-driven success involves changing the long-held perspective of data as a cost center, with few if any investments made to unlock its value to the organization.

Being data-driven, based on analytics, changes this mindset. Business leaders are indeed starting to realize that making data more accessible and useful throughout the organization contributes to the results they want to achieve – and must report to their boards.

If traditional asset management concepts are applied to data, then objectives for security, quality, cataloging, definition, confidence, authorization and accessibility can be defined and achieved. These areas then become the performance criteria of the new data asset class.

So transforming an organization’s leadership and the rest of its culture to perceive and treat data as an asset changes its classification from “cost” to “investment.” Valuable assets earn a financial return and fuel productivity. They also can be re-invested or re-purposed.

Data governance is key to this new perspective of data as an asset.

Data Governance Definition and Purpose

Data governance is important to the modern economy because it enables the transformation of data into valuable assets to improve top- and bottom-line performance. Well-governed data is accessible, useful and relevant across a range of business improvement use cases.

But in the early stages of implementing data governance, organizations tend to have trouble defining it and organizing it, including determining which tasks are involved.

At its core, data governance is a cross-functional program that develops, implements, monitors and enforces policies that improve the performance of select data assets.

Implementing data governance ensures that “asset-grade” data is available to support decision-making, based on advanced analytics. Using this rationale, potential objectives to meet the strategic intent of the organization can be defined to derive value.

Following is a list of possible objectives for a data governance program:

  • Improve data security
  • Increase data quality
  • Make data more accessible to more stakeholders
  • Increase data understanding
  • Raise the confidence of data consumers
  • Increase data literacy and determine the data-driven maturity level of the organization

Building a Data Governance Foundation

The scope and structure of a data governance program are important to determine and include responsibilities, accountabilities, decision rights and authority levels, in addition to how the program fits into the existing corporate structure in terms of virtual or physical teams.

Structural options include top-down command and control and bottom up collaborative networks. Executive accountability also should be outlined.

It’s common for a data executive, such as the chief data officer, to be identified as accountable for overall data governance results. Data owners are business leaders who manage the processes that generate critical data. They’re responsible for defining the polices that support the program’s objectives.

Data stewards report to the data owners and are responsible for translating data policies into actions assigned to data specialists. The data specialists execute projects and other workflows to ensure that the governed data conforms to the intent of the policies.

Data stewards form the backbone of a data governance initiative. They influence how data is managed by assigning tasks to the specialists. Data stewards are responsible for cataloging, defining and describing the governed data assets.

These roles may be full-time or part-time, depending on the scope of the work.

Key processes carried out by the data governance team include:

  1. Defining and planning the program’s scope
  2. Data quality improvement
  3. Data security improvement
  4. Metadata creation and management
  5. Evaluating the suitability of new data sources
  6. Monitoring and enforcing compliance to data policies
  7. Researching new data sources
  8. Training to improve data literacy of staff at all levels
  9. Facilitating and finding new data-driven opportunities to improve the business
  10. Leading and managing cultural change

Data governance is based on a strategy that defines how data assets should look and perform, including levels of quality, security, integration, accessibility, etc. The design and implementation of a data governance program should start with a limited scope and then gradually ramp up to support the overall business strategy. So think big, but start small.

The next post in the series explores how data governance helps implement sustainable business processes that produce measurable results over time. Click here to continue reading on.

Data governance is everyone's business

Categories
erwin Expert Blog

Why Data Governance Leads to Data-Driven Success

Searching for new ways to generate value and improve execution, organizations of all shapes and sizes are racing to embrace data-driven approaches that are enabled by advances in analytics.

A perfect storm of events that started in the mid-2000s has morphed into a disruptive force in the economy at an accelerating pace. Data-driven analytics has gained mainstream business adoption. Advances in communications, geo-positioning systems, sensors and computing technologies have combined with the rise of social media and the incredible growth of available data sources.

Leadership teams in the boardroom have become acutely aware of the potential opportunities available for driving innovation and growth.

Although opportunities are significant, many challenges exist that make it difficult to successfully adopt data-driven approaches.

We’re going to explore the rationale for becoming data-driven, how to frame success, and some of the critical building blocks required, including data governance.

Framing Data-Driven Success

Organizational impact helps us frame the concept of data-driven success. Impact is related to an outcome. An impact describes a changed condition in measurable terms. A well-defined impact is a proxy for value.

Stating that you want to “move the needle,” implies that the area of impact can be measured with a metric that represents that needle. By achieving impact in the right business area, incremental value is created.

When investments are considered for implementing new data-driven approaches, it’s essential to define the desired areas of impact. Evidence of impact requires knowledge of the condition before and after the data-driven approach has been implemented.

Areas of impact can be tangible or intangible. They might be difficult to measure, but measurement strategies can be developed that measure most areas of impact. It’s important to frame the desired area of impact against the feasibility of gathering useful measurements.

Examples of measurable impact:

  • Increase process efficiency by 5%
  • Reduce product defects by 15%
  • Increase profit margin by 10%
  • Reduce customer attrition by 15%
  • Increase customer loyalty by 20%

Impact measures relative changes in performance over time. The changes are directly related to incremental value creation. Impact can be defined and managed by organizations from all sectors of the economy. Areas of impact are linked to their mission, vision and definition of success.

Data-driven excellence describes the performance that exists when targeted areas of impact are successfully enabled by data-driven approaches.

Building Blocks of Data-Driven Approaches

Successfully becoming data-driven requires that desired impacts are related to and supported by four categories of building blocks.

Data-Driven Building Blocks

The first category describes the business activities that must be created or modified to drive the desired impact. These are called the “business building blocks.”

The second category describes the new information and insights required by the business building blocks based on analytic methods that enable smarter business activities. These are called the “analytics building blocks.”

The third category describes the relevant data to be acquired and delivered to the analytics methods that generate the new information and insights. These are called the “data building blocks.”

Success at an organizational level requires that all critical building blocks are aligned with shared objectives and approaches that ensure cohesion and policy compliance. This responsibility is provided by the fourth category called the “governance building blocks.”

The four categories form a layered model that describes their dependencies. Value-creating impact depends on business activities, which depends on analytics, which depends on data, which depends on governance.

The Governance Imperative

Data-driven approaches touch many areas of the organization. Key touch points are located where:

  • Data is acquired and managed
  • Insights are created and consumed
  • Decision-making is enabled
  • Resulting actions are carried out
  • Results are monitored using feedback

Governance at a broad level develops the policies and standards needed across all touch points to generate value.  As a form of leadership, governance sets policies, defines objectives and assigns accountabilities across the business, analytics and data building blocks.

Business activity governance ensures that proactive management and employee teams respond to new sources of information and change their behaviors accordingly. Policies related to process standards, human skill development, compensation levels and incentives make up the scope of business activity governance.

Analytics governance ensures that all digital assets and activities that generate insights and information using analytics methods actually enable smarter business activities. Policies related to information relevance, security, visualization, data literacy, analytics model calibration and lifecycle management are key areas of focus.

Data governance is focussed on the data building blocks. Effective data governance brings together diverse groups and departments to enable the data-driven capabilities needed to achieve success. Data governance defines accountabilities, policies and responsibilities needed to ensure that data sets are managed as true corporate assets.

This implies that governed data sets are identified, described, cataloged, secured and provisioned to support all appropriate analytics and information use cases required to enable the analytics methods. Data quality and integration are also within the scope of data governance.

Foundation for Success

Companies that are successful with data-driven approaches can rapidly identify and implement new ideas and analytics use cases. This helps them compete, innovate and generate new levels of value for their stakeholders on a sustainable basis.

Data governance provides the foundation for this success. Effective data governance ensures that data is managed as a true corporate asset. This means that it can be used and re-purposed on an on-going basis to support new and existing ideas generated by the organization as it matures and broadens its data-driven capabilities.

As organizations unlock more value by creating a wider analytics footprint, data governance provides the foundation necessary to support their journey.

The next post in this blog series dives deeper into data governance in terms of scope options, organization approaches, objectives, structures and processes. It provides perspectives on how a well-designed data governance program directly supports the desired data-driven approaches that ultimately drive key areas of business impact.

Data governance is everyone's business

Categories
erwin Expert Blog

An Agile Data Governance Foundation for Building the Data-Driven Enterprise

The data-driven enterprise is the cornerstone of modern business, and good data governance is a key enabler.

In recent years, we’ve seen startups leverage data to catapult themselves ahead of legacy competitors. Companies such as Airbnb, Netflix and Uber have become household names. Although the service each offers differs vastly, all three identify as ‘technology’ organizations because data is integral to their operations.

Data-Driven Business

As with any standard-setting revolution, businesses across the spectrum are now following these examples. But what these organizations need to understand is that simply deciding to be data-driven, or to “do Big Data,” isn’t enough.

As with any strategy or business model, it’s advisable to apply best practices to ensure the endeavor is worthwhile and that it operates as efficiently as possible. In fact, it’s especially important with data, as poorly governed data will lead to slower times to market and oversights in security. Additionally, poorly managed data fosters inaccurate analysis and poor decision-making, further hampering times to market due to inaccuracy in the planning stages, false starts and wasted cycles.

Essentially garbage in, garbage out – so it’s important for businesses to get their foundations right. To build something, you need to know exactly what you’re building and why to understand the best way to progress.

Data Governance 2.0 Is the Underlying Factor

Good data governance (DG) enables every relevant stakeholder – from executives to frontline employees – to discover, understand, govern and socialize data. Then the right people have access to the right data, so the right decisions are easier to make.

Traditionally, DG encompassed governance goals such as maintaining a business glossary of data terms, a data dictionary and catalog. It also enabled lineage mapping and policy authoring.

However, Data Governance 1.0 was siloed with IT left to handle it. Often there were gaps in context, the chain of accountability and the analysis itself.

Data Governance 2.0 remedies this by taking into account the fact that data now permeates all levels of a business. And it allows for greater collaboration.

It gives people interacting with data the required context to make good decisions, and documents the data’s journey, ensuring accountability and compliance with existing and upcoming data regulations.

But beyond the greater collaboration it fosters between people, it also allows for better collaboration between departments and integration with other technology.

By integrating data governance with data modeling (DM), enterprise architecture (EA) and business process (BP), organizations can break down inter-departmental and technical silos for greater visibility and control across domains.

By leveraging a common metadata repository and intuitive role-based and highly configurable user interfaces, organizations can guarantee everyone is singing off the same sheet of music.

Data Governance Enables Better Data Management

The collaborative nature of Data Governance 2.0 is a key enabler for strong data management. Without it, the differing data management initiatives can and often do pull in different directions.

These silos are usually born out of the use of disparate tools that don’t enable collaboration between the relevant roles responsible for the individual data management initiative. This stifles the potential of data analysis, something organizations can’t afford given today’s market conditions.

Businesses operating in highly competitive markets need every advantage: growth, innovation and differentiation. Organizations also need a complete data platform as the rise of data’s involvement in business and subsequent frequent tech advancements mean market landscapes are changing faster than ever before.

By integrating DM, EA and BP, organizations ensure all three initiatives are in sync. Then historically common issues born of siloed data management initiatives don’t arise.

A unified approach, with Data Governance 2.0 at its core, allows organizations to:

  • Enable data fluency and accountability across diverse stakeholders
  • Standardize and harmonize diverse data management platforms and technologies
  • Satisfy compliance and legislative requirements
  • Reduce risks associated with data-driven business transformation
  • Enable enterprise agility and efficiency in data usage.

Data governance is everyone's business

Categories
erwin Expert Blog

Digital Trust: Enterprise Architecture and the Farm Analogy

With the General Data Protection Regulation (GDPR) taking effect soon, organizations can use it as a catalyst in developing digital trust.

Data breaches are increasing in scope and frequency, creating PR nightmares for the organizations affected. The more data breaches, the more news coverage that stays on consumers’ minds.

The Equifax breach and subsequent stock price fall was well documented and should serve as a warning to businesses and how they manage their data. Large or small,  organizations have lessons to learn when it comes to building and maintaining digital trust, especially with GDPR looming ever closer.

Previously, we discussed the importance of fostering a relationship of trust between business and consumer.  Here, we focus more specifically on data keepers and the public.

Digital Tust: Data Farm

Digital Trust and The Farm Analogy

Any approach to mitigating the risks associated with data management needs to consider the ‘three Vs’: variety, velocity and volume.

In describing best practices for handling data, let’s imagine data as an asset on a farm. The typical farm’s wide span makes constant surveillance impossible, similar in principle to data security.

With a farm, you can’t just put a fence around the perimeter and then leave it alone. The same is true of data because you need a security approach that makes dealing with volume and variety easier.

On a farm, that means separating crops and different types of animals. For data, segregation serves to stop those without permissions from accessing sensitive information.

And as with a farm and its seeds, livestock and other assets, data doesn’t just come in to the farm. You also must manage what goes out.

A farm has several gates allowing people, animals and equipment to pass through, pending approval. With data, gates need to make sure only the intended information filters out and that it is secure when doing so. Failure to correctly manage data transfer will leave your business in breach of GDPR and liable for a hefty fine.

Furthermore, when looking at the gates in which data enters and streams out of an organization, we must also consider the third ‘V’ – velocity, the amount of data an organization’s systems can process at any given time.

Of course, the velocity of data an organization can handle is most often tied to how efficiently a business operates. Effectively dealing with high velocities of data requires faster analysis and times to market.

However, it’s arguably a matter of security too. Although not a breach, DDOS attacks are one such vulnerability associated with data velocity.

DDOS attacks are designed to put the aforementioned data gates under pressure, ramping up the amount of data that passes through them at any one time. Organizations with the infrastructure to deal with such an attack, especially one capable of scaling to demand, will suffer less preventable down time.

Enterprise Architecture and Harvesting the Farm

Making sure you can access, understand and use your data for strategic benefit – including fostering digital trust – comes down to effective data management and governance. And enterprise architecture is a great starting point because it provides a holistic view of an organization’s capabilities, applications and systems including how they all connect.

Enterprise architecture at the core of any data-driven business will serve to identify what parts of the farm need extra protections – those fences and gates mentioned earlier.

It also makes GDPR compliance and overall data governance easier, as the first step for both is knowing where all your data is.

For more data management best practices, click here. And you can subscribe to our blog posts here.

erwin blog

Categories
erwin Expert Blog

SQL, NoSQL or NewSQL: Evaluating Your Database Options

A common question in the modern data management space involves database technology: SQL, NoSQL or NewSQL?

But there isn’t a one-size-fits-all answer. What’s “right” must be evaluated on a case-by-case basis and is dependent on data maturity.

For example, a large bookstore chain with a big-data initiative would be stifled by a SQL database. The advantages that could be gained from analyzing social media data (for popular books, consumer buying habits) couldn’t be realized effectively through sequential analysis. There’s too much data involved in this approach, with too many threads to follow.

However, an independent bookstore isn’t necessarily bound to a big-data approach because it may not have a mature data strategy. It might not have ventured beyond digitizing customer records, and a SQL database is sufficient for that work.

Having said that, the “SQL, NoSQL or NewSQL” question is gaining prominence because businesses are becoming increasingly data-driven.

In 2019, an IDC study found 85% of enterprise decision-makers said they had a time frame of two years to make significant inroads into digital transformation or they will fall behind their competitors and suffer financially. Furthermore, a Progress study showed that 85% of enterprise decision-makers feel they only have two years to make significant digital-transformation progress before suffering financially and/or falling behind competitors.

Considering these statistics, what better time than now to evaluate your database technology? The “SQL, NoSQL or NewSQL question,” is especially important if you intend to become more data-driven.

SQL, NoSQL or NewSQL: Advantages and Disadvantages

SQL

SQL databases are tried and tested, proven to work on disks using interfaces with which businesses are already familiar.

As the longest-standing type of database, plenty of SQL options are available. This competitive market means you’ll likely find what you’re looking for at affordable prices.

Additionally, businesses in the earlier stages of data maturity are more likely to have a SQL database at work already, meaning no new investments need to be made.

However in the modern digital business context, SQL databases weren’t made to support the the three Vs of data. The volume is too high, the variety of sources is too vast, and the velocity (speed at which the data must be processed) is too great to be analyzed in sequence.

Furthermore, the foundational, legacy IT world they were purpose-built to serve has evolved. Now, corporate IT departments must be agile, and their databases must be agile and scalable to match.

NoSQL

Despite its name, “NoSQL” doesn’t mean the complete absence of the SQL database approach. Rather, it works as more of a hybrid. The term is a contraction of “not only SQL.”

So, in addition to the advantage of continuity that staying with SQL offers, NoSQL enjoys many of the benefits of SQL databases.

The key difference is that NoSQL databases were developed with modern IT in mind. They are scalable, agile and purpose-built to deal with disparate, high-volume data.

Hence, data is typically more readily available and can be changed, stored or handle the insertion of new data more easily.

For example, MongoDB, one of the key players in the NoSQL world, uses JavaScript Object Notation (JSON). As the company explains, “A JSON database returns query results that can be easily parsed, with little or no transformation.” The open, human- and machine-readable standard facilitates data interchange and can store records, “just as tables and rows store records in a relational database.”

Generally, NoSQL databases are better equipped to deal with other non-relational data too. As well as JSON, NoSQL supports log messages, XML and unstructured documents. This support avoids the lethargic “schema-on-write,” opting to “schema-on-read” instead.

NewSQL

NewSQL refers to databases based on the relational (SQL) database and SQL query language. In an attempt to solve some of the problems of SQL, the likes of VoltDB and others take a best-of-both-worlds approach, marrying the familiarity of SQL with the scalability and agile enablement of NoSQL.

However, as with most seemingly win-win opportunities, NewSQL isn’t without its caveats. These vary from vendor to vendor, but in essence, you either have to sacrifice familiarity side or scalability.

If you’d like to speak with someone at erwin about SQL, NoSQL or NewSQL in more detail, click here.

For more industry advice, subscribe to the erwin Expert Blog.

Benefits of NoSQL Data Modeling

Categories
erwin Expert Blog

Data Modeling in a Jargon-filled World – Internet of Things (IoT)

In the first post of this blog series, we focused on jargon related to the “volume” aspect of Big Data and its impact on data modeling and data-driven organizations. In this post, we’ll focus on “velocity,” the second of Big Data’s “three Vs.”

In particular, we’re going to explore the Internet of Things (IoT), the constellation of web-connected devices, vehicles, buildings and related sensors and software. It’s a great time for this discussion too, as IoT devices are proliferating at a dizzying pace in both number and variety.

Though IoT devices typically generate small “chunks” of data, they often do so at a rapid pace, hence the term “velocity.” Some of these devices generate data from multiple sensors for each time increment. For example, we recently worked with a utility that embedded sensors in each transformer in its electric network and then generated readings every 4 seconds for voltage, oil pressure and ambient temperature, among others.

While the transformer example is just one of many, we can quickly see two key issues that arise when IoT devices are generating data at high velocity. First, organizations need to be able to process this data at high speed.  Second, organizations need a strategy to manage and integrate this never-ending data stream. Even small chunks of data will accumulate into large volumes if they arrive fast enough, which is why it’s so important for businesses to have a strong data management platform.

It’s worth noting that the idea of managing readings from network-connected devices is not new. In industries like utilities, petroleum and manufacturing, organizations have used SCADA systems for years, both to receive data from instrumented devices to help control processes and to provide graphical representations and some limited reporting.

More recently, many utilities have introduced smart meters in their electricity, gas and/or water networks to make the collection of meter data easier and more efficient for a utility company, as well as to make the information more readily available to customers and other stakeholders.

For example, you may have seen an energy usage dashboard provided by your local electric utility, allowing customers to view graphs depicting their electricity consumption by month, day or hour, enabling each customer to make informed decisions about overall energy use.

Seems simple and useful, but have you stopped to think about the volume of data underlying this feature? Even if your utility only presents information on an hourly basis, if you consider that it’s helpful to see trends over time and you assume that a utility with 1.5 million customers decides to keep these individual hourly readings for 13 months for each customer, then we’re already talking about over 14 billion individual readings for this simple example (1.5 million customers x 13 months x over 30 days/month x 24 hours/day).

Now consider the earlier example I mentioned of each transformer in an electrical grid with sensors generating multiple readings every 4 seconds. You can get a sense of the cumulative volume impact of even very small chunks of data arriving at high speed.

With experts estimating the IoT will consist of almost 50 billion devices by 2020, businesses across every industry must prepare to deal with IoT data.

But I have good news because IoT data is generally very simple and easy to model. Each connected device typically sends one or more data streams with each having a value for the type of reading and the time at which it occurred. Historically, large volumes of simple sensor data like this were best stored in time-series databases like the very popular PI System from OSIsoft.

While this continues to be true for many applications, alternative architectures, such as storing the raw sensor readings in a data lake, are also being successfully implemented. Though organizations need to carefully consider the pros and cons of home-grown infrastructure versus time-tested industrial-grade solutions like the PI System.

Regardless of how raw IoT data is stored once captured, the real value of IoT for most organizations is only realized when IoT data is “contextualized,” meaning it is modeled in the context of the broader organization.

The value of modeled data eclipses that of “edge analytics” (where the value is inspected by a software program while inflight from the sensor, typically to see if it falls within an expected range, and either acted upon if required or allowed simply to pass through) or simple reporting like that in the energy usage dashboard example.

It is straightforward to represent a reading of a particular type from a particular sensor or device in a data model or process model. It starts to get interesting when we take it to the next step and incorporate entities into the data model to represent expected ranges –  both for readings under various conditions and representations of how the devices relate to one another.

If the utility in the transformer example has modeled that IoT data well, it might be able to prevent a developing problem with a transformer and also possibly identify alternate electricity paths to isolate the problem before it has an impact on network stability and customer service.

Hopefully this overview of IoT in the utility industry helps you see how your organization can incorporate high-velocity IoT data to become more data-driven and therefore more successful in achieving larger corporate objectives.

Subscribe and join us next time for Data Modeling in a Jargon-filled World – NoSQL/NewSQL.

Data-Driven Business Transformation