Categories
Data Intelligence erwin Expert Blog

From Chaos to Control with Data Intelligence

As the amount of data grows exponentially, organizations turn to data intelligence to reach deeper conclusions about driving revenue, achieving regulatory compliance and accomplishing other strategic objectives.

It’s no secret that data has grown in volume, variety and velocity, with 2.5 quintillion bytes generated every day and 90 percent of the world’s data volume created just in the last two years. This data explosion has overwhelmed most organizations, making it nearly impossible for them to manage much less put to smart, strategic use. How do you identify the time-sensitive, relevant insights that could mean the difference between the life and death of your business?

Data Intelligence

Time sensitivity in data management and analytics is a massive issue. Data needs to fuel rapid decisions that make your organization more effective, customer-centric and competitive. That was true before COVID 19, and it’s even more important in the face of the radical disruption it’s caused. The answer is radical transformation, made possible by an intelligent, data-driven approach to:

  • New business models
  • New products and services
  • Hyper-competition
  • Market expansion

One Customer’s Journey to Controlling Data Chaos

Ultra Mobile recently shared how it uses erwin Data Intelligence (erwin DI) as part of a modern, ongoing approach to data governance and therefore control versus chaos.

To manage not only risk but also to grow and compete effectively requires the ability to deal with both planned and unplanned change.

With erwin DI as its data governance platform, Ultra Mobile has a one-stop shop to see all of its data – and data changes – in one place thanks to forward and reverse lineage. Now, questions about the health of the business can be answered, including how best to retain customers, and it can explore ways to grow the subscriber base.

Being able to integrate all data touchpoints, including erwin DM for data modeling, Denodo for data visualization, and Jira for ticketing, has been key. This metadata is ingested into the data catalog, definitions are added within a business glossary, and the searchable repository enables users to understand how data is used and stored.

erwin DI’s mind map also has proved helpful with being able to see associations and entity relationships, especially in terms of impact analysis for evaluating planned changes and their downstream effects.

Watch the full webinar.

Data Intelligence Just Got Smarter

erwin just released a new version of erwin DI. The enhancements include improvements to the user interface (UI), plus new artificial intelligence (AI) and self-service data discovery capabilities.

 

The new erwin DI makes it easier for organizations to tailor the solution to meet the unique needs of their data governance frameworks, identify and socialize the most valuable data assets, and expand metadata scanning and sensitive data tracking.

Using erwin DI, customers are powering comprehensive data governance initiatives, cloud migration and other massive digital transformation projects.

It facilitates both IT- and business-friendly discovery, navigation and understanding of data assets within context and in line with governance protocols.

And it provides organizations with even more flexibility to ensure the software fits their unique frameworks and workflows because one size does not fit all when it comes to data governance.

Backed by a flexible metamodel and deep metadata-driven automation, the updated erwin DI uniquely addresses both IT and business data governance needs to safeguard against risks and harness opportunities.

It combines and then raises the visibility of business and physical data assets in a framework that is flexible but always in sync and therefore sustainable. Then stakeholders from across the enterprise can discover, manage and collaborate on the most relevant and valuable data assets.

The latest erwin DI release builds on prior 2020 updates with:

  • New role-based and governance assignment capabilities, making it easier for an organization to tailor erwin DI to its data governance needs and framework
  • Enhanced UI, workflow and search to speed navigation, asset discovery, contextual understanding and data governance management
  • Expanded AI capabilities to enrich metadata scanning and speed the handling of sensitive data for automated GDPR and CCPA compliance programs
  • Greater visibility into business and data lineage through new vantage points, filters and drilldowns
  • Improved socialization and collaboration features to increase business user engagement and capitalize on organizational data quality knowledge
  • More administrative tools to efficiently onboard new users and roles, manage access rights, and address audit requests

Additionally, erwin DI was recently evaluated by Gartner for the 2020 “Metadata Management Solutions Magic Quadrant,” which named erwin as a “Leader” for the second consecutive year. Click here to download a copy of the Gartner Magic Quadrant Report.

You also can request a free demo of erwin DI here.

Gartner Magic Quadrant

Categories
erwin Expert Blog Data Intelligence

Surviving Radical Disruption with Data Intelligence

It’s certainly no secret that data has been growing in volume, variety and velocity, and most companies are overwhelmed by managing it, let alone harnessing it to put it to work.

We’re now generating 2.5 quintillion bytes of data every day, and 90% of the world’s data volume has been created in the past two years alone. With this absolute data explosion, it’s nearly impossible to filter out the time-sensitive data, the information that has immediate relevance and impact on your business.

And this time sensitivity is a massive issue, as taking a proactive and data-driven approach can literally mean life or death to your business or to your customers. And that’s where data analytics can play a huge role.

By leveraging the power of the cloud, harnessing data from the Internet of Things (IoT) and other events, and processing this data in near-real time, analytics helps to effectively process the relentless incoming data feed.

Without automation and the development of a governed data pipeline, you’ll never have enough data scientists in the front office to put the data to work. The benefits of fast time to insights is clear, regardless of the industry you’re in.

Think about these examples: a communications agency that needs to get out in front of a difficult message, a retailer driving sales based on real-time customer behavior, a logistics and delivery company needing to understand road conditions, stoppages and up-to-the-minute weather, or a hospital that needs to tailor patient care based on the latest public health findings.

Your data needs to fuel rapid decisions that make your organization more effective, customer-centric and competitive. This was true before the world changed.

COVID-19 Changed Everything

COVID changed everything. It’s a radical disruptor the likes of which we’ve never seen.

As a CEO, a husband and a father, I’ve made decisions during the past seven months that I never dreamed possible, and I’m sure this is true for you and your family – and business – as well.

Now to survive and thrive in the face of radical disruption requires radical transformation and new business models. Reimagining business, like moving fitness centers outdoors, or developing new products and services, such as restaurants packaging fruits and vegetables to sell as food bundles, or market expansion, like traditional grocers that are becoming online shopping hubs.

The companies that come out of this historic period of global uncertainty and change are those who’ve taken intelligent and data-driven approaches to their businesses.

What holds most companies back from faster time to insights and leveraging radical transformation? I think those answers can be found by asking these core questions:

  1. What data do I have?
  2. Where is the data?
  3. What people and systems are using that data and for what purposes?
  4. What processes should governance use?
  5. How is this data relevant and accessible to the business?  

Data Intelligence Provides an EDGE

There’s a common denominator in what they’re all missing, and that is data intelligence.

IDC defines data intelligence as business, technical, relational, and operational metadata that provides transparency of data profiles, classification, quality, location, context, and lineage, providing people, processes, and technology with trustworthy, reliable data.

In a new IDC Solution Brief, “The Value of Robust Data Intelligence to Enable Data Governance with erwin,” its authors state:

Data is the lifeblood of the digital economy — it is what is driving new business models, better customer experiences, better decision-making, and artificially intelligent automation. The global pandemic in 2020 has accelerated digital transformation and amplified the value of data in what will become the next normal as the global economy struggles through recovery. In a world where market conditions, supply chains, work locations, and communication methods are constantly changing, data is a constant source that can be used to inform decisions from crisis to recovery. To use data effectively, it needs to be trusted, understood, and used appropriately, and herein lies many problems that organizations face in the digital economy.

The IDC authors also interviewed erwin customers who described the erwin Data Intelligence Suite, part of the erwin EDGE platform, as a fundamental component of their efforts to generate more value from data while minimizing data-related risk.

The erwin EDGE helps organizations unlock their potential by maximizing the security, quality and value of their data assets, and it operationalizes these steps by connecting enterprise architecture, business process and data modeling with data intelligence software.

The result is an automated, real-time, high-quality data pipeline from which accurate insights can be derived.

The erwin EDGE enables organizations to see how data flows through and impacts all their business, technology and data architectures. Then all stakeholders within a company, those in IT as well as the larger enterprise, can collaborate to make better decisions based upon data truth, not just gut instinct.

Parts of this blog are excerpted from my keynote on day No. 1 of erwin Insights 2020, our virtual conference on enterprise modeling and data governance/intelligence.

You can view the entire keynote and all other sessions of the conference by registering here.

erwin Insights 2020

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

Categories
Data Intelligence Data Governance erwin Expert Blog

Doing Cloud Migration and Data Governance Right the First Time

More and more companies are looking at cloud migration.

Migrating legacy data to public, private or hybrid clouds provide creative and sustainable ways for organizations to increase their speed to insights for digital transformation, modernize and scale their processing and storage capabilities, better manage and reduce costs, encourage remote collaboration, and enhance security, support and disaster recovery.

But let’s be honest – no one likes to move. So if you’re going to move from your data from on-premise legacy data stores and warehouse systems to the cloud, you should do it right the first time. And as you make this transition, you need to understand what data you have, know where it is located, and govern it along the way.

cloud migration

Automated Cloud Migration

Historically, moving legacy data to the cloud hasn’t been easy or fast.

As organizations look to migrate their data from legacy on-prem systems to cloud platforms, they want to do so quickly and precisely while ensuring the quality and overall governance of that data.

The first step in this process is converting the physical table structures themselves. Then you must bulk load the legacy data. No less daunting, your next step is to re-point or even re-platform your data movement processes.

Without automation, this is a time-consuming and expensive undertaking. And you can’t risk false starts or delayed ROI that reduces the confidence of the business and taint this transformational initiative.

By using automated and repeatable capabilities, you can quickly and safely migrate data to the cloud and govern it along the way.

But transforming and migrating enterprise data to the cloud is only half the story – once there, it needs to be governed for completeness and compliance. That means your cloud data assets must be available for use by the right people for the right purposes to maximize their security, quality and value.

Why You Need Cloud Data Governance

Companies everywhere are building innovative business applications to support their customers, partners and employees and are increasingly migrating from legacy to cloud environments. But even with the “need for speed” to market, new applications must be modeled and documented for compliance, transparency and stakeholder literacy.

The desire to modernize technology, over time, leads to acquiring many different systems with various data entry points and transformation rules for data as it moves into and across the organization.

These tools range from enterprise service bus (ESB) products, data integration tools; extract, transform and load (ETL) tools, procedural code, application program interfaces (APIs), file transfer protocol (FTP) processes, and even business intelligence (BI) reports that further aggregate and transform data.

With all these diverse metadata sources, it is difficult to understand the complicated web they form much less get a simple visual flow of data lineage and impact analysis.

Regulatory compliance is also a major driver of data governance (e.g., GDPR, CCPA, HIPAA, SOX, PIC DSS). While progress has been made, enterprises are still grappling with the challenges of deploying comprehensive and sustainable data governance, including reliance on mostly manual processes for data mapping, data cataloging and data lineage.

Introducing erwin Cloud Catalyst

erwin just announced the release of erwin Cloud Catalyst, a suite of automated cloud migration and data governance software and services. It helps organizations quickly and precisely migrate their data from legacy, on-premise databases to the cloud and then govern those data assets throughout their lifecycle.

Only erwin provides software and services that automate the complete cloud migration and data governance lifecycle – from the reverse-engineering and transformation of legacy systems and ETL/ELT code to moving bulk data to cataloging and auto generating lineage. The metadata-driven suite automatically finds, models, ingests, catalogs and governs cloud data assets.

erwin Cloud Catalyst is comprised of erwin Data Modeler (erwin DM), erwin Data Intelligence (erwin DI) and erwin Smart Data Connectors, working together to simplify and accelerate cloud migration by removing barriers, reducing risks and decreasing time to value for your investments in these modern systems, such Snowflake, Microsoft Azure and Google Cloud.

We start with an assessment of your cloud migration strategy to determine what automation and optimization opportunities exist. Then we deliver an automation roadmap and design the appropriate smart data connectors to help your IT services team achieve your future-state cloud architecture, including accelerating data ingestion and ETL conversion.

Once your data reaches the cloud, you’ll have deep and detailed metadata management with full data governance, data lineage and impact analysis. With erwin Cloud Catalyst, you automate these data governance steps:

  • Harvest and catalog cloud data: erwin DM and erwin DI’s Metadata Manager natively scans RDBMS sources to catalog/document data assets.
  • Model cloud data structures: erwin DM converts, modifies and models the new cloud data structures.
  • Map data movement: erwin DI’s Mapping Manager defines data movement and transformation requirements via drag-and-drop functionality.
  • Generate source code: erwin DI’s automation framework generates data migration source code for any ETL/ELT SDK.
  • Test migrated data: erwin DI’s automation framework generates test cases and validation source code to test migrated data.
  • Govern cloud data: erwin DI gives cloud data assets business context and meaning through the Business Glossary Manager, as well as policies and rules for use.
  • Distribute cloud data: erwin DI’s Business User Portal provides self-service access to cloud data asset discovery and reporting tools.

Request an erwin Cloud Catalyst assessment.

And don’t forget to register for erwin Insights 2020 on October 13-14, with sessions on Snowflake, Microsoft and data lake initiatives powered by erwin Cloud Catalyst.

erwin Data Intelligence

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

Categories
Data Intelligence erwin Expert Blog

Top 6 Benefits of Automating End-to-End Data Lineage

Replace manual and recurring tasks for fast, reliable data lineage and overall data governance

Benefits of Data Lineage

It’s paramount that organizations understand the benefits of automating end-to-end data lineage. Critically, it makes it easier to get a clear view of how information is created and flows into, across and outside an enterprise.

The importance of end-to-end data lineage is widely understood and ignoring it is risky business. But it’s also important to understand why and how automation plays a critical role.

Benjamin Franklin said, “Lost time is never found again.” According to erwin’s “2020 State of Data Governance and Automation” report, close to 70 percent of data professional respondents say they spend an average of 10 or more hours per week on data-related activities, and most of that time is spent searching for and preparing data.

Data automation reduces the loss of time in collecting, processing and storing large chunks of data because it replaces manual processes (and human errors) with intelligent processes, software and artificial intelligence (AI).

Automating end-to-end data lineage helps organizations further focus their available resources on more important and strategic tasks, which ultimately provides greater value.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with auto generated and meaningful documentation of the mappings, is a powerful way to support overall data governance.

According to the erwin report, documenting complete data lineage is currently the data operation with the largest percentage spread between its current level of automation (25%) and being seen as the most valuable operation to automate (65%).

Doing Data Lineage Right

Eliminating manual tasks is not the only reason to adopt automated data lineage. Replacing recurring tasks that don’t rely on human intelligence for completion is where automation makes an even bigger difference. Here are six benefits of automating end-to-end data lineage:

  1. Reduced Errors and Operational Costs

Data quality is crucial to every organization. Automated data capture can significantly reduce errors when compared to manual entry. Company documents can be filled out, stored, retrieved, and used more accurately and this, in turn, can save organizations a significant amount of money.

The 1-10-100 rule, commonly used in business circles, states that preventing an error will cost an organization $1, correcting an error already made will cost $10, and allowing an error to stand will cost $100.

Ratios will vary depending on the magnitude of the mistake and the company involved, of course, but the point remains that adopting the most reliable means of preventing a mistake, is the best approach to take in the long run.

  1. Faster Business Turnaround

Speed and faster time to market is a driving force behind most organizations’ efforts with data lineage automation. More work can be done when you are not waiting on someone to manually process data or forms.

For example, when everything can be scanned using RFID technology, it can be documented and confirmed instantaneously, cutting hours of work down to seconds.

This opens opportunities for employees to train for more profitable roles, allowing organizations to reinvest in their employees. With complex data architectures and systems within so many organizations, tracking data in motion and data at rest is daunting to say the least.

Harvesting the data through automation seamlessly removes ambiguity and speeds up the processing time-to-market capabilities.

  1. Compliance and Auditability

Regulatory compliance places greater transparency demands on firms when it comes to tracing and auditing data.

For example, capital markets trading firms must implement data lineage to support risk management, data governance and reporting for various regulations such as the Basel Committee on Banking Supervision’s standard number 239 (BCBS 239) and Markets in Financial Instruments Directive (MiFID II).

Business terms and data policies should be implemented through standardized and documented business rules. Compliance with these business rules can be tracked through data lineage, incorporating auditability and validation controls across data transformations and pipelines to generate alerts when there are non-compliant data instances.

Also, different organizational stakeholders (customers, employees and auditors) need to understand and trust reported data. Automated data lineage ensures captured data is accurate and consistent across its trajectory.

  1. Consistency, Clarity and Greater Efficiency

Data lineage automation can help improve efficiency and ensure accuracy. The more streamlined your processes, the more efficient your business. The more efficient your business, the more money you save on daily operations.

For example, backing up your data effectively and routinely is important. Data is one of the most important assets for any business.

However, different types of data need to be treated differently. Some data needs to be backed up daily while some types of data demand weekly or monthly backups.

With automation in place, you just need to develop backup strategies for your data with a consistent scheduling process. The actual job of backing things up will be managed by the system processes you set up for consistency and clarity.

  1. Improved Customer and Employee Satisfaction

Customer disengagement is a more severe problem than you might think. A recent study has shown that it costs U.S. businesses around $300 billion annually, nearly equal to the U.S. defense budget. When the employees are disengaged, they consistently give you their time but do not put the best of their efforts.

With data lineage automation, employers can automate such tasks and free up time for high-value work. According to a smartsheet report, 69% of employees thought that automation would reduce wasting time during their workday and 59% thought that they would have more than six spare hours per week if repetitive jobs were automated.

  1. Governance Enforcement

Data lineage automation is a great way to implement governance in any business. Any task that an automated process completes is always documented and has traceability.

For every task, you get clear logs that tell you what was done, who did it and when it was done. As stated before, automation plays a major role in reducing human errors and speeds up tasks that need to be performed repeatedly.

If you have not made the jump to digital yet, you are probably wading through high volumes of resources and manual processes daily. There is no denying the fact that automating business processes contributes immensely to an organization’s success. 

Automated Data Lineage in Action

Automated data lineage tools document the flow of data into and out of an organization’s systems. They capture end-to-end lineage and ensure proper impact analysis can be performed in the event of problems or changes to data assets as they move across pipelines.

erwin Data Intelligence (erwin DI) helps bind business terms to technical data assets with a complete data lineage of scanned metadata assets. Automating data capture frees up resources to focus on more strategic and useful tasks.

It automatically generates end-to-end data lineage, down to the column level and between repositories. You can view data flows from source systems to the reporting layers, including intermediate transformation and business logic.

Request your own demo of erwin DI to see metadata-driven, automated data lineage in action.

erwin Data Intelligence

Categories
Data Intelligence erwin Expert Blog

Why You Need End-to-End Data Lineage

Not Documenting End-to-End Data Lineage Is Risky Business – Understanding your data’s origins is key to successful data governance.

Not everyone understands what end-to-end data lineage is or why it is important. In a previous blog, I explained that data lineage is basically the history of data, including a data set’s origin, characteristics, quality and movement over time.

This information is critical to regulatory compliance, change management and data governance not to mention delivering an optimal customer experience. But given the volume, velocity and variety of data (the three Vs of data) we generate today, producing and keeping up with end-to-end data linage is complex and time-consuming.

Yet given this era of digital transformation and fierce competition, understanding what data you have, where it came from, how it’s changed since creation or acquisition, and whether it poses any risks is paramount to optimizing its value. Furthermore, faulty decision-making based on inconsistent analytics and inaccurate reporting can cost millions.

Data Lineage

Data Lineage Tells an Important Origin Story

End-to-end data lineage explains how information flows into, across and outside an organization. And knowing how information was created, its origin and quality may have greater value than a given data set’s current state.

For example, data lineage provides a way to determine which downstream applications and processes are affected by a change in data expectations and helps in planning for application updates.

As I mentioned above, the three Vs of data and the integration of systems makes it difficult to understand the resulting data web much less capture a simple visual of that flow. Yet a consistent view of data and how it flows is paramount to the success of enterprise data governance and any data-driven initiative.

Whether you need to drill down for a granular view of a particular data set or create a high-level summary to describe a particular system and the data it relies on, end-to-end data lineage must be documented and tracked, with an emphasis on the dynamics of data processing and movement as opposed to data structures. Data lineage helps answer questions about the origin of data in key performance indicator (KPI) reports, including:

  • How are the report tables and columns defined in the metadata?
  • Who are the data owners?
  • What are the transformation rules?

Five Consequences of Ignoring Data Lineage

Why do so many organizations struggle with end-to-end data lineage?

The struggle is real for a number of reasons. At the top of the list, organizations are dealing with more data than ever before using systems that weren’t designed to communicate effectively with one another.

Next, their IT and business stakeholders have a difficult time collaborating. And, for a lot of organizations, they’ve relied mostly on manual processes – if data lineage documentation has been attempted at all.

The risks of ignoring end-to-end data lineage are just too great. Let’s look at some of those consequences:

  1. Derailed Projects

Effectively managing business operations is a key factor to success– especially for organizations that are in the midst of digital transformation. Failures in business processes attributed to errors can be a big problem.

For example, in a typical business scenario where an incorrect data set is discovered within a report, the length of time (on average) that it takes a team to find the source of the error can take days or sometimes weeks – derailing the project and costing time and money.

  1. Policy Bloat and Unruly Rules

The business glossary environment must represent the actual environment, e.g., be refreshed and synched, otherwise it becomes obsolete. You need real collaboration.

Data dictionaries, glossaries and policies can’t live in different formats and in different places. It is common for these to be expressed in different ways, depending on the database and underlying storage technology, but this causes policy bloat and rules that no organization, team or employee will understand, let alone realistically manage.

Effective data governance requires that business glossaries, data dictionaries and data privacy policies live in one central location, so they can be easily tracked, monitored and updated over time.

  1. Major Inefficiencies

Successful data migration and upgrades rely on seamless integration of tools and processes with coordinated efforts of people/resources. A passive approach frequently relies on creating new copies of data, usually with sensitive identifiers removed or obscured.

Not only does this passive approach create inefficiencies between determining what data to copy, how to copy it, and where to store the copy, it also creates new volumes of data that become harder to track over time. Yet again, a passive approach to data cannot scale. Direct access to the same live data across the organization is required.

  1. Not Knowing Where Your Data Is

Metadata management and manual mapping are a challenge to most organizations. Data comes in all shapes, sizes and formats, and there is no way to know what type of data a project will need – or even where that data will sit.

Some data might be in the cloud, some on premise, and sometimes projects will require a hybrid approach. All data must be governed, regardless of where it is located.

  1. Privacy and Compliance Challenges

Privacy and compliance personnel know the rules that must be applied to data, but may not necessarily know the technology. Instead, automated data governance requires that anyone, with any level of expertise, can understand what rules (e.g. privacy policies) are applied to enterprise data.

Organizations with established data governance must empower both those with technical skill sets and those with privacy and compliance knowledge, so all teams can play a meaningful role controlling how data is used.

For more information on data lineage, get the free white paper, Tech Brief: Data Lineage.

End-to-End Data Lineage

 

Categories
Data Intelligence Enterprise Architecture Data Governance erwin Expert Blog

Integrating Data Governance and Enterprise Architecture

Aligning these practices for regulatory compliance and other benefits

Why should you integrate data governance (DG) and enterprise architecture (EA)? It’s time to think about EA beyond IT.

Two of the biggest challenges in creating a successful enterprise architecture initiative are: collecting accurate information on application ecosystems and maintaining the information as application ecosystems change.

Data governance provides time-sensitive, current-state architecture information with a high level of quality. It documents your data assets from end to end for business understanding and clear data lineage with traceability.

In the context of EA, data governance helps you understand what information you have; where it came from; if it’s secure; who’s accountable for it; who accessed it and in which systems and applications it’s located and moves between.

You can collect complete application ecosystem information; objectively identify connections/interfaces between applications, using data; provide accurate compliance assessments; and quickly identify security risks and other issues.

Data governance and EA also provide many of the same benefits of enterprise architecture or business process modeling projects: reducing risk, optimizing operations, and increasing the use of trusted data.

To better understand and align data governance and enterprise architecture, let’s look at data at rest and data in motion and why they both have to be documented.

  1. Documenting data at rest involves looking at where data is stored, such as in databases, data lakes, data warehouses and flat files. You must capture all of this information from the columns, fields and tables – and all the data overlaid on top of that. This means understanding not just the technical aspects of a data asset but also how the business uses that data asset.
  2. Documenting data in motion looks at how data flows between source and target systems and not just the data flows themselves but also how those data flows are structured in terms of metadata. We have to document how our systems interact, including the logical and physical data assets that flow into, out of and between them.

data governance and enterprise architecture

Automating Data Governance and Enterprise Architecture

If you have a data governance program and tooling in place, you’re able to document a lot of information that enterprise architects and process modelers usually spend months, if not years, collecting and keeping up to date.

So within a data governance repository, you’re capturing systems, environments, databases and data — both logical and physical. You’re also collecting information about how those systems are interconnected.

With all this information about the data landscape and the systems that use and store it, you’re automatically collecting your organization’s application architecture. Therefore you can drastically reduce the time to achieving value because your enterprise architecture will always be up to date because you’re managing the associated data properly.

If your organization also has an enterprise architecture practice and tooling, you can automate the current-state architecture, which is arguably the most expensive and time-intensive aspect of enterprise architecture to have at your fingertips.

In erwin’s 2020 State of Data Governance and Automation report, close to 70 percent of respondents said they spend an average of 10 or more hours per week on data-related activities, and most of that time is spent searching for and preparing data.

At the same time, it’s also critical to answer the executives’ questions. You can’t do impact analysis if you don’t understand the current-state architecture, and it’s not going to be delivered quick enough if it isn’t documented.

Data Governance and Enterprise Architecture for Regulatory Compliance

First and foremost, we can start to document the application inventory automatically because we are scanning systems and understanding the architecture itself. When you pre-populate your interface inventory, application lineage and data flows, you see clear-cut dependencies.

That makes regulatory compliance a fantastic use case for both data governance and EA. You can factor this use case into process and application architecture diagrams, looking at where this type of data goes and what sort of systems in touches.

With that information, you can start to classify information for such regulations as the European Union’s General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA) or any type of compliance data for an up-to-date regulatory compliance repository. Then all this information flows into processing controls and will ultimately deliver real-time, true impact analysis and traceability.

erwin for Data Governance and Enterprise Architecture

Using data governance and enterprise architecture in tandem will give you a data-driven architecture, reducing time to value and show true results to your executives.

You can better manage risk because of real-time data coming into the EA space. You can react quicker, answering questions for stakeholders that will ultimately drive business transformation. And you can reinforce the value of your role as an enterprise architect.

erwin Evolve is a full-featured, configurable set of enterprise architecture and business process modeling and analysis tools. It integrates with erwin’s data governance software, the erwin Data Intelligence Suite.

With these unified capabilities, every enterprise stakeholder – enterprise architect, business analyst, developer, chief data officer, risk manager, and CEO – can discover, understand, govern and socialize data assets to realize greater value while mitigating data-related risks.

You can start a free trial of erwin Evolve here.

Enterprise architecture review

Categories
Data Intelligence Data Governance erwin Expert Blog

Data Governance Definition, Best Practices and Benefits

Any organziation with a data-driven strategy should understand the definition of data governance. In fact, in light of increasingly stringent data regulations, any organzation that uses or even stores data, should understand the definition of data governance.

Organizations with a solid understanding of data governance (DG) are better equipped to keep pace with the speed of modern business.

In this post, the erwin Experts address:

 

 

Data Governance Definition

Data governance’s definition is broad as it describes a process, rather than a predetermined method. So an understanding of the process and the best practices associated with it are key to a successful data governance strategy.

Data governance is best defined as the strategic, ongoing and collaborative processes involved in managing data’s access, availability, usability, quality and security in line with established internal policies and relevant data regulations.

It’s often said that when we work together, we can achieve things greater than the sum of our parts. Collective, societal efforts have seen mankind move metaphorical mountains and land on the literal moon.

Such feats were made possible through effective government – or governance.

The same applies to data. A single unit of data in isolation can’t do much, but the sum of an organization’s data can prove invaluable.

Put simply, DG is about maximizing the potential of an organization’s data and minimizing the risk. In today’s data-driven climate, this dynamic is more important than ever.

That’s because data’s value depends on the context in which it exists: too much unstructured or poor-quality data and meaning is lost in a fog; too little insight into data’s lineage, where it is stored, or who has access and the organization becomes an easy target for cybercriminals and/or non-compliance penalties.

So DG is quite simply, about how an organization uses its data. That includes how it creates or collects data, as well as how its data is stored and accessed. It ensures that the right data of the right quality, regardless of where it is stored or what format it is stored in, is available for use – but only by the right people and for the right purpose.

With well governed data, organizations can get more out of their data by making it easier to manage, interpret and use.

Why Is Data Governance Important?

Although governing data is not a new practice, using it as a strategic program is and so are the expectations as to who is responsible for it.

Historically, governing data has been IT’s business because it primarily involved cataloging data to support search and discovery.

But now, governing data is everyone’s business. Both the data “keepers” in IT and the data users everywhere else within the organization have a role to play.

That makes sense, too. The sheer volume and importance of data the average organization now processes are too great to be effectively governed by a siloed IT department.

Think about it. If all the data you access as an employee of your organization had to be vetted by IT first, could you get anything done?

While the exponential increase in the volume and variety of data has provided unparalleled insights for some businesses, only those with the means to deal with the velocity of data have reaped the rewards.

By velocity, we mean the speed at which data can be processed and made useful. More on “The Three Vs of Data” here.

Data giants like Amazon, Netflix and Uber have reshaped whole industries, turning smart, proactive data governance into actionable and profitable insights.

And then, of course, there’s the regulatory side of things. The European Union’s General Data Protection Regulation (GDPR) mandates organization’s govern their data.

Poor data governance doesn’t just lead to breaches – although of course it does – but compliance audits also need an effective data governance initiative in order to pass.

Since non-compliance can be costly, good data governance not only helps organizations make money, it helps them save it too. And organizations are recognizing this fact.

In the lead up to GDPR, studies found that the biggest driver for initiatives for governing data was regulatory compliance. However, since GDPR’s implementation better decision-making and analytics are their top drivers for investing in data governance.

Other areas in where well governed data plays an important role include digital transformation, data standards and uniformity, self-service and customer trust and satisfaction.

For the full list of drivers and deeper insight into the state of data governance, get the free 2020 State of DGA report here.

What Is Good Data Governance?

We’re constantly creating new data whether we’re aware of it or not. Every new sale, every new inquiry, every website interaction, every swipe on social media generates data.

This means the work of governing data is ongoing, and organizations without it can become overwhelmed quickly.

Therefore good data governance is proactive not reactive.

In addition, good data governance requires organizations to encourage a culture that stresses the importance of data with effective policies for its use.

An organization must know who should have access to what, both internally and externally, before any technical solutions can effectively compartmentalize the data.

So good data governance requires both technical solutions and policies to ensure organizations stay in control of their data.

But culture isn’t built on policies alone. An often-overlooked element of good data governance is arguably philosophical. Effectively communicating the benefits of well governed data to employees – like improving the discoverability of data – is just as important as any policy or technology.

And it shouldn’t be difficult. In fact, it should make data-oriented employees’ jobs easier, not harder.

What Are the Key Benefits of Data Governance?

Organizations with a effectively governed data enjoy:

  • Better alignment with data regulations: Get a more holistic understanding of your data and any associated risks, plus improve data privacy and security through better data cataloging.
  • A greater ability to respond to compliance audits: Take the pain out of preparing reports and respond more quickly to audits with better documentation of data lineage.
  • Increased operational efficiency: Identify and eliminate redundancies and streamline operations.
  • Increased revenue: Uncover opportunities to both reduce expenses and discover/access new revenue streams.
  • More accurate analytics and improved decision-making: Be more confident in the quality of your data and the decisions you make based on it.
  • Improved employee data literacy: Consistent data standards help ensure employees are more data literate, and they reduce the risk of semantic misinterpretations of data.
  • Better customer satisfaction/trust and reputation management: Use data to provide a consistent, efficient and personalized customer experience, while avoiding the pitfalls and scandals of breaches and non-compliance.

For a more in-depth assessment of data governance benefits, check out The Top 6 Benefits of Data Governance.

The Best Data Governance Solution

Data has always been important to erwin; we’ve been a trusted data modeling brand for more than 30 years. But we’ve expanded our product portfolio to reflect customer needs and give them an edge, literally.

The erwin EDGE platform delivers an “enterprise data governance experience.” And at the heart of the erwin EDGE is the erwin Data Intelligence Suite (erwin DI).

erwin DI provides all the tools you need for the effective governance of your data. These include data catalog, data literacy and a host of built-in automation capabilities that take the pain out of data preparation.

With erwin DI, you can automatically harvest, transform and feed metadata from a wide array of data sources, operational processes, business applications and data models into a central data catalog and then make it accessible and understandable via role-based, contextual views.

With the broadest set of metadata connectors, erwin DI combines data management and DG processes to fuel an automated, real-time, high-quality data pipeline.

See for yourself why erwin DI is a DBTA 2020 Readers’ Choice Award winner for best data governance solution with your very own, very free demo of erwin DI.

data governance preparedness

Categories
erwin Expert Blog Data Intelligence

What Is Data Literacy?

Today, data literacy is more important than ever.

Data is now being used to support business decisions few executives thought they’d be making even six months ago.

With your employees connected and armed with data that paints a clear picture of the business, your organization is better prepared to turn its attention to whatever your strategic priority may be – i.e. digital transformation, customer experience, or withstanding this current (or future) crisis.

So, what is data literacy?

Data Literacy

Data Literacy Definition

Gartner defines data literacy as the ability to read, write and communicate data in context, including an understanding of data sources and constructs, analytical methods and techniques applied — and the ability to describe the use case, application and resulting value.

Organizations use data literacy tools to improve data literacy across the organization. A good data literacy tool will include functionality such as business glossary management and self-service data discovery. The end result is an organization that’s more data fluent and efficient in how they store, discover and use their data.

What Is Data Literacy For?

For years, we’ve been saying that “we’re all data people.” When all stakeholders in an organization can effectively “speak data” they can:

  • Better understand and identify the data they require
  • Be more self-sufficient in accessing and preparing the data
  • Better articulate the gaps that exist in the data landscape
  • Share their knowledge and experience with data with other consumers to contribute to the greater good
  • Collaborate more effectively with their partners in data (management and governance) for greater efficiency and higher quality outcomes

Why is Data Literacy Important?

Without good data, it’s difficult to make good decisions.

Data access, literacy and knowledge leads to sound decision-making and that’s key to data governance and any other data-driven effort.

Data literacy enables collaboration and innovation. To determine if your organization is data literate you need to ask two questions:  

  1. Can your employees use data to effectively communicate with each other?
  2. Can you develop and circulate ideas that will help the business move forward?

data literacy and data intelligence

The Data Literacy and Data Intelligence Connection

Businesses that invest in data intelligence and data literacy are better positioned to weather any storm and chart a path forward because they have accurate, trusted data at their disposal.

erwin helps customers turn their data from a burden into a benefit by fueling an accurate, real-time, high-quality data pipeline they can mine for insights that lead to smart decisions for operational excellence.

erwin Data Intelligence (erwin DI) combines data catalog and data literacy capabilities for greater awareness of and access to available data assets, guidance on their use, and guardrails to ensure data policies and best practices are followed.

erwin Data Literacy (DL) is founded on enriched business glossaries and socializing data so all stakeholders can view and understand it within the context of their roles.

It allows both IT and business users to discover the data available to them and understand what it means in common, standardized terms, and automates common data curation processes, such as name matching, categorization and association, to optimize governance of the data pipeline including preparation processes.

erwin DL provides self-service, role-based, contextual data views. It also provides a business glossary for the collaborative definition of enterprise data in business terms.

It also includes built-in accountability and workflows to enable data consumers to define and discover data relevant to their roles, facilitate the understanding and use of data within a business context, and ensure the organization is data literate.

With erwin DL, your organization can build glossaries of terms in taxonomies with descriptions, synonyms, acronyms and their associations to data policies, rules and other critical governance artifacts. Other advantages are:

  • Data Visibility & Governance: Visualize and navigate any data from anywhere within a business-centric data asset framework that provides organizational alignment and robust, sustainable data governance.
  • Data Context & Enrichment: Put data in business context and enable stakeholders to share best practices and build communities by tagging/commenting on data assets, enriching the metadata.
  • Enterprise Collaboration & Empowerment: Break down IT and business silos to provide broad access to approved organizational information.
  • Greater Productivity: Reduce the time it takes to find data assets and therefore reliance on technical resources, plus streamline workflows for faster analysis and decision-making.
  • Accountability & Regulatory Peace of Mind: Create an integrated ecosystem of people, processes and technology to manage and protect data, mitigating a wide range of data-related risks and improving compliance.
  • Effective Change Management: Better manage change with the ability to identify data linkages, implications and impacts across the enterprise.
  • Data Literacy, Fluency & Knowledge: Enhance stakeholder discovery and understanding of and trust in data assets to underpin analysis leading to actionable insights.

Learn more about the importance of data literacy by requesting a free demo of erwin Data Intelligence.

erwin Data Intelligence

 

Categories
Data Intelligence erwin Expert Blog

Four Steps to Building a Data-Driven Culture

data-driven culture

Fostering organizational support for a data-driven culture might require a change in the organization’s culture. But how?

Recently, I co-hosted a webinar with our client E.ON, a global energy company that reinvented how it conducts business from branding to customer engagement – with data as the conduit.

There’s no doubt E.ON, based in Essen, Germany, has established one of the most comprehensive and successful data governance programs in modern business.

For E.ON, data governance is not just about data management but also about using information to increase efficiencies. The company needed to help its data scientists and engineers improve their knowledge of the data, find the best data for use at the best time, and put the data in the most appropriate business context.

As an example, E.ON was able to improve data quality, detect redundancies, and create a needs-based, data-use environment by applying a common set of business terms across the enterprise.

Avoiding Hurdles

Businesses have not been able to get as much mileage out of their data governance efforts as hoped, chiefly because of how it’s been handled. And data governance initiatives sometimes fail because organizations tend to treat them as siloed IT programs rather than multi-stakeholder imperatives.

Even when business groups recognize the value of a data governance program and the potential benefits to be derived from it, the IT group traditionally has owned the effort and paid for it.

Despite enterprise-wide awareness of the importance of data governance, a troublingly large number of organizations continue to stumble because of a lack of executive support.

IT and the business will need to take responsibility for selling the benefits of data governance across the enterprise and ensure all stakeholders are properly educated about it.

IT may have to go it alone, at least initially, educating the business on the risks and rewards of data governance and the expectations and accountabilities in implementing it. The business needs to have a role in the justification.

Being a Change Agent

Becoming a data-driven enterprise means making decisions based on facts. It requires a clear vision, strategy and disciplined execution. It also must be well thought out, understood and communicated to others – from the C-suite on down.

For E.ON, the board supported and drove a lot of the thinking that data has to be at the center of everything to reimagine the company. But the data team still needed to convince the head of every one of the company’s hundreds of legal entities to support the digital transformation journey. As a result, the team went on a mission to spread the message.

“The biggest challenge was change management — convincing people to be part of the journey. It is very often underestimated,” said Romina Medici, E.ON’s Program Manager for Data Management and Governance. “Technology is logical, so you can always understand it. Culture is more complex and more diverse.”

She said that ultimately the “communication (across the organization) was bottom up and top down.”

Four Steps to Building a Data-Driven Culture

1. Accelerate Time to Value: Data governance isn’t a one-off project with a defined endpoint. It’s an on-going initiative that requires active engagement from executives and business leaders. The ability to make faster decisions based on data is one way to make the organization pay attention.

2. Ensure Company-Wide Compliance: Compliance isn’t just about government regulations. In today’s business environment, we’re all data people. Everyone in the organization needs to commit to data compliance to ensure high-quality data.

3. Demand Trusted Insights Based on Data Truths: To make smart decisions, you can’t have multiple sets of numbers. Everyone needs to be in lockstep, using and basing decisions on the same data.

4. Foster Data-Driven Collaboration: We call this “social data governance,” meaning you foster collaboration across the business, all the time. 

A data-driven approach has never been more valuable to addressing the complex yet foundational questions enterprises must answer. Organizations that have their data management, data governance and data intelligence houses in order are much better positioned to respond to challenges and thrive moving forward.

As demonstrated by E.ON, data-driven cultures start at the top – but need to proliferate up and down, even sideways.

Business transformation has to be based on accurate data assets within the right context, so organizations have a reliable source of truth on which to base their decisions.

erwin provides a with the data catalog, lineage, glossary and visualization capabilities needed to evaluate the business in its current state and then evolve it to serve new objectives.

Request a demo of the erwin Data Intelligence Suite.

Data Intelligence Solution: Data Catalog, Data Literacy and Automation Tools

Categories
Data Intelligence Data Modeling Business Process Enterprise Architecture erwin Expert Blog

Benefits of Enterprise Modeling and Data Intelligence Solutions

Users discuss how they are putting erwin’s data modeling, enterprise architecture, business process modeling, and data intelligences solutions to work

IT Central Station members using erwin solutions are realizing the benefits of enterprise modeling and data intelligence. This article highlights some specific use cases and the results they’re experiencing within the organizations.

Enterprise Architecture & Business Process Modeling with erwin Evolve

An enterprise architect uses erwin Evolve at an aerospace/defense firm with more than 10,000 employees. His team is “doing business process modeling and high-level strategic modeling with its capabilities.” Others in his company are using it for IT infrastructure, such as aligning requirements to system solutions.

For Matthieu G., a senior business process management architect at a pharma/biotech company with more than 5,000 employees, erwin Evolve was useful for enterprise architecture reference. As he put it, “We are describing our business process and we are trying to describe our data catalog. We are describing our complete applications assets, and we are interfacing to the CMDB of our providers.”

His team also is using the software to manage roadmaps in their main transformation programs. He added, “We have also linked it to our documentation repository, so we have a description of our data documents.” They have documented 200 business processes in this way. In particular, the tool helped them to design their qualification review, which is necessary in a pharmaceutical business.

erwin Evolve users are experiencing numerous benefits. According to the aerospace enterprise architect, “It’s helped us advance in our capabilities to perform model-based systems engineering, and also model-based enterprise architecture.”

This matters because, as he said, “By placing the data and the metadata into a model, which is what the tool does, you gain the abilities for linkages between different objects in the model, linkages that you cannot get on paper or with Visio or PowerPoint.” That is a huge differentiator for this user.

This user also noted, “I use the automatic diagramming features a lot. When one of erwin’s company reps showed that to me a couple of years ago, I was stunned. That saves hours of work in diagramming. That capability is something I have not seen in other suppliers’ tools.”

He further explained “that really helps too with when your data is up to date. The tool will then automatically generate the updated diagram based on the data, so you know it’s always the most current version. You can’t do that in things like Visio and PowerPoint. They’re static snapshots of a diagram at some point in time. This is live and dynamic.”

erwin DM

Data Modeling with erwin Data Modeler

George H., a technology manager, uses erwin Data Modeler (erwin DM) at a pharma/biotech company with more than 10,000 employees for their enterprise data warehouse.

He elaborated by saying, “We have an enterprise model being maintained and we have about 11 business-capability models being maintained. Examples of business capabilities would be finance, human resources, supply-chain, sales and marketing, and procurement. We maintain business domain models in addition to the enterprise model.”

Roshan H., an EDW architect/data modeler who uses erwin DM at Royal Bank of Canada, works on diverse platforms, including Microsoft SQL Server, Oracle, DB2, Teradata and NoSQL. After gathering requirements and mapping data on Excel, they start building the conceptual model and then the logical model with erwin DM.

He said, “When we have these data models built in the erwin DM, we generate the PDF data model diagrams and take it to the team (DBA, BSAs, QA and others) to explain the model diagram. Once everything is reviewed, then we go on to discuss the physical data model.”

“We use erwin DM to do all of the levels of analysis that a data architect does,” said Sharon A., a senior manager, data governance at an insurance company with over 500 employees. She added, “erwin DM does conceptual, logical and physical database or data structure capture and design, and creates a library of such things.

We do conceptual data modeling, which is very high-level and doesn’t have columns and tables. It’s more concepts that the business described to us in words. We can then use the graphic interface to create boxes that contain descriptions of things and connect things together. It helps us to do a scope statement at the beginning of a project to corral what the area is that the data is going to be using.”

Data Governance with erwin Data Intelligence

IT Central Station members are seeing benefits from using erwin Data Intelligence (erwin DI) for data governance use cases. For Rick D., a data architect at NAMM, a small healthcare company, erwin DI “enabled us to centralize a tremendous amount of data into a common standard, and uniform reporting has decreased report requests.”

As a medical company, they receive data from 17 different health plans. Before adopting erwin DI, they didn’t have a centralized data dictionary of their data. The benefit of data governance, as he saw it, was that “everybody in our organization knows what we are talking about. Whether it is an institutional claim, a professional claim, Blue Cross or Blue Shield, health plan payer, group titles, names, etc.”

A solution architect at a pharma/biotech company with more than 10,000 employees used erwin DI for metadata management, versioning of metadata and metadata mappings and automation. In his experience, applying governance to metadata and creating mappings has helped different stakeholders gain a good understanding of the data they use to do their work.

Sharon A. had a comparable use case. She said, “You can map the business understanding in your glossary back to your physical so you can see it both ways. With erwin DI, I can have a full library of physical data there or logical data sets, publish it out through the portal, and then the business can do self-service. The DBAs can use it for all different types of value-add from their side of the house. They have the ability to see particular aspects, such as RPII, and there are some neat reports which show that. They are able manage who can look at these different pieces of information.”

For more real erwin user experiences, visit IT Central Station.