Categories
erwin Expert Blog

How Metadata Makes Data Meaningful

Metadata is an important part of data governance, and as a result, most nascent data governance programs are rife with project plans for assessing and documenting metadata. But in many scenarios, it seems that the underlying driver of metadata collection projects is that it’s just something you do for data governance.

So most early-stage data governance managers kick off a series of projects to profile data, make inferences about data element structure and format, and store the presumptive metadata in some metadata repository. But are these rampant and often uncontrolled projects to collect metadata properly motivated?

There is rarely a clear directive about how metadata is used. Therefore prior to launching metadata collection tasks, it is important to specifically direct how the knowledge embedded within the corporate metadata should be used.

Managing metadata should not be a sub-goal of data governance. Today, metadata is the heart of enterprise data management and governance/ intelligence efforts and should have a clear strategy – rather than just something you do.

metadata data governance

What Is Metadata?

Quite simply, metadata is data about data. It’s generated every time data is captured at a source, accessed by users, moved through an organization, integrated or augmented with other data from other sources, profiled, cleansed and analyzed. Metadata is valuable because it provides information about the attributes of data elements that can be used to guide strategic and operational decision-making. It answers these important questions:

  • What data do we have?
  • Where did it come from?
  • Where is it now?
  • How has it changed since it was originally created or captured?
  • Who is authorized to use it and how?
  • Is it sensitive or are there any risks associated with it?

The Role of Metadata in Data Governance

Organizations don’t know what they don’t know, and this problem is only getting worse. As data continues to proliferate, so does the need for data and analytics initiatives to make sense of it all. Here are some benefits of metadata management for data governance use cases:

  • Better Data Quality: Data issues and inconsistencies within integrated data sources or targets are identified in real time to improve overall data quality by increasing time to insights and/or repair.
  • Quicker Project Delivery: Accelerate Big Data deployments, Data Vaults, data warehouse modernization, cloud migration, etc., by up to 70 percent.
  • Faster Speed to Insights: Reverse the current 80/20 rule that keeps high-paid knowledge workers too busy finding, understanding and resolving errors or inconsistencies to actually analyze source data.
  • Greater Productivity & Reduced Costs: Being able to rely on automated and repeatable metadata management processes results in greater productivity. Some erwin customers report productivity gains of 85+% for coding, 70+% for metadata discovery, up to 50% for data design, up to 70% for data conversion, and up to 80% for data mapping.
  • Regulatory Compliance: Regulations such as GDPR, HIPAA, PII, BCBS and CCPA have data privacy and security mandates, so sensitive data needs to be tagged, its lineage documented, and its flows depicted for traceability.
  • Digital Transformation: Knowing what data exists and its value potential promotes digital transformation by improving digital experiences, enhancing digital operations, driving digital innovation and building digital ecosystems.
  • Enterprise Collaboration: With the business driving alignment between data governance and strategic enterprise goals and IT handling the technical mechanics of data management, the door opens to finding, trusting and using data to effectively meet organizational objectives.

Giving Metadata Meaning

So how do you give metadata meaning? While this sounds like a deep philosophical question, the reality is the right tools can make all the difference.

erwin Data Intelligence (erwin DI) combines data management and data governance processes in an automated flow.

It’s unique in its ability to automatically harvest, transform and feed metadata from a wide array of data sources, operational processes, business applications and data models into a central data catalog and then make it accessible and understandable within the context of role-based views.

erwin DI sits on a common metamodel that is open, extensible and comes with a full set of APIs. A comprehensive list of erwin-owned standard data connectors are included for automated harvesting, refreshing and version-controlled metadata management. Optional erwin Smart Data Connectors reverse-engineer ETL code of all types and connect bi-directionally with reporting and other ecosystem tools. These connectors offer the fastest and most accurate path to data lineage, impact analysis and other detailed graphical relationships.

Additionally, erwin DI is part of the larger erwin EDGE platform that integrates data modelingenterprise architecturebusiness process modelingdata cataloging and data literacy. We know our customers need an active metadata-driven approach to:

  • Understand their business, technology and data architectures and the relationships between them
  • Create an automate a curated enterprise data catalog, complete with physical assets, data models, data movement, data quality and on-demand lineage
  • Activate their metadata to drive agile and well-governed data preparation with integrated business glossaries and data dictionaries that provide business context for stakeholder data literacy

erwin was named a Leader in Gartner’s “2019 Magic Quadrant for Metadata Management Solutions.”

Click here to get a free copy of the report.

Click here to request a demo of erwin DI.

Gartner Magic Quadrant Metadata Management

 

Categories
erwin Expert Blog

Metadata Management, Data Governance and Automation

Can the 80/20 Rule Be Reversed?

erwin released its State of Data Governance Report in February 2018, just a few months before the General Data Protection Regulation (GDPR) took effect.

This research showed that the majority of responding organizations weren’t actually prepared for GDPR, nor did they have the understanding, executive support and budget for data governance – although they recognized the importance of it.

Of course, data governance has evolved with astonishing speed, both in response to data privacy and security regulations and because organizations see the potential for using it to accomplish other organizational objectives.

But many of the world’s top brands still seem to be challenged in implementing and sustaining effective data governance programs (hello, Facebook).

We wonder why.

Too Much Time, Too Few Insights

According to IDC’s “Data Intelligence in Context” Technology Spotlight sponsored by erwin, “professionals who work with data spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analytics.”

Specifically, 80 percent of data professionals’ time is spent on data discovery, preparation and protection, and only 20 percent on analysis leading to insights.

In most companies, an incredible amount of data flows from multiple sources in a variety of formats and is constantly being moved and federated across a changing system landscape.

Often these enterprises are heavily regulated, so they need a well-defined data integration model that will help avoid data discrepancies and remove barriers to enterprise business intelligence and other meaningful use.

IT teams need the ability to smoothly generate hundreds of mappings and ETL jobs. They need their data mappings to fall under governance and audit controls, with instant access to dynamic impact analysis and data lineage.

But most organizations, especially those competing in the digital economy, don’t have enough time or money for data management using manual processes. Outsourcing is also expensive, with inevitable delays because these vendors are dependent on manual processes too.

The Role of Data Automation

Data governance maturity includes the ability to rely on automated and repeatable processes.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement — for data lineage and impact analysis — without interrupting system developers’ normal work methods.

GDPR compliance, for instance, requires a business to discover source-to-target mappings with all accompanying transactions, such as what business rules in the repository are applied to it, to comply with audits.

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology — that is, reverse-engineering code from existing XML within the ETL layer — to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

With automation, data professionals can meet the above needs at a fraction of the cost of the traditional, manual way. To summarize, just some of the benefits of data automation are:

• Centralized and standardized code management with all automation templates stored in a governed repository
• Better quality code and minimized rework
• Business-driven data movement and transformation specifications
• Superior data movement job designs based on best practices
• Greater agility and faster time-to-value in data preparation, deployment and governance
• Cross-platform support of scripting languages and data movement technologies

One global pharmaceutical giant reduced costs by 70 percent and generated 95 percent of production code with “zero touch.” With automation, the company improved the time to business value and significantly reduced the costly re-work associated with error-prone manual processes.

Gartner Magic Quadrant Metadata Management

Help Us Help You by Taking a Brief Survey

With 2020 just around the corner and another data regulation about to take effect, the California Consumer Privacy Act (CCPA), we’re working with Dataversity on another research project.

And this time, you guessed it – we’re focusing on data automation and how it could impact metadata management and data governance.

We would appreciate your input and will release the findings in January 2020.

Click here to take the brief survey

Categories
erwin Expert Blog

Why EA Needs to Be Part of Your Digital Transformation Strategy

Enterprise architecture (EA) isn’t dead, you’re just using it wrong. Part three of erwin’s digital transformation blog series.  

I’ll let you in on a little secret: the rumor of enterprise architecture’s demise has been greatly exaggerated. However, the truth for many of today’s fast-moving businesses is that enterprise architecture fails. But why?

Enterprise architecture is invaluable for internal business intelligence (but is rarely used for real intelligence), governance (but often has a very narrow focus), management insights (but doesn’t typically provide useful insights), and transformation and planning (ok, now we have something!).

In reality, most organizations do not leverage EA teams to their true potential. Instead they rely on consultants, trends, regulations and legislation to drive strategy.

Why does this happen?

Don’t Put Enterprise Architecture in a Corner

EA has remained in its traditional comfort zone of IT. EA is not only about IT …  but yet, EA lives within IT, focuses on IT and therefore loses its business dimension and support.

It remains isolated and is rarely, if ever, involved in:

  • Assessing, planning and running business transformation initiatives
  • Providing real, enterprise-wide insights
  • Producing actionable initiatives

Instead, it focuses on managing “stuff”:

  • Understanding existing “stuff” by gathering exhaustively detailed information
  • Running “stuff”-deployment projects
  • Managing cost “stuff”
  • “Moving to the cloud” (the solution to … everything)

Enterprise Architecture

What Prevents Enterprise Architecture from Being Successful?

There are three main reasons why EA has been pigeon-holed:

  1. Lack of trust in the available information
    • Information is mostly collected, entered and maintained manually
    • Automated data collection and connection is costly and error-prone
    • Identification of issues can be very difficult and time-consuming
  1. Lack of true asset governance and collaboration
    • Enterprise architecture becomes ring-fenced within a department
    • Few stakeholders willing to be actively involved in owning assets and be responsible for them
    • Collaboration on EA is seen as secondary and mostly focused on reports and status updates
  1. Lack of practical insights (insights, analyses and management views)
    • Too small and narrow thinking of what EA can provide
    • The few analyses performed focus on immediate questions, rarely planning and strategy
    • Collaboration on EA is seen as secondary and mostly focused on reports and status updates

Because of this, EA fails to deliver the relevant insights that management needs to make decisions – in a timely manner – and loses its credibility.

But the fact is EA should be, and was designed to be, about actionable insights leading to innovative architecture, not about only managing “stuff!”

Don’t Slow Your Roll. Elevate Your Role.

It’s clear that the role of EA in driving digital transformation needs to be elevated. It needs to be a strategic partner with the business.

According to a McKinsey report on the “Five Enterprise-Architecture Practices That Add Value to Digital Transformations,” EA teams need to:

“Translate architecture issues into terms that senior executives will understand. Enterprise architects can promote closer alignment between business and IT by helping to translate architecture issues for business leaders and managers who aren’t technology savvy. Engaging senior management in discussions about enterprise architecture requires management to dedicate time and actively work on technology topics. It also requires the EA team to explain technology matters in terms that business leaders can relate to.”

With that said, to further change the perception of EA within the organization you need to serve what management needs. To do this, enterprise architects need to develop innovative business, not IT insights, and make them dynamic. Next, enterprise architects need to gather information you can trust and then maintain.

To provide these strategic insights, you don’t need to focus on everything — you need to focus on what management wants you to focus on. The rest is just IT being IT. And, finally, you need to collaborate – like your life depends on it.

Giving Digital Transformation an Enterprise Architecture EDGE

The job of the enterprise architecture is to provide the tools and insights for the C-suite, and other business stakeholders, to help deploy strategies for business transformation.

Let’s say the CEO has a brilliant idea and wants to test it. This is EA’s sweet spot and opportunity to shine. And this is where erwin lives by providing an easy, automated way to deliver collaboration, speed and responsiveness.

erwin is about providing the right information to the right people at the right time. We are focused on empowering the forward-thinking enterprise architect by providing:

  • Superb, near real-time understanding of information
  • Excellent, intuitive collaboration
  • Dynamic, interactive dashboards (vertical and horizontal)
  • Actual, realistic, business-oriented insights
  • Assessment, planning and implementation support

Data-Driven Business Transformation

Categories
erwin Expert Blog

Choosing the Right Data Modeling Tool

The need for an effective data modeling tool is more significant than ever.

For decades, data modeling has provided the optimal way to design and deploy new relational databases with high-quality data sources and support application development. But it provides even greater value for modern enterprises where critical data exists in both structured and unstructured formats and lives both on premise and in the cloud.

In today’s hyper-competitive, data-driven business landscape, organizations are awash with data and the applications, databases and schema required to manage it.

For example, an organization may have 300 applications, with 50 different databases and a different schema for each. Additional challenges, such as increasing regulatory pressures – from the General Data Protection Regulation (GDPR) to the Health Insurance Privacy and Portability Act (HIPPA) – and growing stores of unstructured data also underscore the increasing importance of a data modeling tool.

Data modeling, quite simply, describes the process of discovering, analyzing, representing and communicating data requirements in a precise form called the data model. There’s an expression: measure twice, cut once. Data modeling is the upfront “measuring tool” that helps organizations reduce time and avoid guesswork in a low-cost environment.

From a business-outcome perspective, a data modeling tool is used to help organizations:

  • Effectively manage and govern massive volumes of data
  • Consolidate and build applications with hybrid architectures, including traditional, Big Data, cloud and on premise
  • Support expanding regulatory requirements, such as GDPR and the California Consumer Privacy Act (CCPA)
  • Simplify collaboration across key roles and improve information alignment
  • Improve business processes for operational efficiency and compliance
  • Empower employees with self-service access for enterprise data capability, fluency and accountability

Data Modeling Tool

Evaluating a Data Modeling Tool – Key Features

Organizations seeking to invest in a new data modeling tool should consider these four key features.

  1. Ability to visualize business and technical database structures through an integrated, graphical model.

Due to the amount of database platforms available, it’s important that an organization’s data modeling tool supports a sufficient (to your organization) array of platforms. The chosen data modeling tool should be able to read the technical formats of each of these platforms and translate them into highly graphical models rich in metadata. Schema can be deployed from models in an automated fashion and iteratively updated so that new development can take place via model-driven design.

  1. Empowering of end-user BI/analytics by data source discovery, analysis and integration. 

A data modeling tool should give business users confidence in the information they use to make decisions. Such confidence comes from the ability to provide a common, contextual, easily accessible source of data element definitions to ensure they are able to draw upon the correct data; understand what it represents, including where it comes from; and know how it’s connected to other entities.

A data modeling tool can also be used to pull in data sources via self-service BI and analytics dashboards. The data modeling tool should also have the ability to integrate its models into whatever format is required for downstream consumption.

  1. The ability to store business definitions and data-centric business rules in the model along with technical database schemas, procedures and other information.

With business definitions and rules on board, technical implementations can be better aligned with the needs of the organization. Using an advanced design layer architecture, model “layers” can be created with one or more models focused on the business requirements that then can be linked to one or more database implementations. Design-layer metadata can also be connected from conceptual through logical to physical data models.

  1. Rationalize platform inconsistencies and deliver a single source of truth for all enterprise business data.

Many organizations struggle to breakdown data silos and unify data into a single source of truth, due in large part to varying data sources and difficulty managing unstructured data. Being able to model any data from anywhere accounts for this with on-demand modeling for non-relational databases that offer speed, horizontal scalability and other real-time application advantages.

With NoSQL support, model structures from non-relational databases, such as Couchbase and MongoDB can be created automatically. Existing Couchbase and MongoDB data sources can be easily discovered, understood and documented through modeling and visualization. Existing entity-relationship diagrams and SQL databases can be migrated to Couchbase and MongoDB too. Relational schema also will be transformed to query-optimized NoSQL constructs.

Other considerations include the ability to:

  • Compare models and databases.
  • Increase enterprise collaboration.
  • Perform impact analysis.
  • Enable business and IT infrastructure interoperability.

When it comes to data modeling, no one knows it better. For more than 30 years, erwin Data Modeler has been the market leader. It is built on the vision and experience of data modelers worldwide and is the de-facto standard in data model integration.

You can learn more about driving business value and underpinning governance with erwin DM in this free white paper.

Data Modeling Drives Business Value

Categories
erwin Expert Blog

Enterprise Architecture and Business Process: Common Goals Require Common Tools

For decades now, the professional world has put a great deal of energy into discussing the gulf that exists between business and IT teams within organizations.

They speak different languages, it’s been said, and work toward different goals. Technology plans don’t seem to account for the reality of the business, and business plans don’t account for the capabilities of the technology.

Data governance is one area where business and IT never seemed to establish ownership. Early attempts at data governance treated the idea as a game of volleyball, passing ownership back and forth, with one team responsible for storing data and running applications, and one responsible for using the data for business outcomes.

Today, we see ample evidence this gap is closing at many organizations. Consider:

  • Many technology platforms and software applications now are designed for business users. Business intelligence is a prime example; it’s rare today to see IT pros have to run reports for business users thanks to self-service.
  • Many workers, especially those that came of age surrounded by technology, have a better understanding of both the business and technology that runs their organizations. Education programs also have evolved to help students develop a background in both business and technology.
  • There’s more portability in roles, with technology minds moving to business leadership positions and vice versa.

“The business domain has always existed in enterprise architecture,” says Manuel Ponchaux, director of product management at erwin, Inc. “However, enterprise architecture has traditionally been an IT function with a prime focus on IT. We are now seeing a shift with a greater focus on business outcomes.”

You can see evidence of this blended focus in some of the titles, like “business architect,” being bestowed upon what was traditionally at IT function. These titles demonstrate an understanding that technology cannot exist in the modern organization for the sake of technology alone – technology needs to support the business and its customers. This concept is also a major focus of the digital transformation wave that’s washing over the business world, and thus we see it reflected in job titles that simply didn’t exist a decade ago.

Job titles aside, enterprise architecture (EA) and business process (BP) teams still have different goals, though at many organizations they now work more closely together than they did in the past. Today, both EA and BP teams recognize that their common goal is better business outcomes. Along the way to that goal, each team conducts a number of similar tasks.

Enterprise Architecture and Business Process: Better Together

One prominent example is modeling. Both enterprise architecture and business process teams do modeling, but they do it in different ways at different levels, and they often use different data and tools. This lack of coordination and communication makes it difficult to develop a true sense of a process from the IT and business sides of the equation. It can also lead to duplication of efforts, which is inefficient and likely to add further confusion when trying to understand outcomes.

Building better business outcomes is like following a plan at a construction site. If different teams are making their own decisions about the materials they’re going to use and following their own blueprints, you’re unlikely to see the building you expect to see at the end of the job.

And that’s essentially what is missing at many organizations: A common repository with role-based views, interfaces and dashboard so that enterprise architecture and business process can truly work together using the same blueprint. When enterprise architecture and business process can use common tools that both aid collaboration and help them understand the elements most important to their roles, the result is greater accuracy, increased efficiency and improved outcomes.

erwin’s enterprise architecture and business process tools provide the common repository and role-based views that help these teams work collaboratively toward their common goals. Finally, enterprise architecture and business process can be on the same page.

Business Process Modeling Use Cases

Categories
erwin Expert Blog

Digital Transformation In Retail: The Retail Apocalypse

Much like the hospitality industry, digital transformation in retail has been a huge driver of change.

One important fact is getting lost among all of the talk of “the retail apocalypse” and myriad stories about increasingly empty shopping malls: there’s a lot of money to be made in retail. In fact, the retail market was expected to grow by more than 3 percent in 2018, unemployment is low, and wages are at least stable.

In short, there’s money to be spent. Now, where are shoppers spending it?

Coming into 2019, consumers are in control when it comes to retail. Choices are abundant. According to Deloitte’s 2018 Retail, Wholesale and Distribution Industry Trends Outlook, “consumers have been conditioned to expect fast, convenient and effortless consumption.”

This is arguably the result of the degree of digital transformation in retail that we’ve seen in recent years.

If you want to survive in retail today, you need to make it easy on your customers. That means meeting their needs across channels, fulfilling orders quickly and accurately, offering competitive prices, and not sacrificing quality in the process.

Even in a world where Amazon has changed the retail game, Walmart just announced that it had its best holiday season in years. According to a recent Fortune article, “Walmart’s e-commerce sales rose 43 percent during the quarter, belying another myth: e-commerce and store sales are in competition with each other.”

Retail has always been a very fickle industry, with the right product mix and the right appeal to the right customers being crucial to success. But digital transformation in retail has seen the map change. You’re no longer competing with the store across the street; you’re competing with the store across the globe.

Digital Transformation In Retail

Retailers are putting every aspect of their businesses under scrutiny to help them remain relevant. Four areas in particular are getting a great deal of attention:

Customer experience: In today’s need-it-fast, need-it-now, need-it-right world, customers expect the ability to make purchases where they are, not where you are. That means via the Web, mobile devices or in a store. And all of the information about those orders needs to be tied together, so that if there is a problem, it can be resolved quickly via any channel.

Competitive differentiation: Appealing to retail customers used to mean appealing to all of your customers as one group or like-minded block. But customers are individuals, and today they can be targeted with personalized messaging and products that are likely to appeal to them, not to everyone.

Supply chain: Having the right products in the right place at the right time is part of the supply chain strategy. But moving them efficiently and cost effectively from any number of suppliers to warehouses and stores can make or break margins.

Partnerships: Among the smaller players in the retail space, partnerships with industry giants like Amazon can help reach a global audience that simply isn’t otherwise available and also reduce complexity. Larger players also recognize that partnerships can be mutually beneficial in the retail space.

Enabling each of these strategies is data – and lots of it. Data is the key to recognizing customers, personalizing experiences, making helpful recommendations, ensuring items are in stock, tracking deliveries and more. At its core, this is what digital transformation in retail seeks to achieve.

Digital Transformation in Retail – What’s the Risk?

But if data is the great enabler in retail, it’s also a huge risk – risk that the data is wrong, that it is old, and that it ends up in the hands of some person or entity that isn’t supposed to have it.

Danny Sandwell, director of product marketing for erwin, Inc., says retailers need to achieve a level of what he calls “data intelligence.” A little like business intelligence, Sandwell uses the term to mean that when someone in retail uses data to make a decision or power an experience or send a recommendation, they have the ability to find out anything they need about that data, including its source, age, who can access it, which applications use it, and more.

Given all of the data that flows into the modern retailer, this level of data intelligence requires a holistic, mature and well-planned data governance strategy. Data governance doesn’t just sit in the data warehouse, it’s woven into business processes and enterprise architecture to provide data visibility for fast, accurate decision-making, help keep data secure, identify problems early, and alert users to things that are working.

How important is clean, accurate, timely data in retail? Apply it to the four areas discussed above:

Customer experience:  If your data shows a lot of abandoned carts from mobile app users, then that’s an area to investigate, and good data will identify it.

Competitive differentiation: Are personalized offers increasing sales and creating customer loyalty? This is an important data point for marketing strategy.

Supply chain: Can a problem with quality be related to items shipping from a certain warehouse? Data will zero in on the location of the problem.

Partnerships: Are your partnerships helping grow other parts of your business and creating new customers? Or are your existing customers using partners in place of visiting your store? Data can tell you.

Try drawing these conclusions without data. You can’t. And even worse, try drawing them with inaccurate data and see what happens when a partnership that was creating customers is ended or mobile app purchases plummet after an ill-advised change to the experience.

If you want to focus on margins in retail, don’t forget this one: there is no margin for error.

Over the next few weeks, we’ll be looking closely at digital transformation examples in other sectors, including hospitality and government. Subscribe to to stay in the loop.

Data Management and Data Governance: Solving the Enterprise Data Dilemma

Categories
erwin Expert Blog

Five Benefits of an Automation Framework for Data Governance

Organizations are responsible for governing more data than ever before, making a strong automation framework a necessity. But what exactly is an automation framework and why does it matter?

In most companies, an incredible amount of data flows from multiple sources in a variety of formats and is constantly being moved and federated across a changing system landscape.

Often these enterprises are heavily regulated, so they need a well-defined data integration model that helps avoid data discrepancies and removes barriers to enterprise business intelligence and other meaningful use.

IT teams need the ability to smoothly generate hundreds of mappings and ETL jobs. They need their data mappings to fall under governance and audit controls, with instant access to dynamic impact analysis and lineage.

With an automation framework, data professionals can meet these needs at a fraction of the cost of the traditional manual way.

In data governance terms, an automation framework refers to a metadata-driven universal code generator that works hand in hand with enterprise data mapping for:

  • Pre-ETL enterprise data mapping
  • Governing metadata
  • Governing and versioning source-to-target mappings throughout the lifecycle
  • Data lineage, impact analysis and business rules repositories
  • Automated code generation

Such automation enables organizations to bypass bottlenecks, including human error and the time required to complete these tasks manually.

In fact, being able to rely on automated and repeatable processes can result in up to 50 percent in design savings, up to 70 percent conversion savings and up to 70 percent acceleration in total project delivery.

So without further ado, here are the five key benefits of an automation framework for data governance.

Automation Framework

Benefits of an Automation Framework for Data Governance

  1. Creates simplicity, reliability, consistency and customization for the integrated development environment.

Code automation templates (CATs) can be created – for virtually any process and any tech platform – using the SDK scripting language or the solution’s published libraries to completely automate common, manual data integration tasks.

CATs are designed and developed by senior automation experts to ensure they are compliant with industry or corporate standards as well as with an organization’s best practice and design standards.

The 100-percent metadata-driven approach is critical to creating reliable and consistent CATs.

It is possible to scan, pull in and configure metadata sources and targets using standard or custom adapters and connectors for databases, ERP, cloud environments, files, data modeling, BI reports and Big Data to document data catalogs, data mappings, ETL (XML code) and even SQL procedures of any type.

  1. Provides blueprints anyone in the organization can use.

Stage DDL from source metadata for the target DBMS; profile and test SQL for test automation of data integration projects; generate source-to-target mappings and ETL jobs for leading ETL tools, among other capabilities.

It also can populate and maintain Big Data sets by generating PIG, Scoop, MapReduce, Spark, Python scripts and more.

  1. Incorporates data governance into the system development process.

An organization can achieve a more comprehensive and sustainable data governance initiative than it ever could with a homegrown solution.

An automation framework’s ability to automatically create, version, manage and document source-to-target mappings greatly matters both to data governance maturity and a shorter-time-to-value.

This eliminates duplication that occurs when project teams are siloed, as well as prevents the loss of knowledge capital due to employee attrition.

Another value capability is coordination between data governance and SDLC, including automated metadata harvesting and cataloging from a wide array of sources for real-time metadata synchronization with core data governance capabilities and artifacts.

  1. Proves the value of data lineage and impact analysis for governance and risk assessment.

Automated reverse-engineering of ETL code into natural language enables a more intuitive lineage view for data governance.

With end-to-end lineage, it is possible to view data movement from source to stage, stage to EDW, and on to a federation of marts and reporting structures, providing a comprehensive and detailed view of data in motion.

The process includes leveraging existing mapping documentation and auto-documented mappings to quickly render graphical source-to-target lineage views including transformation logic that can be shared across the enterprise.

Similarly, impact analysis – which involves data mapping and lineage across tables, columns, systems, business rules, projects, mappings and ETL processes – provides insight into potential data risks and enables fast and thorough remediation when needed.

Impact analysis across the organization while meeting regulatory compliance with industry regulators requires detailed data mapping and lineage.

THE REGULATORY RATIONALE FOR INTEGRATING DATA MANAGEMENT & DATA GOVERNANCE

  1. Supports a wide spectrum of business needs.

Intelligent automation delivers enhanced capability, increased efficiency and effective collaboration to every stakeholder in the data value chain: data stewards, architects, scientists, analysts; business intelligence developers, IT professionals and business consumers.

It makes it easier for them to handle jobs such as data warehousing by leveraging source-to-target mapping and ETL code generation and job standardization.

It’s easier to map, move and test data for regular maintenance of existing structures, movement from legacy systems to new systems during a merger or acquisition, or a modernization effort.

erwin’s Approach to Automation for Data Governance: The erwin Automation Framework

Mature and sustainable data governance requires collaboration from both IT and the business, backed by a technology platform that accelerates the time to data intelligence.

Part of the erwin EDGE portfolio for an “enterprise data governance experience,” the erwin Automation Framework transforms enterprise data into accurate and actionable insights by connecting all the pieces of the data management and data governance lifecycle.

 As with all erwin solutions, it embraces any data from anywhere (Any2) with automation for relational, unstructured, on-premise and cloud-based data assets and data movement specifications harvested and coupled with CATs.

If your organization would like to realize all the benefits explained above – and gain an “edge” in how it approaches data governance, you can start by joining one of our weekly demos for erwin Mapping Manager.

Automate Data Mapping

Categories
erwin Expert Blog

Solving the Enterprise Data Dilemma

Due to the adoption of data-driven business, organizations across the board are facing their own enterprise data dilemmas.

This week erwin announced its acquisition of metadata management and data governance provider AnalytiX DS. The combined company touches every piece of the data management and governance lifecycle, enabling enterprises to fuel automated, high-quality data pipelines for faster speed to accurate, actionable insights.

Why Is This a Big Deal?

From digital transformation to AI, and everything in between, organizations are flooded with data. So, companies are investing heavily in initiatives to use all the data at their disposal, but they face some challenges. Chiefly, deriving meaningful insights from their data – and turning them into actions that improve the bottom line.

This enterprise data dilemma stems from three important but difficult questions to answer: What data do we have? Where is it? And how do we get value from it?

Large enterprises use thousands of unharvested, undocumented databases, applications, ETL processes and procedural code that make it difficult to gather business intelligence, conduct IT audits, and ensure regulatory compliance – not to mention accomplish other objectives around customer satisfaction, revenue growth and overall efficiency and decision-making.

The lack of visibility and control around “data at rest” combined with “data in motion”, as well as difficulties with legacy architectures, means these organizations spend more time finding the data they need rather than using it to produce meaningful business outcomes.

To remedy this, enterprises need smarter and faster data management and data governance capabilities, including the ability to efficiently catalog and document their systems, processes and the associated data without errors. In addition, business and IT must collaborate outside their traditional operational silos.

But this coveted state of data nirvana isn’t possible without the right approach and technology platform.

Enterprise Data: Making the Data Management-Data Governance Love Connection

Enterprise Data: Making the Data Management-Data Governance Love Connection

Bringing together data management and data governance delivers greater efficiencies to technical users and better analytics to business users. It’s like two sides of the same coin:

  • Data management drives the design, deployment and operation of systems that deliver operational and analytical data assets.
  • Data governance delivers these data assets within a business context, tracks their physical existence and lineage, and maximizes their security, quality and value.

Although these disciplines approach data from different perspectives and are used to produce different outcomes, they have a lot in common. Both require a real-time, accurate picture of an organization’s data landscape, including data at rest in data warehouses and data lakes and data in motion as it is integrated with and used by key applications.

However, creating and maintaining this metadata landscape is challenging because this data in its various forms and from numerous sources was never designed to work in concert. Data infrastructures have been cobbled together over time with disparate technologies, poor documentation and little thought for downstream integration, so the applications and initiatives that depend on data infrastructure are often out-of-date and inaccurate, rendering faulty insights and analyses.

Organizations need to know what data they have and where it’s located, where it came from and how it got there, what it means in common business terms [or standardized business terms] and be able to transform it into useful information they can act on – all while controlling its access.

To support the total enterprise data management and governance lifecycle, they need an automated, real-time, high-quality data pipeline. Then every stakeholder – data scientist, ETL developer, enterprise architect, business analyst, compliance officer, CDO and CEO – can fuel the desired outcomes with reliable information on which to base strategic decisions.

Enterprise Data: Creating Your “EDGE”

At the end of the day, all industries are in the data business and all employees are data people. The success of an organization is not measured by how much data it has, but by how well it’s used.

Data governance enables organizations to use their data to fuel compliance, innovation and transformation initiatives with greater agility, efficiency and cost-effectiveness.

Organizations need to understand their data from different perspectives, identify how it flows through and impacts the business, aligns this business view with a technical view of the data management infrastructure, and synchronizes efforts across both disciplines for accuracy, agility and efficiency in building a data capability that impacts the business in a meaningful and sustainable fashion.

The persona-based erwin EDGE creates an “enterprise data governance experience” that facilitates collaboration between both IT and the business to discover, understand and unlock the value of data both at rest and in motion.

By bringing together enterprise architecture, business process, data mapping and data modeling, erwin’s approach to data governance enables organizations to get a handle on how they handle their data. With the broadest set of metadata connectors and automated code generation, data mapping and cataloging tools, the erwin EDGE Platform simplifies the total data management and data governance lifecycle.

This single, integrated solution makes it possible to gather business intelligence, conduct IT audits, ensure regulatory compliance and accomplish any other organizational objective by fueling an automated, high-quality and real-time data pipeline.

With the erwin EDGE, data management and data governance are unified and mutually supportive, with one hand aware and informed by the efforts of the other to:

  • Discover data: Identify and integrate metadata from various data management silos.
  • Harvest data: Automate the collection of metadata from various data management silos and consolidate it into a single source.
  • Structure data: Connect physical metadata to specific business terms and definitions and reusable design standards.
  • Analyze data: Understand how data relates to the business and what attributes it has.
  • Map data flows: Identify where to integrate data and track how it moves and transforms.
  • Govern data: Develop a governance model to manage standards and policies and set best practices.
  • Socialize data: Enable stakeholders to see data in one place and in the context of their roles.

An integrated solution with data preparation, modeling and governance helps businesses reach data governance maturity – which equals a role-based, collaborative data governance system that serves both IT and business users equally. Such maturity may not happen overnight, but it will ultimately deliver the accurate and actionable insights your organization needs to compete and win.

Your journey to data nirvana begins with a demo of the enhanced erwin Data Governance solution. Register now.

erwin ADS webinar

Categories
erwin Expert Blog

Data Discovery Fire Drill: Why Isn’t My Executive Business Intelligence Report Correct?

Executive business intelligence (BI) reporting can be incomplete, inconsistent and/or inaccurate, becoming a critical concern for the executive management team trying to make informed business decisions. When issues arise, it is up to the IT department to figure out what the problem is, where it occurred, and how to fix it. This is not a trivial task.

Take the following scenario in which a CEO receives two reports supposedly from the same set of data, but each report shows different results. Which report is correct?  If this is something your organization has experienced, then you know what happens next – the data discovery fire drill.

A flurry of activities take place, suspending all other top priorities. A special team is quickly assembled to delve into each report. They review the data sources, ETL processes and data marts in an effort to trace the events that affected the data. Fire drills like the above can consume days if not weeks of effort to locate the error.

In the above situation it turns out there was a new update to one ETL process that was implemented in only one report. When you multiply the number of data discovery fire drills by the number of data quality concerns for any executive business intelligence report, the costs continue to mount.

Data can arrive from multiple systems at the same time, often occurring rapidly and in parallel. In some cases, the ETL load itself may generate new data. Through all of this, IT still has to answer two fundamental questions: where did this data come from, and how did it get here?

Accurate Executive Business Intelligence Reporting Requires Data Governance

As the volume of data rapidly increases, BI data environments are becoming more complex. To manage this complexity, organizations invest in a multitude of elaborate and expensive tools. But despite this investment, IT is still overwhelmed trying to track the vast collection of data within their BI environment. Is more technology the answer?

Perhaps the better question we should look to answer is: how can we avoid these data discovery fires in the future?

We believe it’s possible to prevent data discovery fires, and that starts with proper data governance and a strong data lineage capability.

Data Discovery Fire Drill: Executive Business Intelligence

Why is data governance important?

  • Governed data promotes data sharing.
  • Data standards make data more reusable.
  • Greater context in data definitions assist in more accurate analytics.
  • A clear set of data policies and procedures support data security.

Why is data lineage important?

  • Data trust is built by establishing its origins.
  • The troubleshooting process is simplified by enabling data to be traced.
  • The risk of ETL data loss is reduced by exposing potential problems in the process.
  • Business rules, which otherwise would be buried in an ETL process, are visible.

Data Governance Enables Data-Driven Business

In the context of modern, data-driven business in which organizations are essentially production lines of information – data governance is responsible for the health and maintenance of said production line.

It’s the enabling factor of the enterprise data management suite that ensures data quality,  so organizations can have greater trust in their data. It ensures that any data created is properly stored, tagged and assigned the context needed to prevent corruption or loss as it moves through the production line – greatly enhancing data discovery.

Alongside improving data quality, aiding in regulatory compliance, and making practices like tracing data lineage easier, sound data governance also helps organizations be proactive with their data, using it to drive revenue. They can make better decisions faster and negate the likelihood of costly mistakes and data breaches that would eat into their  bottom lines.

For more information about how data governance supports executive business intelligence and the rest of the enterprise data management suite, click here.

Data governance is everyone's business

Categories
erwin Expert Blog

A New Wave in Application Development

Application development is new again.

The ever-changing business landscape – fueled by digital transformation initiatives indiscriminate of industry – demands businesses deliver innovative customer – and partner – facing solutions, not just tactical apps to support internal functions.

Therefore, application developers are playing an increasingly important role in achieving business goals. The financial services sector is a notable example, with companies like JPMorgan Chase spending millions on emerging fintech like online and mobile tools for opening accounts and completing transactions, real-time stock portfolio values, and electronic trading and cash management services.

But businesses are finding that creating market-differentiating applications to improve the customer experience, and subsequently customer satisfaction, requires some significant adjustments. For example, using non-relational database technologies, building another level of development expertise, and driving optimal data performance will be on their agendas.

Of course, all of this must be done with a focus on data governance – backed by data modeling – as the guiding principle for accurate, real-time analytics and business intelligence (BI).

Evolving Application Development Requirements

The development organization must identify which systems, processes and even jobs must evolve to meet demand. The factors it will consider include agile development, skills transformation and faster querying.

Rapid delivery is the rule, with products released in usable increments in sprints as part of ongoing, iterative development. Developers can move from conceptual models for defining high-level requirements to creating low-level physical data models to be incorporated directly into the application logic. This route facilitates dynamic change support to drive speedy baselining, fast-track sprint development cycles and quick application scaling. Logical modeling then follows.

Application Development

Agile application development usually goes hand in hand with using NoSQL databases, so developers can take advantage of more pliable data models. This technology has more dynamic and flexible schema design than relational databases and supports whatever data types and query options an application requires, processing efficiency, and scalability and performance suiting Big Data and new-age apps’ real-time requirements. However, NoSQL skills aren’t widespread so specific tools for modeling unstructured data in NoSQL databases can help staff used to RDBMS ramp up.

Finally, the shift to agile development and NoSQL technology as part of more complex data architectures is driving another shift. Storage-optimized models are moving to the backlines because a new format is available to support real-time app development. It is one that understands what’s being asked of the data and enables schemes to be structured to support application data access requirements for speedy responses to complex queries.

The NoSQL Paradigm

erwin DM NoSQL takes into account all the requirements for the new application development era. In addition to its modeling tools, the solution includes patent-pending Query-Optimized ModelingTM that replaces storage-optimized modeling, giving users guidance to build schemas for optimal performance for NoSQL applications.

erwin DM NoSQL also embraces an “any-squared” approach to data management, so “any data” from “anywhere” can be visualized for greater understanding. And the solution now supports the Couchbase Data Platform in addition to MongoDB. Used in conjunction with erwin DG, businesses also can be assured that agility, speed and flexibility will not take precedence over the equally important need to stringently manage data.

With all this in place, enterprises will be positioned to deliver unique, real-time and responsive apps to enhance the customer experience and support new digital-transformation opportunities. At the same time, they’ll be able to preserve and extend the work they’ve already done in terms of maintaining well-governed data assets.

For more information about how to realize value from app development in the age of digital transformation with the help of data modeling and data governance, you can download our new e-book: Application Development Is New Again.