Categories
erwin Expert Blog

Business Process Modeling Use Cases and Definition

What is business process modeling (BPM)? A visual representation of what your business does and how it does it. Why is having this picture important?

According to Gartner, BPM links business strategy to IT systems development to ensure business value. It also combines process/ workflow, functional, organizational and data/resource views with underlying metrics such as costs, cycle times and responsibilities to provide a foundation for analyzing value chains, activity-based costs, bottlenecks, critical paths and inefficiencies.

Every organization—particularly those operating in industries where quality, regulatory, health, safety or environmental issues are a concern—must have a complete understanding of its processes. Equally important, employees must fully comprehend and be accountable for appropriately carrying out the processes for which they are responsible.

BPM allows organizations to benefit from an easily digestible visualization of its systems and the associated information. It makes it easier to be agile and responsive to changes in markets and consumer demands,

This is because the visualization process galvanizes an organization’s ability to identify areas of improvement, potential innovation and necessary reorganization.

But a theoretical understanding of business process modeling will only get you so far. The following use cases demonstrate the benefits of business process modeling in real life.

Business process modeling (BPM) is a practice that helps organizations understand how their strategy relates to their IT systems and system development.

Business Process Modeling Use Cases

Compliance:

Regulations like the E.U.’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) are requiring businesses across industries to think about their compliance efforts. Business process modeling helps organizations prove what they are doing to meet compliance requirements and understand how changes to their processes impact compliance efforts (and vice versa).

The visualization process can aid in an organization’s ability to understand the security risks associated with a particular process. It also means that should a breach occur, the organization’s greater understanding of its processes and related systems means they can respond with greater agility, mitigate the damage and quickly inform affected parties as required specifically by GDPR.

In the case of an audit, BPM can be used to demonstrate that the organization is cognizant of compliance standards and is doing what is required.

This also extends to industry-specific other compliance mandates  such as those in healthcare, pharmaceutical and the financial services industries.

The Regulatory Rationale for Integrating Data Management & Data Governance

The Democratization of Information:

Increasing an organizations ability to retain knowledge is another cross-industry use case for business process modeling. This use case benefits organizations in two key areas:

1. Democratization of information.

By documenting processes, organizations can ensure that knowledge and information is de-siloed and that the organization as a whole can benefit from it. In this case, a key best practice to consider is the introduction of role/user-based access. This way an organization can ensure only the necessary parties can access such information and ensure they are in keeping with compliance standards.

2. Knowledge retention.

By documenting processes and democratizing information, process-specific knowledge can be retained, even when key employees leave. This is particularly important in the case of an aging workforce, where an organization could suffer a “brain drain” as large numbers of employees retire during a short span of time.

Digital Transformation:

Once in a while, a technological revolution turns the nature of business on its head. The most recent and arguably most significant of which – although at this point it’s hard to argue – is the rise of data-driven businesses.

In a relatively short amount of time, the leaders in data-driven businesses were launched and stormed their way to the forefront of their respective industries – think Amazon, Netflix and Uber.

The result? Data is now considered more valuable than oil and industries across the board are seeing digital transformation en masse.

There’s a clear connection between business process modeling and digital transformation initiatives. With it, an organization can explore models to understand information assets within a business context, from internal operations to full customer experiences.

This practice identifies and drives digital transformation opportunities to increase revenue while limiting risks and avoiding regulatory and compliance gaffes.

Organizations that leverage BPM in their digital transformation efforts can use their greater

understanding of their current processes to make more informed decisions about future implementations.

And the use cases for business process modeling don’t stop there.

A better understanding of your organizations processes can also ease software deployments and make mergers and acquisitions (M&A) far easier to handle. Large organizations grow through M&A activity, and the combining of business processes, software applications and infrastructure when two organizations become one is very complex.

Business process modeling offers visibility into existing processes and helps design new processes that will deliver results in a post-merger environment.

The latest guide from the erwin Experts expands on these use cases and details how best to use business process modeling to tame your organization’s complexity and maximize its potential and profits.

Business Process Modeling Use Cases

Categories
erwin Expert Blog

What’s Business Process Modeling Got to Do with It? – Choosing A BPM Tool

With business process modeling (BPM) being a key component of data governance, choosing a BPM tool is part of a dilemma many businesses either have or will soon face.

Historically, BPM didn’t necessarily have to be tied to an organization’s data governance initiative.

However, data-driven business and the regulations that oversee it are becoming increasingly extensive, so the need to view data governance as a collective effort – in terms of personnel and the tools that make up the strategy – is becoming harder to ignore.

Data governance also relies on business process modeling and analysis to drive improvement, including identifying business practices susceptible to security, compliance or other risks and adding controls to mitigate exposures.

Choosing a BPM Tool: An Overview

As part of a data governance strategy, a BPM tool aids organizations in visualizing their business processes, system interactions and organizational hierarchies to ensure elements are aligned and core operations are optimized.

The right BPM tool also helps organizations increase productivity, reduce errors and mitigate risks to achieve strategic objectives.

With  insights from the BPM tool, you can clarify roles and responsibilities – which in turn should influence an organization’s policies about data ownership and make data lineage easier to manage.

Organizations also can use a BPM tool to identify the staff who function as “unofficial data repositories.” This has both a primary and secondary function:

1. Organizations can document employee processes to ensure vital information isn’t lost should an employee choose to leave.

2. It is easier to identify areas where expertise may need to be bolstered.

Organizations that adopt a BPM tool also enjoy greater process efficiency. This is through a combination of improving existing processes or designing new process flows, eliminating unnecessary or contradictory steps, and documenting results in a shareable format that is easy to understand so the organization is pulling in one direction.

Choosing a BPM Tool

Silo Buster

Understanding the typical use cases for business process modeling is the first step. As with any tech investment, it’s important to understand how the technology will work in the context of your organization/business.

For example, it’s counter-productive to invest in a solution that reduces informational silos only to introduce a new technological silo through a lack of integration.

Ideally, organizations want a BPM tool that works in conjunction with the wider data management platform and data governance initiative – not one that works against them.

That means it must support data imports and integrations from/with external sources, a solution that enables in-tool collaboration to reduce departmental silos, and most crucial, a solution that taps into a central metadata repository to ensure consistency across the whole data management and governance initiatives.

The lack of a central metadata repository is a far too common thorn in an organization’s side. Without it, they have to juggle multiple versions as changes to the underlying data aren’t automatically updated across the platform.

It also means organizations waste crucial time manually manufacturing and maintaining data quality, when an automation framework could achieve the same goal instantaneously, without human error and with greater consistency.

A central metadata repository ensures an organization can acknowledge and get behind a single source of truth. This has a wealth of favorable consequences including greater cohesion across the organization, better data quality and trust, and faster decision-making with less false starts due to plans based on misleading information.

Three Key Questions to Ask When Choosing a BPM Tool

Organizations in the market for a BPM tool should also consider the following:

1. Configurability: Does the tool support the ability to model and analyze business processes with links to data, applications and other aspects of your organization? And how easy is this to achieve?

2. Role-based views: Can the tool develop integrated business models for a single source of truth but with different views for different stakeholders based on their needs – making regulatory compliance more manageable? Does it enable cross-functional and enterprise collaboration through discussion threads, surveys and other social features?

3. Business and IT infrastructure interoperability: How well does the tool integrate with other key components of data governance including enterprise architecture, data modeling, data cataloging and data literacy? Can it aid in providing data intelligence to connect all the pieces of the data management and governance lifecycles?

For more information and to find out how such a solution can integrate with your organization and current data management and data governance initiatives, click here.

BPM Tool - erwin BP powered by Casewise

Categories
erwin Expert Blog

Data Mapping Tools: What Are the Key Differentiators

The need for data mapping tools in light of increasing volumes and varieties of data – as well as the velocity at which it must be processed – is growing.

It’s not difficult to see why either. Data mapping tools have always been a key asset for any organization looking to leverage data for insights.

Isolated units of data are essentially meaningless. By linking data and enabling its categorization in relation to other data units, data mapping provides the context vital for actionable information.

Now with the General Data Protection Regulation (GDPR) in effect, data mapping has become even more significant.

The scale of GDPR’s reach has set a new precedent and is the closest we’ve come to a global standard in terms of data regulations. The repercussions can be huge – just ask Google.

Data mapping tools are paramount in charting a path to compliance for said new, near-global standard and avoiding the hefty fines.

Because of GDPR, organizations that may not have fully leveraged data mapping for proactive data-driven initiatives (e.g., analysis) are now adopting data mapping tools with compliance in mind.

Arguably, GDPR’s implementation can be viewed as an opportunity – a catalyst for digital transformation.

Those organizations investing in data mapping tools with compliance as the main driver will definitely want to consider this opportunity and have it influence their decision as to which data mapping tool to adopt.

With that in mind, it’s important to understand the key differentiators in data mapping tools and the associated benefits.

Data Mapping Tools: erwin Mapping Manager

Data Mapping Tools: Automated or Manual?

In terms of differentiators for data mapping tools, perhaps the most distinct is automated data mapping versus data mapping via manual processes.

Data mapping tools that allow for automation mean organizations can benefit from in-depth, quality-assured data mapping, without the significant allocations of resources typically associated with such projects.

Eighty percent of data scientists’ and other data professionals’ time is spent on manual data maintenance. That’s anything and everything from addressing errors and inconsistencies and trying to understand source data or track its lineage. This doesn’t even account for the time lost due to missed errors that contribute to inherently flawed endeavors.

Automated data mapping tools render such issues and concerns void. In turn, data professionals’ time can be put to much better, proactive use, rather than them being bogged down with reactive, house-keeping tasks.

FOUR INDUSTRY FOCUSSED CASE STUDIES FOR AUTOMATED METADATA-DRIVEN AUTOMATION 
(BFSI, PHARMA, INSURANCE AND NON-PROFIT) 

 

As well as introducing greater efficiency to the data governance process, automated data mapping tools enable data to be auto-documented from XML that builds mappings for the target repository or reporting structure.

Additionally, a tool that leverages and draws from a single metadata repository means that mappings are dynamically linked with underlying metadata to render automated lineage views, including full transformation logic in real time.

Therefore, changes (e.g., in the data catalog) will be reflected across data governance domains (business process, enterprise architecture and data modeling) as and when they’re made – no more juggling and maintaining multiple, out-of-date versions.

It also enables automatic impact analysis at the table and column level – even for business/transformation rules.

For organizations looking to free themselves from the burden of juggling multiple versions, siloed business processes and a disconnect between interdepartmental collaboration, this feature is a key benefit to consider.

Data Mapping Tools: Other Differentiators

In light of the aforementioned changes to data regulations, many organizations will need to consider the extent of a data mapping tool’s data lineage capabilities.

The ability to reverse-engineer and document the business logic from your reporting structures for true source-to-report lineage is key because it makes analysis (and the trust in said analysis) easier. And should a data breach occur, affected data/persons can be more quickly identified in accordance with GDPR.

Article 33 of GDPR requires organizations to notify the appropriate supervisory authority “without undue delay and, where, feasible, not later than 72 hours” after discovering a breach.

As stated above, a data governance platform that draws from a single metadata source is even more advantageous here.

Mappings can be synchronized with metadata so that source or target metadata changes can be automatically pushed into the mappings – so your mappings stay up to date with little or no effort.

The Data Mapping Tool For Data-Driven Businesses

Nobody likes manual documentation. It’s arduous, error-prone and a waste of resources. Quite frankly, it’s dated.

Any organization looking to invest in data mapping, data preparation and/or data cataloging needs to make automation a priority.

With automated data mapping, organizations can achieve “true data intelligence,”. That being the ability to tell the story of how data enters the organization and changes throughout the entire lifecycle to the consumption/reporting layer.  If you’re working harder than your tool, you have the wrong tool.

The manual tools of old do not have auto documentation capabilities, cannot produce outbound code for multiple ETL or script types, and are a liability in terms of accuracy and GDPR.

Automated data mapping is the only path to true GDPR compliance, and erwin Mapping Manager can get you there in a matter of weeks thanks to our robust reverse-engineering technology. 

Learn more about erwin’s automation framework for data governance here.

Automate Data Mapping

Categories
erwin Expert Blog

Data Governance Stock Check: Using Data Governance to Take Stock of Your Data Assets

For regulatory compliance (e.g., GDPR) and to ensure peak business performance, organizations often bring consultants on board to help take stock of their data assets. This sort of data governance “stock check” is important but can be arduous without the right approach and technology. That’s where data governance comes in …

While most companies hold the lion’s share of operational data within relational databases, it also can live in many other places and various other formats. Therefore, organizations need the ability to manage any data from anywhere, what we call our “any-squared” (Any2) approach to data governance.

Any2 first requires an understanding of the ‘3Vs’ of data – volume, variety and velocity – especially in context of the data lifecycle, as well as knowing how to leverage the key  capabilities of data governance – data cataloging, data literacy, business process, enterprise architecture and data modeling – that enable data to be leveraged at different stages for optimum security, quality and value.

Following are two examples that illustrate the data governance stock check, including the Any2 approach in action, based on real consulting engagements.

Data Governance Stock Check

Data Governance “Stock Check” Case 1: The Data Broker

This client trades in information. Therefore, the organization needed to catalog the data it acquires from suppliers, ensure its quality, classify it, and then sell it to customers. The company wanted to assemble the data in a data warehouse and then provide controlled access to it.

The first step in helping this client involved taking stock of its existing data. We set up a portal so data assets could be registered via a form with basic questions, and then a central team received the registrations, reviewed and prioritized them. Entitlement attributes also were set up to identify and profile high-priority assets.

A number of best practices and technology solutions were used to establish the data required for managing the registration and classification of data feeds:

1. The underlying metadata is harvested followed by an initial quality check. Then the metadata is classified against a semantic model held in a business glossary.

2. After this classification, a second data quality check is performed based on the best-practice rules associated with the semantic model.

3. Profiled assets are loaded into a historical data store within the warehouse, with data governance tools generating its structure and data movement operations for data loading.

4. We developed a change management program to make all staff aware of the information brokerage portal and the importance of using it. It uses a catalog of data assets, all classified against a semantic model with data quality metrics to easily understand where data assets are located within the data warehouse.

5. Adopting this portal, where data is registered and classified against an ontology, enables the client’s customers to shop for data by asset or by meaning (e.g., “what data do you have on X topic?”) and then drill down through the taxonomy or across an ontology. Next, they raise a request to purchase the desired data.

This consulting engagement and technology implementation increased data accessibility and capitalization. Information is registered within a central portal through an approved workflow, and then customers shop for data either from a list of physical assets or by information content, with purchase requests also going through an approval workflow. This, among other safeguards, ensures data quality.

Benefits of Data Governance

Data Governance “Stock Check” Case 2: Tracking Rogue Data

This client has a geographically-dispersed organization that stored many of its key processes in Microsoft Excel TM spreadsheets. They were planning to move to Office 365TM and were concerned about regulatory compliance, including GDPR mandates.

Knowing that electronic documents are heavily used in key business processes and distributed across the organization, this company needed to replace risky manual processes with centralized, automated systems.

A key part of the consulting engagement was to understand what data assets were in circulation and how they were used by the organization. Then process chains could be prioritized to automate and outline specifications for the system to replace them.

This organization also adopted a central portal that allowed employees to register data assets. The associated change management program raised awareness of data governance across the organization and the importance of data registration.

For each asset, information was captured and reviewed as part of a workflow. Prioritized assets were then chosen for profiling, enabling metadata to be reverse-engineered before being classified against the business glossary.

Additionally, assets that were part of a process chain were gathered and modeled with enterprise architecture (EA) and business process (BP) modeling tools for impact analysis.

High-level requirements for new systems then could be defined again in the EA/BP tools and prioritized on a project list. For the others, decisions could be made on whether they could safely be placed in the cloud and whether macros would be required.

In this case, the adoption of purpose-built data governance solutions helped build an understanding of the data assets in play, including information about their usage and content to aid in decision-making.

This client then had a good handle of the “what” and “where” in terms of sensitive data stored in their systems. They also better understood how this sensitive data was being used and by whom, helping reduce regulatory risks like those associated with GDPR.

In both scenarios, we cataloged data assets and mapped them to a business glossary. It acts as a classification scheme to help govern data and located data, making it both more accessible and valuable. This governance framework reduces risk and protects its most valuable or sensitive data assets.

Focused on producing meaningful business outcomes, the erwin EDGE platform was pivotal in achieving these two clients’ data governance goals – including the infrastructure to undertake a data governance stock check. They used it to create an “enterprise data governance experience” not just for cataloging data and other foundational tasks, but also for a competitive “EDGE” in maximizing the value of their data while reducing data-related risks.

To learn more about the erwin EDGE data governance platform and how it aids in undertaking a data governance stock check, register for our free, 30-minute demonstration here.

Categories
erwin Expert Blog

Four Use Cases Proving the Benefits of Metadata-Driven Automation

Organization’s cannot hope to make the most out of a data-driven strategy, without at least some degree of metadata-driven automation.

The volume and variety of data has snowballed, and so has its velocity. As such, traditional – and mostly manual – processes associated with data management and data governance have broken down. They are time-consuming and prone to human error, making compliance, innovation and transformation initiatives more complicated, which is less than ideal in the information age.

So it’s safe to say that organizations can’t reap the rewards of their data without automation.

Data scientists and other data professionals can spend up to 80 percent of their time bogged down trying to understand source data or addressing errors and inconsistencies.

That’s time needed and better used for data analysis.

By implementing metadata-driven automation, organizations across industry can unleash the talents of their highly skilled, well paid data pros to focus on finding the goods: actionable insights that will fuel the business.

Metadata-Driven Automation

Metadata-Driven Automation in the BFSI Industry

The banking, financial services and insurance industry typically deals with higher data velocity and tighter regulations than most. This bureaucracy is rife with data management bottlenecks.

These bottlenecks are only made worse when organizations attempt to get by with systems and tools that are not purpose-built.

For example, manually managing data mappings for the enterprise data warehouse via MS Excel spreadsheets had become cumbersome and unsustainable for one BSFI company.

After embracing metadata-driven automation and custom code automation templates, it saved hundreds of thousands of dollars in code generation and development costs and achieved more work in less time with fewer resources. ROI on the automation solutions was realized within the first year.

Metadata-Driven Automation in the Pharmaceutical Industry

Despite its shortcomings, the Excel spreadsheet method for managing data mappings is common within many industries.

But with the amount of data organizations need to process in today’s business climate, this manual approach makes change management and determining end-to-end lineage a significant and time-consuming challenge.

One global pharmaceutical giant headquartered in the United States experienced such issues until it adopted metadata-driven automation. Then the pharma company was able to scan in all source and target system metadata and maintain it within a single repository. Users now view end-to-end data lineage from the source layer to the reporting layer within seconds.

On the whole, the implementation resulted in extraordinary time savings and a total cost reduction of 60 percent.

Metadata-Driven Automation in the Insurance Industry

Insurance is another industry that has to cope with high data velocity and stringent data regulations. Plus many organizations in this sector find that they’ve outgrown their systems.

For example, an insurance company using a CDMA product to centralize data mappings is probably missing certain critical features, such as versioning, impact analysis and lineage, which adds to costs, times to market and errors.

By adopting metadata-driven automation, organizations can standardize the pre-ETL data mapping process and better manage data integration through the change and release process. As a result, both internal data mapping and cross functional teams now have easy and fast web-based access to data mappings and valuable information like impact analysis and lineage.

Here is the story of a business that adopted such an approach and achieved operational excellence and a delivery time reduction by 80 percent, as well as achieving ROI within 12 months.

Metadata-Driven Automation for a Non-Profit

Another common issue cited by organizations using manual data mapping is ballooning complexity and subsequent confusion.

Any organization expanding its data-driven focus without sufficiently maturing data management initiative(s) will experience this at some point.

One of the world’s largest humanitarian organizations, with millions of members and volunteers operating all over the world, was confronted with this exact issue.

It recognized the need for a solution to standardize the pre-ETL data mapping process to make data integration more efficient and cost-effective.

With metadata-driven automation, the organization would be able to scan and store metadata and data dictionaries in a central repository, as well as manage the business definitions and data dictionary for legacy systems contributing data to the enterprise data warehouse.

By adopting such an approach, the organization realized time savings across all IT development and cross-functional testing teams. Additionally, they were able to more easily manage mappings, code sets, reference data and data validation rules.

Again, ROI was achieved within a year.

A Universal Solution for Metadata-Driven Automation

Metadata-driven automation is a capability any organization can benefit from – regardless of industry, as demonstrated by the various real-world use cases chronicled here.

The erwin Automation Framework is a key component of the erwin EDGE platform for comprehensive data management and data governance.

With it, data professionals realize these industry-agnostic benefits:

  • Centralized and standardized code management with all automation templates stored in a governed repository
  • Better quality code and minimized rework
  • Business-driven data movement and transformation specifications
  • Superior data movement job designs based on best practices
  • Greater agility and faster time-to-value in data preparation, deployment and governance
  • Cross-platform support of scripting languages and data movement technologies

Learn more about metadata-driven automation as it relates to data preparation and enterprise data mapping.

Join one our weekly erwin Mapping Manager demos.

Automate Data Mapping

Categories
erwin Expert Blog

Google’s Record GDPR Fine: Avoiding This Fate with Data Governance

The General Data Protection Regulation (GDPR) made its first real impact as Google’s record GDPR fine dominated news cycles.

Historically, fines had peaked at six figures with the U.K.’s Information Commissioner’s Office (ICO) fines of 500,000 pounds ($650,000 USD) against both Facebook and Equifax for their data protection breaches.

Experts predicted an uptick in GDPR enforcement in 2019, and Google’s recent record GDPR fine has brought that to fruition. France’s data privacy enforcement agency hit the tech giant with a $57 million penalty – more than 80 times the steepest ICO fine.

If it can happen to Google, no organization is safe. Many in fact still lag in the GDPR compliance department. Cisco’s 2019 Data Privacy Benchmark Study reveals that only 59 percent of organizations are meeting “all or most” of GDPR’s requirements.

So many more GDPR violations are likely to come to light. And even organizations that are currently compliant can’t afford to let their data governance standards slip.

Data Governance for GDPR

Google’s record GDPR fine makes the rationale for better data governance clear enough. However, the Cisco report offers even more insight into the value of achieving and maintaining compliance.

Organizations with GDPR-compliant security measures are not only less likely to suffer a breach (74 percent vs. 89 percent), but the breaches suffered are less costly too, with fewer records affected.

However, applying such GDPR-compliant provisions can’t be done on a whim; organizations must expand their data governance practices to include compliance.

GDPR White Paper

A robust data governance initiative provides a comprehensive picture of an organization’s systems and the units of data contained or used within them. This understanding encompasses not only the original instance of a data unit but also its lineage and how it has been handled and processed across an organization’s ecosystem.

With this information, organizations can apply the relevant degrees of security where necessary, ensuring expansive and efficient protection from external (i.e., breaches) and internal (i.e., mismanaged permissions) data security threats.

Although data security cannot be wholly guaranteed, these measures can help identify and contain breaches to minimize the fallout.

Looking at Google’s Record GDPR Fine as An Opportunity

The tertiary benefits of GDPR compliance include greater agility and innovation and better data discovery and management. So arguably, the “tertiary” benefits of data governance should take center stage.

While once exploited by such innovators as Amazon and Netflix, data optimization and governance is now on everyone’s radar.

So organization’s need another competitive differentiator.

An enterprise data governance experience (EDGE) provides just that.

THE REGULATORY RATIONALE FOR INTEGRATING DATA MANAGEMENT & DATA GOVERNANCE

This approach unifies data management and data governance, ensuring that the data landscape, policies, procedures and metrics stem from a central source of truth so data can be trusted at any point throughout its enterprise journey.

With an EDGE, the Any2 (any data from anywhere) data management philosophy applies – whether structured or unstructured, in the cloud or on premise. An organization’s data preparation (data mapping), enterprise modeling (business, enterprise and data) and data governance practices all draw from a single metadata repository.

In fact, metadata from a multitude of enterprise systems can be harvested and cataloged automatically. And with intelligent data discovery, sensitive data can be tagged and governed automatically as well – think GDPR as well as HIPAA, BCBS and CCPA.

Organizations without an EDGE can still achieve regulatory compliance, but data silos and the associated bottlenecks are unavoidable without integration and automation – not to mention longer timeframes and higher costs.

To get an “edge” on your competition, consider the erwin EDGE platform for greater control over and value from your data assets.

Data preparation/mapping is a great starting point and a key component of the software portfolio. Join us for a weekly demo.

Automate Data Mapping

Categories
erwin Expert Blog

erwin Automation Framework: Achieving Faster Time-to-Value in Data Preparation, Deployment and Governance

Data governance is more important to the enterprise than ever before. It ensures everyone in the organization can discover and analyze high-quality data to quickly deliver business value.

It assists in successfully meeting increasingly strict compliance requirements, such as those in the General Data Protection Regulation (GDPR). And it provides a clear gauge on business performance.

A mature and sustainable data governance initiative must include data integration.

This often requires reconciling two groups of individuals within the organization: 1) those who care about governance and the meaningful use of data and 2) those who care about getting and transforming the data from source to target for actionable insights.

Both ends of the data value chain are covered when governance is coupled programmatically with IT’s integration practices.

The tools and processes for this should automatically generate “pre-ETL” source-to-target mapping to minimize human errors that can occur while manually compiling and interpreting a multitude of Excel-based data mappings that exist across the organization.

In addition to reducing errors and improving data quality, the efficiencies gained through automation, including minimizing rework, can help cut system development lifecycle costs in half.

In fact, being able to rely on automated and repeatable processes can result in up to 50 percent in design savings, up to 70 percent conversion savings, and up to 70 percent acceleration in total project delivery.

Data Governance and the System Development Lifecycle

Boosting data governance maturity starts with a central metadata repository (data dictionary) for version-controlling metadata imported from a broad array of file and database types to inform data mappings. It can be used to automatically generate governed design mappings and code in the design phase of the system development lifecycle.

The right toolset – one that supports a unifying and underlying metadata model – will be a design and code-generation platform that introduces efficiency, visibility and governance principles while reducing the opportunity for human error.

Automatically generating ETL/ELT jobs for leading ETL tools based on best design practices accommodates those principles; it functions according to approved corporate and industry standards.

Automatically importing mappings from developers’ Excel sheets, flat files, access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement – for lineage and impact analysis – without interrupting system developers’ normal work methods.

GDPR compliance, for example, requires a business to discover source-to-target mappings with all accompanying transactions, such as what business rules in the repository are applied to it, to comply with audits.

THE REGULATORY RATIONALE FOR INTEGRATING DATA MANAGEMENT & DATA GOVERNANCE

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology – that is, reverse-engineering code from existing XML within the ETL layer – to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

This is one example of how to meet data governance demands with more agility and accuracy at high speed.

Faster Time-to-Value with the erwin Automation Framework

The erwin Automation Framework is a metadata-driven universal code generator that works hand in hand with erwin Mapping Manager (MM) for:

  • Pre-ETL enterprise data mapping
  • Governing metadata
  • Governing and versioning source-to-target mappings throughout the lifecycle
  • Data lineage, impact analysis and business rules repositories
  • Automated code generation

If you’d like to save time and money in preparing, deploying and governing you organization’s data, please join us for a demo of erwin MM.

Automate Data Mapping

Categories
erwin Expert Blog

Who to Follow in 2019 for Big Data, Data Governance and GDPR Advice

Experts are predicting a surge in GDPR enforcement in 2019 as regulators begin to crackdown on organizations still lagging behind compliance standards.

With this in mind, the erwin team has compiled a list of the most valuable data governance, GDPR and Big data blogs and news sources for data management and data governance best practice advice from around the web.

From regulatory compliance (GDPR, HIPPA, etc.,) to driving revenue through proactive data governance initiatives and Big Data strategies, these accounts cover it all.

Top 7 Data Governance, GDPR and Big Data Blogs and News Sources from Around the Web

Honorable Mention: @BigDataBatman

The Twitter account data professionals deserve, but probably not the one you need right now.

This quirky Twitter bot trawls the web for big data tweets and news stories, and substitutes “big data” for “Batman”. If data is the Bane of your existence, this account will serve up some light relief.

 

1. The erwin Expert Network

Twitter| LinkedIn | Facebook | Blog

For anything data management and data governance related, the erwin Experts should be your first point of call.

The team behind the most connected data management and data governance solutions on the market regularly share best practice advice in guide, whitepaper, blog and social media update form.

 

2. GDPR For Online Entrepreneurs (UK, US, CA, AU)

This community-driven Facebook group is a consistent source of insightful information for data-driven businesses.

In addition to sharing data and GDPR-focused articles from around the web, GDPR For Online Entrepreneurs encourages members to seek GDPR advice from its community’s members.

 

3. GDPR General Data Protection Regulation Technology

LinkedIn also has its own community-driven GDPR advice groups. The most active of these is the, “GDPR General Data Protection Regulation Technology”.

The group aims to be an information hub for anybody responsible for company data, including company leaders, GDPR specialists and consultants, business analysts and process experts. 

Data governance, GDPR, big data blogs

 

 

4. DBTA

Twitter | LinkedIn | Facebook

Database Trends and Applications is a publication that should be on every data professionals’ radar. Alongside news and editorials covering big data, database management, data integrations and more, DBTA is also a great source of advice for professionals looking to research buying options.

Their yearly “Trend-Setting Products in Data and Information Management” list and Product Spotlight featurettes can help data professionals put together proposals, and help give decision makers piece of mind.

 

5. Dataversity

Twitter | LinkedIn

Dataversity is another excellent source for data management and data governance related best practices and think-pieces.

In addition to hosting and sponsoring a number of live events throughout the year, the platform is a regular provider of data leadership webinars and training with a library full of webinars available on-demand.

 

6. WIRED

Twitter | LinkedIn | Facebook

Wired is a physical and digital tech magazine that covers all the bases.

For data professionals that are after the latest news and editorials pertaining to data security and a little extra – from innovations in transport to the applications of Blockchain – Wired is a great publication to keep on your radar.

 

7. TDAN

Twitter | LinkedIn | Facebook

For those looking for something a little more focused, check out TDAN. A subsidiary of Dataversity, TDAN regularly publish new editorial content covering data governance, data management, data modeling and Big Data.

Categories
erwin Expert Blog

Top 7 Data Governance Blog Posts of 2018

The driving factors behind data governance adoption vary.

Whether implemented as preventative measures (risk management and regulation) or proactive endeavors (value creation and ROI), the benefits of a data governance initiative is becoming more apparent.

Historically most organizations have approached data governance in isolation and from the former category. But as data’s value to the enterprise has grown, so has the need for a holistic, collaborative means of discovering, understanding and governing data.

So with the impetus of the General Data Protection Regulation (GDPR) and the opportunities presented by data-driven transformation, many organizations are re-evaluating their data management and data governance practices.

With that in mind, we’ve compiled a list of the very best, best-practice blog posts from the erwin Experts in 2018.

Defining data governance: DG Drivers

Defining Data Governance

www.erwin.com/blog/defining-data-governance/

Data governance’s importance has become more widely understood. But for a long time, the discipline was marred with a poor reputation owed to consistent false starts, dogged implementations and underwhelming ROI.

The evolution from Data Governance 1.0 to Data Governance 2.0 has helped shake past perceptions, introducing a collaborative approach. But to ensure the collaborative take on data governance is implemented properly, an organization must settle on a common definition.

The Top 6 Benefits of Data Governance

www.erwin.com/blog/top-6-benefits-of-data-governance/

GDPR went into effect for businesses trading with the European Union, including hefty fines for noncompliance with its data collection, storage and usage standards.

But it’s important for organizations to understand that the benefits of data governance extend beyond just GDPR or compliance with any other internal or external regulations.

Data Governance Readiness: The Five Pillars

www.erwin.com/blog/data-governance-readiness/

GDPR had organizations scrambling to implement data governance initiatives by the effective date, but many still lag behind.

Enforcement and fines will increase in 2019, so an understanding of the five pillars of data governance readiness are essential: initiative sponsorship, organizational support, allocation of team resources, enterprise data management methodology and delivery capability.

Data Governance and GDPR: How the Most Comprehensive Data Regulation in the World Will Affect Your Business

www.erwin.com/blog/data-governance-and-gdpr/

Speaking of GDPR enforcement, this post breaks down how the regulation affects business.

From rules regarding active consent, data processing and the tricky “right to be forgotten” to required procedures for notifying afflicted parties of a data breach and documenting compliance, GDPR introduces a lot of complexity.

The Top Five Data Governance Use Cases and Drivers

www.erwin.com/blog/data-governance-use-cases/

An erwin-UBM study conducted in late 2017 sought to determine the biggest drivers for data governance.

In addition to compliance, top drivers turned out to be improving customer satisfaction, reputation management, analytics and Big Data.

Data Governance 2.0 for Financial Services

www.erwin.com/blog/data-governance-2-0-financial-services/

Organizations operating within the financial services industry were arguably the most prepared for GDPR, given its history. However, the huge Equifax data breach was a stark reminder that organizations still have work to do.

As well as an analysis of data governance for regulatory compliance in financial services, this article examines the value data governance can bring to these organizations – up to $30 billion could be on the table.

Understanding and Justifying Data Governance 2.0

www.erwin.com/blog/justifying-data-governance/

For some organizations, the biggest hurdle in implementing a new data governance initiative or strengthening an existing one is support from business leaders. Its value can be hard to demonstrate to those who don’t work directly with data and metadata on a daily basis.

This article examines this data governance roadblock and others in addition to advice on how to overcome them.

 

Automate Data Mapping

Categories
erwin Expert Blog

Six Reasons Business Glossary Management Is Crucial to Data Governance

A business glossary is crucial to any data governance strategy, yet it is often overlooked.

Consider this – no one likes unpleasant surprises, especially in business. So when it comes to objectively understanding what’s happening from the top of the sales funnel to the bottom line of finance, everyone wants – and needs – to trust the data they have.

That’s why you can’t underestimate the importance of a business glossary. Sometimes the business folks say IT or marketing speaks a different language. Or in the case of mergers and acquisitions, different companies call the same thing something else.

A business glossary solves this complexity by creating a common business vocabulary. Regardless of the industry you’re in or the type of data initiative you’re undertaking, the ability for an organization to have a unified, common language is a key component of data governance, ensuring you can trust your data.

Are we speaking the same language?

How can two reports show different results for the same region? A quick analysis of invoices will likely reveal that some of the data fed into the report wasn’t based on a clear understanding of business terms.

Business Glossary Management is Crucial to Data Governance

In such embarrassing scenarios, a business glossary and its ongoing management has obvious significance. And with the complexity of today’s business environment, organizations need the right solution to make sense out of their data and govern it properly.

Here are six reasons a business glossary is vital to data governance:

  1. Bridging the gap between Business & IT

A sound data governance initiative bridges the gap between the business and IT. By understanding the underlying metadata associated with business terms and the associated data lineage, a business glossary helps bridge this gap to deliver greater value to the organization.

  1. Integrated search

The biggest appeal of business glossary management is that it helps establish relationships between business terms to drive data governance across the entire organization. A good business glossary should provide an integrated search feature that can find context-specific results, such as business terms, definitions, technical metadata, KPIs and process areas.

  1. The ability to capture business terms and all associated artifacts

What good is a business term if it can’t be associated with other business terms and KPIs? Capturing relationships between business terms as well as between technical and business entities is essential in today’s regulatory and compliance-conscious environment. A business glossary defines the relationship between the business terms and their underlying metadata for faster analysis and enhanced decision-making.

  1. Integrated project management and workflow

When the business and cross-functional teams operate in silos, users start defining business terms according to their own preferences rather than following standard policies and best practices. To be effective, a business glossary should enable a collaborative workflow management and approval process so stakeholders have visibility with established data governance roles and responsibilities. With this ability, business glossary users can provide input during the entire data definition process prior to publication.

  1. The ability to publish business terms

Successful businesses not only capture business terms and their definitions, they also publish them so that the business-at-large can access it. Business glossary users, who are typically members of the data governance team, should be assigned roles for creating, editing, approving and publishing business glossary content. A workflow feature will show which users are assigned which roles, including those with publishing permissions.

After initial publication, business glossary content can be revised and republished on an ongoing basis, based on the needs of your enterprise.

  1. End-to-end traceability

Capturing business terms and establishing relationships are key to glossary management. However, it is far from a complete solution without traceability. A good business glossary can help generate enterprise-level traceability in the form of mind maps or tabular reports to the business community once relationships have been established.

Business Glossary, the Heart of Data Governance

With a business glossary at the heart of your regulatory compliance and data governance initiatives, you can help break down organizational and technical silos for data visibility, context, control and collaboration across domains. It ensures that you can trust your data.

Plus, you can unify the people, processes and systems that manage and protect data through consistent exchange, understanding and processing to increase quality and trust.

By building a glossary of business terms in taxonomies with synonyms, acronyms and relationships, and publishing approved standards and prioritizing them, you can map data in all its forms to the central catalog of data elements.

That answers the vital question of “where is our data?” Then you can understand who and what is using your data to ensure adherence to usage standards and rules.

Value of Data Intelligence IDC Report