Categories
erwin Expert Blog Business Process

In Times of Rapid Change, Business Process Modeling Becomes a Critical Tool

With the help of business process modeling (BPM) organizations can visualize processes and all the associated information identifying the areas ripe for innovation, improvement or reorganization.

In the blink of an eye, COVID-19 has disrupted all industries and quickly accelerated their plans for digital transformation. As part of their transformations, businesses are moving quickly from on premise to the cloud and therefore need to create business process models available to everyone within the organization so they understand what data is tied to what applications and what processes are in place.

There’s a clear connection between business process modeling and digital transformation initiatives. With it, an organization can explore models to understand information assets within a business context, from internal operations to full customer experiences.

This practice identifies and drives digital transformation opportunities to increase revenue while limiting risks and avoiding regulatory and compliance gaffes.

Business Process Data Governance

Bringing IT and Business Together to Make More Informed Decisions

Developing a shared repository is key to aligning IT systems to accomplish business strategies, reducing the time it takes to make decisions and accelerating solution delivery.

It also serves to operationalize and govern mission-critical information by making it available to the wider enterprise at the right levels to identify synergies and ensure the appropriate collaboration.

One customer says his company realized early on that there’s a difference between business expertise and process expertise, and when you partner the two you really start to see the opportunities for success.

By bringing your business and IT together via BPM, you create a single point of truth within your organization — delivered to stakeholders within the context of their roles.

You then can understand where your data is, how you can find it, how you can monetize it, how you can report on it, and how you can visualize it. You are able to do it in an easy format that you can catalog, do mappings, lineage and focus on tying business and IT together to make more informed decisions.

BPM for Regulatory Compliance

Business process modeling is also critical for risk management and regulatory compliance. When thousands of employees need to know what compliance processes to follow, such as those associated with the European Union’s General Data Protection Regulation (GDPR), ensuring not only access to proper documentation but current, updated information is critical.

Industry and government regulations affect businesses that work in or do business with any number of industries or in specific geographies. Industry-specific regulations in areas like healthcare, pharmaceuticals and financial services have been in place for some time.

Now, broader mandates like GDPR and the California Consumer Privacy Act (CCPA) require businesses across industries to think about their compliance efforts. Business process modeling helps organizations prove what they are doing to meet compliance requirements and understand how changes to their processes impact compliance efforts (and vice versa).

This same customer says, “The biggest bang for the buck is having a single platform, a one-stop shop, for when you’re working with auditors.” You go to one place that is your source of truth: Here are processes; here’s how we have implemented these controls; here are the list of our controls and where they’re implemented in our business.”

He also notes that a single BPM platform “helps cut through a lot of questions and get right to the heart of the matter.” As a result, the company has had positive audit findings and results because they have a structure, a plan, and it’s easy to see the connection between how they’re ensuring their controls are adhered to and where those results are in their business processes.

Change Is Constant

Heraclitus, the Greek philosopher said, “The only constant in life is change.” This applies to business, as well. Today things are changing quite quickly. And with our current landscape, executives are not going to wait around for months as impact analyses are being formulated. They want actionable intelligence – fast.

For business process architects, being able to manage change and address key issues is what keeps the job function highly relevant to stakeholders. The key point is that useful change comes from routinely looking at process models and spotting a sub-optimality. Business process modeling supports many beneficial use cases and transformation projects used to empower employees and therefore better serve customers.

Organizational success depends on agility and adaptability in responding to change across the enterprise, both planned and unplanned. To be agile and responsive to changes in markets and consumer demands, you need a visual representation of what your business does and how it does it.

Companies that maintain accurate business process models also are well-positioned to analyze and optimize end-to-end process threads—lead-to-cash, problem-to-resolution or hire-to-retire, for example—that contribute to strategic business objectives, such as improving customer journeys or maximizing employee retention.

They also can slice and dice their models in multiple other ways, such as by functional hierarchies to understand what business groups organize or participate in processes as a step in driving better collaboration or greater efficiencies.

erwin Evolve enables communication and collaboration across the enterprise with reliable tools that make it possible to quickly and accurately gather information, make decisions, and then ensure consistent standards, policies and processes are established and available for consumption internally and externally as required.

Try erwin Evolve for yourself in a no-cost, risk-free trial.

Categories
erwin Expert Blog

How Metadata Makes Data Meaningful

Metadata is an important part of data governance, and as a result, most nascent data governance programs are rife with project plans for assessing and documenting metadata. But in many scenarios, it seems that the underlying driver of metadata collection projects is that it’s just something you do for data governance.

So most early-stage data governance managers kick off a series of projects to profile data, make inferences about data element structure and format, and store the presumptive metadata in some metadata repository. But are these rampant and often uncontrolled projects to collect metadata properly motivated?

There is rarely a clear directive about how metadata is used. Therefore prior to launching metadata collection tasks, it is important to specifically direct how the knowledge embedded within the corporate metadata should be used.

Managing metadata should not be a sub-goal of data governance. Today, metadata is the heart of enterprise data management and governance/ intelligence efforts and should have a clear strategy – rather than just something you do.

metadata data governance

What Is Metadata?

Quite simply, metadata is data about data. It’s generated every time data is captured at a source, accessed by users, moved through an organization, integrated or augmented with other data from other sources, profiled, cleansed and analyzed. Metadata is valuable because it provides information about the attributes of data elements that can be used to guide strategic and operational decision-making. It answers these important questions:

  • What data do we have?
  • Where did it come from?
  • Where is it now?
  • How has it changed since it was originally created or captured?
  • Who is authorized to use it and how?
  • Is it sensitive or are there any risks associated with it?

The Role of Metadata in Data Governance

Organizations don’t know what they don’t know, and this problem is only getting worse. As data continues to proliferate, so does the need for data and analytics initiatives to make sense of it all. Here are some benefits of metadata management for data governance use cases:

  • Better Data Quality: Data issues and inconsistencies within integrated data sources or targets are identified in real time to improve overall data quality by increasing time to insights and/or repair.
  • Quicker Project Delivery: Accelerate Big Data deployments, Data Vaults, data warehouse modernization, cloud migration, etc., by up to 70 percent.
  • Faster Speed to Insights: Reverse the current 80/20 rule that keeps high-paid knowledge workers too busy finding, understanding and resolving errors or inconsistencies to actually analyze source data.
  • Greater Productivity & Reduced Costs: Being able to rely on automated and repeatable metadata management processes results in greater productivity. Some erwin customers report productivity gains of 85+% for coding, 70+% for metadata discovery, up to 50% for data design, up to 70% for data conversion, and up to 80% for data mapping.
  • Regulatory Compliance: Regulations such as GDPR, HIPAA, PII, BCBS and CCPA have data privacy and security mandates, so sensitive data needs to be tagged, its lineage documented, and its flows depicted for traceability.
  • Digital Transformation: Knowing what data exists and its value potential promotes digital transformation by improving digital experiences, enhancing digital operations, driving digital innovation and building digital ecosystems.
  • Enterprise Collaboration: With the business driving alignment between data governance and strategic enterprise goals and IT handling the technical mechanics of data management, the door opens to finding, trusting and using data to effectively meet organizational objectives.

Giving Metadata Meaning

So how do you give metadata meaning? While this sounds like a deep philosophical question, the reality is the right tools can make all the difference.

erwin Data Intelligence (erwin DI) combines data management and data governance processes in an automated flow.

It’s unique in its ability to automatically harvest, transform and feed metadata from a wide array of data sources, operational processes, business applications and data models into a central data catalog and then make it accessible and understandable within the context of role-based views.

erwin DI sits on a common metamodel that is open, extensible and comes with a full set of APIs. A comprehensive list of erwin-owned standard data connectors are included for automated harvesting, refreshing and version-controlled metadata management. Optional erwin Smart Data Connectors reverse-engineer ETL code of all types and connect bi-directionally with reporting and other ecosystem tools. These connectors offer the fastest and most accurate path to data lineage, impact analysis and other detailed graphical relationships.

Additionally, erwin DI is part of the larger erwin EDGE platform that integrates data modelingenterprise architecturebusiness process modelingdata cataloging and data literacy. We know our customers need an active metadata-driven approach to:

  • Understand their business, technology and data architectures and the relationships between them
  • Create an automate a curated enterprise data catalog, complete with physical assets, data models, data movement, data quality and on-demand lineage
  • Activate their metadata to drive agile and well-governed data preparation with integrated business glossaries and data dictionaries that provide business context for stakeholder data literacy

erwin was named a Leader in Gartner’s “2019 Magic Quadrant for Metadata Management Solutions.”

Click here to get a free copy of the report.

Click here to request a demo of erwin DI.

Gartner Magic Quadrant Metadata Management

 

Categories
erwin Expert Blog

Metadata Management, Data Governance and Automation

Can the 80/20 Rule Be Reversed?

erwin released its State of Data Governance Report in February 2018, just a few months before the General Data Protection Regulation (GDPR) took effect.

This research showed that the majority of responding organizations weren’t actually prepared for GDPR, nor did they have the understanding, executive support and budget for data governance – although they recognized the importance of it.

Of course, data governance has evolved with astonishing speed, both in response to data privacy and security regulations and because organizations see the potential for using it to accomplish other organizational objectives.

But many of the world’s top brands still seem to be challenged in implementing and sustaining effective data governance programs (hello, Facebook).

We wonder why.

Too Much Time, Too Few Insights

According to IDC’s “Data Intelligence in Context” Technology Spotlight sponsored by erwin, “professionals who work with data spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analytics.”

Specifically, 80 percent of data professionals’ time is spent on data discovery, preparation and protection, and only 20 percent on analysis leading to insights.

In most companies, an incredible amount of data flows from multiple sources in a variety of formats and is constantly being moved and federated across a changing system landscape.

Often these enterprises are heavily regulated, so they need a well-defined data integration model that will help avoid data discrepancies and remove barriers to enterprise business intelligence and other meaningful use.

IT teams need the ability to smoothly generate hundreds of mappings and ETL jobs. They need their data mappings to fall under governance and audit controls, with instant access to dynamic impact analysis and data lineage.

But most organizations, especially those competing in the digital economy, don’t have enough time or money for data management using manual processes. Outsourcing is also expensive, with inevitable delays because these vendors are dependent on manual processes too.

The Role of Data Automation

Data governance maturity includes the ability to rely on automated and repeatable processes.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement — for data lineage and impact analysis — without interrupting system developers’ normal work methods.

GDPR compliance, for instance, requires a business to discover source-to-target mappings with all accompanying transactions, such as what business rules in the repository are applied to it, to comply with audits.

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology — that is, reverse-engineering code from existing XML within the ETL layer — to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

With automation, data professionals can meet the above needs at a fraction of the cost of the traditional, manual way. To summarize, just some of the benefits of data automation are:

• Centralized and standardized code management with all automation templates stored in a governed repository
• Better quality code and minimized rework
• Business-driven data movement and transformation specifications
• Superior data movement job designs based on best practices
• Greater agility and faster time-to-value in data preparation, deployment and governance
• Cross-platform support of scripting languages and data movement technologies

One global pharmaceutical giant reduced costs by 70 percent and generated 95 percent of production code with “zero touch.” With automation, the company improved the time to business value and significantly reduced the costly re-work associated with error-prone manual processes.

Gartner Magic Quadrant Metadata Management

Help Us Help You by Taking a Brief Survey

With 2020 just around the corner and another data regulation about to take effect, the California Consumer Privacy Act (CCPA), we’re working with Dataversity on another research project.

And this time, you guessed it – we’re focusing on data automation and how it could impact metadata management and data governance.

We would appreciate your input and will release the findings in January 2020.

Click here to take the brief survey

Categories
erwin Expert Blog

Very Meta … Unlocking Data’s Potential with Metadata Management Solutions

Untapped data, if mined, represents tremendous potential for your organization. While there has been a lot of talk about big data over the years, the real hero in unlocking the value of enterprise data is metadata, or the data about the data.

However, most organizations don’t use all the data they’re flooded with to reach deeper conclusions about how to drive revenue, achieve regulatory compliance or make other strategic decisions. They don’t know exactly what data they have or even where some of it is.

Quite honestly, knowing what data you have and where it lives is complicated. And to truly understand it, you need to be able to create and sustain an enterprise-wide view of and easy access to underlying metadata.

This isn’t an easy task. Organizations are dealing with numerous data types and data sources that were never designed to work together and data infrastructures that have been cobbled together over time with disparate technologies, poor documentation and with little thought for downstream integration.

As a result, the applications and initiatives that depend on a solid data infrastructure may be compromised, leading to faulty analysis and insights.

Metadata Is the Heart of Data Intelligence

A recent IDC Innovators: Data Intelligence Report says that getting answers to such questions as “where is my data, where has it been, and who has access to it” requires harnessing the power of metadata.

Metadata is generated every time data is captured at a source, accessed by users, moves through an organization, and then is profiled, cleansed, aggregated, augmented and used for analytics to guide operational or strategic decision-making.

In fact, data professionals spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analysis, according to IDC.

To flip this 80/20 rule, they need an automated metadata management solution for:

• Discovering data – Identify and interrogate metadata from various data management silos.
• Harvesting data – Automate the collection of metadata from various data management silos and consolidate it into a single source.
• Structuring and deploying data sources – Connect physical metadata to specific data models, business terms, definitions and reusable design standards.
• Analyzing metadata – Understand how data relates to the business and what attributes it has.
• Mapping data flows – Identify where to integrate data and track how it moves and transforms.
• Governing data – Develop a governance model to manage standards, policies and best practices and associate them with physical assets.
• Socializing data – Empower stakeholders to see data in one place and in the context of their roles.

Addressing the Complexities of Metadata Management

The complexities of metadata management can be addressed with a strong data management strategy coupled with metadata management software to enable the data quality the business requires.

This encompasses data cataloging (integration of data sets from various sources), mapping, versioning, business rules and glossary maintenance, and metadata management (associations and lineage).

erwin has developed the only data intelligence platform that provides organizations with a complete and contextual depiction of the entire metadata landscape.

It is the only solution that can automatically harvest, transform and feed metadata from operational processes, business applications and data models into a central data catalog and then made accessible and understandable within the context of role-based views.

erwin’s ability to integrate and continuously refresh metadata from an organization’s entire data ecosystem, including business processes, enterprise architecture and data architecture, forms the foundation for enterprise-wide data discovery, literacy, governance and strategic usage.

Organizations then can take a data-driven approach to business transformation, speed to insights, and risk management.
With erwin, organizations can:

1. Deliver a trusted metadata foundation through automated metadata harvesting and cataloging
2. Standardize data management processes through a metadata-driven approach
3. Centralize data-driven projects around centralized metadata for planning and visibility
4. Accelerate data preparation and delivery through metadata-driven automation
5. Master data management platforms through metadata abstraction
6. Accelerate data literacy through contextual metadata enrichment and integration
7. Leverage a metadata repository to derive lineage, impact analysis and enable audit/oversight ability

With erwin Data Intelligence as part of the erwin EDGE platform, you know what data you have, where it is, where it’s been and how it transformed along the way, plus you can understand sensitivities and risks.

With an automated, real-time, high-quality data pipeline, enterprise stakeholders can base strategic decisions on a full inventory of reliable information.

Many of our customers are hard at work addressing metadata management challenges, and that’s why erwin was Named a Leader in Gartner’s “2019 Magic Quadrant for Metadata Management Solutions.”

Gartner Magic Quadrant Metadata Management

Categories
erwin Expert Blog

Top Use Cases for Enterprise Architecture: Architect Everything

Architect Everything: New use cases for enterprise architecture are increasing enterprise architect’s stock in data-driven business

As enterprise architecture has evolved, so to have the use cases for enterprise architecture.

Analyst firm Ovum recently released a new report titled Ovum Market Radar: Enterprise Architecture. In it, they make the case that enterprise architecture (EA) is becoming AE – or “architect everything”.

The transition highlights enterprise architecture’s evolution from being solely an IT function to being more closely aligned with the business. As such, the function has changed from EA to AE.

At erwin, we’re definitely witnessing this EA evolution as more and more as organizations undertake digital transformation initiatives, including rearchitecting their business models and value streams, as well as responding to increasing regulatory pressures.

This is because EA provides the right information to the right people at the right time for smarter decision-making.

Following are some of the top use cases for enterprise architecture that demonstrate how EA is moving beyond IT and into the business.

Enterprise Architecture Use Cases

Top 7 Use Cases for Enterprise Architecture

Compliance. Enterprise architecture is critical for regulatory compliance. It helps model, manage and transform mission-critical value streams across industries, as well as identify sensitive information. When thousands of employees need to know what compliance processes to follow, such as those associated with regulations (e.g., GDPR, HIPAA, SOX, CCPA, etc.) it ensures not only access to proper documentation but also current, updated information.

The Regulatory Rationale for Integrating Data Management & Data Governance

Data security/risk management. EA should be commonplace in data security planning. Any flaw in the way data is stored or monitored is a potential ‘in’ for a breach, and so businesses have to ensure security surrounding sensitive information is thorough and covers the whole business. Security should be proactive, not reactive, which is why EA should be a huge part of security planning.

Data governance. Today’s enterprise embraces data governance to drive data opportunities, including growing revenue, and limit data risks, including regulatory and compliance gaffes.

EA solutions that provide much-needed insight into the relationship between data assets and applications make it possible to appropriately direct data usage and flows, as well as focus greater attention, if warranted, on applications where data use delivers optimal business value.

Digital transformation. For an organization to successfully embrace change, innovation, EA and project delivery need to be intertwined and traceable. Enterprise architects are crucial to delivering innovation. Taking an idea from concept to delivery requires strategic planning and the ability to execute. An enterprise architecture roadmap can help focus such plans and many organizations are now utilizing them to prepare their enterprise architectures for 5G.

Mergers & acquisitions. Enterprise architecture is essential to successful mergers and acquisitions. It helps alignment by providing a business- outcome perspective for IT and guiding transformation. It also helps define strategy and models, improving interdepartmental cohesion and communication.

In an M&A scenario, businesses need to ensure their systems are fully documented and rationalized. This way they can comb through their inventories to make more informed decisions about which systems to cut or phase out to operate more efficiently.

Innovation management. EA is crucial to innovation and project delivery. Using open standards to link to other products within the overall project lifecycle, integrating agile enterprise architecture with agile development and connecting project delivery with effective governance.

It takes a rigorous approach to ensure that current and future states are published for a wider audience for consumption and collaboration – from modeling to generating road maps with meaningful insights provided to both technical and business stakeholders during every step.

Knowledge retention. Unlocking knowledge and then putting systems in place to retain that knowledge is a key benefit of EA. Many organizations lack a structured approach for gathering and investigating employee ideas. Ideas can fall into a black hole where they don’t get feedback and employees become less engaged.

When your enterprise architecture is aligned with your business outcomes, it provides a way to help your business ideate and investigate the viability of ideas on both the technical and business level.

If the benefits of enterprise architecture would help your business, here’s how you can try erwin EA for free.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Using Strategic Data Governance to Manage GDPR/CCPA Complexity

In light of recent, high-profile data breaches, it’s past-time we re-examined strategic data governance and its role in managing regulatory requirements.

News broke earlier this week of British Airways being fined 183 million pounds – or $228 million – by the U.K. for alleged violations of the European Union’s General Data Protection Regulation (GDPR). While not the first, it is the largest penalty levied since the GDPR went into effect in May 2018.

Given this, Oppenheimer & Co. cautions:

“European regulators could accelerate the crackdown on GDPR violators, which in turn could accelerate demand for GDPR readiness. Although the CCPA [California Consumer Privacy Act, the U.S. equivalent of GDPR] will not become effective until 2020, we believe that new developments in GDPR enforcement may influence the regulatory framework of the still fluid CCPA.”

With all the advance notice and significant chatter for GDPR/CCPA,  why aren’t organizations more prepared to deal with data regulations?

In a word? Complexity.

The complexity of regulatory requirements in and of themselves is aggravated by the complexity of the business and data landscapes within most enterprises.

So it’s important to understand how to use strategic data governance to manage the complexity of regulatory compliance and other business objectives …

Designing and Operationalizing Regulatory Compliance Strategy

It’s not easy to design and deploy compliance in an environment that’s not well understood and difficult in which to maneuver. First you need to analyze and design your compliance strategy and tactics, and then you need to operationalize them.

Modern, strategic data governance, which involves both IT and the business, enables organizations to plan and document how they will discover and understand their data within context, track its physical existence and lineage, and maximize its security, quality and value. It also helps enterprises put these strategic capabilities into action by:

  • Understanding their business, technology and data architectures and their inter-relationships, aligning them with their goals and defining the people, processes and technologies required to achieve compliance.
  • Creating and automating a curated enterprise data catalog, complete with physical assets, data models, data movement, data quality and on-demand lineage.
  • Activating their metadata to drive agile data preparation and governance through integrated data glossaries and dictionaries that associate policies to enable stakeholder data literacy.

Strategic Data Governance for GDPR/CCPA

Five Steps to GDPR/CCPA Compliance

With the right technology, GDPR/CCPA compliance can be automated and accelerated in these five steps:

  1. Catalog systems

Harvest, enrich/transform and catalog data from a wide array of sources to enable any stakeholder to see the interrelationships of data assets across the organization.

  1. Govern PII “at rest”

Classify, flag and socialize the use and governance of personally identifiable information regardless of where it is stored.

  1. Govern PII “in motion”

Scan, catalog and map personally identifiable information to understand how it moves inside and outside the organization and how it changes along the way.

  1. Manage policies and rules

Govern business terminology in addition to data policies and rules, depicting relationships to physical data catalogs and the applications that use them with lineage and impact analysis views.

  1. Strengthen data security

Identify regulatory risks and guide the fortification of network and encryption security standards and policies by understanding where all personally identifiable information is stored, processed and used.

How erwin Can Help

erwin is the only software provider with a complete, metadata-driven approach to data governance through our integrated enterprise modeling and data intelligence suites. We help customers overcome their data governance challenges, with risk management and regulatory compliance being primary concerns.

However, the erwin EDGE also delivers an “enterprise data governance experience” in terms of agile innovation and business transformation – from creating new products and services to keeping customers happy to generating more revenue.

Whatever your organization’s key drivers are, a strategic data governance approach – through  business process, enterprise architecture and data modeling combined with data cataloging and data literacy – is key to success in our modern, digital world.

If you’d like to get a handle on handling your data, you can sign up for a free, one-on-one demo of erwin Data Intelligence.

For more information on GDPR/CCPA, we’ve also published a white paper on the Regulatory Rationale for Integrating Data Management and Data Governance.

GDPR White Paper

Categories
erwin Expert Blog

Choosing the Right Data Modeling Tool

The need for an effective data modeling tool is more significant than ever.

For decades, data modeling has provided the optimal way to design and deploy new relational databases with high-quality data sources and support application development. But it provides even greater value for modern enterprises where critical data exists in both structured and unstructured formats and lives both on premise and in the cloud.

In today’s hyper-competitive, data-driven business landscape, organizations are awash with data and the applications, databases and schema required to manage it.

For example, an organization may have 300 applications, with 50 different databases and a different schema for each. Additional challenges, such as increasing regulatory pressures – from the General Data Protection Regulation (GDPR) to the Health Insurance Privacy and Portability Act (HIPPA) – and growing stores of unstructured data also underscore the increasing importance of a data modeling tool.

Data modeling, quite simply, describes the process of discovering, analyzing, representing and communicating data requirements in a precise form called the data model. There’s an expression: measure twice, cut once. Data modeling is the upfront “measuring tool” that helps organizations reduce time and avoid guesswork in a low-cost environment.

From a business-outcome perspective, a data modeling tool is used to help organizations:

  • Effectively manage and govern massive volumes of data
  • Consolidate and build applications with hybrid architectures, including traditional, Big Data, cloud and on premise
  • Support expanding regulatory requirements, such as GDPR and the California Consumer Privacy Act (CCPA)
  • Simplify collaboration across key roles and improve information alignment
  • Improve business processes for operational efficiency and compliance
  • Empower employees with self-service access for enterprise data capability, fluency and accountability

Data Modeling Tool

Evaluating a Data Modeling Tool – Key Features

Organizations seeking to invest in a new data modeling tool should consider these four key features.

  1. Ability to visualize business and technical database structures through an integrated, graphical model.

Due to the amount of database platforms available, it’s important that an organization’s data modeling tool supports a sufficient (to your organization) array of platforms. The chosen data modeling tool should be able to read the technical formats of each of these platforms and translate them into highly graphical models rich in metadata. Schema can be deployed from models in an automated fashion and iteratively updated so that new development can take place via model-driven design.

  1. Empowering of end-user BI/analytics by data source discovery, analysis and integration. 

A data modeling tool should give business users confidence in the information they use to make decisions. Such confidence comes from the ability to provide a common, contextual, easily accessible source of data element definitions to ensure they are able to draw upon the correct data; understand what it represents, including where it comes from; and know how it’s connected to other entities.

A data modeling tool can also be used to pull in data sources via self-service BI and analytics dashboards. The data modeling tool should also have the ability to integrate its models into whatever format is required for downstream consumption.

  1. The ability to store business definitions and data-centric business rules in the model along with technical database schemas, procedures and other information.

With business definitions and rules on board, technical implementations can be better aligned with the needs of the organization. Using an advanced design layer architecture, model “layers” can be created with one or more models focused on the business requirements that then can be linked to one or more database implementations. Design-layer metadata can also be connected from conceptual through logical to physical data models.

  1. Rationalize platform inconsistencies and deliver a single source of truth for all enterprise business data.

Many organizations struggle to breakdown data silos and unify data into a single source of truth, due in large part to varying data sources and difficulty managing unstructured data. Being able to model any data from anywhere accounts for this with on-demand modeling for non-relational databases that offer speed, horizontal scalability and other real-time application advantages.

With NoSQL support, model structures from non-relational databases, such as Couchbase and MongoDB can be created automatically. Existing Couchbase and MongoDB data sources can be easily discovered, understood and documented through modeling and visualization. Existing entity-relationship diagrams and SQL databases can be migrated to Couchbase and MongoDB too. Relational schema also will be transformed to query-optimized NoSQL constructs.

Other considerations include the ability to:

  • Compare models and databases.
  • Increase enterprise collaboration.
  • Perform impact analysis.
  • Enable business and IT infrastructure interoperability.

When it comes to data modeling, no one knows it better. For more than 30 years, erwin Data Modeler has been the market leader. It is built on the vision and experience of data modelers worldwide and is the de-facto standard in data model integration.

You can learn more about driving business value and underpinning governance with erwin DM in this free white paper.

Data Modeling Drives Business Value

Categories
erwin Expert Blog

The Importance of EA/BP for Mergers and Acquisitions

Over the past few weeks several huge mergers and acquisitions (M&A) have been announced, including Raytheon and United Technologies, the Salesforce acquisition of Tableau and the Merck acquisition of Tilos Therapeutics.

According to collated research and a Harvard Business Review report, the M&A failure rate sits between 70 and 90 percent. Additionally, McKinsey estimates that around 70 percent of mergers do not achieve their expected “revenue synergies.”

Combining two organizations into one is complicated. And following a merger or acquisition, businesses typically find themselves with duplicate applications and business capabilities that are costly and obviously redundant, making alignment difficult.

Enterprise architecture is essential to successful mergers and acquisitions. It helps alignment by providing a business- outcome perspective for IT and guiding transformation. It also helps define strategy and models, improving interdepartmental cohesion and communication. Roadmaps can be used to provide a common focus throughout the new company, and if existing roadmaps are in place, they can be modified to fit the new landscape.

Additionally, an organization must understand both sets of processes being brought to the table. Without business process modeling, this is near impossible.

In an M&A scenario, businesses need to ensure their systems are fully documented and rationalized. This way, they can comb through their inventories to make more informed decisions about which systems to cut or phase out to operate more efficiently and then deliver the roadmap to enable those changes.

Mergers and Acquisitions

Getting Rid of Duplications Duplications

Mergers and acquisitions are daunting. Depending on the size of the businesses, hundreds of systems and processes need to be accounted for, which can be difficult, and even impossible to do in advance.

Enterprise architecture aids in rooting out process and operational duplications, making the new entity more cost efficient. Needless to say, the behind-the-scenes complexities are many and can include discovering that the merging enterprises use the same solution but under different names in different parts of the organizations, for example.

Determinations also may need to be made about whether particular functions, that are expected to become business-critical, have a solid, scalable base to build upon. If an existing application won’t be able to handle the increased data load and processing, then those previously planned investments don’t need to be made.

Gaining business-wide visibility of data and enterprise architecture all within a central repository enables relevant parties across merging companies to work from a single source of information. This provides insights to help determine whether, for example, two equally adept applications of the same nature can continue to be used as the companies merge, because they share common underlying data infrastructures that make it possible for them to interoperate across a single source of synched information.

Or, in another scenario, it may be obvious that it is better to keep only one of the applications because it alone serves as the system of record for what the organization has determined are valuable conceptual data entities in its data model.

At the same time, it can reveal the location of data that might otherwise have been unwittingly discharged with the elimination of an application, enabling it to be moved to a lower-cost storage tier for potential future use.

Knowledge Retention – Avoiding Brain Drain

When employees come and go, as they tend to during mergers and acquisitions, they take critical institutional knowledge with them.

Unlocking knowledge and then putting systems in place to retain that knowledge is one key benefit of business process modeling. Knowledge retention and training has become a pivotal area in which businesses will either succeed or fail.

Different organizations tend to speak different languages. For instance, one company might refer to a customer as “customer,” while another might refer to them as a “client.” Business process modeling is a great way to get everybody in the organization using the same language, referring to things in the same way.

Drawing out this knowledge then allows a centralized and uniform process to be adopted across the company. In any department within any company, individuals and teams develop processes for doing things. Business process modeling extracts all these pieces of information from individuals and teams so they can be turned into centrally adopted processes.

 

[FREE EBOOK] Application Portfolio Management For Mergers & Acquisitions 

 

Ensuring Compliance

Industry and government regulations affect businesses that work in or do business with any number of industries or in specific geographies. Industry-specific regulations in areas like healthcare, pharmaceuticals and financial services have been in place for some time.

Now, broader mandates like the European Union’s Generation Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) require businesses across industries to think about their compliance efforts. Business process modeling helps organizations prove what they are doing to meet compliance requirements and understand how changes to their processes impact compliance efforts (and vice versa).

In highly regulated industries like financial services and pharmaceuticals, where mergers and acquisitions activity is frequent, identifying and standardizing business processes meets the scrutiny of regulatory compliance.

Business process modeling makes it easier to document processes, align documentation within document control and learning management systems, and give R&D employees easy access and intuitive navigation so they can find the information they need.

Introducing Business Architecture

Organizations often interchange the terms “business process” and “enterprise architecture” because both are strategic functions with many interdependencies.

However, business process architecture defines the elements of a business and how they interact with the aim of aligning people, processes, data, technologies and applications. Enterprise architecture defines the structure and operation of an organization with the purpose of determining how it can achieve its current and future objectives most effectively, translating those goals into a blueprint of IT capabilities.

Although both disciplines seek to achieve the organization’s desired outcomes, both have largely operated in silos.

To learn more about how erwin provides modeling and analysis software to support both business process and enterprise architecture practices and enable their broader collaboration, click here.

Cloud-based enterprise architecture and business process

Categories
erwin Expert Blog

Keeping Up with New Data Protection Regulations

Keeping up with new data protection regulations can be difficult, and the latest – the General Data Protection Regulation (GDPR) – isn’t the only new data protection regulation organizations should be aware of.

California recently passed a law that gives residents the right to control the data companies collect about them. Some suggest the California Consumer Privacy Act (CCPA), which takes effect January 1, 2020, sets a precedent other states will follow by empowering consumers to set limits on how companies can use their personal information.

In fact, organizations should expect increasing pressure on lawmakers to introduce new data protection regulations. A number of high-profile data breaches and scandals have increased public awareness of the issue.

Facebook was in the news again last week for another major problem around the transparency of its user data, and the tech-giant also is reportedly facing 10 GDPR investigations in Ireland – along with Apple, LinkedIn and Twitter.

Some industries, such as healthcare and financial services, have been subject to stringent data regulations for years: GDPR now joins the Health Insurance Portability and Accountability Act (HIPAA), the Payment Card Industry Data Security Standard (PCI DSS) and the Basel Committee on Banking Supervision (BCBS).

Due to these pre-existing regulations, organizations operating within these sectors, as well as insurance, had some of the GDPR compliance bases covered in advance.

Other industries had their own levels of preparedness, based on the nature of their operations. For example, many retailers have robust, data-driven e-commerce operations that are international. Such businesses are bound to comply with varying local standards, especially when dealing with personally identifiable information (PII).

Smaller, more brick-and-mortar-focussed retailers may have had to start from scratch.

But starting position aside, every data-driven organization should strive for a better standard of data management — and not just for compliance sake. After all, organizations are now realizing that data is one of their most valuable assets.

New Data Protection Regulations – Always Be Prepared

When it comes to new data protection regulations in the face of constant data-driven change, it’s a matter of when, not if.

As they say, the best defense is a good offense. Fortunately, whenever the time comes, the first point of call will always be data governance, so organizations can prepare.

Effective compliance with new data protection regulations requires a robust understanding of the “what, where and who” in terms of data and the stakeholders with access to it (i.e., employees).

The Regulatory Rationale for Integrating Data Management & Data Governance

This is also true for existing data regulations. Compliance is an on-going requirement, so efforts to become compliant should not be treated as static events.

Less than four months before GDPR came into effect, only 6 percent of enterprises claimed they were prepared for it. Many of these organizations will recall a number of stressful weeks – or even months – tidying up their databases and their data management processes and policies.

This time and money was spent reactionarily, at the behest of proactive efforts to grow the business.

The implementation and subsequent observation of a strong data governance initiative ensures organizations won’t be put on the spot going forward. Should an audit come up, current projects aren’t suddenly derailed as they reenact pre-GDPR panic.

New Data Regulations

Data Governance: The Foundation for Compliance

The first step to compliance with new – or old – data protection regulations is data governance.

A robust and effective data governance initiative ensures an organization understands where security should be focussed.

By adopting a data governance platform that enables you to automatically tag sensitive data and track its lineage, you can ensure nothing falls through the cracks.

Your chosen data governance solution should enable you to automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data – Gain better visibility and control across the enterprise to identify data security threats and reduce associated risks.
  • Enriching business data elements for sensitive data discovery – By leveraging a comprehensive mechanism to define business data elements for PII, PHI and PCI across database systems, cloud and Big Data stores, you can easily identify sensitive data based on a set of algorithms and data patterns.
  • Providing metadata and value-based analysis – Simplify the discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data, including PII, PHI and PCI.

With these precautionary steps, organizations are primed to respond if a data breach occurs. Having a well governed data ecosystem with data lineage capabilities means issues can be quickly identified.

Additionally, if any follow-up is necessary –  such as with GDPR’s data breach reporting time requirements – it can be handles swiftly and in accordance with regulations.

It’s also important to understand that the benefits of data governance don’t stop with regulatory compliance.

A better understanding of what data you have, where it’s stored and the history of its use and access isn’t only beneficial in fending off non-compliance repercussions. In fact, such an understanding is arguably better put to use proactively.

Data governance improves data quality standards, it enables better decision-making and ensures businesses can have more confidence in the data informing those decisions.

The same mechanisms that protect data by controlling its access also can be leveraged to make data more easily discoverable to approved parties – improving operational efficiency.

All in all, the cumulative result of data governance’s influence on data-driven businesses both drives revenue (through greater efficiency) and reduces costs (less errors, false starts, etc.).

To learn more about data governance and the regulatory rationale for its implementation, get our free guide here.

DG RediChek

Categories
erwin Expert Blog

Google’s Record GDPR Fine: Avoiding This Fate with Data Governance

The General Data Protection Regulation (GDPR) made its first real impact as Google’s record GDPR fine dominated news cycles.

Historically, fines had peaked at six figures with the U.K.’s Information Commissioner’s Office (ICO) fines of 500,000 pounds ($650,000 USD) against both Facebook and Equifax for their data protection breaches.

Experts predicted an uptick in GDPR enforcement in 2019, and Google’s recent record GDPR fine has brought that to fruition. France’s data privacy enforcement agency hit the tech giant with a $57 million penalty – more than 80 times the steepest ICO fine.

If it can happen to Google, no organization is safe. Many in fact still lag in the GDPR compliance department. Cisco’s 2019 Data Privacy Benchmark Study reveals that only 59 percent of organizations are meeting “all or most” of GDPR’s requirements.

So many more GDPR violations are likely to come to light. And even organizations that are currently compliant can’t afford to let their data governance standards slip.

Data Governance for GDPR

Google’s record GDPR fine makes the rationale for better data governance clear enough. However, the Cisco report offers even more insight into the value of achieving and maintaining compliance.

Organizations with GDPR-compliant security measures are not only less likely to suffer a breach (74 percent vs. 89 percent), but the breaches suffered are less costly too, with fewer records affected.

However, applying such GDPR-compliant provisions can’t be done on a whim; organizations must expand their data governance practices to include compliance.

GDPR White Paper

A robust data governance initiative provides a comprehensive picture of an organization’s systems and the units of data contained or used within them. This understanding encompasses not only the original instance of a data unit but also its lineage and how it has been handled and processed across an organization’s ecosystem.

With this information, organizations can apply the relevant degrees of security where necessary, ensuring expansive and efficient protection from external (i.e., breaches) and internal (i.e., mismanaged permissions) data security threats.

Although data security cannot be wholly guaranteed, these measures can help identify and contain breaches to minimize the fallout.

Looking at Google’s Record GDPR Fine as An Opportunity

The tertiary benefits of GDPR compliance include greater agility and innovation and better data discovery and management. So arguably, the “tertiary” benefits of data governance should take center stage.

While once exploited by such innovators as Amazon and Netflix, data optimization and governance is now on everyone’s radar.

So organization’s need another competitive differentiator.

An enterprise data governance experience (EDGE) provides just that.

THE REGULATORY RATIONALE FOR INTEGRATING DATA MANAGEMENT & DATA GOVERNANCE

This approach unifies data management and data governance, ensuring that the data landscape, policies, procedures and metrics stem from a central source of truth so data can be trusted at any point throughout its enterprise journey.

With an EDGE, the Any2 (any data from anywhere) data management philosophy applies – whether structured or unstructured, in the cloud or on premise. An organization’s data preparation (data mapping), enterprise modeling (business, enterprise and data) and data governance practices all draw from a single metadata repository.

In fact, metadata from a multitude of enterprise systems can be harvested and cataloged automatically. And with intelligent data discovery, sensitive data can be tagged and governed automatically as well – think GDPR as well as HIPAA, BCBS and CCPA.

Organizations without an EDGE can still achieve regulatory compliance, but data silos and the associated bottlenecks are unavoidable without integration and automation – not to mention longer timeframes and higher costs.

To get an “edge” on your competition, consider the erwin EDGE platform for greater control over and value from your data assets.

Data preparation/mapping is a great starting point and a key component of the software portfolio. Join us for a weekly demo.

Automate Data Mapping