Categories
erwin Expert Blog

Documenting and Managing Governance, Risk and Compliance with Business Process

Managing an organization’s governance, risk and compliance (GRC) via its enterprise and business architectures means managing them against business processes (BP).

Shockingly, a lot of organizations, even today, manage this through, either homemade tools or documents, checklists, Excel files, custom-made databases and so on and so forth. The three main reasons organizations tend to still operate in this manual and disparate way comes down to three reasons:

  1. Cost
  2. Governance, risk and compliance are treated as isolated bubbles.
  3. Data-related risks are not connected with the data architects/data scientists.

If we look at this past year, COVID-19 fundamentally changed everything overnight – and it was something that nobody could have anticipated. However, only organizations that had their risks mapped at the process level could see their operational risk profiles and also see what processes needed adjustments – quickly.

Furthermore, by linking compliance with process, those organizations were prepared to answer very specific compliance questions. For example, if a customer asked, “Since most of your employees are working from home now, how can you ensure that my data is not shared with their kids?” Organizations with business process could respond with, “We have anticipated these kinds of risks and implemented the following controls, and this is how we protect you in different layers.”

Every company must understand its business processes, particularly those in industries in which quality, regulatory, health, safety or environmental standards are serious considerations. BP modeling and analysis shows process flows, system interactions and organizational hierarchies to identity areas for improvement as well as practices susceptible to the greatest security, compliance or other risks so controls and audits can be implemented to mitigate exposures.

Connecting the GRC, Data and Process Layers

The GRC layer comprises mandatory components like risks, controls and compliance elements. Traditionally, these are manually documented, monitored and managed.

For example, if tomorrow you decide you want ISO (International Organization for Standardization) 27001 compliance for your information security management system, you can go to the appropriate ISO site, download the entire standard with all the assessments with all the descriptions, mandates, questions and documents that you will need to provide. All of these items would comprise the GRC layer.

However, many organizations maintain Excel files with risk and control information and other Office files with compliance files and information in isolation. Or some of these files are uploaded to various systems, but they don’t talk to each other or any other enterprise systems for that matter. This is the data layer, which is factual, objective and, as opposed to the GRC layer, can be either fully or partly automated.

Now, let’s add the process layer to the equation. Why? Because that is where the GRC and data layers meet. How? Processes produce, process and consume data –information captured in the metadata layer. By following the process sequence, I can actually trace the data lineage as it flows across the entire business ecosystem, beyond the application layer.

Taking it further, from processes, I can look at how the data is being managed by my capabilities. In other words, if I do have a data breach, how do I mitigate it? What impact will it have on my organization? And what are the necessary controls to manage it? Looking at them from right to left, I can identify the effected systems, and I can identify the interfaces between systems.

Mitigating Data Breaches

Most data breaches happen either at the database or interface level. Interfaces are how applications talk to each other.

Organizations are showing immense interest in expanding the development of risk profiles, not only for isolated layers but also in how those layers interact – how applications talk to each other, how processes use data, how data is stored, and how infrastructure is managed. Understanding these profiles allows for more targeted and even preemptive risk mitigation, enabling organizations to fortify their weak points with sufficient controls but also practical and effective processes.

We’re moving from a world in which everything is performed manually and in isolation to one that is fully automated and integrated.

erwin instructs how to document and manage governance, risk and compliance using business process modeling and enterprise architecture solution erwin Evolve.

The C-Level Demands GRC Real-Time Impact Analysis

Impact analysis is critical. Everything needs to be clearly documented, covering all important and relevant aspects. No service, capability or delivery process is considered complete unless the risks and controls that affect it, or are implemented through it, are mapped and that assessment is used to generate risk profiles for the process, service or capability. And the demand for this to happen automatically increases daily.

This is now one of the key mandates across many organizations. C-level executives now demand risk profile dashboards at the process ,organizational and local level.

For example, an executive travelling from one country to another, or from one continent to another, can make a query: “I’m traveling to X, so what is the country’s risk profile and how is it being managed What do I need to be aware of or address while I’m there?” Or when a new legislation is introduced affecting multiple countries, the impact of that legislation to those countries’ risk profiles can be quickly and accurately calculated and actions planned accordingly.

erwin Evolve

GRC is more critical than ever. Organizations and specifically the C-suite are demanding to see risk profiles at different slices and dices of a particular process. But this is impossible without automation.

erwin Evolve is a full-featured, configurable enterprise architecture (EA) and BP modeling and analysis software suite that aids regulatory and industry compliance and maps business systems that support the enterprise. Its automated visualization, documentation and enterprise collaboration capabilities turn EA and BP artifacts into insights both IT and business users can access in a central location for making strategic decisions and managing GRC.

Please click here to start your free trial of erwin Evolve.

Categories
erwin Expert Blog Business Process

In Times of Rapid Change, Business Process Modeling Becomes a Critical Tool

With the help of business process modeling (BPM) organizations can visualize processes and all the associated information identifying the areas ripe for innovation, improvement or reorganization.

In the blink of an eye, COVID-19 has disrupted all industries and quickly accelerated their plans for digital transformation. As part of their transformations, businesses are moving quickly from on premise to the cloud and therefore need to create business process models available to everyone within the organization so they understand what data is tied to what applications and what processes are in place.

There’s a clear connection between business process modeling and digital transformation initiatives. With it, an organization can explore models to understand information assets within a business context, from internal operations to full customer experiences.

This practice identifies and drives digital transformation opportunities to increase revenue while limiting risks and avoiding regulatory and compliance gaffes.

Business Process Data Governance

Bringing IT and Business Together to Make More Informed Decisions

Developing a shared repository is key to aligning IT systems to accomplish business strategies, reducing the time it takes to make decisions and accelerating solution delivery.

It also serves to operationalize and govern mission-critical information by making it available to the wider enterprise at the right levels to identify synergies and ensure the appropriate collaboration.

One customer says his company realized early on that there’s a difference between business expertise and process expertise, and when you partner the two you really start to see the opportunities for success.

By bringing your business and IT together via BPM, you create a single point of truth within your organization — delivered to stakeholders within the context of their roles.

You then can understand where your data is, how you can find it, how you can monetize it, how you can report on it, and how you can visualize it. You are able to do it in an easy format that you can catalog, do mappings, lineage and focus on tying business and IT together to make more informed decisions.

BPM for Regulatory Compliance

Business process modeling is also critical for risk management and regulatory compliance. When thousands of employees need to know what compliance processes to follow, such as those associated with the European Union’s General Data Protection Regulation (GDPR), ensuring not only access to proper documentation but current, updated information is critical.

Industry and government regulations affect businesses that work in or do business with any number of industries or in specific geographies. Industry-specific regulations in areas like healthcare, pharmaceuticals and financial services have been in place for some time.

Now, broader mandates like GDPR and the California Consumer Privacy Act (CCPA) require businesses across industries to think about their compliance efforts. Business process modeling helps organizations prove what they are doing to meet compliance requirements and understand how changes to their processes impact compliance efforts (and vice versa).

This same customer says, “The biggest bang for the buck is having a single platform, a one-stop shop, for when you’re working with auditors.” You go to one place that is your source of truth: Here are processes; here’s how we have implemented these controls; here are the list of our controls and where they’re implemented in our business.”

He also notes that a single BPM platform “helps cut through a lot of questions and get right to the heart of the matter.” As a result, the company has had positive audit findings and results because they have a structure, a plan, and it’s easy to see the connection between how they’re ensuring their controls are adhered to and where those results are in their business processes.

Change Is Constant

Heraclitus, the Greek philosopher said, “The only constant in life is change.” This applies to business, as well. Today things are changing quite quickly. And with our current landscape, executives are not going to wait around for months as impact analyses are being formulated. They want actionable intelligence – fast.

For business process architects, being able to manage change and address key issues is what keeps the job function highly relevant to stakeholders. The key point is that useful change comes from routinely looking at process models and spotting a sub-optimality. Business process modeling supports many beneficial use cases and transformation projects used to empower employees and therefore better serve customers.

Organizational success depends on agility and adaptability in responding to change across the enterprise, both planned and unplanned. To be agile and responsive to changes in markets and consumer demands, you need a visual representation of what your business does and how it does it.

Companies that maintain accurate business process models also are well-positioned to analyze and optimize end-to-end process threads—lead-to-cash, problem-to-resolution or hire-to-retire, for example—that contribute to strategic business objectives, such as improving customer journeys or maximizing employee retention.

They also can slice and dice their models in multiple other ways, such as by functional hierarchies to understand what business groups organize or participate in processes as a step in driving better collaboration or greater efficiencies.

erwin Evolve enables communication and collaboration across the enterprise with reliable tools that make it possible to quickly and accurately gather information, make decisions, and then ensure consistent standards, policies and processes are established and available for consumption internally and externally as required.

Try erwin Evolve for yourself in a no-cost, risk-free trial.

Categories
erwin Expert Blog Enterprise Architecture

Using Enterprise Architecture for Integration After Mergers and Acquisitions

Because of its holistic view of an organization, enterprise architecture and mergers & acquisitions (M&A) go hand-in-hand.

M&A activity, despite or in light of COVID-19, are on an upswing. The Financial Times reported Google, Amazon, Apple, Facebook and Microsoft have made 19 deals so far this year, according to Refinitiv, the London-based global provider of financial market data. This represents the fastest pace of acquisitions and strategic investments since 2015.

Let’s face it, company mergers, even once approved, can be daunting affairs. Depending on the size of the businesses involved, hundreds of systems and processes need to be accounted for, which can be difficult and often impossible to do in advance.

Following these transactions, businesses typically find themselves with a plethora of duplicate applications and business capabilities that eat into overhead and complicate inter-departmental alignment.

These drawbacks mean businesses have to ensure their systems are fully documented and rationalized. This way the organization can comb through its inventory and make more informed decisions on which systems can and should be cut or phased out, so it can operate closer to peak efficiency and deliver the roadmap to enable the necessary change.

enterprise architecture innovation management

Enterprise Architecture Needs a Seat at the Table

IT professionals have the inside track about the connection that already exists across applications and data – and they’ll be the ones tasked with carrying out whatever technical requirements are in order post-acquisition.

But despite this, they’re rarely part of M&A tech strategy discussions and the synergy between enterprise architecture and mergers & acquisitions is overlooked. That should change.

With IT leaders involved from the start, they can work with the CFO and COO teams on assessing systems and providing advice on costs that might not otherwise be fully accounted for, such as systems and data integration.

Additionally, by leveraging mergers and acquisitions tools in the beginning, IT can provide a collaborative platform for business and technical stakeholders to get a complete view of their data and quickly visualize and assess what’s in place across companies, as well as what integrations, overlaps or other complexities exist.

This is why enterprise architecture for mergers and acquisitions is essential.

EA helps organizational alignment, providing a business-outcome perspective for IT and guiding transformation. It also helps a business define strategy and models, improving interdepartmental cohesion and communication.

Enterprise Architecture roadmaps can also be leveraged to provide a common focus throughout the company, and if existing roadmaps are in place, they can be modified to fit the new landscape.

EA aids in rooting out duplications in processes and operations, making the business more cost efficient on-the-whole.

Two Approaches to Enterprise Architecture

The Makeshift Approach

The first approach is more common in businesses with either no or a low-maturity enterprise architecture initiative. Smaller businesses often start out with this approach, as their limited operations and systems aren’t enough to justify real EA investment. Instead, businesses opt to repurpose tools they already have, such as the Microsoft Office Suite.

This comes with its advantages that mainly play out on a short-term basis, with the disadvantages only becoming apparent as the EA develops. For a start, the learning curve is typically smaller, as many people are already familiar with software, and the cost per license is relatively low when compared with built-for-purpose EA tools.

These short-term advantages will be eclipsed overtime as the organization’s EA grows. The adhoc Office tools approach to EA requires juggling a number of applications and formats that can stifle effectiveness.

Not only do the operations and systems become too numbered to manage this way, the disparity between formats prevents deep analysis. It also creates more work for the enterprise architect, as the disparate parts of the Office tools must be maintained separately when changes are made, to make sure everything is up to date.

This method also increases the likelihood that data is overlooked as key information is siloed, and it isn’t always clear which data set is behind any given door, disrupting efficiency and time to market.

It isn’t just data that siloed, though. The Office tools approach can isolate the EA department itself, from the wider business as the aforementioned disparities owed to the mis-matching formats can make collaborating with the wider business more difficult.

The Dedicated Approach

As an organization’s enterprise architecture grows, investing in dedicated EA tools becomes a necessity, making the transition just a matter of timing.

With a dedicated enterprise architecture tool, EA management is much easier. The data is all stored in one place, allowing for faster, deeper and more comprehensive analysis and comparison.

See also: Getting Started with Enterprise Architecture Tools

Collaboration also benefits from this approach, as having everything housed under one roof makes it far easier to share with stakeholders, decision-makers, C-level executives and other relevant parties.

Benefits of Enterprise Architecture for Mergers & Acquisitions

While organizational mergers can be fraught with many challenges. they don’t have to be so hard.

Enterprise architecture is essential to successful M&A. EA helps document and manage this complexity, turning all this data into meaningful insights.

It helps alignment by providing a business-outcome perspective for IT and guiding transformation. It also helps define strategy and models, improving interdepartmental cohesion and communication.

Roadmaps can be used to provide a common focus throughout the new company, and if existing roadmaps are in place, they can be modified to fit the new landscape.

erwin Evolve is a full-featured, configurable set of enterprise architecture and business process modeling and analysis tools. Use erwin Evolve to effectively tame complexity, manage change, and increase operational efficiency.

enterprise architecture business process

Categories
erwin Expert Blog

Business Process Modeling Use Case: Disaster Recovery

In these challenging times, many of our customers are focused on disaster recovery and business contingency planning.

Disaster recovery is not just an event but an entire process defined as identifying, preventing and restoring a loss of technology involving a high-availability, high-value asset in which services and data are in serious jeopardy.

Technical teams charged with maintaining and executing these processes require detailed tasks, and business process modeling is integral to their documentation.

erwin’s Evolve software is integral to modeling process flow requirements, but what about the technology side of the equation? What questions need answering regarding planning and executing disaster recovery measures?

  • Consumers and Dependencies: Who will be affected if an asset goes offline and for how long? How will consumer downtime adversely affect finances? What are the effects on systems if a dependent system crashes?
  • Interconnectivity: How are systems within the same ecosystem tied together, and what happens if one fails?
  • Hardware and Software: Which assets are at risk in the event of an outage? How does everything tie together if there is a break point?
  • Responsibility: Who are the technical and business owners of servers and enterprise applications? What are their roles in the case of a disastrous event?
  • Fail-Over: What exactly happens when a device fails? How long before the fail-over occurs, and which assets will activate in its place?

The erwin disaster recovery model answers these questions by capturing and displaying the relevant data. That data is then used to automatically render simple drawings that display either a current or target state for disaster recovery analysis.

Reports can be generated to gather more in-depth information. Other drawings can be rendered to show flow, plus how a break in the flow will affect other systems.

erwin Rapid Response Resource Center (ERRRC)

So what does an erwin disaster recovery model show?

The erwin model uses a layered ecosystem approach. We first define a company’s logical application ecosystems, which house tightly-coupled technologies and software.

  • For example, a company may have an erwin ecosystem deployed, which consists of various layers. A presentation layer will include web-based products, application layers holding the client software, data layers hosting the databases, etc.
  • Each layer is home to a deployment node, which is home to servers, datastores and software. Each node typically will contain a software component and its hosting server.
  • There are both production nodes and disaster recovery nodes.

Our diagrams and data provide answers such as:

  • Which production servers fail over to which disaster recovery servers
  • What effects an outage will have on dependent systems
  • Downtime metrics, including lost revenue and resources required for restoration
  • Hosting information that provides a detailed view of exactly what software is installed on which servers
  • Technology ownership, including both business and technology owners

The attached diagram is a server-to-server view designed to verify that the correct production to disaster recovery relationships exist (example: “prod fails over to DR”).  It also is used to identify gaps in case there are no DR servers in deployment (example: we filter for “deployed” servers only).

Other views can be generated to show business and technology owners, software, databases, etc.  They all are tied to the deployment nodes, which can be configured for various views. Detailed reports with server IP addresses, technical owners, software instances, and internal and external dependencies also can be generated.

You can try erwin Evolve for yourself and keep any content you produce should you decide to buy.

Our solution strategists and business process consultants also are available to help answer questions about your disaster recovery process modeling needs.

business process disaster recovery

Categories
erwin Expert Blog

Enterprise Architecture and Business Process Modeling Tools Have Evolved

Enterprise architecture (EA) and business process (BP) modeling tools are evolving at a rapid pace. They are being employed more strategically across the wider organization to transform some of business’s most important value streams.

Recently, Glassdoor named enterprise architecture the top tech job in the UK, indicating its increasing importance to the enterprise in the tech and data-driven world.

Whether documenting systems and technology, designing processes and value streams, or managing innovation and change, organizations need flexible but powerful EA and BP tools they can rely on for collecting relevant information for decision-making.

It’s like constructing a building or even a city – you need a blueprint to understand what goes where, how everything fits together to support the structure, where you have room to grow, and if it will be feasible to knock down any walls if you need to.

 

Data-Driven Enterprise Architecture

 

Without a picture of what’s what and the interdependencies, your enterprise can’t make changes at speed and scale to serve its needs.

Recognizing this evolution, erwin has enhanced and repackaged its EA/BP platform as erwin Evolve.

The combined solution enables organizations to map IT capabilities to the business functions they support and determine how people, processes, data, technologies and applications interact to ensure alignment in achieving enterprise objectives.

These initiatives can include digital transformation, cloud migration, portfolio and infrastructure rationalization, regulatory compliance, mergers and acquisitions, and innovation management.

Regulatory Compliance Through Enterprise Architecture & Business Process Modeling Software

A North American banking group is using erwin Evolve to integrate information across the organization and provide better governance to boost business agility. Developing a shared repository was key to aligning IT systems to accomplish business strategies, reducing the time it takes to make decisions, and accelerating solution delivery.

It also operationalizes and governs mission-critical information by making it available to the wider enterprise at the right levels to identify synergies and ensure the appropriate collaboration.

EA and BP modeling are both critical for risk management and regulatory compliance, a major concern for financial services customers like the one above when it comes to ever-changing regulations on money laundering, fraud and more. erwin helps model, manage and transform mission-critical value streams across industries, as well as identify sensitive information.

Additionally, when thousands of employees need to know what compliance processes to follow, such as those associated with regulations like the General Data Protection Regulation (GDPR), ensuring not only access to proper documentation but current, updated information is critical.

The Advantages of Enterprise Architecture & Business Process Modeling from erwin

The power to adapt the EA/BP platform leads global giants in critical infrastructure, financial services, healthcare, manufacturing and pharmaceuticals to deploy what is now erwin Evolve for both EA and BP use cases. Its unique advantages are:

  • Integrated, Web-Based Modeling & Diagramming: Harmonize EA/BP capabilities with a robust, flexible and web-based modeling and diagramming interface easy for all stakeholders to use.
  • High-Performance, Scalable & Centralized Repository: See an integrated set of views for EA and BP content in a central, enterprise-strength repository capable of supporting thousands of global users.
  • Configurable Platform with Role-Based Views: Configure the metamodel, frameworks and user interface for an integrated, single source of truth with different views for different stakeholders based on their roles and information needs.
  • Visualizations & Dashboards: View mission-critical data in the central repository in the form of user-friendly automated visualizations, dashboards and diagrams.
  • Third-Party Integrations: Synchronize data with such enterprise applications as CAST, Cloud Health, RSA Archer, ServiceNow and Zendesk.
  • Professional Services: Tap into the knowledge of our veteran EA and BP consultants for help with customizations and integrations, including support for ArchiMate.

erwin Evolve 2020’s specific enhancements include web-based diagramming for non-IT users, stronger document generation and analytics, TOGAF support, improved modeling and navigation through inferred relationships, new API extensions, and modular packaging so customers can choose the components that best meet their needs.

erwin Evolve is also part of the erwin EDGE with data modeling, data catalog and data literacy capabilities for overall data intelligence.

enterprise architecture business process

Categories
erwin Expert Blog

Data Governance Makes Data Security Less Scary

Happy Halloween!

Do you know where your data is? What data you have? Who has had access to it?

These can be frightening questions for an organization to answer.

Add to the mix the potential for a data breach followed by non-compliance, reputational damage and financial penalties and a real horror story could unfold.

In fact, we’ve seen some frightening ones play out already:

  1. Google’s record GDPR fine – France’s data privacy enforcement agency hit the tech giant with a $57 million penalty in early 2019 – more than 80 times the steepest fine the U.K.’s Information Commissioner’s Office had levied against both Facebook and Equifax for their data breaches.
  2. In July 2019, British Airways received the biggest GDPR fine to date ($229 million) because the data of more than 500,000 customers was compromised.
  3. Marriot International was fined $123 million, or 1.5 percent of its global annual revenue, because 330 million hotel guests were affected by a breach in 2018.

Now, as Cybersecurity Awareness Month comes to a close – and ghosts and goblins roam the streets – we thought it a good time to resurrect some guidance on how data governance can make data security less scary.

We don’t want you to be caught off guard when it comes to protecting sensitive data and staying compliant with data regulations.

Data Governance Makes Data Security Less Scary

Don’t Scream; You Can Protect Your Sensitive Data

It’s easier to protect sensitive data when you know what it is, where it’s stored and how it needs to be governed.

Data security incidents may be the result of not having a true data governance foundation that makes it possible to understand the context of data – what assets exist and where, the relationship between them and enterprise systems and processes, and how and by what authorized parties data is used.

That knowledge is critical to supporting efforts to keep relevant data secure and private.

Without data governance, organizations don’t have visibility of the full data landscape – linkages, processes, people and so on – to propel more context-sensitive security architectures that can better assure expectations around user and corporate data privacy. In sum, they lack the ability to connect the dots across governance, security and privacy – and to act accordingly.

This addresses these fundamental questions:

  1. What private data do we store and how is it used?
  2. Who has access and permissions to the data?
  3. What data do we have and where is it?

Where Are the Skeletons?

Data is a critical asset used to operate, manage and grow a business. While sometimes at rest in databases, data lakes and data warehouses; a large percentage is federated and integrated across the enterprise, introducing governance, manageability and risk issues that must be managed.

Knowing where sensitive data is located and properly governing it with policy rules, impact analysis and lineage views is critical for risk management, data audits and regulatory compliance.

However, when key data isn’t discovered, harvested, cataloged, defined and standardized as part of integration processes, audits may be flawed and therefore your organization is at risk.

Sensitive data – at rest or in motion – that exists in various forms across multiple systems must be automatically tagged, its lineage automatically documented, and its flows depicted so that it is easily found and its usage across workflows easily traced.

Thankfully, tools are available to help automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data: Better visibility and control across the enterprise to identify data security threats and reduce associated risks
  • Enriching business data elements for sensitive data discovery: Comprehensively defining business data element for PII, PHI and PCI across database systems, cloud and Big Data stores to easily identify sensitive data based on a set of algorithms and data patterns
  • Providing metadata and value-based analysis: Discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data including PII, PHI, PCI and other sensitive information.

No Hocus Pocus

Truly understanding an organization’s data, including its value and quality, requires a harmonized approach embedded in business processes and enterprise architecture.

Such an integrated enterprise data governance experience helps organizations understand what data they have, where it is, where it came from, its value, its quality and how it’s used and accessed by people and applications.

An ounce of prevention is worth a pound of cure  – from the painstaking process of identifying what happened and why to notifying customers their data and thus their trust in your organization has been compromised.

A well-formed security architecture that is driven by and aligned by data intelligence is your best defense. However, if there is nefarious intent, a hacker will find a way. So being prepared means you can minimize your risk exposure and the damage to your reputation.

Multiple components must be considered to effectively support a data governance, security and privacy trinity. They are:

  1. Data models
  2. Enterprise architecture
  3. Business process models

Creating policies for data handling and accountability and driving culture change so people understand how to properly work with data are two important components of a data governance initiative, as is the technology for proactively managing data assets.

Without the ability to harvest metadata schemas and business terms; analyze data attributes and relationships; impose structure on definitions; and view all data in one place according to each user’s role within the enterprise, businesses will be hard pressed to stay in step with governance standards and best practices around security and privacy.

As a consequence, the private information held within organizations will continue to be at risk.

Organizations suffering data breaches will be deprived of the benefits they had hoped to realize from the money spent on security technologies and the time invested in developing data privacy classifications.

They also may face heavy fines and other financial, not to mention PR, penalties.

Gartner Magic Quadrant Metadata Management

Categories
erwin Expert Blog

Very Meta … Unlocking Data’s Potential with Metadata Management Solutions

Untapped data, if mined, represents tremendous potential for your organization. While there has been a lot of talk about big data over the years, the real hero in unlocking the value of enterprise data is metadata, or the data about the data.

However, most organizations don’t use all the data they’re flooded with to reach deeper conclusions about how to drive revenue, achieve regulatory compliance or make other strategic decisions. They don’t know exactly what data they have or even where some of it is.

Quite honestly, knowing what data you have and where it lives is complicated. And to truly understand it, you need to be able to create and sustain an enterprise-wide view of and easy access to underlying metadata.

This isn’t an easy task. Organizations are dealing with numerous data types and data sources that were never designed to work together and data infrastructures that have been cobbled together over time with disparate technologies, poor documentation and with little thought for downstream integration.

As a result, the applications and initiatives that depend on a solid data infrastructure may be compromised, leading to faulty analysis and insights.

Metadata Is the Heart of Data Intelligence

A recent IDC Innovators: Data Intelligence Report says that getting answers to such questions as “where is my data, where has it been, and who has access to it” requires harnessing the power of metadata.

Metadata is generated every time data is captured at a source, accessed by users, moves through an organization, and then is profiled, cleansed, aggregated, augmented and used for analytics to guide operational or strategic decision-making.

In fact, data professionals spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analysis, according to IDC.

To flip this 80/20 rule, they need an automated metadata management solution for:

• Discovering data – Identify and interrogate metadata from various data management silos.
• Harvesting data – Automate the collection of metadata from various data management silos and consolidate it into a single source.
• Structuring and deploying data sources – Connect physical metadata to specific data models, business terms, definitions and reusable design standards.
• Analyzing metadata – Understand how data relates to the business and what attributes it has.
• Mapping data flows – Identify where to integrate data and track how it moves and transforms.
• Governing data – Develop a governance model to manage standards, policies and best practices and associate them with physical assets.
• Socializing data – Empower stakeholders to see data in one place and in the context of their roles.

Addressing the Complexities of Metadata Management

The complexities of metadata management can be addressed with a strong data management strategy coupled with metadata management software to enable the data quality the business requires.

This encompasses data cataloging (integration of data sets from various sources), mapping, versioning, business rules and glossary maintenance, and metadata management (associations and lineage).

erwin has developed the only data intelligence platform that provides organizations with a complete and contextual depiction of the entire metadata landscape.

It is the only solution that can automatically harvest, transform and feed metadata from operational processes, business applications and data models into a central data catalog and then made accessible and understandable within the context of role-based views.

erwin’s ability to integrate and continuously refresh metadata from an organization’s entire data ecosystem, including business processes, enterprise architecture and data architecture, forms the foundation for enterprise-wide data discovery, literacy, governance and strategic usage.

Organizations then can take a data-driven approach to business transformation, speed to insights, and risk management.
With erwin, organizations can:

1. Deliver a trusted metadata foundation through automated metadata harvesting and cataloging
2. Standardize data management processes through a metadata-driven approach
3. Centralize data-driven projects around centralized metadata for planning and visibility
4. Accelerate data preparation and delivery through metadata-driven automation
5. Master data management platforms through metadata abstraction
6. Accelerate data literacy through contextual metadata enrichment and integration
7. Leverage a metadata repository to derive lineage, impact analysis and enable audit/oversight ability

With erwin Data Intelligence as part of the erwin EDGE platform, you know what data you have, where it is, where it’s been and how it transformed along the way, plus you can understand sensitivities and risks.

With an automated, real-time, high-quality data pipeline, enterprise stakeholders can base strategic decisions on a full inventory of reliable information.

Many of our customers are hard at work addressing metadata management challenges, and that’s why erwin was Named a Leader in Gartner’s “2019 Magic Quadrant for Metadata Management Solutions.”

Gartner Magic Quadrant Metadata Management

Categories
erwin Expert Blog

Business Process Can Make or Break Data Governance

Data governance isn’t a one-off project with a defined endpoint. It’s an on-going initiative that requires active engagement from executives and business leaders.

Data governance, today, comes back to the ability to understand critical enterprise data within a business context, track its physical existence and lineage, and maximize its value while ensuring quality and security.

Free Data Modeling Best Practice Guide

Historically, little attention has focused on what can literally make or break any data governance initiative — turning it from a launchpad for competitive advantage to a recipe for disaster. Data governance success hinges on business process modeling and enterprise architecture.

To put it even more bluntly, successful data governance* must start with business process modeling and analysis.

*See: Three Steps to Successful & Sustainable Data Governance Implementation

Business Process Data Governance

Passing the Data Governance Ball

For years, data governance was the volleyball passed back and forth over the net between IT and the business, with neither side truly owning it. However, once an organization understands that IT and the business are both responsible for data, it needs to develop a comprehensive, holistic strategy for data governance that is capable of four things:

  1. Reaching every stakeholder in the process
  2. Providing a platform for understanding and governing trusted data assets
  3. Delivering the greatest benefit from data wherever it lives, while minimizing risk
  4. Helping users understand the impact of changes made to a specific data element across the enterprise.

To accomplish this, a modern data governance strategy needs to be interdisciplinary to break down traditional silos. Enterprise architecture is important because it aligns IT and the business, mapping a company’s applications and the associated technologies and data to the business functions and value streams they enable.

Ovum Market Radar: Enterprise Architecture

The business process and analysis component is vital because it defines how the business operates and ensures employees understand and are accountable for carrying out the processes for which they are responsible. Enterprises can clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

Slow Down, Ask Questions

In a rush to implement a data governance methodology and system, organizations can forget that a system must serve a process – and be governed/controlled by one.

To choose the correct system and implement it effectively and efficiently, you must know – in every detail – all the processes it will impact. You need to ask these important questions:

  1. How will it impact them?
  2. Who needs to be involved?
  3. When do they need to be involved?

These questions are the same ones we ask in data governance. They involve impact analysis, ownership and accountability, control and traceability – all of which effectively documented and managed business processes enable.

Data sets are not important in and of themselves. Data sets become important in terms of how they are used, who uses them and what their use is – and all this information is described in the processes that generate, manipulate and use them. So unless we know what those processes are, how can any data governance implementation be complete or successful?

Processes need to be open and shared in a concise, consistent way so all parts of the organization can investigate, ask questions, and then add their feedback and information layers. In other words, processes need to be alive and central to the organization because only then will the use of data and data governance be truly effective.

A Failure to Communicate

Consider this scenario: We’ve perfectly captured our data lineage, so we know what our data sets mean, how they’re connected, and who’s responsible for them – not a simple task but a massive win for any organization. Now a breach occurs. Will any of the above information tell us why it happened? Or where? No! It will tell us what else is affected and who can manage the data layer(s), but unless we find and address the process failure that led to the breach, it is guaranteed to happen again.

By knowing where data is used – the processes that use and manage it – we can quickly, even instantly, identify where a failure occurs. Starting with data lineage (meaning our forensic analysis starts from our data governance system), we can identify the source and destination processes and the associated impacts throughout the organization.

We can know which processes need to change and how. We can anticipate the pending disruptions to our operations and, more to the point, the costs involved in mitigating and/or addressing them.

But knowing all the above requires that our processes – our essential and operational business architecture – be accurately captured and modelled. Instituting data governance without processes is like building a castle on sand.

Rethinking Business Process Modeling and Analysis

Modern organizations need a business process modeling and analysis tool with easy access to all the operational layers across the organization – from high-level business architecture all the way down to data.

Such a system should be flexible, adjustable, easy-to-use and capable of supporting multiple layers simultaneously, allowing users to start in their comfort zones and mature as they work toward their organization’s goals.

The erwin EDGE is one of the most comprehensive software platforms for managing an organization’s data governance and business process initiatives, as well as the whole data architecture. It allows natural, organic growth throughout the organization and the assimilation of data governance and business process management under the same platform provides a unique data governance experience because of its integrated, collaborative approach.

Start your free, cloud-based trial of erwin Business Process and see how some of the world’s largest enterprises have benefited from its centralized repository and integrated, role-based views.

We’d also be happy to show you our data governance software, which includes data cataloging and data literacy capabilities.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Internal Business Process Modeling: The Secret Behind Exponential Organizations

Strong internal business process modeling and management helps data-driven organizations compete and lead

In short, an internal business process is a documented account of how things should be done to maximize efficiency and achieve a particular goal.

In the book “Exponential Organizations” by Salim Ismail, Michael S. Malone and Yuri van Geest, the authors, examine how every company is or will evolve into an information-based entity in which costs fall to nearly zero, abundance replaces scarcity and only “exponential organizations” survive.

It’s not news that exponential organizations like Uber, Airbnb and Netflix have flipped the script on disrupting traditional industries like taxis, hotels and video rentals/TV viewing.

But now, even traditional industries like healthcare and financial services, which were historically slow to innovate, are transforming at breakneck speed.

Let’s face it, in today’s hyper-competitive markets, the traditional approach of relying on legacy strengths or inertia for survival just simply won’t work.

The days of enterprises focusing almost exclusively on rigid structures, centralized management and accountability; concentrated knowledge; service mainly to external customers; and reactive, short-term strategy alignment driven mainly by massive-scale projects are antiquated.

The information within your organization’s internal business processes is where the data your company collects, creates, stores and analyzes actually transforms into something that makes your company go, hopefully for the long haul.

Internal Business Process Modeling - Exponential Organizations

The Value of Internal Business Process Modeling

Organizations are built on a series of internal business processes. The complexity of modern data-driven organizations requires processes to work in tandem to create and sustain value.

The degree to which any individual internal business process drives value can vary, but even the most seemingly mundane processes are part of a collective sum, greater than its parts.

Therefore, it’s critical for organizations to map their internal business processes to understand how a given action relates to the organizations’ overall strategy and goals.

Such knowledge is at the core of exponential organizations. They understand how any given internal business process relates to value creation, making it far easier to assess what’s currently working but also identify areas for improvement as well as the potential for competitive differentiation.

Exponential organizations also are better positioned to respond and adapt to disruptive forces, such as 5G. This is because understanding what and how you do things now makes it easier to implement change in an agile manner.

5G Roadmap: Preparing Your Enterprise Architecture

How do you join the ranks of exponential organizations? And where do you begin your journey to becoming an information-based entity?

Attitude Adjustment

More and more organizations are realizing they need to adjust their traditional thinking and subsequent actions, even if just a bit, to gain strategic advantage, reduce costs and retain market dominance. For example:

  1. Structures are becoming more adaptable, allowing for greater flexibility and cost management. How is this possible and why now? Organizations are grasping that effective, well-managed and documented internal business processes should form their operational backbones.
  2. Business units and the departments within them are becoming accountable not only for their own budgets but also on how well they achieve their goals. This is possible because their responsibilities and processes can be clearly defined, documented and then monitored to ensure their work is executed in a repeatable, predictable and measurable way.
  3. Knowledge is now both centralized and distributed thanks to modern knowledge management systems. Central repositories and collaborative portals give everyone within the organization equal access to the data they need to do their jobs more effectively and efficiently.
  4. And thanks to all the above, organizations can expand their focus from external customers to internal ones as well. By clearly identifying individual processes (and their cross-business handover points) and customer touchpoints, organizations can interact with any customer at the right point with the most appropriate resources.

Benefits of Internal Business Process Modeling and Management

One of the main benefits of a process-based organizational engine is that it should be able to better handle outside pressures, such as new regulations, if they are – or are becoming – truly process-based. Because once processes (and their encompassing business architecture) become central to the organization, a wide array of things become simpler, faster and cheaper.

Another benefit is application design – the holy grail or black hole of budgetary spending and project management, depending on your point of view – is streamlined, with requirements clearly gathered and managed in perfect correspondence to the processes they serve and with the data they manage clearly documented and communicated to the developers.

Testing occurs against real-life scenarios by the responsible parties as documented by the process owners – a drastic departure from the more traditional approaches in which the responsibility fell to designated, usually technical application owners.

Finally – and most important – data governance is no longer the isolated domain of data architects but central to the everyday processes that make an organization tick. As processes have stakeholders who use information – data – the roles of technical owners and data stewards become integral to ensuring processes operate efficiently, effectively and – above all – without interruptions. On the other side of this coin, data owners and data stewards no longer operate in their own worlds, distant from the processes their data supports.

Carpe Process

All modern organizations should seize business process as a central component to their operations. Data governance as well, and cost management becoming a third driver for the enterprise machine. But as we all know, it takes more than stable connecting rods to make an engine work – it needs cogs and wheels, belts and multiple power sources, all working together.

In the traditional organization, people are the internal mechanics. These days, powerful and flexible workflow engines provide much-needed automation for greater visibility plus more power, stability and quality – all the things a machine needs to operate as required/designed.

Advanced process management systems are becoming essential, not optional. And while not as sexy or attention-grabbing as other technologies, they provide the power to drive an organization toward its goals quickly, cost-effectively and efficiently.

To learn how erwin can empower a modern, process-based organization, please click here.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Constructing a Digital Transformation Strategy: Putting the Data in Digital Transformation

Having a clearly defined digital transformation strategy is an essential best practice for successful digital transformation. But what makes a digital transformation strategy viable?

Part Two of the Digital Transformation Journey …

In our last blog on driving digital transformation, we explored how business architecture and process (BP) modeling are pivotal factors in a viable digital transformation strategy.

EA and BP modeling squeeze risk out of the digital transformation process by helping organizations really understand their businesses as they are today. It gives them the ability to identify what challenges and opportunities exist, and provides a low-cost, low-risk environment to model new options and collaborate with key stakeholders to figure out what needs to change, what shouldn’t change, and what’s the most important changes are.

Once you’ve determined what part(s) of your business you’ll be innovating — the next step in a digital transformation strategy is using data to get there.

Digital Transformation Examples

Constructing a Digital Transformation Strategy: Data Enablement

Many organizations prioritize data collection as part of their digital transformation strategy. However, few organizations truly understand their data or know how to consistently maximize its value.

If your business is like most, you collect and analyze some data from a subset of sources to make product improvements, enhance customer service, reduce expenses and inform other, mostly tactical decisions.

The real question is: are you reaping all the value you can from all your data? Probably not.

Most organizations don’t use all the data they’re flooded with to reach deeper conclusions or make other strategic decisions. They don’t know exactly what data they have or even where some of it is, and they struggle to integrate known data in various formats and from numerous systems—especially if they don’t have a way to automate those processes.

How does your business become more adept at wringing all the value it can from its data?

The reality is there’s not enough time, people and money for true data management using manual processes. Therefore, an automation framework for data management has to be part of the foundations of a digital transformation strategy.

Your organization won’t be able to take complete advantage of analytics tools to become data-driven unless you establish a foundation for agile and complete data management.

You need automated data mapping and cataloging through the integration lifecycle process, inclusive of data at rest and data in motion.

An automated, metadata-driven framework for cataloging data assets and their flows across the business provides an efficient, agile and dynamic way to generate data lineage from operational source systems (databases, data models, file-based systems, unstructured files and more) across the information management architecture; construct business glossaries; assess what data aligns with specific business rules and policies; and inform how that data is transformed, integrated and federated throughout business processes—complete with full documentation.

Without this framework and the ability to automate many of its processes, business transformation will be stymied. Companies, especially large ones with thousands of systems, files and processes, will be particularly challenged by taking a manual approach. Outsourcing these data management efforts to professional services firms only delays schedules and increases costs.

With automation, data quality is systemically assured. The data pipeline is seamlessly governed and operationalized to the benefit of all stakeholders.

Constructing a Digital Transformation Strategy: Smarter Data

Ultimately, data is the foundation of the new digital business model. Companies that have the ability to harness, secure and leverage information effectively may be better equipped than others to promote digital transformation and gain a competitive advantage.

While data collection and storage continues to happen at a dramatic clip, organizations typically analyze and use less than 0.5 percent of the information they take in – that’s a huge loss of potential. Companies have to know what data they have and understand what it means in common, standardized terms so they can act on it to the benefit of the organization.

Unfortunately, organizations spend a lot more time searching for data rather than actually putting it to work. In fact, data professionals spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analysis, according to IDC.

The solution is data intelligence. It improves IT and business data literacy and knowledge, supporting enterprise data governance and business enablement.

It helps solve the lack of visibility and control over “data at rest” in databases, data lakes and data warehouses and “data in motion” as it is integrated with and used by key applications.

Organizations need a real-time, accurate picture of the metadata landscape to:

  • Discover data – Identify and interrogate metadata from various data management silos.
  • Harvest data – Automate metadata collection from various data management silos and consolidate it into a single source.
  • Structure and deploy data sources – Connect physical metadata to specific data models, business terms, definitions and reusable design standards.
  • Analyze metadata – Understand how data relates to the business and what attributes it has.
  • Map data flows – Identify where to integrate data and track how it moves and transforms.
  • Govern data – Develop a governance model to manage standards, policies and best practices and associate them with physical assets.
  • Socialize data – Empower stakeholders to see data in one place and in the context of their roles.

The Right Tools

When it comes to digital transformation (like most things), organizations want to do it right. Do it faster. Do it cheaper. And do it without the risk of breaking everything. To accomplish all of this, you need the right tools.

The erwin Data Intelligence (DI) Suite is the heart of the erwin EDGE platform for creating an “enterprise data governance experience.” erwin DI combines data cataloging and data literacy capabilities to provide greater awareness of and access to available data assets, guidance on how to use them, and guardrails to ensure data policies and best practices are followed.

erwin Data Catalog automates enterprise metadata management, data mapping, reference data management, code generation, data lineage and impact analysis. It efficiently integrates and activates data in a single, unified catalog in accordance with business requirements. With it, you can:

  • Schedule ongoing scans of metadata from the widest array of data sources.
  • Keep metadata current with full versioning and change management.
  • Easily map data elements from source to target, including data in motion, and harmonize data integration across platforms.

erwin Data Literacy provides self-service, role-based, contextual data views. It also provides a business glossary for the collaborative definition of enterprise data in business terms, complete with built-in accountability and workflows. With it, you can:

  • Enable data consumers to define and discover data relevant to their roles.
  • Facilitate the understanding and use of data within a business context.
  • Ensure the organization is fluent in the language of data.

With data governance and intelligence, enterprises can discover, understand, govern and socialize mission-critical information. And because many of the associated processes can be automated, you reduce errors and reliance on technical resources while increasing the speed and quality of your data pipeline to accomplish whatever your strategic objectives are, including digital transformation.

Check out our latest whitepaper, Data Intelligence: Empowering the Citizen Analyst with Democratized Data.

Data Intelligence: Empowering the Citizen Analyst with Democratized Data