Categories
erwin Expert Blog

What’s Business Process Modeling Got to Do with It? – Choosing A BPM Tool

With business process modeling (BPM) being a key component of data governance, choosing a BPM tool is part of a dilemma many businesses either have or will soon face.

Historically, BPM didn’t necessarily have to be tied to an organization’s data governance initiative.

However, data-driven business and the regulations that oversee it are becoming increasingly extensive, so the need to view data governance as a collective effort – in terms of personnel and the tools that make up the strategy – is becoming harder to ignore.

Data governance also relies on business process modeling and analysis to drive improvement, including identifying business practices susceptible to security, compliance or other risks and adding controls to mitigate exposures.

Choosing a BPM Tool: An Overview

As part of a data governance strategy, a BPM tool aids organizations in visualizing their business processes, system interactions and organizational hierarchies to ensure elements are aligned and core operations are optimized.

The right BPM tool also helps organizations increase productivity, reduce errors and mitigate risks to achieve strategic objectives.

With  insights from the BPM tool, you can clarify roles and responsibilities – which in turn should influence an organization’s policies about data ownership and make data lineage easier to manage.

Organizations also can use a BPM tool to identify the staff who function as “unofficial data repositories.” This has both a primary and secondary function:

1. Organizations can document employee processes to ensure vital information isn’t lost should an employee choose to leave.

2. It is easier to identify areas where expertise may need to be bolstered.

Organizations that adopt a BPM tool also enjoy greater process efficiency. This is through a combination of improving existing processes or designing new process flows, eliminating unnecessary or contradictory steps, and documenting results in a shareable format that is easy to understand so the organization is pulling in one direction.

Choosing a BPM Tool

Silo Buster

Understanding the typical use cases for business process modeling is the first step. As with any tech investment, it’s important to understand how the technology will work in the context of your organization/business.

For example, it’s counter-productive to invest in a solution that reduces informational silos only to introduce a new technological silo through a lack of integration.

Ideally, organizations want a BPM tool that works in conjunction with the wider data management platform and data governance initiative – not one that works against them.

That means it must support data imports and integrations from/with external sources, a solution that enables in-tool collaboration to reduce departmental silos, and most crucial, a solution that taps into a central metadata repository to ensure consistency across the whole data management and governance initiatives.

The lack of a central metadata repository is a far too common thorn in an organization’s side. Without it, they have to juggle multiple versions as changes to the underlying data aren’t automatically updated across the platform.

It also means organizations waste crucial time manually manufacturing and maintaining data quality, when an automation framework could achieve the same goal instantaneously, without human error and with greater consistency.

A central metadata repository ensures an organization can acknowledge and get behind a single source of truth. This has a wealth of favorable consequences including greater cohesion across the organization, better data quality and trust, and faster decision-making with less false starts due to plans based on misleading information.

Three Key Questions to Ask When Choosing a BPM Tool

Organizations in the market for a BPM tool should also consider the following:

1. Configurability: Does the tool support the ability to model and analyze business processes with links to data, applications and other aspects of your organization? And how easy is this to achieve?

2. Role-based views: Can the tool develop integrated business models for a single source of truth but with different views for different stakeholders based on their needs – making regulatory compliance more manageable? Does it enable cross-functional and enterprise collaboration through discussion threads, surveys and other social features?

3. Business and IT infrastructure interoperability: How well does the tool integrate with other key components of data governance including enterprise architecture, data modeling, data cataloging and data literacy? Can it aid in providing data intelligence to connect all the pieces of the data management and governance lifecycles?

For more information and to find out how such a solution can integrate with your organization and current data management and data governance initiatives, click here.

BPM Tool - erwin BP powered by Casewise

Categories
erwin Expert Blog

Continuous Business Improvement Depends on Data Governance

In my last post, I explained why organizations need to consider data as an asset rather than a cost center. When we deem something to be valuable, we then need to determine how and when we’ll use it as well as secure it. We do this by establishing standards, policies and processes to define how this asset will be utilized and protected.

Let’s look at the example of an office building. Furniture and equipment are inventoried and tracked. Employees are trained on safety and security, with some developing expertise in the use of specialized equipment. Office managers know which conference rooms and desks are available for use and their locations.

Keeping this office building clean, secure, comfortable and well organized adds to the productivity of its occupants.

Without such office governance, this office building could become unsafe, unsecure, unproductive and underutilized. Do you see the parallel between this office asset example and your data? Transforming data into an asset also relies on effective data governance.

Starting a Continuous Improvement Journey

Continuous improvement

Successful data-driven companies embrace and implement continuous improvement activities to enhance results, providing a structured approach for business improvement projects. Steps include problem identification, data collection, root-cause analysis, planning process changes, implementing the changes and monitoring the results. This cycle is known as the Plan-Do-Check-Act cycle of continuous improvement, or PDCA.

Organizations committed to a continuous improvement culture, based on the PDCA cycle, depend heavily on data at every step.  Business problems can be defined in terms of waste, delays and re-work. These problems need to be quantified with actual measurements to help analysis teams detect and prioritize the next set of improvement activities.

After improvement activities have been completed, it’s important to monitor the results through feedback. It provides evidence of success, and it also helps improvement teams learn about the processes on which to focus.

Data collected about improvement processes will show symptoms of inefficiencies and waste. The analysis team then carries out root-cause analysis to determine the “levers” that can be adjusted to reduce them.  Assumptions and hypotheses will be tested and validated to find the real forces at play so the appropriate management and operational levers can be adjusted accordingly.

Scaling and Sustaining the Improvement Cycle

Companies that implement a PDCA cycle of continuous improvement realize there will be challenges in scaling and sustaining the program across multiple business areas over time.

Data collection can be tedious, especially if the associated data management activities require significant manual activity. It is common that the data available from operating databases has many problems related to quality, security, confidence, accessibility and overall understanding. These are all roadblocks the will delay the improvement activities.

If data isn’t readily available, accessible, trusted or understandable, the analysis and improvement teams can’t do their jobs effectively. This will lead to a slowdown in momentum or cause companies to abandon the improvement approach altogether.  The necessary data to drive the improvement cycle must be in an “asset class” form to sustain the improvement cycle.

Scaling the PDCA cycle involves multiple teams working in different business areas to broaden the reach of the improvement activities. Processes for finance, human resources, operations, sales, supply chain, customer service and IT may all be under analysis and evaluation.

The path to operational excellence is based on the ability to scale and sustain continuous improvement.

How Data Governance Supports the Improvement Cycle

Consider a utility company that operates a physical network delivering energy to customers. The executive team wants to reduce the time it takes for newly constructed assets to go online and reap the financial benefits of commissioning them for service more quickly.

The business improvement team starts gathering performance data from previous construction projects to determine potential areas of improvement.

They soon realize a new work management system was implemented, and the conversion of historical construction data was deemed as “non-critical” to keep the project on schedule and in budget.

The implementation team didn’t view the historical construction data as valuable from an operational perspective, so they archived it rather than covert it to the new system. This decision was made within the context of a “local” project without considering the larger analytics needs of the company.

Unfortunately, data governance was not understood or in place at this utility. If it were, the historical construction data would have been cleansed and converted as part of the new work management system’s deployment. This company failed to recognize this data as an asset with downstream analytics applications.

In this example, the decision not to convert historical data was based on managing cost at the project level. A data investment was not considered. But well-governed data is a true asset. Quality, accessibility, timeliness and understandability are fundamental to the productivity and sustainability of continuous improvement processes.

If your company is implementing any form of program to improve results, such as specialized management systems, balanced scorecards, lean management concepts, Six-Sigma or total quality management, data governance sits is at the core of long-term, sustainable success.

Improvement programs require motivation, energy and commitment at all levels of the organization. To maintain momentum, governed data assets are the key enabler, making it easier and faster to detect and diagnose problems, improve processes and validate results. There’s a direct link between the quality of improvement programs and the data assets that power them.

erwin blog