Categories
erwin Expert Blog

The Intersection of Innovation, Enterprise Architecture and Project Delivery

The only thing that’s constant for most organizations is change. Today there’s an unprecedented, rapid rate of change across all industry sectors, even those that have been historically slow to innovate like healthcare and financial services.

In the past, managing ideation to the delivery of innovation was either not done or was relegated within organizational silos, creating a disconnect across the business. This, in turn, resulted in change not being implemented properly or a focus on the wrong type of change.

For an organization to successfully embrace change, innovation, enterprise architecture and project delivery need to be intertwined and traceable.

Enterprise Architecture Helps Bring Ideas to Life

Peter Drucker famously declared “innovate or die.” But where do you start?

Many companies start with campaigns and ideation. They run challenges and solicit ideas both from inside and outside their walls. Ideas are then prioritized and evaluated. Sometimes prototypes are built and tested, but what happens next?

Organizations often turn to the blueprints or roadmaps generated by their enterprise architectures, IT architectures and or business process architectures for answers. They evaluate how a new idea and its supporting technology, such as service-oriented architecture (SOA) or enterprise-resource planning (ERP), fits into the broader architecture. They manage their technology portfolio by looking at their IT infrastructure needs.

A lot of organizations form program management boards to evaluate ideas, initiatives and their costs. In reality, these evaluations are based on lightweight business cases without broader context. They don’t have a comprehensive understanding of what systems, processes and resources they have, what they are being used for, how much they cost, and the effects of regulations.

Projects are delivered and viewed on an individual basis without regard for the bigger picture. Enterprise-, technology- and process-related decisions are made within the flux of change and without access to the real knowledge contained within the organization or in the marketplace. All too often, IT is ultimately in the hot seat of this type of decision-making.

5 Questions to Ask of Enterprise Architecture

The Five EA Questions IT Needs to Ask

While IT planning should be part of a broader enterprise architecture or market analysis, IT involvement in technology investments is often done close to the end of the strategic planning process and without proper access to enterprise or market data.

The following five questions illustrate the competing demands found within the typical IT environment:

  1. How can we manage the prioritization of business-, architectural-and project-driven initiatives?

Stakeholders place a large number of tactical and strategic requirements on IT. IT is required to offer different technology investment options but is often constrained by a competition for resources.

  1. How do we balance enterprise architecture’s role with IT portfolio management?

An enterprise architect provides a high-level view of the risks and benefits of a project and the alignment to future goals. It can illustrate the project complexities and the impact of change. Future-state architectures and transition plans can be used to define investment portfolio content. At the same time, portfolio management provides a detailed perspective of development and implementation. Balancing these often-competing viewpoints can be tricky.

  1. How well are application lifecycles being managed?

Application management requires a product/service/asset view over time. Well-managed application lifecycles demand a process of continuous releases, especially when time to market is key. The higher-level view required by portfolio management provides a broader perspective of how all assets work together. Balancing application lifecycle demands against a broader portfolio framework can present an inherent conflict about priorities and a struggle for resources.

  1. How do we manage the numerous and often conflicting governance requirements across the delivery process?

As many organizations move to small-team agile development, coordinating the various application development projects becomes more difficult. Managing the development process using waterfall methods can shorten schedules but also can increase the chance of errors and a disconnect with broader portfolio and enterprise goals.

  1. How do we address different lifecycles and tribes in the organization?

Lifecycles such as innovation management, enterprise architecture, business process management and solution delivery are all necessary but are not harmonized across the enterprise. The connection among these lifecycles is important to the effective delivery of initiatives and understanding the impact of change.

The Business Value of Enterprise Architecture

Enterprise architects are crucial to delivering innovation. However, all too often, enterprise architecture has been executed by IT groups for IT groups and has involved the idea that everything in the current state has to be drawn and modeled before you can start to derive value. This approach has wasted effort, taken too long to show results, and provided insufficient added value to the organization.

Enterprise and data architects who relate what they are doing back to what the C-suite really wants find it easier to get budget and stay relevant. It’s important to remember that enterprise architecture is about smarter decision-making, enabling management to make decisions more quickly because they have access to the right information in the right format at the right time. Of course, focusing on future state (desired business outcome) first, helps to reduce the scope of current-state analysis and speed up the delivery of value.

Data Management and Data Governance: Solving the Enterprise Data Dilemma

Categories
erwin Expert Blog

Managing Emerging Technology Disruption with Enterprise Architecture

Emerging technology has always played an important role in business transformation. In the race to collect and analyze data, provide superior customer experiences, and manage resources, new technologies always interest IT and business leaders.

KPMG’s The Changing Landscape of Disruptive Technologies found that today’s businesses are showing the most interest in emerging technology like the Internet of Things (IoT), artificial intelligence (AI) and robotics. Other emerging technologies that are making headlines include natural language processing (NLP) and blockchain.

In many cases, emerging technologies such as these are not fully embedded into business environments. Before they enter production, organizations need to test and pilot their projects to help answer some important questions:

  • How do these technologies disrupt?
  • How do they provide value?

Enterprise Architecture’s Role in Managing Emerging Technology

Pilot projects that take a small number of incremental steps, with small funding increases along the way, help provide answers to these questions. If the pilot proves successful, it’s then up to the enterprise architecture team to explore what it takes to integrate these technologies into the IT environment.

This is the point where new technologies go from “emerging technologies” to becoming another solution in the stack the organization relies on to create the business outcomes it’s seeking.

One of the easiest, quickest ways to try to pilot and put new technologies into production is to use cloud-based services. All of the major public cloud platform providers have AI and machine learning capabilities.

Integrating new technologies based in the cloud will change the way the enterprise architecture team models the IT environment, but that’s actually a good thing.

Modeling can help organizations understand the complex integrations that bring cloud services into the organization, and help them better understand the service level agreements (SLAs), security requirements and contracts with cloud partners.

When done right, enterprise architecture modeling also will help the organization better understand the value of emerging technology and even cloud migrations that increasingly accompany them. Once again, modeling helps answer important questions, such as:

  • Does the model demonstrate the benefits that the business expects from the cloud?
  • Do the benefits remain even if some legacy apps and infrastructure need to remain on premise?
  • What type of savings do you see if you can’t consolidate enough close an entire data center?
  • How does the risk change?

Many of the emerging technologies garnering attention today are on their way to becoming a standard part of the technology stack. But just as the web came before mobility, and mobility came before AI,  other technologies will soon follow in their footsteps.

To most efficiently evaluate these technologies and decide if they are right for the business, organizations need to provide visibility to both their enterprise architecture and business process teams so everyone understands how their environment and outcomes will change.

When the enterprise architecture and business process teams use a common platform and model the same data, their results will be more accurate and their collaboration seamless. This will cut significant time off the process of piloting, deploying and seeing results.

Outcomes like more profitable products and better customer experiences are the ultimate business goals. Getting there first is important, but only if everything runs smoothly on the customer side. The disruption of new technologies should take place behind the scenes, after all.

And that’s where investing in pilot programs and enterprise architecture modeling demonstrate value as you put emerging technology to work.

Emerging technology - Data-driven business transformation

Categories
erwin Expert Blog

Enterprise Architect: A Role That Keeps Evolving

Enterprise architect is a common job title within IT organizations at large companies, but the term lacks any standard definition. Ask someone on the business side what their organization’s enterprise architects do, and you’ll likely get a response like, “They work with IT,” which is true, but also pretty vague.

What the enterprise architects at your organization do depends in large part on how the IT department is organized. At some organizations, enterprise architects work closely with the software applications in a role that some might refer to as a solution architect.

In other organizations, the role of enterprise architect might carry more traditional IT responsibilities around systems management. Other enterprise architects, especially at large organizations, might specialize in exploring how emerging technologies can be tested and later integrated into the business.

Technology research and advisory firm Gartner predicts that enterprise architects will increasingly move into an internal consultancy function within large organizations. While this use of the role is not currently widespread, it’s easy to see how it could make sense for some businesses.

If, for example, a business sets a goal to increase its website sales by 20 percent in one year’s time, meeting that goal will require that different IT and business functions work together.

The business side might tackle changes to the marketing plan and collect data about website visitors and shoppers, but ultimately they will need to collaborate with someone on the technology side to discuss how IT can help reach that goal. And that’s where an enterprise architect in the role of an internal consultant comes into play.

Each business is going to organize its enterprise architects in a way that best serves the organization and helps achieve its goals.

That’s one of the reasons the enterprise architect role has no standard definition. Most teams consist of members with broad IT experience, but each member will often have some role-specific knowledge. One team member might specialize in security, for example, and another in applications.

Like the tech industry in general, the only constant in enterprise architecture is change. Roles and titles will continue to evolve, and as the business and IT sides of the organization continue to come together in the face of digital transformation, how these teams are organized, where they report, and the types of projects they focus on are sure to change over time.

Enterprise integration architect is one role in enterprise architecture that’s on the rise. These architects specialize in integrating the various cloud and on-premise systems that are now common in the hybrid/multi-cloud infrastructures powering the modern enterprise.

Enterprise Architect: A Role That Keeps Evolving

For the Enterprise Architect, Business Experience Becomes a Valuable Commodity

Regardless of the specific title, enterprise architects need the ability to work with both their business and IT colleagues to help improve business outcomes. As enterprise architecture roles move closer to the business, those with business knowledge are becoming valuable assets. This is especially true for industry-specific business knowledge.

As industry and government compliance regulations, for example, become part of the business fabric in industries like financial services, healthcare and pharmaceuticals, many enterprise architects are developing specializations in these industries that demonstrate their understanding of the business and IT sides of these regulations.

This is important because compliance permeates every area of many of these organizations, from the enterprise architecture to the business processes, and today it’s all enabled by software. Compliance is another area where Gartner’s internal consultancy model for enterprise architects could benefit a number of organizations. The stakes are simply too high to do anything but guarantee all of your processes are compliant.

Enterprise architect is just one role in the modern organization that increasingly stands with one foot on the business side and the other in IT. As your organization navigates its digital transformation, it’s important to use tools that can do the same.

erwin, Inc.’s industry-leading tools for enterprise architecture and business process modeling use a common repository and role-based views, so business users, IT users and those who straddle the line have the visibility they need. When everyone uses the same tools and the same data, they can speak the same language, collaborate more effectively, and produce better business outcomes. That’s something the whole team can support, regardless of job title.

Business Process Modeling Use Cases

Categories
erwin Expert Blog

Enterprise Architecture and Business Process: Common Goals Require Common Tools

For decades now, the professional world has put a great deal of energy into discussing the gulf that exists between business and IT teams within organizations.

They speak different languages, it’s been said, and work toward different goals. Technology plans don’t seem to account for the reality of the business, and business plans don’t account for the capabilities of the technology.

Data governance is one area where business and IT never seemed to establish ownership. Early attempts at data governance treated the idea as a game of volleyball, passing ownership back and forth, with one team responsible for storing data and running applications, and one responsible for using the data for business outcomes.

Today, we see ample evidence this gap is closing at many organizations. Consider:

  • Many technology platforms and software applications now are designed for business users. Business intelligence is a prime example; it’s rare today to see IT pros have to run reports for business users thanks to self-service.
  • Many workers, especially those that came of age surrounded by technology, have a better understanding of both the business and technology that runs their organizations. Education programs also have evolved to help students develop a background in both business and technology.
  • There’s more portability in roles, with technology minds moving to business leadership positions and vice versa.

“The business domain has always existed in enterprise architecture,” says Manuel Ponchaux, director of product management at erwin, Inc. “However, enterprise architecture has traditionally been an IT function with a prime focus on IT. We are now seeing a shift with a greater focus on business outcomes.”

You can see evidence of this blended focus in some of the titles, like “business architect,” being bestowed upon what was traditionally at IT function. These titles demonstrate an understanding that technology cannot exist in the modern organization for the sake of technology alone – technology needs to support the business and its customers. This concept is also a major focus of the digital transformation wave that’s washing over the business world, and thus we see it reflected in job titles that simply didn’t exist a decade ago.

Job titles aside, enterprise architecture (EA) and business process (BP) teams still have different goals, though at many organizations they now work more closely together than they did in the past. Today, both EA and BP teams recognize that their common goal is better business outcomes. Along the way to that goal, each team conducts a number of similar tasks.

Enterprise Architecture and Business Process: Better Together

One prominent example is modeling. Both enterprise architecture and business process teams do modeling, but they do it in different ways at different levels, and they often use different data and tools. This lack of coordination and communication makes it difficult to develop a true sense of a process from the IT and business sides of the equation. It can also lead to duplication of efforts, which is inefficient and likely to add further confusion when trying to understand outcomes.

Building better business outcomes is like following a plan at a construction site. If different teams are making their own decisions about the materials they’re going to use and following their own blueprints, you’re unlikely to see the building you expect to see at the end of the job.

And that’s essentially what is missing at many organizations: A common repository with role-based views, interfaces and dashboard so that enterprise architecture and business process can truly work together using the same blueprint. When enterprise architecture and business process can use common tools that both aid collaboration and help them understand the elements most important to their roles, the result is greater accuracy, increased efficiency and improved outcomes.

erwin’s enterprise architecture and business process tools provide the common repository and role-based views that help these teams work collaboratively toward their common goals. Finally, enterprise architecture and business process can be on the same page.

Business Process Modeling Use Cases

Categories
erwin Expert Blog

A Guide to CCPA Compliance and How the California Consumer Privacy Act Compares to GDPR

California Consumer Privacy Act (CCPA) compliance shares many of the same requirements in the European Unions’ General Data Protection Regulation (GDPR).

While the CCPA has been signed into law, organizations have until Jan. 1, 2020, to enact its mandates. Luckily, many organizations have already laid the regulatory groundwork for it because of their efforts to comply with GDPR.

However, there are some key differences that we’ll explore in the Q&A below.

Data governance, thankfully, provides a framework for compliance with either or both – in addition to other regulatory mandates your organization may be subject to.

CCPA Compliance Requirements vs. GDPR FAQ

Does CCPA apply to not-for-profit organizations? 

No, CCPA compliance only applies to for-profit organizations. GDPR compliance is required for any organization, public or private (including not-for-profit).

What for-profit businesses does CCPA apply to?

The mandate for CCPA compliance only applies if a for-profit organization:

  • Has an annual gross revenue exceeding $25 million
  • Collects, sells or shares the personal data of 50,000 or more consumers, households or devices
  • Earns 50% of more of its annual revenue by selling consumers’ personal information

Does the CCPA apply outside of California?

As the name suggests, the legislation is designed to protect the personal data of consumers who reside in the state of California.

But like GDPR, CCPA compliance has impacts outside the area of origin. This means businesses located outside of California, but selling to (or collecting the data of) California residents must also comply.

Does the CCPA exclude anything that GDPR doesn’t? 

GDPR encompasses all categories of “personal data,” with no distinctions.

CCPA does make distinctions, particularly when other regulations may overlap. These include:

  • Medical information covered by the Confidentiality of Medical Information Act (CMIA) and the Health Insurance Portability and Accountability Act (HIPAA)
  • Personal information covered by the Gramm-Leach-Bliley Act (GLBA)
  • Personal information covered by the Driver’s Privacy Protection Act (DPPA)
  • Clinical trial data
  • Information sold to or by consumer reporting agencies
  • Publicly available personal information (federal, state and local government records)

What about access requests? 

Under the GDPR, organizations must make any personal data collected from an EU citizen available upon request.

CCPA compliance only requires data collected within the last 12 months to be shared upon request.

Does the CCPA include the right to opt out?

CCPA, like GDPR, empowers gives consumers/citizens the right to opt out in regard to the processing of their personal data.

However, CCPA compliance only requires an organization to observe an opt-out request when it comes to the sale of personal data. GDPR does not make any distinctions between “selling” personal data and any other kind of data processing.

To meet CCPA compliance opt-out standards, organizations must provide a “Do Not Sell My Personal Information” link on their home pages.

Does the CCPA require individuals to willingly opt in?

No. Whereas the GDPR requires informed consent before an organization sells an individual’s information, organizations under the scope of the CCPA can still assume consent. The only exception involves the personal information of children (under 16). Children over 13 can consent themselves, but if the consumer is a child under 13, a parent or guardian must authorize the sale of said child’s personal data.

What about fines for CCPA non-compliance? 

In theory, fines for CCPA non-compliance are potentially more far reaching than those of GDPR because there is no ceiling for CCPA penalties. Under GDPR, penalties have a ceiling of 4% of global annual revenue or €20 million, whichever is greater. GDPR recently resulted in a record fine for Google.

Organizations outside of CCPA compliance can only be fined up to $7,500 per violation, but there is no upper ceiling.

CCPA compliance is a data governance issue

Data Governance for Regulatory Compliance

While CCPA has a more narrow geography and focus than GDPR, compliance is still a serious effort for organizations under its scope. And as data-driven business continues to expand, so too will the pressure on lawmakers to regulate how organizations process data. Remember the Facebook hearings and now inquiries into Google and Twitter, for example?

Regulatory compliance remains a key driver for data governance. After all, to understand how to meet data regulations, an organization must first understand its data.

An effective data governance initiative should enable just that, by giving an organization the tools to:

  • Discover data: Identify and interrogate metadata from various data management silos
  • Harvest data: Automate the collection of metadata from various data management silos and consolidate it into a single source
  • Structure data: Connect physical metadata to specific business terms and definitions and reusable design standards
  • Analyze data: Understand how data relates to the business and what attributes it has
  • Map data flows: Identify where to integrate data and track how it moves and transforms
  • Govern data: Develop a governance model to manage standards and policies and set best practices
  • Socialize data: Enable all stakeholders to see data in one place in their own context

A Regulatory EDGE

The erwin EDGE software platform creates an “enterprise data governance experience” to transform how all stakeholders discover, understand, govern and socialize data assets. It includes enterprise modeling, data cataloging and data literacy capabilities, giving organizations visibility and control over their disparate architectures and all the supporting data.

Both IT and business stakeholders have role-based, self-service access to the information they need to collaborate in making strategic decisions. And because many of the associated processes can be automated, you reduce errors and increase the speed and quality of your data pipeline. This data intelligence unlocks knowledge and value.

The erwin EDGE provides the most agile, efficient and cost-effective means of launching and sustaining a strategic and comprehensive data governance initiative, whether you wish to deploy on premise or in the cloud. But you don’t have to implement every component of the erwin EDGE all at once to see strategic value.

Because of the platform’s federated design, you can address your organization’s most urgent needs, such as regulatory compliance, first. Then you can proactively address other organization objectives, such as operational efficiency, revenue growth, increasing customer satisfaction and improving overall decision-making.

You can learn more about leveraging data governance to navigate the changing tide of data regulations here.

Are you compliant with data regulations?

Categories
erwin Expert Blog

Keeping Up with New Data Protection Regulations

Keeping up with new data protection regulations can be difficult, and the latest – the General Data Protection Regulation (GDPR) – isn’t the only new data protection regulation organizations should be aware of.

California recently passed a law that gives residents the right to control the data companies collect about them. Some suggest the California Consumer Privacy Act (CCPA), which takes effect January 1, 2020, sets a precedent other states will follow by empowering consumers to set limits on how companies can use their personal information.

In fact, organizations should expect increasing pressure on lawmakers to introduce new data protection regulations. A number of high-profile data breaches and scandals have increased public awareness of the issue.

Facebook was in the news again last week for another major problem around the transparency of its user data, and the tech-giant also is reportedly facing 10 GDPR investigations in Ireland – along with Apple, LinkedIn and Twitter.

Some industries, such as healthcare and financial services, have been subject to stringent data regulations for years: GDPR now joins the Health Insurance Portability and Accountability Act (HIPAA), the Payment Card Industry Data Security Standard (PCI DSS) and the Basel Committee on Banking Supervision (BCBS).

Due to these pre-existing regulations, organizations operating within these sectors, as well as insurance, had some of the GDPR compliance bases covered in advance.

Other industries had their own levels of preparedness, based on the nature of their operations. For example, many retailers have robust, data-driven e-commerce operations that are international. Such businesses are bound to comply with varying local standards, especially when dealing with personally identifiable information (PII).

Smaller, more brick-and-mortar-focussed retailers may have had to start from scratch.

But starting position aside, every data-driven organization should strive for a better standard of data management — and not just for compliance sake. After all, organizations are now realizing that data is one of their most valuable assets.

New Data Protection Regulations – Always Be Prepared

When it comes to new data protection regulations in the face of constant data-driven change, it’s a matter of when, not if.

As they say, the best defense is a good offense. Fortunately, whenever the time comes, the first point of call will always be data governance, so organizations can prepare.

Effective compliance with new data protection regulations requires a robust understanding of the “what, where and who” in terms of data and the stakeholders with access to it (i.e., employees).

The Regulatory Rationale for Integrating Data Management & Data Governance

This is also true for existing data regulations. Compliance is an on-going requirement, so efforts to become compliant should not be treated as static events.

Less than four months before GDPR came into effect, only 6 percent of enterprises claimed they were prepared for it. Many of these organizations will recall a number of stressful weeks – or even months – tidying up their databases and their data management processes and policies.

This time and money was spent reactionarily, at the behest of proactive efforts to grow the business.

The implementation and subsequent observation of a strong data governance initiative ensures organizations won’t be put on the spot going forward. Should an audit come up, current projects aren’t suddenly derailed as they reenact pre-GDPR panic.

New Data Regulations

Data Governance: The Foundation for Compliance

The first step to compliance with new – or old – data protection regulations is data governance.

A robust and effective data governance initiative ensures an organization understands where security should be focussed.

By adopting a data governance platform that enables you to automatically tag sensitive data and track its lineage, you can ensure nothing falls through the cracks.

Your chosen data governance solution should enable you to automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data – Gain better visibility and control across the enterprise to identify data security threats and reduce associated risks.
  • Enriching business data elements for sensitive data discovery – By leveraging a comprehensive mechanism to define business data elements for PII, PHI and PCI across database systems, cloud and Big Data stores, you can easily identify sensitive data based on a set of algorithms and data patterns.
  • Providing metadata and value-based analysis – Simplify the discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data, including PII, PHI and PCI.

With these precautionary steps, organizations are primed to respond if a data breach occurs. Having a well governed data ecosystem with data lineage capabilities means issues can be quickly identified.

Additionally, if any follow-up is necessary –  such as with GDPR’s data breach reporting time requirements – it can be handles swiftly and in accordance with regulations.

It’s also important to understand that the benefits of data governance don’t stop with regulatory compliance.

A better understanding of what data you have, where it’s stored and the history of its use and access isn’t only beneficial in fending off non-compliance repercussions. In fact, such an understanding is arguably better put to use proactively.

Data governance improves data quality standards, it enables better decision-making and ensures businesses can have more confidence in the data informing those decisions.

The same mechanisms that protect data by controlling its access also can be leveraged to make data more easily discoverable to approved parties – improving operational efficiency.

All in all, the cumulative result of data governance’s influence on data-driven businesses both drives revenue (through greater efficiency) and reduces costs (less errors, false starts, etc.).

To learn more about data governance and the regulatory rationale for its implementation, get our free guide here.

DG RediChek

Categories
erwin Expert Blog Data Governance

Data Governance Frameworks: The Key to Successful Data Governance Implementation

A strong data governance framework is central to successful data governance implementation in any data-driven organization because it ensures that data is properly maintained, protected and maximized.

But despite this fact, enterprises often face push back when implementing a new data governance initiative or trying to mature an existing one.

Let’s assume you have some form of informal data governance operation with some strengths to build on and some weaknesses to correct. Some parts of the organization are engaged and behind the initiative, while others are skeptical about its relevance or benefits.

Some other common data governance implementation obstacles include:

  • Questions about where to begin and how to prioritize which data streams to govern first
  • Issues regarding data quality and ownership
  • Concerns about data lineage
  • Competing project and resources (time, people and funding)

By using a data governance framework, organizations can formalize their data governance implementation and subsequent adherence to. This addressess common concerns including data quality and data lineage, and provides a clear path to successful data governance implementation.

In this blog, we will cover three key steps to successful data governance implementation. We will also look into how we can expand the scope and depth of a data governance framework to ensure data governance standards remain high.

Data Governance Implementation in 3 Steps

When maturing or implementing data governance and/or a data governance framework, an accurate assessment of the ‘here and now’ is key. Then you can rethink the path forward, identifying any current policies or business processes that should be incorporated, being careful to avoid making the same mistakes of prior iterations.

With this in mind, here are three steps we recommend for implementing data governance and a data governance framework.

Data Governance Framework

Step 1: Shift the culture toward data governance

Data governance isn’t something to set and forget; it’s a strategic approach that needs to evolve over time in response to new opportunities and challenges. Therefore, a successful data governance framework has to become part of the organization’s culture but such a shift requires listening – and remembering that it’s about people, empowerment and accountability.

In most cases, a new data governance framework requires people – those in IT and across the business, including risk management and information security – to change how they work. Any concerns they raise or recommendations they make should be considered. You can encourage feedback through surveys, workshops and open dialog.

Once input has been discussed and plan agreed upon, it is critical to update roles and responsibilities, provide training and ensure ongoing communication. Many organizations now have internal certifications for different data governance roles who wear these badges with pride.

A top-down management approach will get a data governance initiative off the ground, but only bottom-up cultural adoption will carry it out.

Step 2: Refine the data governance framework

The right capabilities and tools are important for fueling an accurate, real-time data pipeline and governing it for maximum security, quality and value. For example:

Data catalogingOrganization’s implementing a data governance framework will benefit from automated metadata harvesting, data mapping, code generation and data lineage with reference data management, lifecycle management and data quality. With these capabilities, you can  efficiently integrate and activate enterprise data within a single, unified catalog in accordance with business requirements.

Data literacy Being able to discover what data is available and understand what it means in common, standardized terms is important because data elements may mean different things to different parts of the organization. A business glossary answers this need, as does the ability for stakeholders to view data relevant to their roles and understand it within a business context through a role-based portal.

Such tools are further enhanced if they can be integrated across data and business architectures and when they promote self-service and collaboration, which also are important to the cultural shift.

 

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

 

 

Step 3: Prioritize then scale the data governance framework

Because data governance is on-going, it’s important to prioritize the initial areas of focus and scale from there. Organizations that start with 30 to 50 data items are generally more successful than those that attempt more than 1,000 in the early stages.

Find some representative (familiar) data items and create examples for data ownership, quality, lineage and definition so stakeholders can see real examples of the data governance framework in action. For example:

  • Data ownership model showing a data item, its definition, producers, consumers, stewards and quality rules (for profiling)
  • Workflow showing the creation, enrichment and approval of the above data item to demonstrate collaboration

Whether your organization is just adopting data governance or the goal is to refine an existing data governance framework, the erwin DG RediChek will provide helpful insights to guide you in the journey.

Categories
erwin Expert Blog

Business Process Modeling Use Cases and Definition

What is business process modeling (BPM)? A visual representation of what your business does and how it does it. Why is having this picture important?

According to Gartner, BPM links business strategy to IT systems development to ensure business value. It also combines process/ workflow, functional, organizational and data/resource views with underlying metrics such as costs, cycle times and responsibilities to provide a foundation for analyzing value chains, activity-based costs, bottlenecks, critical paths and inefficiencies.

Every organization—particularly those operating in industries where quality, regulatory, health, safety or environmental issues are a concern—must have a complete understanding of its processes. Equally important, employees must fully comprehend and be accountable for appropriately carrying out the processes for which they are responsible.

BPM allows organizations to benefit from an easily digestible visualization of its systems and the associated information. It makes it easier to be agile and responsive to changes in markets and consumer demands,

This is because the visualization process galvanizes an organization’s ability to identify areas of improvement, potential innovation and necessary reorganization.

But a theoretical understanding of business process modeling will only get you so far. The following use cases demonstrate the benefits of business process modeling in real life.

Business process modeling (BPM) is a practice that helps organizations understand how their strategy relates to their IT systems and system development.

Business Process Modeling Use Cases

Compliance:

Regulations like the E.U.’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) are requiring businesses across industries to think about their compliance efforts. Business process modeling helps organizations prove what they are doing to meet compliance requirements and understand how changes to their processes impact compliance efforts (and vice versa).

The visualization process can aid in an organization’s ability to understand the security risks associated with a particular process. It also means that should a breach occur, the organization’s greater understanding of its processes and related systems means they can respond with greater agility, mitigate the damage and quickly inform affected parties as required specifically by GDPR.

In the case of an audit, BPM can be used to demonstrate that the organization is cognizant of compliance standards and is doing what is required.

This also extends to industry-specific other compliance mandates  such as those in healthcare, pharmaceutical and the financial services industries.

The Regulatory Rationale for Integrating Data Management & Data Governance

The Democratization of Information:

Increasing an organizations ability to retain knowledge is another cross-industry use case for business process modeling. This use case benefits organizations in two key areas:

1. Democratization of information.

By documenting processes, organizations can ensure that knowledge and information is de-siloed and that the organization as a whole can benefit from it. In this case, a key best practice to consider is the introduction of role/user-based access. This way an organization can ensure only the necessary parties can access such information and ensure they are in keeping with compliance standards.

2. Knowledge retention.

By documenting processes and democratizing information, process-specific knowledge can be retained, even when key employees leave. This is particularly important in the case of an aging workforce, where an organization could suffer a “brain drain” as large numbers of employees retire during a short span of time.

Digital Transformation:

Once in a while, a technological revolution turns the nature of business on its head. The most recent and arguably most significant of which – although at this point it’s hard to argue – is the rise of data-driven businesses.

In a relatively short amount of time, the leaders in data-driven businesses were launched and stormed their way to the forefront of their respective industries – think Amazon, Netflix and Uber.

The result? Data is now considered more valuable than oil and industries across the board are seeing digital transformation en masse.

There’s a clear connection between business process modeling and digital transformation initiatives. With it, an organization can explore models to understand information assets within a business context, from internal operations to full customer experiences.

This practice identifies and drives digital transformation opportunities to increase revenue while limiting risks and avoiding regulatory and compliance gaffes.

Organizations that leverage BPM in their digital transformation efforts can use their greater

understanding of their current processes to make more informed decisions about future implementations.

And the use cases for business process modeling don’t stop there.

A better understanding of your organizations processes can also ease software deployments and make mergers and acquisitions (M&A) far easier to handle. Large organizations grow through M&A activity, and the combining of business processes, software applications and infrastructure when two organizations become one is very complex.

Business process modeling offers visibility into existing processes and helps design new processes that will deliver results in a post-merger environment.

The latest guide from the erwin Experts expands on these use cases and details how best to use business process modeling to tame your organization’s complexity and maximize its potential and profits.

Business Process Modeling Use Cases

Categories
erwin Expert Blog

What’s Business Process Modeling Got to Do with It? – Choosing A BPM Tool

With business process modeling (BPM) being a key component of data governance, choosing a BPM tool is part of a dilemma many businesses either have or will soon face.

Historically, BPM didn’t necessarily have to be tied to an organization’s data governance initiative.

However, data-driven business and the regulations that oversee it are becoming increasingly extensive, so the need to view data governance as a collective effort – in terms of personnel and the tools that make up the strategy – is becoming harder to ignore.

Data governance also relies on business process modeling and analysis to drive improvement, including identifying business practices susceptible to security, compliance or other risks and adding controls to mitigate exposures.

Choosing a BPM Tool: An Overview

As part of a data governance strategy, a BPM tool aids organizations in visualizing their business processes, system interactions and organizational hierarchies to ensure elements are aligned and core operations are optimized.

The right BPM tool also helps organizations increase productivity, reduce errors and mitigate risks to achieve strategic objectives.

With  insights from the BPM tool, you can clarify roles and responsibilities – which in turn should influence an organization’s policies about data ownership and make data lineage easier to manage.

Organizations also can use a BPM tool to identify the staff who function as “unofficial data repositories.” This has both a primary and secondary function:

1. Organizations can document employee processes to ensure vital information isn’t lost should an employee choose to leave.

2. It is easier to identify areas where expertise may need to be bolstered.

Organizations that adopt a BPM tool also enjoy greater process efficiency. This is through a combination of improving existing processes or designing new process flows, eliminating unnecessary or contradictory steps, and documenting results in a shareable format that is easy to understand so the organization is pulling in one direction.

Choosing a BPM Tool

Silo Buster

Understanding the typical use cases for business process modeling is the first step. As with any tech investment, it’s important to understand how the technology will work in the context of your organization/business.

For example, it’s counter-productive to invest in a solution that reduces informational silos only to introduce a new technological silo through a lack of integration.

Ideally, organizations want a BPM tool that works in conjunction with the wider data management platform and data governance initiative – not one that works against them.

That means it must support data imports and integrations from/with external sources, a solution that enables in-tool collaboration to reduce departmental silos, and most crucial, a solution that taps into a central metadata repository to ensure consistency across the whole data management and governance initiatives.

The lack of a central metadata repository is a far too common thorn in an organization’s side. Without it, they have to juggle multiple versions as changes to the underlying data aren’t automatically updated across the platform.

It also means organizations waste crucial time manually manufacturing and maintaining data quality, when an automation framework could achieve the same goal instantaneously, without human error and with greater consistency.

A central metadata repository ensures an organization can acknowledge and get behind a single source of truth. This has a wealth of favorable consequences including greater cohesion across the organization, better data quality and trust, and faster decision-making with less false starts due to plans based on misleading information.

Three Key Questions to Ask When Choosing a BPM Tool

Organizations in the market for a BPM tool should also consider the following:

1. Configurability: Does the tool support the ability to model and analyze business processes with links to data, applications and other aspects of your organization? And how easy is this to achieve?

2. Role-based views: Can the tool develop integrated business models for a single source of truth but with different views for different stakeholders based on their needs – making regulatory compliance more manageable? Does it enable cross-functional and enterprise collaboration through discussion threads, surveys and other social features?

3. Business and IT infrastructure interoperability: How well does the tool integrate with other key components of data governance including enterprise architecture, data modeling, data cataloging and data literacy? Can it aid in providing data intelligence to connect all the pieces of the data management and governance lifecycles?

For more information and to find out how such a solution can integrate with your organization and current data management and data governance initiatives, click here.

BPM Tool - erwin BP powered by Casewise

Categories
erwin Expert Blog

Data Mapping Tools: What Are the Key Differentiators

The need for data mapping tools in light of increasing volumes and varieties of data – as well as the velocity at which it must be processed – is growing.

It’s not difficult to see why either. Data mapping tools have always been a key asset for any organization looking to leverage data for insights.

Isolated units of data are essentially meaningless. By linking data and enabling its categorization in relation to other data units, data mapping provides the context vital for actionable information.

Now with the General Data Protection Regulation (GDPR) in effect, data mapping has become even more significant.

The scale of GDPR’s reach has set a new precedent and is the closest we’ve come to a global standard in terms of data regulations. The repercussions can be huge – just ask Google.

Data mapping tools are paramount in charting a path to compliance for said new, near-global standard and avoiding the hefty fines.

Because of GDPR, organizations that may not have fully leveraged data mapping for proactive data-driven initiatives (e.g., analysis) are now adopting data mapping tools with compliance in mind.

Arguably, GDPR’s implementation can be viewed as an opportunity – a catalyst for digital transformation.

Those organizations investing in data mapping tools with compliance as the main driver will definitely want to consider this opportunity and have it influence their decision as to which data mapping tool to adopt.

With that in mind, it’s important to understand the key differentiators in data mapping tools and the associated benefits.

Data Mapping Tools: erwin Mapping Manager

Data Mapping Tools: Automated or Manual?

In terms of differentiators for data mapping tools, perhaps the most distinct is automated data mapping versus data mapping via manual processes.

Data mapping tools that allow for automation mean organizations can benefit from in-depth, quality-assured data mapping, without the significant allocations of resources typically associated with such projects.

Eighty percent of data scientists’ and other data professionals’ time is spent on manual data maintenance. That’s anything and everything from addressing errors and inconsistencies and trying to understand source data or track its lineage. This doesn’t even account for the time lost due to missed errors that contribute to inherently flawed endeavors.

Automated data mapping tools render such issues and concerns void. In turn, data professionals’ time can be put to much better, proactive use, rather than them being bogged down with reactive, house-keeping tasks.

FOUR INDUSTRY FOCUSSED CASE STUDIES FOR AUTOMATED METADATA-DRIVEN AUTOMATION 
(BFSI, PHARMA, INSURANCE AND NON-PROFIT) 

 

As well as introducing greater efficiency to the data governance process, automated data mapping tools enable data to be auto-documented from XML that builds mappings for the target repository or reporting structure.

Additionally, a tool that leverages and draws from a single metadata repository means that mappings are dynamically linked with underlying metadata to render automated lineage views, including full transformation logic in real time.

Therefore, changes (e.g., in the data catalog) will be reflected across data governance domains (business process, enterprise architecture and data modeling) as and when they’re made – no more juggling and maintaining multiple, out-of-date versions.

It also enables automatic impact analysis at the table and column level – even for business/transformation rules.

For organizations looking to free themselves from the burden of juggling multiple versions, siloed business processes and a disconnect between interdepartmental collaboration, this feature is a key benefit to consider.

Data Mapping Tools: Other Differentiators

In light of the aforementioned changes to data regulations, many organizations will need to consider the extent of a data mapping tool’s data lineage capabilities.

The ability to reverse-engineer and document the business logic from your reporting structures for true source-to-report lineage is key because it makes analysis (and the trust in said analysis) easier. And should a data breach occur, affected data/persons can be more quickly identified in accordance with GDPR.

Article 33 of GDPR requires organizations to notify the appropriate supervisory authority “without undue delay and, where, feasible, not later than 72 hours” after discovering a breach.

As stated above, a data governance platform that draws from a single metadata source is even more advantageous here.

Mappings can be synchronized with metadata so that source or target metadata changes can be automatically pushed into the mappings – so your mappings stay up to date with little or no effort.

The Data Mapping Tool For Data-Driven Businesses

Nobody likes manual documentation. It’s arduous, error-prone and a waste of resources. Quite frankly, it’s dated.

Any organization looking to invest in data mapping, data preparation and/or data cataloging needs to make automation a priority.

With automated data mapping, organizations can achieve “true data intelligence,”. That being the ability to tell the story of how data enters the organization and changes throughout the entire lifecycle to the consumption/reporting layer.  If you’re working harder than your tool, you have the wrong tool.

The manual tools of old do not have auto documentation capabilities, cannot produce outbound code for multiple ETL or script types, and are a liability in terms of accuracy and GDPR.

Automated data mapping is the only path to true GDPR compliance, and erwin Mapping Manager can get you there in a matter of weeks thanks to our robust reverse-engineering technology. 

Learn more about erwin’s automation framework for data governance here.

Automate Data Mapping