Categories
erwin Expert Blog

Top Use Cases for Enterprise Architecture: Architect Everything

Architect Everything: New use cases for enterprise architecture are increasing enterprise architect’s stock in data-driven business

As enterprise architecture has evolved, so to have the use cases for enterprise architecture.

Analyst firm Ovum recently released a new report titled Ovum Market Radar: Enterprise Architecture. In it, they make the case that enterprise architecture (EA) is becoming AE – or “architect everything”.

The transition highlights enterprise architecture’s evolution from being solely an IT function to being more closely aligned with the business. As such, the function has changed from EA to AE.

At erwin, we’re definitely witnessing this EA evolution as more and more as organizations undertake digital transformation initiatives, including rearchitecting their business models and value streams, as well as responding to increasing regulatory pressures.

This is because EA provides the right information to the right people at the right time for smarter decision-making.

Following are some of the top use cases for enterprise architecture that demonstrate how EA is moving beyond IT and into the business.

Enterprise Architecture Use Cases

Top 7 Use Cases for Enterprise Architecture

Compliance. Enterprise architecture is critical for regulatory compliance. It helps model, manage and transform mission-critical value streams across industries, as well as identify sensitive information. When thousands of employees need to know what compliance processes to follow, such as those associated with regulations (e.g., GDPR, HIPAA, SOX, CCPA, etc.) it ensures not only access to proper documentation but also current, updated information.

The Regulatory Rationale for Integrating Data Management & Data Governance

Data security/risk management. EA should be commonplace in data security planning. Any flaw in the way data is stored or monitored is a potential ‘in’ for a breach, and so businesses have to ensure security surrounding sensitive information is thorough and covers the whole business. Security should be proactive, not reactive, which is why EA should be a huge part of security planning.

Data governance. Today’s enterprise embraces data governance to drive data opportunities, including growing revenue, and limit data risks, including regulatory and compliance gaffes.

EA solutions that provide much-needed insight into the relationship between data assets and applications make it possible to appropriately direct data usage and flows, as well as focus greater attention, if warranted, on applications where data use delivers optimal business value.

Digital transformation. For an organization to successfully embrace change, innovation, EA and project delivery need to be intertwined and traceable. Enterprise architects are crucial to delivering innovation. Taking an idea from concept to delivery requires strategic planning and the ability to execute. An enterprise architecture roadmap can help focus such plans and many organizations are now utilizing them to prepare their enterprise architectures for 5G.

Mergers & acquisitions. Enterprise architecture is essential to successful mergers and acquisitions. It helps alignment by providing a business- outcome perspective for IT and guiding transformation. It also helps define strategy and models, improving interdepartmental cohesion and communication.

In an M&A scenario, businesses need to ensure their systems are fully documented and rationalized. This way they can comb through their inventories to make more informed decisions about which systems to cut or phase out to operate more efficiently.

Innovation management. EA is crucial to innovation and project delivery. Using open standards to link to other products within the overall project lifecycle, integrating agile enterprise architecture with agile development and connecting project delivery with effective governance.

It takes a rigorous approach to ensure that current and future states are published for a wider audience for consumption and collaboration – from modeling to generating road maps with meaningful insights provided to both technical and business stakeholders during every step.

Knowledge retention. Unlocking knowledge and then putting systems in place to retain that knowledge is a key benefit of EA. Many organizations lack a structured approach for gathering and investigating employee ideas. Ideas can fall into a black hole where they don’t get feedback and employees become less engaged.

When your enterprise architecture is aligned with your business outcomes, it provides a way to help your business ideate and investigate the viability of ideas on both the technical and business level.

If the benefits of enterprise architecture would help your business, here’s how you can try erwin EA for free.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Business Process Can Make or Break Data Governance

Data governance isn’t a one-off project with a defined endpoint. It’s an on-going initiative that requires active engagement from executives and business leaders.

Data governance, today, comes back to the ability to understand critical enterprise data within a business context, track its physical existence and lineage, and maximize its value while ensuring quality and security.

Free Data Modeling Best Practice Guide

Historically, little attention has focused on what can literally make or break any data governance initiative — turning it from a launchpad for competitive advantage to a recipe for disaster. Data governance success hinges on business process modeling and enterprise architecture.

To put it even more bluntly, successful data governance* must start with business process modeling and analysis.

*See: Three Steps to Successful & Sustainable Data Governance Implementation

Business Process Data Governance

Passing the Data Governance Ball

For years, data governance was the volleyball passed back and forth over the net between IT and the business, with neither side truly owning it. However, once an organization understands that IT and the business are both responsible for data, it needs to develop a comprehensive, holistic strategy for data governance that is capable of four things:

  1. Reaching every stakeholder in the process
  2. Providing a platform for understanding and governing trusted data assets
  3. Delivering the greatest benefit from data wherever it lives, while minimizing risk
  4. Helping users understand the impact of changes made to a specific data element across the enterprise.

To accomplish this, a modern data governance strategy needs to be interdisciplinary to break down traditional silos. Enterprise architecture is important because it aligns IT and the business, mapping a company’s applications and the associated technologies and data to the business functions and value streams they enable.

Ovum Market Radar: Enterprise Architecture

The business process and analysis component is vital because it defines how the business operates and ensures employees understand and are accountable for carrying out the processes for which they are responsible. Enterprises can clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

Slow Down, Ask Questions

In a rush to implement a data governance methodology and system, organizations can forget that a system must serve a process – and be governed/controlled by one.

To choose the correct system and implement it effectively and efficiently, you must know – in every detail – all the processes it will impact. You need to ask these important questions:

  1. How will it impact them?
  2. Who needs to be involved?
  3. When do they need to be involved?

These questions are the same ones we ask in data governance. They involve impact analysis, ownership and accountability, control and traceability – all of which effectively documented and managed business processes enable.

Data sets are not important in and of themselves. Data sets become important in terms of how they are used, who uses them and what their use is – and all this information is described in the processes that generate, manipulate and use them. So unless we know what those processes are, how can any data governance implementation be complete or successful?

Processes need to be open and shared in a concise, consistent way so all parts of the organization can investigate, ask questions, and then add their feedback and information layers. In other words, processes need to be alive and central to the organization because only then will the use of data and data governance be truly effective.

A Failure to Communicate

Consider this scenario: We’ve perfectly captured our data lineage, so we know what our data sets mean, how they’re connected, and who’s responsible for them – not a simple task but a massive win for any organization. Now a breach occurs. Will any of the above information tell us why it happened? Or where? No! It will tell us what else is affected and who can manage the data layer(s), but unless we find and address the process failure that led to the breach, it is guaranteed to happen again.

By knowing where data is used – the processes that use and manage it – we can quickly, even instantly, identify where a failure occurs. Starting with data lineage (meaning our forensic analysis starts from our data governance system), we can identify the source and destination processes and the associated impacts throughout the organization.

We can know which processes need to change and how. We can anticipate the pending disruptions to our operations and, more to the point, the costs involved in mitigating and/or addressing them.

But knowing all the above requires that our processes – our essential and operational business architecture – be accurately captured and modelled. Instituting data governance without processes is like building a castle on sand.

Rethinking Business Process Modeling and Analysis

Modern organizations need a business process modeling and analysis tool with easy access to all the operational layers across the organization – from high-level business architecture all the way down to data.

Such a system should be flexible, adjustable, easy-to-use and capable of supporting multiple layers simultaneously, allowing users to start in their comfort zones and mature as they work toward their organization’s goals.

The erwin EDGE is one of the most comprehensive software platforms for managing an organization’s data governance and business process initiatives, as well as the whole data architecture. It allows natural, organic growth throughout the organization and the assimilation of data governance and business process management under the same platform provides a unique data governance experience because of its integrated, collaborative approach.

Start your free, cloud-based trial of erwin Business Process and see how some of the world’s largest enterprises have benefited from its centralized repository and integrated, role-based views.

We’d also be happy to show you our data governance software, which includes data cataloging and data literacy capabilities.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Internal Business Process Modeling: The Secret Behind Exponential Organizations

Strong internal business process modeling and management helps data-driven organizations compete and lead

In short, an internal business process is a documented account of how things should be done to maximize efficiency and achieve a particular goal.

In the book “Exponential Organizations” by Salim Ismail, Michael S. Malone and Yuri van Geest, the authors, examine how every company is or will evolve into an information-based entity in which costs fall to nearly zero, abundance replaces scarcity and only “exponential organizations” survive.

It’s not news that exponential organizations like Uber, Airbnb and Netflix have flipped the script on disrupting traditional industries like taxis, hotels and video rentals/TV viewing.

But now, even traditional industries like healthcare and financial services, which were historically slow to innovate, are transforming at breakneck speed.

Let’s face it, in today’s hyper-competitive markets, the traditional approach of relying on legacy strengths or inertia for survival just simply won’t work.

The days of enterprises focusing almost exclusively on rigid structures, centralized management and accountability; concentrated knowledge; service mainly to external customers; and reactive, short-term strategy alignment driven mainly by massive-scale projects are antiquated.

The information within your organization’s internal business processes is where the data your company collects, creates, stores and analyzes actually transforms into something that makes your company go, hopefully for the long haul.

Internal Business Process Modeling - Exponential Organizations

The Value of Internal Business Process Modeling

Organizations are built on a series of internal business processes. The complexity of modern data-driven organizations requires processes to work in tandem to create and sustain value.

The degree to which any individual internal business process drives value can vary, but even the most seemingly mundane processes are part of a collective sum, greater than its parts.

Therefore, it’s critical for organizations to map their internal business processes to understand how a given action relates to the organizations’ overall strategy and goals.

Such knowledge is at the core of exponential organizations. They understand how any given internal business process relates to value creation, making it far easier to assess what’s currently working but also identify areas for improvement as well as the potential for competitive differentiation.

Exponential organizations also are better positioned to respond and adapt to disruptive forces, such as 5G. This is because understanding what and how you do things now makes it easier to implement change in an agile manner.

5G Roadmap: Preparing Your Enterprise Architecture

How do you join the ranks of exponential organizations? And where do you begin your journey to becoming an information-based entity?

Attitude Adjustment

More and more organizations are realizing they need to adjust their traditional thinking and subsequent actions, even if just a bit, to gain strategic advantage, reduce costs and retain market dominance. For example:

  1. Structures are becoming more adaptable, allowing for greater flexibility and cost management. How is this possible and why now? Organizations are grasping that effective, well-managed and documented internal business processes should form their operational backbones.
  2. Business units and the departments within them are becoming accountable not only for their own budgets but also on how well they achieve their goals. This is possible because their responsibilities and processes can be clearly defined, documented and then monitored to ensure their work is executed in a repeatable, predictable and measurable way.
  3. Knowledge is now both centralized and distributed thanks to modern knowledge management systems. Central repositories and collaborative portals give everyone within the organization equal access to the data they need to do their jobs more effectively and efficiently.
  4. And thanks to all the above, organizations can expand their focus from external customers to internal ones as well. By clearly identifying individual processes (and their cross-business handover points) and customer touchpoints, organizations can interact with any customer at the right point with the most appropriate resources.

Benefits of Internal Business Process Modeling and Management

One of the main benefits of a process-based organizational engine is that it should be able to better handle outside pressures, such as new regulations, if they are – or are becoming – truly process-based. Because once processes (and their encompassing business architecture) become central to the organization, a wide array of things become simpler, faster and cheaper.

Another benefit is application design – the holy grail or black hole of budgetary spending and project management, depending on your point of view – is streamlined, with requirements clearly gathered and managed in perfect correspondence to the processes they serve and with the data they manage clearly documented and communicated to the developers.

Testing occurs against real-life scenarios by the responsible parties as documented by the process owners – a drastic departure from the more traditional approaches in which the responsibility fell to designated, usually technical application owners.

Finally – and most important – data governance is no longer the isolated domain of data architects but central to the everyday processes that make an organization tick. As processes have stakeholders who use information – data – the roles of technical owners and data stewards become integral to ensuring processes operate efficiently, effectively and – above all – without interruptions. On the other side of this coin, data owners and data stewards no longer operate in their own worlds, distant from the processes their data supports.

Carpe Process

All modern organizations should seize business process as a central component to their operations. Data governance as well, and cost management becoming a third driver for the enterprise machine. But as we all know, it takes more than stable connecting rods to make an engine work – it needs cogs and wheels, belts and multiple power sources, all working together.

In the traditional organization, people are the internal mechanics. These days, powerful and flexible workflow engines provide much-needed automation for greater visibility plus more power, stability and quality – all the things a machine needs to operate as required/designed.

Advanced process management systems are becoming essential, not optional. And while not as sexy or attention-grabbing as other technologies, they provide the power to drive an organization toward its goals quickly, cost-effectively and efficiently.

To learn how erwin can empower a modern, process-based organization, please click here.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Top 5 Data Catalog Benefits

A data catalog benefits organizations in a myriad of ways. With the right data catalog tool, organizations can automate enterprise metadata management – including data cataloging, data mapping, data quality and code generation for faster time to value and greater accuracy for data movement and/or deployment projects.

Data cataloging helps curate internal and external datasets for a range of content authors. Gartner says this doubles business benefits and ensures effective management and monetization of data assets in the long-term if linked to broader data governance, data quality and metadata management initiatives.

But even with this in mind, the importance of data cataloging is growing. In the regulated data world (GDPR, HIPAA etc) organizations need to have a good understanding of their data lineage – and the data catalog benefits to data lineage are substantial.

Data lineage is a core operational business component of data governance technology architecture, encompassing the processes and technology to provide full-spectrum visibility into the ways data flows across an enterprise.

There are a number of different approaches to data lineage. Here, I outline the common approach, and the approach incorporating data cataloging – including the top 5 data catalog benefits for understanding your organization’s data lineage.

Data Catalog Benefits

Data Lineage – The Common Approach

The most common approach for assembling a collection of data lineage mappings traces data flows in a reverse manner. The process begins with the target or data end-point, and then traversing the processes, applications, and ETL tasks in reverse from the target.

For example, to determine the mappings for the data pipelines populating a data warehouse, a data lineage tool might begin with the data warehouse and examine the ETL tasks that immediately proceed the loading of the data into the target warehouse.

The data sources that feed the ETL process are added to a “task list,” and the process is repeated for each of those sources. At each stage, the discovered pieces of lineage are documented. At the end of the sequence, the process will have reverse-mapped the pipelines for populating that warehouse.

While this approach does produce a collection of data lineage maps for selected target systems, there are some drawbacks.

  • First, this approach focuses only on assembling the data pipelines populating the selected target system but does not necessarily provide a comprehensive view of all the information flows and how they interact.
  • Second, this process produces the information that can be used for a static view of the data pipelines, but the process needs to be executed on a regular basis to account for changes to the environment or data sources.
  • Third, and probably most important, this process produces a technical view of the information flow, but it does not necessarily provide any deeper insights into the semantic lineage, or how the data assets map to the corresponding business usage models.

A Data Catalog Offers an Alternate Data Lineage Approach

An alternate approach to data lineage combines data discovery and the use of a data catalog that captures data asset metadata with a data mapping framework that documents connections between the data assets.

This data catalog approach also takes advantage of automation, but in a different way: using platform-specific data connectors, the tool scans the environment for storing each data asset and imports data asset metadata into the data catalog.

When data asset structures are similar, the tool can compare data element domains and value sets, and automatically create the data mapping.

In turn, the data catalog approach performs data discovery using the same data connectors to parse the code involved in data movement, such as major ETL environments and procedural code – basically any executable task that moves data.

The information collected through this process is reverse engineered to create mappings from source data sets to target data sets based on what was discovered.

For example, you can map the databases used for transaction processing, determine that subsets of the transaction processing database are extracted and moved to a staging area, and then parse the ETL code to infer the mappings.

These direct mappings also are documented in the data catalog. In cases where the mappings are not obvious, a tool can help a data steward manually map data assets into the catalog.

The result is a data catalog that incorporates the structural and semantic metadata associated with each data asset as well as the direct mappings for how that data set is populated.

Learn more about data cataloging.

Value of Data Intelligence IDC Report

And this is a powerful representative paradigm – instead of capturing a static view of specific data pipelines, it allows a data consumer to request a dynamically-assembled lineage from the documented mappings.

By interrogating the catalog, the current view of any specific data lineage can be rendered on the fly that shows all points of the data lineage: the origination points, the processing stages, the sequences of transformations, and the final destination.

Materializing the “current active lineage” dynamically reduces the risk of having an older version of the lineage that is no longer relevant or correct. When new information is added to the data catalog (such as a newly-added data source of a modification to the ETL code), dynamically-generated views of the lineage will be kept up-to-date automatically.

Top 5 Data Catalog Benefits for Understanding Data Lineage

A data catalog benefits data lineage in the following five distinct ways:

1. Accessibility

The data catalog approach allows the data consumer to query the tool to materialize specific data lineage mappings on demand.

2. Currency

The data lineage is rendered from the most current data in the data catalog.

3. Breadth

As the number of data assets documented in the data catalog increases, the scope of the materializable lineage expands accordingly. With all corporate data assets cataloged, any (or all!) data lineage mappings can be produced on demand.

4. Maintainability and Sustainability

Since the data lineage mappings are not managed as distinct artifacts, there are no additional requirements for maintenance. As long as the data catalog is kept up to date, the data lineage mappings can be materialized.

5. Semantic Visibility

In addition to visualizing the physical movement of data across the enterprise, the data catalog approach allows the data steward to associate business glossary terms, data element definitions, data models, and other semantic details with the different mappings. Additional visualization methods can demonstrate where business terms are used, how they are mapped to different data elements in different systems, and the relationships among these different usage points.

One can impose additional data governance controls with project management oversight, which allows you to designate data lineage mappings in terms of the project life cycle (such as development, test or production).

Aside from these data catalog benefits, this approach allows you to reduce the amount of manual effort for accumulating the information for data lineage and continually reviewing the data landscape to maintain consistency, thus providing a greater return on investment for your data intelligence budget.

Learn more about data cataloging.

Categories
erwin Expert Blog Data Intelligence

The Top 8 Benefits of Data Lineage

It’s important we recognize the benefits of data lineage.

As corporate data governance programs have matured, the inventory of agreed-to data policies has grown rapidly. These include guidelines for data quality assurance, regulatory compliance and data democratization, among other information utilization initiatives.

Organizations that are challenged by translating their defined data policies into implemented processes and procedures are starting to identify tools and technologies that can supplement the ways organizational data policies can be implemented and practiced.

One such technique, data lineage, is gaining prominence as a core operational business component of the data governance technology architecture. Data lineage encompasses processes and technology to provide full-spectrum visibility into the ways that data flow across the enterprise.

To data-driven businesses, the benefits of data lineage are significant. Data lineage tools are used to survey, document and enable data stewards to query and visualize the end-to-end flow of information units from their origination points through the series of transformation and processing stages to their final destination.

Benefits of Data Lineage

The Benefits of Data Lineage

Data stewards are attracted to data lineage because the benefits of data lineage help in a number of different governance practices, including:

1. Operational intelligence

At its core, data lineage captures the mappings of the rapidly growing number of data pipelines in the organization. Visualizing the information flow landscape provides insight into the “demographics” of data consumption and use, answering questions such as “what data sources feed the greatest number of downstream sources” or “which data analysts use data that is ingested from a specific data source.” Collecting this intelligence about the data landscape better positions the data stewards for enforcing governance policies.

2. Business terminology consistency

One of the most confounding data governance challenges is understanding the semantics of business terminology within data management contexts. Because application development was traditionally isolated within each business function, the same (or similar) terms are used in different data models, even though the designers did not take the time to align definitions and meanings. Data lineage allows the data stewards to find common business terms, review their definitions, and determine where there are inconsistencies in the ways the terms are used.

3. Data incident root cause analysis

It has long been asserted that when a data consumer finds a data error, the error most likely was introduced into the environment at an earlier stage of processing. Yet without a “roadmap” that indicates the processing stages through which the data were processed, it is difficult to speculate where the error was actually introduced. Using data lineage, though, a data steward can insert validation probes within the information flow to validate data values and determine the stage in the data pipeline where an error originated.

4. Data quality remediation assessment

Root cause analysis is just the first part of the data quality process. Once the data steward has determined where the data flaw was introduced, the next step is to determine why the error occurred. Again, using a data lineage mapping, the steward can trace backward through the information flow to examine the standardizations and transformations applied to the data, validate that transformations were correctly performed, or identify one (or more) performed incorrectly, resulting in the data flaw.

5. Impact analysis

The enterprise is always subject to changes; externally-imposed requirements (such as regulatory compliance) evolve, internal business directives may affect user expectations, and ingested data source models may change unexpectedly. When there is a change to the environment, it is valuable to assess the impacts to the enterprise application landscape. In the event of a change in data expectations, data lineage provides a way to determine which downstream applications and processes are affected by the change and helps in planning for application updates.

6. Performance assessment

Not only does lineage provide a collection of mappings of data pipelines, it allows for the identification of potential performance bottlenecks. Data pipeline stages with many incoming paths are candidate bottlenecks. Using a set of data lineage mappings, the performance analyst can profile execution times across different pipelines and redistribute processing to eliminate bottlenecks.

7. Policy compliance

Data policies can be implemented through the specification of business rules. Compliance with these business rules can be facilitated using data lineage by embedding business rule validation controls across the data pipelines. These controls can generate alerts when there are noncompliant data instances.

8. Auditability of data pipelines

In many cases, regulatory compliance is a combination of enforcing a set of defined data policies along with a capability for demonstrating that the overall process is compliant. Data lineage provides visibility into the data pipelines and information flows that can be audited thereby supporting the compliance process.

Evaluating Enterprise Data Lineage Tools

While data lineage benefits are obvious, large organizations with complex data pipelines and data flows do face challenges in embracing the technology to document the enterprise data pipelines. These include:

  • Surveying the enterprise – Gathering information about the sources, flows and configurations of data pipelines.
  • Maintenance – Configuring a means to maintain an up-to-date view of the data pipelines.
  • Deliverability – Providing a way to give data consumers visibility to the lineage maps.
  • Sustainability – Ensuring sustainability of the processes for producing data lineage mappings.

Producing a collection of up-to-date data lineage mappings that are easily reviewed by different data consumers depends on addressing these challenges. When considering data lineage tools, keep these issues in mind when evaluating how well the tools can meet your data governance needs.

erwin Data Intelligence (erwin DI) helps organizations automate their data lineage initiatives. Learn more about data lineage with erwin DI.

Value of Data Intelligence IDC Report

Categories
erwin Expert Blog

Why EA Needs to Be Part of Your Digital Transformation Strategy

Enterprise architecture (EA) isn’t dead, you’re just using it wrong. Part three of erwin’s digital transformation blog series.  

I’ll let you in on a little secret: the rumor of enterprise architecture’s demise has been greatly exaggerated. However, the truth for many of today’s fast-moving businesses is that enterprise architecture fails. But why?

Enterprise architecture is invaluable for internal business intelligence (but is rarely used for real intelligence), governance (but often has a very narrow focus), management insights (but doesn’t typically provide useful insights), and transformation and planning (ok, now we have something!).

In reality, most organizations do not leverage EA teams to their true potential. Instead they rely on consultants, trends, regulations and legislation to drive strategy.

Why does this happen?

Don’t Put Enterprise Architecture in a Corner

EA has remained in its traditional comfort zone of IT. EA is not only about IT …  but yet, EA lives within IT, focuses on IT and therefore loses its business dimension and support.

It remains isolated and is rarely, if ever, involved in:

  • Assessing, planning and running business transformation initiatives
  • Providing real, enterprise-wide insights
  • Producing actionable initiatives

Instead, it focuses on managing “stuff”:

  • Understanding existing “stuff” by gathering exhaustively detailed information
  • Running “stuff”-deployment projects
  • Managing cost “stuff”
  • “Moving to the cloud” (the solution to … everything)

Enterprise Architecture

What Prevents Enterprise Architecture from Being Successful?

There are three main reasons why EA has been pigeon-holed:

  1. Lack of trust in the available information
    • Information is mostly collected, entered and maintained manually
    • Automated data collection and connection is costly and error-prone
    • Identification of issues can be very difficult and time-consuming
  1. Lack of true asset governance and collaboration
    • Enterprise architecture becomes ring-fenced within a department
    • Few stakeholders willing to be actively involved in owning assets and be responsible for them
    • Collaboration on EA is seen as secondary and mostly focused on reports and status updates
  1. Lack of practical insights (insights, analyses and management views)
    • Too small and narrow thinking of what EA can provide
    • The few analyses performed focus on immediate questions, rarely planning and strategy
    • Collaboration on EA is seen as secondary and mostly focused on reports and status updates

Because of this, EA fails to deliver the relevant insights that management needs to make decisions – in a timely manner – and loses its credibility.

But the fact is EA should be, and was designed to be, about actionable insights leading to innovative architecture, not about only managing “stuff!”

Don’t Slow Your Roll. Elevate Your Role.

It’s clear that the role of EA in driving digital transformation needs to be elevated. It needs to be a strategic partner with the business.

According to a McKinsey report on the “Five Enterprise-Architecture Practices That Add Value to Digital Transformations,” EA teams need to:

“Translate architecture issues into terms that senior executives will understand. Enterprise architects can promote closer alignment between business and IT by helping to translate architecture issues for business leaders and managers who aren’t technology savvy. Engaging senior management in discussions about enterprise architecture requires management to dedicate time and actively work on technology topics. It also requires the EA team to explain technology matters in terms that business leaders can relate to.”

With that said, to further change the perception of EA within the organization you need to serve what management needs. To do this, enterprise architects need to develop innovative business, not IT insights, and make them dynamic. Next, enterprise architects need to gather information you can trust and then maintain.

To provide these strategic insights, you don’t need to focus on everything — you need to focus on what management wants you to focus on. The rest is just IT being IT. And, finally, you need to collaborate – like your life depends on it.

Giving Digital Transformation an Enterprise Architecture EDGE

The job of the enterprise architecture is to provide the tools and insights for the C-suite, and other business stakeholders, to help deploy strategies for business transformation.

Let’s say the CEO has a brilliant idea and wants to test it. This is EA’s sweet spot and opportunity to shine. And this is where erwin lives by providing an easy, automated way to deliver collaboration, speed and responsiveness.

erwin is about providing the right information to the right people at the right time. We are focused on empowering the forward-thinking enterprise architect by providing:

  • Superb, near real-time understanding of information
  • Excellent, intuitive collaboration
  • Dynamic, interactive dashboards (vertical and horizontal)
  • Actual, realistic, business-oriented insights
  • Assessment, planning and implementation support

Data-Driven Business Transformation

Categories
erwin Expert Blog

Constructing a Digital Transformation Strategy: Putting the Data in Digital Transformation

Having a clearly defined digital transformation strategy is an essential best practice for successful digital transformation. But what makes a digital transformation strategy viable?

Part Two of the Digital Transformation Journey …

In our last blog on driving digital transformation, we explored how business architecture and process (BP) modeling are pivotal factors in a viable digital transformation strategy.

EA and BP modeling squeeze risk out of the digital transformation process by helping organizations really understand their businesses as they are today. It gives them the ability to identify what challenges and opportunities exist, and provides a low-cost, low-risk environment to model new options and collaborate with key stakeholders to figure out what needs to change, what shouldn’t change, and what’s the most important changes are.

Once you’ve determined what part(s) of your business you’ll be innovating — the next step in a digital transformation strategy is using data to get there.

Digital Transformation Examples

Constructing a Digital Transformation Strategy: Data Enablement

Many organizations prioritize data collection as part of their digital transformation strategy. However, few organizations truly understand their data or know how to consistently maximize its value.

If your business is like most, you collect and analyze some data from a subset of sources to make product improvements, enhance customer service, reduce expenses and inform other, mostly tactical decisions.

The real question is: are you reaping all the value you can from all your data? Probably not.

Most organizations don’t use all the data they’re flooded with to reach deeper conclusions or make other strategic decisions. They don’t know exactly what data they have or even where some of it is, and they struggle to integrate known data in various formats and from numerous systems—especially if they don’t have a way to automate those processes.

How does your business become more adept at wringing all the value it can from its data?

The reality is there’s not enough time, people and money for true data management using manual processes. Therefore, an automation framework for data management has to be part of the foundations of a digital transformation strategy.

Your organization won’t be able to take complete advantage of analytics tools to become data-driven unless you establish a foundation for agile and complete data management.

You need automated data mapping and cataloging through the integration lifecycle process, inclusive of data at rest and data in motion.

An automated, metadata-driven framework for cataloging data assets and their flows across the business provides an efficient, agile and dynamic way to generate data lineage from operational source systems (databases, data models, file-based systems, unstructured files and more) across the information management architecture; construct business glossaries; assess what data aligns with specific business rules and policies; and inform how that data is transformed, integrated and federated throughout business processes—complete with full documentation.

Without this framework and the ability to automate many of its processes, business transformation will be stymied. Companies, especially large ones with thousands of systems, files and processes, will be particularly challenged by taking a manual approach. Outsourcing these data management efforts to professional services firms only delays schedules and increases costs.

With automation, data quality is systemically assured. The data pipeline is seamlessly governed and operationalized to the benefit of all stakeholders.

Constructing a Digital Transformation Strategy: Smarter Data

Ultimately, data is the foundation of the new digital business model. Companies that have the ability to harness, secure and leverage information effectively may be better equipped than others to promote digital transformation and gain a competitive advantage.

While data collection and storage continues to happen at a dramatic clip, organizations typically analyze and use less than 0.5 percent of the information they take in – that’s a huge loss of potential. Companies have to know what data they have and understand what it means in common, standardized terms so they can act on it to the benefit of the organization.

Unfortunately, organizations spend a lot more time searching for data rather than actually putting it to work. In fact, data professionals spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analysis, according to IDC.

The solution is data intelligence. It improves IT and business data literacy and knowledge, supporting enterprise data governance and business enablement.

It helps solve the lack of visibility and control over “data at rest” in databases, data lakes and data warehouses and “data in motion” as it is integrated with and used by key applications.

Organizations need a real-time, accurate picture of the metadata landscape to:

  • Discover data – Identify and interrogate metadata from various data management silos.
  • Harvest data – Automate metadata collection from various data management silos and consolidate it into a single source.
  • Structure and deploy data sources – Connect physical metadata to specific data models, business terms, definitions and reusable design standards.
  • Analyze metadata – Understand how data relates to the business and what attributes it has.
  • Map data flows – Identify where to integrate data and track how it moves and transforms.
  • Govern data – Develop a governance model to manage standards, policies and best practices and associate them with physical assets.
  • Socialize data – Empower stakeholders to see data in one place and in the context of their roles.

The Right Tools

When it comes to digital transformation (like most things), organizations want to do it right. Do it faster. Do it cheaper. And do it without the risk of breaking everything. To accomplish all of this, you need the right tools.

The erwin Data Intelligence (DI) Suite is the heart of the erwin EDGE platform for creating an “enterprise data governance experience.” erwin DI combines data cataloging and data literacy capabilities to provide greater awareness of and access to available data assets, guidance on how to use them, and guardrails to ensure data policies and best practices are followed.

erwin Data Catalog automates enterprise metadata management, data mapping, reference data management, code generation, data lineage and impact analysis. It efficiently integrates and activates data in a single, unified catalog in accordance with business requirements. With it, you can:

  • Schedule ongoing scans of metadata from the widest array of data sources.
  • Keep metadata current with full versioning and change management.
  • Easily map data elements from source to target, including data in motion, and harmonize data integration across platforms.

erwin Data Literacy provides self-service, role-based, contextual data views. It also provides a business glossary for the collaborative definition of enterprise data in business terms, complete with built-in accountability and workflows. With it, you can:

  • Enable data consumers to define and discover data relevant to their roles.
  • Facilitate the understanding and use of data within a business context.
  • Ensure the organization is fluent in the language of data.

With data governance and intelligence, enterprises can discover, understand, govern and socialize mission-critical information. And because many of the associated processes can be automated, you reduce errors and reliance on technical resources while increasing the speed and quality of your data pipeline to accomplish whatever your strategic objectives are, including digital transformation.

Check out our latest whitepaper, Data Intelligence: Empowering the Citizen Analyst with Democratized Data.

Data Intelligence: Empowering the Citizen Analyst with Democratized Data

Categories
erwin Expert Blog

Using Strategic Data Governance to Manage GDPR/CCPA Complexity

In light of recent, high-profile data breaches, it’s past-time we re-examined strategic data governance and its role in managing regulatory requirements.

News broke earlier this week of British Airways being fined 183 million pounds – or $228 million – by the U.K. for alleged violations of the European Union’s General Data Protection Regulation (GDPR). While not the first, it is the largest penalty levied since the GDPR went into effect in May 2018.

Given this, Oppenheimer & Co. cautions:

“European regulators could accelerate the crackdown on GDPR violators, which in turn could accelerate demand for GDPR readiness. Although the CCPA [California Consumer Privacy Act, the U.S. equivalent of GDPR] will not become effective until 2020, we believe that new developments in GDPR enforcement may influence the regulatory framework of the still fluid CCPA.”

With all the advance notice and significant chatter for GDPR/CCPA,  why aren’t organizations more prepared to deal with data regulations?

In a word? Complexity.

The complexity of regulatory requirements in and of themselves is aggravated by the complexity of the business and data landscapes within most enterprises.

So it’s important to understand how to use strategic data governance to manage the complexity of regulatory compliance and other business objectives …

Designing and Operationalizing Regulatory Compliance Strategy

It’s not easy to design and deploy compliance in an environment that’s not well understood and difficult in which to maneuver. First you need to analyze and design your compliance strategy and tactics, and then you need to operationalize them.

Modern, strategic data governance, which involves both IT and the business, enables organizations to plan and document how they will discover and understand their data within context, track its physical existence and lineage, and maximize its security, quality and value. It also helps enterprises put these strategic capabilities into action by:

  • Understanding their business, technology and data architectures and their inter-relationships, aligning them with their goals and defining the people, processes and technologies required to achieve compliance.
  • Creating and automating a curated enterprise data catalog, complete with physical assets, data models, data movement, data quality and on-demand lineage.
  • Activating their metadata to drive agile data preparation and governance through integrated data glossaries and dictionaries that associate policies to enable stakeholder data literacy.

Strategic Data Governance for GDPR/CCPA

Five Steps to GDPR/CCPA Compliance

With the right technology, GDPR/CCPA compliance can be automated and accelerated in these five steps:

  1. Catalog systems

Harvest, enrich/transform and catalog data from a wide array of sources to enable any stakeholder to see the interrelationships of data assets across the organization.

  1. Govern PII “at rest”

Classify, flag and socialize the use and governance of personally identifiable information regardless of where it is stored.

  1. Govern PII “in motion”

Scan, catalog and map personally identifiable information to understand how it moves inside and outside the organization and how it changes along the way.

  1. Manage policies and rules

Govern business terminology in addition to data policies and rules, depicting relationships to physical data catalogs and the applications that use them with lineage and impact analysis views.

  1. Strengthen data security

Identify regulatory risks and guide the fortification of network and encryption security standards and policies by understanding where all personally identifiable information is stored, processed and used.

How erwin Can Help

erwin is the only software provider with a complete, metadata-driven approach to data governance through our integrated enterprise modeling and data intelligence suites. We help customers overcome their data governance challenges, with risk management and regulatory compliance being primary concerns.

However, the erwin EDGE also delivers an “enterprise data governance experience” in terms of agile innovation and business transformation – from creating new products and services to keeping customers happy to generating more revenue.

Whatever your organization’s key drivers are, a strategic data governance approach – through  business process, enterprise architecture and data modeling combined with data cataloging and data literacy – is key to success in our modern, digital world.

If you’d like to get a handle on handling your data, you can sign up for a free, one-on-one demo of erwin Data Intelligence.

For more information on GDPR/CCPA, we’ve also published a white paper on the Regulatory Rationale for Integrating Data Management and Data Governance.

GDPR White Paper

Categories
erwin Expert Blog

Business Architecture and Process Modeling for Digital Transformation

At a fundamental level, digital transformation is about further synthesizing an organization’s operations and technology, so involving business architecture and process modeling is a best practice organizations cannot ignore.

This post outlines how business architecture and process modeling come together to facilitate efficient and successful digital transformation efforts.

Business Process Modeling: The First Step to Giving Customers What They Expect

Salesforce recently released the State of the Connected Customer report, with 75 percent of customers saying they expect companies to use new technologies to create better experiences. So the business and digital transformation playbook has to be updated.

These efforts must be carried out with continuous improvement in mind. Today’s constantly evolving business environment totally reinforces the old adage that change is the only constant.

Even historically reluctant-to-change banks now realize they need to innovate, adopting digital transformation to acquire and retain customers. Innovate or die is another adage that holds truer than ever before.

Fidelity International is an example of a successful digital transformation adopter and innovator. The company realized that different generations want different information and have distinct communication preferences.

For instance, millennials are adept at using digital channels, and they are the fastest-growing customer base for financial services companies. Fidelity knew it needed to understand customer needs and adapt its processes around key customer touch points and build centers of excellence to support them.

Business architecture and process modeling

Business Architecture and Process Modeling

Planning and working toward a flexible, responsive and adaptable future is no longer enough – the modern organization must be able to visualize not only the end state (the infamous and so-elusive “to-be”) but also perform detailed and comprehensive impact analysis on each scenario, often in real time. This analysis also needs to span multiple departments, extending beyond business and process architecture to IT, compliance and even HR and legal.

The ability of process owners to provide this information to management is central to ensuring the success of any transformation initiative. And new requirements and initiatives need to be managed in new ways. Digital and business transformation is about being able to do three things at the same time, all working toward the same goals:

  • Collect, document and analyze requirements
  • Establish all information layers impacted by the requirements
  • Develop and test the impact of multiple alternative scenarios

Comprehensive business process modeling underpins all of the above, providing the central information axis around which initiatives are scoped, evaluated, planned, implemented and ultimately managed.

Because of its central role, business process modeling must expand to modeling information from other layers within the organization, including:

  • System and application usage information
  • Supporting and reference documentation
  • Compliance, project and initiative information
  • Data usage

All these information layers must be captured and modeled at the appropriate levels, then connected to form a comprehensive information ecosystem that enables parts of the organization running transformation and other initiatives to instantly access and leverage it for decision-making, simulation and scenario evaluation, and planning, management and maintenance.

No Longer a Necessary Evil

Traditionally, digital and business transformation initiatives relied almost exclusively on human knowledge and experience regarding processes, procedures, how things worked, and how they fit together to provide a comprehensive and accurate framework. Today, technology can aggregate and manage all this information – and more – in a structured, organized and easily accessible way.

Business architecture extends beyond simple modeling; it also incorporates automation to reduce manual effort, remove potential for error, and guarantee effective data governance – with visibility from strategy all the way down to data entry and the ability to trace and manage data lineage. It requires robotics to cross-reference mass amounts of information, never before integrated to support effective decision-making.

The above are not options that are “nice to have,” but rather necessary gateways to taking business process management into the future. And the only way to leverage them is through systemic, organized and comprehensive business architecture modeling and analysis.

Therefore, business architecture and process modeling are no longer a necessary evil. They are critical success factors to any digital or business transformation journey.

A Competitive Weapon

Experts confirm the need to rethink and revise business processes to incorporate more digital automation. Forrester notes in its report, The Growing Importance of Process to Digital Transformation, that the changes in how business is conducted are driving the push “to reframe organizational operational processes around digital transformation efforts.” In a dramatic illustration of the need to move in this direction, the research firm writes that “business leaders are looking to use process as a competitive weapon.”

If a company hasn’t done a good job of documenting its processes, it can’t realize a future in which digital transformation is part of everyday operations. It’s never too late to start, though. In a fast-moving and pressure cooker business environment, companies need to implement business process models that make it possible to visually and analytically represent the steps that will add value to the company – either around internal operations or external ones, such as product or service delivery.

erwin BP, part of the erwin EDGE Platform, enables effective business architecture and process modeling. With it, any transformation initiative becomes a simple, streamlined exercise to support distributed information capture and management, object-oriented modeling, simulation and collaboration.

To find out about how erwin can help in empowering your transformation initiatives, please click here.

data-driven business transformation

Categories
erwin Expert Blog

Choosing the Right Data Modeling Tool

The need for an effective data modeling tool is more significant than ever.

For decades, data modeling has provided the optimal way to design and deploy new relational databases with high-quality data sources and support application development. But it provides even greater value for modern enterprises where critical data exists in both structured and unstructured formats and lives both on premise and in the cloud.

In today’s hyper-competitive, data-driven business landscape, organizations are awash with data and the applications, databases and schema required to manage it.

For example, an organization may have 300 applications, with 50 different databases and a different schema for each. Additional challenges, such as increasing regulatory pressures – from the General Data Protection Regulation (GDPR) to the Health Insurance Privacy and Portability Act (HIPPA) – and growing stores of unstructured data also underscore the increasing importance of a data modeling tool.

Data modeling, quite simply, describes the process of discovering, analyzing, representing and communicating data requirements in a precise form called the data model. There’s an expression: measure twice, cut once. Data modeling is the upfront “measuring tool” that helps organizations reduce time and avoid guesswork in a low-cost environment.

From a business-outcome perspective, a data modeling tool is used to help organizations:

  • Effectively manage and govern massive volumes of data
  • Consolidate and build applications with hybrid architectures, including traditional, Big Data, cloud and on premise
  • Support expanding regulatory requirements, such as GDPR and the California Consumer Privacy Act (CCPA)
  • Simplify collaboration across key roles and improve information alignment
  • Improve business processes for operational efficiency and compliance
  • Empower employees with self-service access for enterprise data capability, fluency and accountability

Data Modeling Tool

Evaluating a Data Modeling Tool – Key Features

Organizations seeking to invest in a new data modeling tool should consider these four key features.

  1. Ability to visualize business and technical database structures through an integrated, graphical model.

Due to the amount of database platforms available, it’s important that an organization’s data modeling tool supports a sufficient (to your organization) array of platforms. The chosen data modeling tool should be able to read the technical formats of each of these platforms and translate them into highly graphical models rich in metadata. Schema can be deployed from models in an automated fashion and iteratively updated so that new development can take place via model-driven design.

  1. Empowering of end-user BI/analytics by data source discovery, analysis and integration. 

A data modeling tool should give business users confidence in the information they use to make decisions. Such confidence comes from the ability to provide a common, contextual, easily accessible source of data element definitions to ensure they are able to draw upon the correct data; understand what it represents, including where it comes from; and know how it’s connected to other entities.

A data modeling tool can also be used to pull in data sources via self-service BI and analytics dashboards. The data modeling tool should also have the ability to integrate its models into whatever format is required for downstream consumption.

  1. The ability to store business definitions and data-centric business rules in the model along with technical database schemas, procedures and other information.

With business definitions and rules on board, technical implementations can be better aligned with the needs of the organization. Using an advanced design layer architecture, model “layers” can be created with one or more models focused on the business requirements that then can be linked to one or more database implementations. Design-layer metadata can also be connected from conceptual through logical to physical data models.

  1. Rationalize platform inconsistencies and deliver a single source of truth for all enterprise business data.

Many organizations struggle to breakdown data silos and unify data into a single source of truth, due in large part to varying data sources and difficulty managing unstructured data. Being able to model any data from anywhere accounts for this with on-demand modeling for non-relational databases that offer speed, horizontal scalability and other real-time application advantages.

With NoSQL support, model structures from non-relational databases, such as Couchbase and MongoDB can be created automatically. Existing Couchbase and MongoDB data sources can be easily discovered, understood and documented through modeling and visualization. Existing entity-relationship diagrams and SQL databases can be migrated to Couchbase and MongoDB too. Relational schema also will be transformed to query-optimized NoSQL constructs.

Other considerations include the ability to:

  • Compare models and databases.
  • Increase enterprise collaboration.
  • Perform impact analysis.
  • Enable business and IT infrastructure interoperability.

When it comes to data modeling, no one knows it better. For more than 30 years, erwin Data Modeler has been the market leader. It is built on the vision and experience of data modelers worldwide and is the de-facto standard in data model integration.

You can learn more about driving business value and underpinning governance with erwin DM in this free white paper.

Data Modeling Drives Business Value