Categories
erwin Expert Blog Enterprise Architecture

Top 3 Benefits of Enterprise Architecture

Benefits of Enterprise Architecture

Enterprise architecture (EA) benefits modern organizations in many ways. It provides a holistic, top down view of structure and systems, making it invaluable in managing the complexities of data-driven business.

Once considered solely a function of IT, enterprise architecture has historically operated from an ivory tower. It was often siloed from the business at large, stifling the potential benefits of the holistic view it could have provided.

Now, the growing importance of EA is reflected in its evolving position in the business. Instead of being considered just a function of IT, EA now plays a leading role in bridging the gap between IT and the business.

The practice has evolved in approach, too. In the past, enterprise architecture has played a foundational, support role – largely focused with “keeping the lights on.”

Today its scope is more progressive and business outcome-focused to identify opportunities for growth and change.

As a matter of fact, Gartner has said that EA is becoming a “form of internal management consulting” because it helps define and shape business and operating models, identify risk and opportunities, and create technology roadmaps to suit.

Analyst firm Ovum also recognizes EA’s evolution, referring to today’s EA as AE, or “architect everything,” further demonstrating its newfound scope.

 

Top Three Enterprise Architecture Benefits

Of course, enterprise architecture can’t sit at the strategy table without results. Following are what we believe to be the top three benefits of enterprise architecture:

1. Manage complexity

Modern organizations are a complicated mesh of different systems and applications of varying degrees of importance and prominence.

The top-down, holistic view of an organization provided by enterprise architecture means that organizations are more able to efficiently and confidently assess such assets. For example, impact analysis might identify areas where an organization can streamline its tech stack and cut costs.

It might uncover redundancies where multiple applications address the same process.

Alternatively, impact analysis might find that a seemingly less prominent application is actual integral to operations in circumstances where leadership are considering phasing it out.

In short, enterprise architecture helps business and IT leaders capture, understand and articulate opportunities, challenges and risks – including security.

2. Supporting the creation of actionable, signature-ready EA deliverables

As well as assessing an organization’s current capabilities, the holistic, top-down view provided by enterprise architecture also helps identify gaps.

A better understanding of its enterprise architecture means an organization can make more informed investment decisions. Of course, this means organizations have a better understanding of what they should invest in.

However, it also helps them better understand when, as more pressing concerns can be identified and roadmaps can be created to reflect an organization’s priorities. 

This approach helps an organization meet its current operational demands and opportunities, whilst navigating and mitigating disruptions. It can also ensure it does this in accordance with the longer-term strategic vision of the organization.

3. Increasing agility and speeding time to value

In the era of rapidly evolving technology and rampant – often disruptive – digital transformation, the need for enterprise architecture tools is abundantly clear. Organizations with a healthy understanding of their enterprise architecture are better equipped to evaluate and implement new technology in a timely and efficient manner. 

EA tools accelerate analysis and decision support for alternative investment, rationalization, and optimization opportunities and plans and for assessing risk, change and the impact on the organization.

Maturing Enterprise Architecture

To reap such benefits of this new approach to EA, many organizations will have to work to mature their practices.

To be effective, business outcome-focused enterprise architecture needs to be consistent. It needs to be communicable and discernible. It needs to be up to date and accurate.

For many organizations, these standards have been impossible to meet as their enterprise architectures are burdened by the use of systems that were not built for purpose.

Basic visualization tools, spreadsheets and even word processors have typically played stand-in for dedicated EA solutions. The non-purpose-built systems lacked the industry standards needed to accurately capture and align business and IT elements and how they link together.

Additionally, collaboration was often marred by issues with outdated, and even disparate file versions and types. This being due to business’ lacking the systems necessary to continuously and methodically maintain models, frameworks and concepts as they evolve.

Therefore, a key milestone in maturing a modern enterprise architecture initiative, is developing a single source of truth, consistent across the enterprise. This requires the implementation of a dedicated, centralized and collaborative enterprise architecture tool, be that on-premise, or via the cloud.

Of course, such a tool should cover enterprise architecture’s legacy capabilities and expectations. Those include support for industry standard frameworks and notation, the ability to perform impact analysis and the streamlining of systems and applications.

But to mature the practice, organizations should implement an EA tool with a shared, centralized metadata repository and role-based access.

It should have the ability to share an integrated set of views and information on strategy, business capabilities, applications, information assets, technologies, etc., to help provide stakeholders with a thorough understanding of the enterprise.

Once this milestone has been met, organizations can really begin to enjoy the benefits of enterprise architecture, in the modern, data-driven business context.

If the benefits of enterprise architecture would help your business, and you’d like to be the next erwin EA success story, try erwin’s enterprise architecture and business process modeling software for free.

Categories
erwin Expert Blog

Benefits of Data Vault Automation

The benefits of Data Vault automation from the more abstract – like improving data integrity – to the tangible – such as clearly identifiable savings in cost and time.

So Seriously … You Should Automate Your Data Vault

 By Danny Sandwell

Data Vault is a methodology for architecting and managing data warehouses in complex data environments where new data types and structures are constantly introduced.

Without Data Vault, data warehouses are difficult and time consuming to change causing latency issues and slowing time to value. In addition, the queries required to maintain historical integrity are complex to design and run slow causing performance issues and potentially incorrect results because the ability to understand relationships between historical snap shots of data is lacking.

In his blog, Dan Linstedt, the creator of Data Vault methodology, explains that Data Vaults “are extremely scalable, flexible architectures” enabling the business to grow and change without “the agony and pain of high costs, long implementation and test cycles, and long lists of impacts across the enterprise warehouse.”

With a Data Vault, new functional areas typically are added quickly and easily, with changes to existing architecture taking less than half the traditional time with much less impact on the downstream systems, he notes.

Astonishingly, nearly 20 years since the methodology’s creation, most Data Vault design, development and deployment phases are still handled manually. But why?

Traditional manual efforts to define the Data Vault population and create ETL code from scratch can take weeks or even months. The entire process is time consuming slowing down the data pipeline and often riddled with human errors.

On the flipside, automating the development and deployment of design changes and the resulting data movement processing code ensures companies can accelerate dev and deployment in a timely and cost-effective manner.

Benefits of Data Vault Automation

Benefits of Data Vault Automation – A Case Study …

Global Pharma Company Saves Considerable Time and Money with Data Vault Automation

Let’s take a look at a large global pharmaceutical company that switched to Data Vault automation with staggering results.

Like many pharmaceutical companies, it manages a massive data warehouse combining clinical trial, supply chain and other mission-critical data. They had chosen a Data Vault schema for its flexibility in handling change but found creating the hubs and satellite structure incredibly laborious.

They needed to accelerate development, as well as aggregate data from different systems for internal customers to access and share. Additionally, the company needed lineage and traceability for regulatory compliance efforts.

With this ability, they can identify data sources, transformations and usage to safeguard protected health information (PHI) for clinical trials.

After an initial proof of concept, they deployed erwin Data Vault Automation and generated more than 200 tables, jobs and processes with 10 to 12 scripts. The highly schematic structure of the models enabled large portions of the modeling process to be automated, dramatically accelerating Data Vault projects and optimizing data warehouse management.

erwin Data Vault Automation helped this pharma customer automate the complete lifecycle – accelerating development while increasing consistency, simplicity and flexibility – to save considerable time and money.

For this customer the benefits of data vault automation were as such:

  • Saving an estimated 70% of the costs of manual development
  • Generating 95% of the production code with “zero touch,” improving the time to business value and significantly reduced costly re-work associated with error-prone manual processes
  • Increasing data integrity, including for new requirements and use cases regardless of changes to the warehouse structure because legacy source data doesn’t degrade
  • Creating a sustainable approach to Data Vault deployment, ensuring the agile, adaptable and timely delivery of actionable insights to the business in a well-governed facility for regulatory compliance, including full transparency and ease of auditability

Homegrown Tools Never Provide True Data Vault Automation

Many organizations use some form of homegrown tool or standalone applications. However, they don’t integrate with other tools and components of the architecture, they’re expensive, and quite frankly, they make it difficult to derive any meaningful results.

erwin Data Vault Automation centralizes the specification and deployment of Data Vault architectures for better control and visibility of the software development lifecycle. erwin Data Catalog makes it easy to discover, organize, curate and govern data being sourced for and managed in the warehouse.

With this solution, users select data sets to be included in the warehouse and fully automate the loading of Data Vault structures and ETL operations.

erwin Data Vault Smart Connectors eliminate the need for a business analyst and ETL developers to repeat mundane tasks, so they can focus on choosing and using the desired data instead. This saves considerable development time and effort plus delivers a high level of standardization and reuse.

After the Data Vault processes have been automated, the warehouse is well documented with traceability from the marts back to the operational data to speed the investigation of issues and analyze the impact of changes.

Bottom line: if your Data Vault integration is not automated, you’re already behind.

If you’d like to get started with erwin Data Vault Automation or request a quote, you can email consulting@erwin.com.

Data Modeling Drives Business Value

Categories
erwin Expert Blog

Business Process Can Make or Break Data Governance

Data governance isn’t a one-off project with a defined endpoint. It’s an on-going initiative that requires active engagement from executives and business leaders.

Data governance, today, comes back to the ability to understand critical enterprise data within a business context, track its physical existence and lineage, and maximize its value while ensuring quality and security.

Free Data Modeling Best Practice Guide

Historically, little attention has focused on what can literally make or break any data governance initiative — turning it from a launchpad for competitive advantage to a recipe for disaster. Data governance success hinges on business process modeling and enterprise architecture.

To put it even more bluntly, successful data governance* must start with business process modeling and analysis.

*See: Three Steps to Successful & Sustainable Data Governance Implementation

Business Process Data Governance

Passing the Data Governance Ball

For years, data governance was the volleyball passed back and forth over the net between IT and the business, with neither side truly owning it. However, once an organization understands that IT and the business are both responsible for data, it needs to develop a comprehensive, holistic strategy for data governance that is capable of four things:

  1. Reaching every stakeholder in the process
  2. Providing a platform for understanding and governing trusted data assets
  3. Delivering the greatest benefit from data wherever it lives, while minimizing risk
  4. Helping users understand the impact of changes made to a specific data element across the enterprise.

To accomplish this, a modern data governance strategy needs to be interdisciplinary to break down traditional silos. Enterprise architecture is important because it aligns IT and the business, mapping a company’s applications and the associated technologies and data to the business functions and value streams they enable.

Ovum Market Radar: Enterprise Architecture

The business process and analysis component is vital because it defines how the business operates and ensures employees understand and are accountable for carrying out the processes for which they are responsible. Enterprises can clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

Slow Down, Ask Questions

In a rush to implement a data governance methodology and system, organizations can forget that a system must serve a process – and be governed/controlled by one.

To choose the correct system and implement it effectively and efficiently, you must know – in every detail – all the processes it will impact. You need to ask these important questions:

  1. How will it impact them?
  2. Who needs to be involved?
  3. When do they need to be involved?

These questions are the same ones we ask in data governance. They involve impact analysis, ownership and accountability, control and traceability – all of which effectively documented and managed business processes enable.

Data sets are not important in and of themselves. Data sets become important in terms of how they are used, who uses them and what their use is – and all this information is described in the processes that generate, manipulate and use them. So unless we know what those processes are, how can any data governance implementation be complete or successful?

Processes need to be open and shared in a concise, consistent way so all parts of the organization can investigate, ask questions, and then add their feedback and information layers. In other words, processes need to be alive and central to the organization because only then will the use of data and data governance be truly effective.

A Failure to Communicate

Consider this scenario: We’ve perfectly captured our data lineage, so we know what our data sets mean, how they’re connected, and who’s responsible for them – not a simple task but a massive win for any organization. Now a breach occurs. Will any of the above information tell us why it happened? Or where? No! It will tell us what else is affected and who can manage the data layer(s), but unless we find and address the process failure that led to the breach, it is guaranteed to happen again.

By knowing where data is used – the processes that use and manage it – we can quickly, even instantly, identify where a failure occurs. Starting with data lineage (meaning our forensic analysis starts from our data governance system), we can identify the source and destination processes and the associated impacts throughout the organization.

We can know which processes need to change and how. We can anticipate the pending disruptions to our operations and, more to the point, the costs involved in mitigating and/or addressing them.

But knowing all the above requires that our processes – our essential and operational business architecture – be accurately captured and modelled. Instituting data governance without processes is like building a castle on sand.

Rethinking Business Process Modeling and Analysis

Modern organizations need a business process modeling and analysis tool with easy access to all the operational layers across the organization – from high-level business architecture all the way down to data.

Such a system should be flexible, adjustable, easy-to-use and capable of supporting multiple layers simultaneously, allowing users to start in their comfort zones and mature as they work toward their organization’s goals.

The erwin EDGE is one of the most comprehensive software platforms for managing an organization’s data governance and business process initiatives, as well as the whole data architecture. It allows natural, organic growth throughout the organization and the assimilation of data governance and business process management under the same platform provides a unique data governance experience because of its integrated, collaborative approach.

Start your free, cloud-based trial of erwin Business Process and see how some of the world’s largest enterprises have benefited from its centralized repository and integrated, role-based views.

We’d also be happy to show you our data governance software, which includes data cataloging and data literacy capabilities.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Internal Business Process Modeling: The Secret Behind Exponential Organizations

Strong internal business process modeling and management helps data-driven organizations compete and lead

In short, an internal business process is a documented account of how things should be done to maximize efficiency and achieve a particular goal.

In the book “Exponential Organizations” by Salim Ismail, Michael S. Malone and Yuri van Geest, the authors, examine how every company is or will evolve into an information-based entity in which costs fall to nearly zero, abundance replaces scarcity and only “exponential organizations” survive.

It’s not news that exponential organizations like Uber, Airbnb and Netflix have flipped the script on disrupting traditional industries like taxis, hotels and video rentals/TV viewing.

But now, even traditional industries like healthcare and financial services, which were historically slow to innovate, are transforming at breakneck speed.

Let’s face it, in today’s hyper-competitive markets, the traditional approach of relying on legacy strengths or inertia for survival just simply won’t work.

The days of enterprises focusing almost exclusively on rigid structures, centralized management and accountability; concentrated knowledge; service mainly to external customers; and reactive, short-term strategy alignment driven mainly by massive-scale projects are antiquated.

The information within your organization’s internal business processes is where the data your company collects, creates, stores and analyzes actually transforms into something that makes your company go, hopefully for the long haul.

Internal Business Process Modeling - Exponential Organizations

The Value of Internal Business Process Modeling

Organizations are built on a series of internal business processes. The complexity of modern data-driven organizations requires processes to work in tandem to create and sustain value.

The degree to which any individual internal business process drives value can vary, but even the most seemingly mundane processes are part of a collective sum, greater than its parts.

Therefore, it’s critical for organizations to map their internal business processes to understand how a given action relates to the organizations’ overall strategy and goals.

Such knowledge is at the core of exponential organizations. They understand how any given internal business process relates to value creation, making it far easier to assess what’s currently working but also identify areas for improvement as well as the potential for competitive differentiation.

Exponential organizations also are better positioned to respond and adapt to disruptive forces, such as 5G. This is because understanding what and how you do things now makes it easier to implement change in an agile manner.

5G Roadmap: Preparing Your Enterprise Architecture

How do you join the ranks of exponential organizations? And where do you begin your journey to becoming an information-based entity?

Attitude Adjustment

More and more organizations are realizing they need to adjust their traditional thinking and subsequent actions, even if just a bit, to gain strategic advantage, reduce costs and retain market dominance. For example:

  1. Structures are becoming more adaptable, allowing for greater flexibility and cost management. How is this possible and why now? Organizations are grasping that effective, well-managed and documented internal business processes should form their operational backbones.
  2. Business units and the departments within them are becoming accountable not only for their own budgets but also on how well they achieve their goals. This is possible because their responsibilities and processes can be clearly defined, documented and then monitored to ensure their work is executed in a repeatable, predictable and measurable way.
  3. Knowledge is now both centralized and distributed thanks to modern knowledge management systems. Central repositories and collaborative portals give everyone within the organization equal access to the data they need to do their jobs more effectively and efficiently.
  4. And thanks to all the above, organizations can expand their focus from external customers to internal ones as well. By clearly identifying individual processes (and their cross-business handover points) and customer touchpoints, organizations can interact with any customer at the right point with the most appropriate resources.

Benefits of Internal Business Process Modeling and Management

One of the main benefits of a process-based organizational engine is that it should be able to better handle outside pressures, such as new regulations, if they are – or are becoming – truly process-based. Because once processes (and their encompassing business architecture) become central to the organization, a wide array of things become simpler, faster and cheaper.

Another benefit is application design – the holy grail or black hole of budgetary spending and project management, depending on your point of view – is streamlined, with requirements clearly gathered and managed in perfect correspondence to the processes they serve and with the data they manage clearly documented and communicated to the developers.

Testing occurs against real-life scenarios by the responsible parties as documented by the process owners – a drastic departure from the more traditional approaches in which the responsibility fell to designated, usually technical application owners.

Finally – and most important – data governance is no longer the isolated domain of data architects but central to the everyday processes that make an organization tick. As processes have stakeholders who use information – data – the roles of technical owners and data stewards become integral to ensuring processes operate efficiently, effectively and – above all – without interruptions. On the other side of this coin, data owners and data stewards no longer operate in their own worlds, distant from the processes their data supports.

Carpe Process

All modern organizations should seize business process as a central component to their operations. Data governance as well, and cost management becoming a third driver for the enterprise machine. But as we all know, it takes more than stable connecting rods to make an engine work – it needs cogs and wheels, belts and multiple power sources, all working together.

In the traditional organization, people are the internal mechanics. These days, powerful and flexible workflow engines provide much-needed automation for greater visibility plus more power, stability and quality – all the things a machine needs to operate as required/designed.

Advanced process management systems are becoming essential, not optional. And while not as sexy or attention-grabbing as other technologies, they provide the power to drive an organization toward its goals quickly, cost-effectively and efficiently.

To learn how erwin can empower a modern, process-based organization, please click here.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog Enterprise Architecture

5G Roadmap: Preparing Your Enterprise Architecture

Why planning your 5G roadmap requires significant input from enterprise architects

5G is coming and bringing with it the promise to transform any industry. And while the focus has been on the benefits to consumers,  the effects on the enterprise are far-reaching.

Few examples of emerging technology have the potential to disrupt and downright revolutionize certain markets and processes than 5G.

For enterprise architects, it’s important to understand how a potentially disruptive emerging technology like 5G might be incorporated into an organization, in advance.

A 5G roadmap could be the difference between such disruptions being an obstruction or an opportunity.

As with any emerging technology,  organizations need to test and pilot their projects to answer some important questions before going into production:

  • How do these technologies disrupt?
  • How do they provide value?

While the transition from 3G to 4G wasn’t all that eventful – or all that long ago – 5G is expected to buck the trend.

But how exactly?

5G: What to expect

5G promises dramatically faster download and upload speeds and reduced latency.

For context, average 4G speeds peak at around 45 Mbps (megabits per second); the industry goal is to hit 1 Gb (gigabit per second = 1,000 Mbps).

Telecom company Qualcomm believes real-world applications of 5G could be 10 to 20 times faster than that.

For consumers, this will mean dramatically faster downloads and uploads. Currently, downloading a two-hour movie takes around six  minutes on 4G. A 5G connection would achieve the same in just 3.6 seconds.

Organizations will, of course, enjoy the same benefits but will be burdened by the need to manage new levels of data, starting with telecommunications companies (telcos).

5G – A disruptive force vs. a catalyst for disruption

Usually, when we think of emerging disruptive technologies, the technology (or process, product, etc.) itself is the primary cause of the disruption.

With 5G, that’s still somewhat true. At least for telcos …

For example, 5G-driven disruption is forcing telecommunications companies to upgrade their infrastructure to cope with new volumes and velocities of data.

On a base level, these higher data volumes and velocities will be attributable to the fact that by making something happen faster, more of it can happen in a shorter amount of time.

But the increase in data speeds will be a catalyst for products and services that are currently not feasible becoming completely viable in the near future.

Of course, enterprise architecture is already integral to organizations with Internet of Things (IoT) devices in their portfolios.

5G enterprise architecture roadmap

But companies involved in internet-connected product market, as well as telcos, will need a 5G roadmap to ensure their enterprise architectures can cope with the additional data burden.

In addition to faster connection speeds, 5G will grant telcos more control over networks.

One such example of this control is the potential for network slicing, whereby multiple virtual networks can be generated within one physical 5G network, in turn allowing greater control of the service provided.

For example, self-driving cars would benefit from a network slice that offered exceptionally fast, low-latency connections to better accommodate their real-time data processing and transmitting needs.

Such a set up would go to waste for less-interactive, internet-connected devices. A smart fridge for example, could make do with far slower connection speeds.

This would mean telecommunications companies would start to look more like public-cloud providers and offer scalable services to their user bases.

However, realizing this potential would require more agile-oriented infrastructures than telcos typically have – which will of course require further input from enterprise architects to encourage an efficient implementation.

Another red pin to account for on the 5G roadmap.

So the answer to “Is 5G a disruptive force in and of itself, or is it a catalyst for disruption?” is actually … well, both. With telcos directly impacted by 5G disruption, and IoT product/service providers and digital business on the whole being disrupted by what 5G ultimately enables.

What does this mean for enterprise architects?

As addressed above, many of the business benefits of 5G are directly tied to increasing the amount of data that can be transferred at one time.

This presents a number of challenges for enterprise architects going forward.

As well as the increased volume of data itself, enterprise architects will need to prepare for faster times to market.

Radically improved data transfer speeds will encourage more agile product rollouts and updates, especially in connected devices that will feedback data insights about their performance.

The reduced latency will likely lead to a new influx of remote working, collaboration- enabling tools as well as products and services currently unaccounted for. Organizations with more agile enterprise architectures will be better placed to implement these smoothly when the time comes.

To better understand how your organization can prepare for 5G by adopting an agile enterprise architecture approach, click here.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Software Deployment Strategy: How to Get It Right the First Time

Big or Small, Enterprise Architecture Is a Key Part of a Successful Software Deployment Strategy

A good software deployment strategy could be the difference between multiple and costly false starts and a smooth implementation. Considering the rate at which emerging technologies are introduced, it’s becoming more important than ever for organizations to have a software deployment strategy in place.

But what does it involve?

Not all software deployments and investments are equal. Large-scale, big-money investments like ERP require a lot of resources and planning. Small-scale investments, like website technology, on the other hand, can be purchased, expensed and deployed with few people knowing. And of course, there are thousands of software decisions made that fall somewhere in between.

Software purchase decisions and deployments represent an opportunity to leverage the experience and knowledge of your enterprise architecture (EA) team so you can make smarter, better investments. The key here is the EA team’s complete view of your IT landscape, which can help eliminate redundant purchases, identify issues of integration and more.

 

Software Deployment Strategy: How to Get It Right the First Time

Small Projects Can Create Big Headaches

Here’s an example of how a small-scale software investment can wreak havoc on an organization.

There is an intense focus today on customer experience (CX). Ensuring that your website visitors have access to the information they want, and they can find it quickly and easily, is just part of your overall CX. This makes your customer-facing technologies – the ones that power your website or mobile app – critical investments, even though they may not carry the price tag of an ERP system.

Even the smallest investments need to be vetted to make sure they work with existing infrastructure and processes. One small piece of website tech that ends up degrading your online CX can cost your organization millions in a very short amount of time. There’s simply too many choices just a click away today if something isn’t working properly. Differentiating technologies are also more likely to be customized than an application like ERP, which can often use a number of out-of-the-box processes.

These are areas where a software deployment strategy involving your EA team can help guide the software purchase and deployment process. But even in a world where software deployments increasingly mean logging into a cloud-based SaaS application, a software deployment strategy is still beneficial.

Don’t Be Resigned to Failure

Many SaaS vendors like to talk about how easy it is to get up and running with their products, especially when the infrastructure elements are in the cloud. But the reality is that the network that connects to the SaaS application, the security, the integrations with existing (often on-premise) applications, the SLAs and licensing, can all benefit from a review by the EA team.

Failed software deployments are, in fact, a significant problem for many organizations. Such failures can often be attributed to a lack of planning and foresight.

Considering the costs associated with some software – including its purchase, implementation and consultancy fees/training required to get started – a good software deployment strategy could save millions … literally.

A Gartner study found that nearly half (46 percent) of respondents said their most expensive, time-intensive software deployments were not delivering. When Gartner broke the software purchases in question into deal sizes of over and under $1 million, the firm got similar results.

When your EA team has the visibility to see across your IT landscape and understand the business processes built on your technology, it can help provide a better idea of the real costs behind your software deployments and you can better estimate your time to value. When it comes to software investments, you don’t be resigned to failure.

erwin EA gives organizations a full-featured, versatile platform for enterprise architecture in its broadest sense to ensure the success of projects – regardless of their size or scope.

Start your free trial of erwin EA now.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

The Design Thinking Process: Five Stages to Solving Business Problems

The design thinking process is a method of encouraging and improving creative problem-solving.

The design thinking process is by no means new.

John Edward Arnold, a professor of mechanical engineering and business administration, was one of the first to discuss the concept in as early as the 1950s.

But the wave of digital and data-driven business has created new opportunities for the design thinking process to be applied.

For example, your business is likely collecting, storing and analyzing more information than ever before.

And while the intense focus on analytics in recent years has been good for many businesses, it’s important to remember the human element of making decisions and solving problems.

So with that in mind, the design thinking process can be used to bridge the gap between the data and the people.

But what is the design thinking process, exactly? And how does it work?

Design Thinking Definition: The Five Stages of the Design Thinking Process

There are lots of ways to harness ideas and solve problems. Design thinking is one means to foster and refine creative problem-solving.

While it doesn’t suggest ignoring your data, design thinking is, at its core, human-centered. It encourages organizations to focus on the people they’re creating for in hopes of producing better products, services and internal processes.

5 Stages in the Design Thinking Process

There are five stages in the design thinking process:

1. Empathize – The first stage of the design thinking process is gaining a better understanding of what problems need solving. It puts the end user you are trying to help first and encourages you to work backwards. By consulting and subsequently empathizing with the end user, you ensure your eventual solution is goal-oriented, increasing the likelihood of its effectiveness.

2. Define the problem – Once you have a better understanding of potential issues, it’s time to get specific. At this point, it’s good practice to translate the problem into a “problem statement” – a concise description of the issue that identifies the current state you wish to address and the desired future state you intend to reach.

3. Ideate solutions – This is the time to get creative. Once you have a solid understanding of the problem you can brainstorm ideas to bridge the gap between the current and the desired future state to eliminate it.

4. Prototype – At stage four, it’s time to implement the ideas from stage three in the real world. Typically, the prototype will be a scaled-down example of the solution – or ideally, possible solutions. It goes without saying, but things are rarely perfect in their first iteration, as you’ll likely discover in the next stage.

5. Test – At this point, it’s time to test whether the proposed solution works. In the case of multiple potential solutions, this stage can identify which is most effective and/or efficient. It’s also an opportunity to assess what – if any – new problems the solution might cause.

With this in mind, it’s important to remember that progression through the five stages of the design thinking process isn’t necessarily linear.

Unsuccessful tests could lead your team back to the ideation stage. In some cases, you may want to circle back to stage one to test your new solution with end users. Then you’ll be able to better emphasize and understand how your solution might work in practice.

It’s also important to understand that the design thinking process is not, strictly speaking, the same as innovation. It’s an approach to problem-solving that may ultimately involve innovation or emerging technologies, but innovation is not inherently required.

Design thinking is an iterative process, and the best solutions that come out of it in many organizations will become part of their enterprise architectures.

Incorporating Design Thinking into Your Organization with Enterprise Architecture

The best way to put design thinking into use in your organization is by creating a strategic planning approach that takes ideas from assessment to analysis to delivery.

By employing an iterative approach with a thorough assessment and a feedback loop, everyone in your organization will feel more empowered and engaged.

The reality of business today is that nearly every business problem is going to have a technological solution.

It will fall to the IT organization to take the ideas that come out of your design thinking and figure out how to deliver them as solutions at scale and speed.

This is where enterprise architecture comes into play.

Evaluating, planning and deploying a business solution will require visibility. How will these solutions impact users? Can they be supported by the existing IT infrastructure? How do they fit into the business ecosystem?

When it comes to these important questions, the best place to get answers is from your enterprise architecture team. Be sure to make them a central part of your design thinking process.

In addition to enterprise architecture software, erwin also provides enterprise architecture consulting. You can learn more about those services here.

You also can try all the current features of erwin EA for free via our secure, cloud-based trial environment.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Managing Ideation and Innovation with Enterprise Architecture

Organizations largely recognize the need for enterprise architecture tools, yet some still struggle to communicate their value and prioritize such initiatives.

As data-driven business thrives, organizations will have to overcome these challenges because managing IT trends and emerging technologies makes enterprise architecture (EA) increasingly relevant.

“By 2021, 40 percent of organizations will use enterprise architects to help ideate new business innovations made possible by emerging technologies,” says Marcus Blosch, Vice President Analyst, Gartner.

With technology now vital to every aspect of the business, enterprise architecture tools and EA as a function help generate and evaluate ideas that move the business forward.

Every business has its own (often ad hoc) way of gathering ideas and evaluating them to see how they can be implemented and what it would take to deploy them.

But organizations can use enterprise architecture tools to bridge the gap between ideation and implementation, making more informed choices in the process.

By combining enterprise architecture tools with the EA team’s knowledge in a process for managing ideas and innovation, organizations can be more strategic in their planning.

Emerging technologies is one of the key areas in which such a process benefits an organization. The timely identification of emerging technologies can make or break a business. The more thought that goes into the planning of when and how to use emerging technologies, the better the implementation, which leads to better outcomes and greater ROI.

Gartner emphasize the value of enterprise architecture tools

Enterprise Architecture Tools: The Fabric of Your Organization

At its 2019 Gartner Enterprise Architecture & Technology Innovation Summit, Gartner identified 10 emerging and strategic technology trends that will shape IT in the coming years.

They included trends that utilize intelligence, such as autonomous things and augmented analytics; digital trends like empowered edge and immersive experiences; mesh trends like Blockchain and smart spaces; as well as broad concepts like digital ethics and privacy and quantum computing.

As these trends develop into applications or become part of your organization’s fabric, you need to think about how they can help grow your business in the near and long term. How will your business investigate their use? How will you identify the people who understand how they can be used to drive your business?

Many organizations lack a structured approach for gathering and investigating employee ideas, especially those around emerging technologies. This creates two issues:

1. When employee ideas fall into a black hole where they don’t get feedback, the employees become less engaged.

2. The emerging technology and its implementation are disconnected, which leads to silos or wasted resources.

How Enterprise Architecture Tools Help Communicate the Value of Emerging Technologies

When your enterprise architecture is aligned with your business outcomes it provides a way to help your business ideate and investigate the viability of ideas on both the technical and business level. When aligned correctly, emerging technologies can be evaluated based on how they meet business needs and what the IT organization must do to support them.

But the only way you can accurately make those determinations is by having visibility into your IT services and the application portfolio. And that’s how enterprise architecture can help communicate the value of emerging technologies in your organization.

erwin EA provides a way to quickly and efficiently understand opportunities offered by new technologies, process improvements and portfolio rationalization and translate them into an actionable strategy for the entire organization.

Take erwin EA for a free spin thanks to our secure, cloud-based trial.

Enterprise Architecture Business Process Trial

Categories
erwin Expert Blog

Top 5 Data Catalog Benefits

A data catalog benefits organizations in a myriad of ways. With the right data catalog tool, organizations can automate enterprise metadata management – including data cataloging, data mapping, data quality and code generation for faster time to value and greater accuracy for data movement and/or deployment projects.

Data cataloging helps curate internal and external datasets for a range of content authors. Gartner says this doubles business benefits and ensures effective management and monetization of data assets in the long-term if linked to broader data governance, data quality and metadata management initiatives.

But even with this in mind, the importance of data cataloging is growing. In the regulated data world (GDPR, HIPAA etc) organizations need to have a good understanding of their data lineage – and the data catalog benefits to data lineage are substantial.

Data lineage is a core operational business component of data governance technology architecture, encompassing the processes and technology to provide full-spectrum visibility into the ways data flows across an enterprise.

There are a number of different approaches to data lineage. Here, I outline the common approach, and the approach incorporating data cataloging – including the top 5 data catalog benefits for understanding your organization’s data lineage.

Data Catalog Benefits

Data Lineage – The Common Approach

The most common approach for assembling a collection of data lineage mappings traces data flows in a reverse manner. The process begins with the target or data end-point, and then traversing the processes, applications, and ETL tasks in reverse from the target.

For example, to determine the mappings for the data pipelines populating a data warehouse, a data lineage tool might begin with the data warehouse and examine the ETL tasks that immediately proceed the loading of the data into the target warehouse.

The data sources that feed the ETL process are added to a “task list,” and the process is repeated for each of those sources. At each stage, the discovered pieces of lineage are documented. At the end of the sequence, the process will have reverse-mapped the pipelines for populating that warehouse.

While this approach does produce a collection of data lineage maps for selected target systems, there are some drawbacks.

  • First, this approach focuses only on assembling the data pipelines populating the selected target system but does not necessarily provide a comprehensive view of all the information flows and how they interact.
  • Second, this process produces the information that can be used for a static view of the data pipelines, but the process needs to be executed on a regular basis to account for changes to the environment or data sources.
  • Third, and probably most important, this process produces a technical view of the information flow, but it does not necessarily provide any deeper insights into the semantic lineage, or how the data assets map to the corresponding business usage models.

A Data Catalog Offers an Alternate Data Lineage Approach

An alternate approach to data lineage combines data discovery and the use of a data catalog that captures data asset metadata with a data mapping framework that documents connections between the data assets.

This data catalog approach also takes advantage of automation, but in a different way: using platform-specific data connectors, the tool scans the environment for storing each data asset and imports data asset metadata into the data catalog.

When data asset structures are similar, the tool can compare data element domains and value sets, and automatically create the data mapping.

In turn, the data catalog approach performs data discovery using the same data connectors to parse the code involved in data movement, such as major ETL environments and procedural code – basically any executable task that moves data.

The information collected through this process is reverse engineered to create mappings from source data sets to target data sets based on what was discovered.

For example, you can map the databases used for transaction processing, determine that subsets of the transaction processing database are extracted and moved to a staging area, and then parse the ETL code to infer the mappings.

These direct mappings also are documented in the data catalog. In cases where the mappings are not obvious, a tool can help a data steward manually map data assets into the catalog.

The result is a data catalog that incorporates the structural and semantic metadata associated with each data asset as well as the direct mappings for how that data set is populated.

Learn more about data cataloging.

Value of Data Intelligence IDC Report

And this is a powerful representative paradigm – instead of capturing a static view of specific data pipelines, it allows a data consumer to request a dynamically-assembled lineage from the documented mappings.

By interrogating the catalog, the current view of any specific data lineage can be rendered on the fly that shows all points of the data lineage: the origination points, the processing stages, the sequences of transformations, and the final destination.

Materializing the “current active lineage” dynamically reduces the risk of having an older version of the lineage that is no longer relevant or correct. When new information is added to the data catalog (such as a newly-added data source of a modification to the ETL code), dynamically-generated views of the lineage will be kept up-to-date automatically.

Top 5 Data Catalog Benefits for Understanding Data Lineage

A data catalog benefits data lineage in the following five distinct ways:

1. Accessibility

The data catalog approach allows the data consumer to query the tool to materialize specific data lineage mappings on demand.

2. Currency

The data lineage is rendered from the most current data in the data catalog.

3. Breadth

As the number of data assets documented in the data catalog increases, the scope of the materializable lineage expands accordingly. With all corporate data assets cataloged, any (or all!) data lineage mappings can be produced on demand.

4. Maintainability and Sustainability

Since the data lineage mappings are not managed as distinct artifacts, there are no additional requirements for maintenance. As long as the data catalog is kept up to date, the data lineage mappings can be materialized.

5. Semantic Visibility

In addition to visualizing the physical movement of data across the enterprise, the data catalog approach allows the data steward to associate business glossary terms, data element definitions, data models, and other semantic details with the different mappings. Additional visualization methods can demonstrate where business terms are used, how they are mapped to different data elements in different systems, and the relationships among these different usage points.

One can impose additional data governance controls with project management oversight, which allows you to designate data lineage mappings in terms of the project life cycle (such as development, test or production).

Aside from these data catalog benefits, this approach allows you to reduce the amount of manual effort for accumulating the information for data lineage and continually reviewing the data landscape to maintain consistency, thus providing a greater return on investment for your data intelligence budget.

Learn more about data cataloging.

Categories
erwin Expert Blog Data Intelligence

The Top 8 Benefits of Data Lineage

It’s important we recognize the benefits of data lineage.

As corporate data governance programs have matured, the inventory of agreed-to data policies has grown rapidly. These include guidelines for data quality assurance, regulatory compliance and data democratization, among other information utilization initiatives.

Organizations that are challenged by translating their defined data policies into implemented processes and procedures are starting to identify tools and technologies that can supplement the ways organizational data policies can be implemented and practiced.

One such technique, data lineage, is gaining prominence as a core operational business component of the data governance technology architecture. Data lineage encompasses processes and technology to provide full-spectrum visibility into the ways that data flow across the enterprise.

To data-driven businesses, the benefits of data lineage are significant. Data lineage tools are used to survey, document and enable data stewards to query and visualize the end-to-end flow of information units from their origination points through the series of transformation and processing stages to their final destination.

Benefits of Data Lineage

The Benefits of Data Lineage

Data stewards are attracted to data lineage because the benefits of data lineage help in a number of different governance practices, including:

1. Operational intelligence

At its core, data lineage captures the mappings of the rapidly growing number of data pipelines in the organization. Visualizing the information flow landscape provides insight into the “demographics” of data consumption and use, answering questions such as “what data sources feed the greatest number of downstream sources” or “which data analysts use data that is ingested from a specific data source.” Collecting this intelligence about the data landscape better positions the data stewards for enforcing governance policies.

2. Business terminology consistency

One of the most confounding data governance challenges is understanding the semantics of business terminology within data management contexts. Because application development was traditionally isolated within each business function, the same (or similar) terms are used in different data models, even though the designers did not take the time to align definitions and meanings. Data lineage allows the data stewards to find common business terms, review their definitions, and determine where there are inconsistencies in the ways the terms are used.

3. Data incident root cause analysis

It has long been asserted that when a data consumer finds a data error, the error most likely was introduced into the environment at an earlier stage of processing. Yet without a “roadmap” that indicates the processing stages through which the data were processed, it is difficult to speculate where the error was actually introduced. Using data lineage, though, a data steward can insert validation probes within the information flow to validate data values and determine the stage in the data pipeline where an error originated.

4. Data quality remediation assessment

Root cause analysis is just the first part of the data quality process. Once the data steward has determined where the data flaw was introduced, the next step is to determine why the error occurred. Again, using a data lineage mapping, the steward can trace backward through the information flow to examine the standardizations and transformations applied to the data, validate that transformations were correctly performed, or identify one (or more) performed incorrectly, resulting in the data flaw.

5. Impact analysis

The enterprise is always subject to changes; externally-imposed requirements (such as regulatory compliance) evolve, internal business directives may affect user expectations, and ingested data source models may change unexpectedly. When there is a change to the environment, it is valuable to assess the impacts to the enterprise application landscape. In the event of a change in data expectations, data lineage provides a way to determine which downstream applications and processes are affected by the change and helps in planning for application updates.

6. Performance assessment

Not only does lineage provide a collection of mappings of data pipelines, it allows for the identification of potential performance bottlenecks. Data pipeline stages with many incoming paths are candidate bottlenecks. Using a set of data lineage mappings, the performance analyst can profile execution times across different pipelines and redistribute processing to eliminate bottlenecks.

7. Policy compliance

Data policies can be implemented through the specification of business rules. Compliance with these business rules can be facilitated using data lineage by embedding business rule validation controls across the data pipelines. These controls can generate alerts when there are noncompliant data instances.

8. Auditability of data pipelines

In many cases, regulatory compliance is a combination of enforcing a set of defined data policies along with a capability for demonstrating that the overall process is compliant. Data lineage provides visibility into the data pipelines and information flows that can be audited thereby supporting the compliance process.

Evaluating Enterprise Data Lineage Tools

While data lineage benefits are obvious, large organizations with complex data pipelines and data flows do face challenges in embracing the technology to document the enterprise data pipelines. These include:

  • Surveying the enterprise – Gathering information about the sources, flows and configurations of data pipelines.
  • Maintenance – Configuring a means to maintain an up-to-date view of the data pipelines.
  • Deliverability – Providing a way to give data consumers visibility to the lineage maps.
  • Sustainability – Ensuring sustainability of the processes for producing data lineage mappings.

Producing a collection of up-to-date data lineage mappings that are easily reviewed by different data consumers depends on addressing these challenges. When considering data lineage tools, keep these issues in mind when evaluating how well the tools can meet your data governance needs.

erwin Data Intelligence (erwin DI) helps organizations automate their data lineage initiatives. Learn more about data lineage with erwin DI.

Value of Data Intelligence IDC Report