Categories
erwin Expert Blog

Benefits of Process: Why Modern Organizations Need Process-Based Engines

In the current data-driven business climate, the benefits of process and process-based strategy are more desirable to organizations than ever.

Industry regulations and competition traditionally have driven organizational change, but such “transformation” has rarely been comprehensive or truly transformative. Rather, organizational transformation has come in waves, forcing companies and their IT ecosystems to ride them as best as they can – sometimes their fortunes have risen, and sometimes they have waned.

The advent of Brexit and GDPR have again forced today’s organizations to confront external stimuli’s impact on their operations. The difference is that the modern, process-based enterprises can better anticipate these sorts of mandates, incorporate them into their strategic plans, and even leapfrog ahead of their requirements by initiating true internal transformation initiatives – ones based on effectively managed and well-documented business processes.

Shifting Attitudes

Traditional organizations focus almost exclusively on rigid structures, centralized management and accountability; concentrated knowledge; service mainly to external customers; and reactive, short-term strategy alignment driven mainly by massive-scale projects. This traditional approach results in large, unwieldy and primarily reactive organizations that rely either on legacy strengths or inertia for survival.

But as technology evolves and proliferates, more and more organizations are realizing they need to adjust their traditional thinking and subsequent actions, even if just slightly, to gain strategic advantage, reduce costs and retain market dominance. For example:

  • Structures are becoming more adaptable, allowing for greater flexibility and cost management. How is this possible and why now? Organizations are grasping that effective, well-managed and documented business processes should form their operational backbones.
  • Business units and the departments within them are becoming accountable not only for their own budgets but also on how well they achieve their goals. This is possible because their responsibilities and processes can be clearly defined, documented and then monitored to ensure their work is executed in a repeatable, predictable and measurable way.
  • Knowledge is now both centralized and distributed thanks to modern knowledge management systems. Central repositories and collaborative portals give everyone within the organization equal access to the data they need to do their jobs more effectively and efficiently.
  • And thanks to all the above, organizations can expand their focus from external customers to internal ones as well. By clearly identifying individual processes (and their cross-business handover points) and customer touchpoints, organizations can interact with any customer at the right point with the most appropriate resources.

If business drivers are connected to processes with appropriate accountability, they become measurable in dimensions never before possible. Such elements as customer-journey quality and cost, process-delivery efficiency and even bottom-up cost aggregation can be captured. Strategic decision-making then becomes infinitely practical and forward-looking.

With this interconnected process – and information – based ecosystem, management can perform accurate and far-reaching impact analyses, test alternate scenarios, and evaluate their costs and implementation possibilities (and difficulties) to make decisions with full knowledge of their implications. Organizational departments can provide real-time feedback on designs and projects, turning theoretical designs into practical plans with buy-in at the right levels.

Benefits of Process

As stated above, one of the key benefits of process and a process-based organizational engine is that organizations should be able to better handle outside pressures, such as new regulations, if they are – or are becoming – truly process-based. Because once processes (and their encompassing business architecture) become central to the organization, a wide array of things become simpler, faster and cheaper.

The benefits of process don’t stop there either. Application design – the holy grail or black hole of budgetary spending and project management, depending on your point of view – is streamlined, with requirements clearly gathered and managed in perfect correspondence to the processes they serve and with the data they manage clearly documented and communicated to the developers. Testing occurs against real-life scenarios by the responsible parties as documented by the process owners – a drastic departure from the more traditional approaches in which the responsibility fell to designated, usually technical application owners.

Finally – and most important – data governance is no longer the isolated domain of data architects but central to the everyday processes that make an organization tick. As processes have stakeholders who use information – data – the roles of technical owners and data stewards become integral to ensuring processes operate efficiently, effectively and – above all – without interruptions. On the other side of this coin, data owners and data stewards no longer operate in their own worlds, distant from the processes their data supports.

Seizing a Process-Based Future

Process is a key axis along which the modern organization must operate. Data governance is another, with cost management becoming a third driver for the enterprise machine. But as we all know, it takes more than stable connecting rods to make an engine work – it needs cogs and wheels, belts and multiple power sources, all working together.

In the traditional organization, people are the internal mechanics. But one can’t escape visions of Charlie Chaplin’s Modern Times worker hopelessly entangled in the machine on which he was working. That’s why, these days, powerful and flexible workflow engines provide much-needed automation for greater visibility plus more power, stability and quality – all the things a machine needs to operate as required/designed.

Advanced process management systems are becoming essential, not optional. And while not as sexy or attention-grabbing as other technologies, they provide the power to drive an organization toward its goals quickly, cost-effectively and efficiently.

To learn how erwin can empower a modern, process-based organization, please click here.

Data-Driven Business Transformation whitepaper

Categories
erwin Expert Blog

Data Governance Tackles the Top Three Reasons for Bad Data

In modern, data-driven busienss, it’s integral that organizations understand the reasons for bad data and how best to address them. Data has revolutionized how organizations operate, from customer relationships to strategic decision-making and everything in between. And with more emphasis on automation and artificial intelligence, the need for data/digital trust also has risen. Even minor errors in an organization’s data can cause massive headaches because the inaccuracies don’t involve just one corrupt data unit.

Inaccurate or “bad” data also affects relationships to other units of data, making the business context difficult or impossible to determine. For example, are data units tagged according to their sensitivity [i.e., personally identifiable information subject to the General Data Protection Regulation (GDPR)], and is data ownership and lineage discernable (i.e., who has access, where did it originate)?

Relying on inaccurate data will hamper decisions, decrease productivity, and yield suboptimal results. Given these risks, organizations must increase their data’s integrity. But how?

Integrated Data Governance

Modern, data-driven organizations are essentially data production lines. And like physical production lines, their associated systems and processes must run smoothly to produce the desired results. Sound data governance provides the framework to address data quality at its source, ensuring any data recorded and stored is done so correctly, securely and in line with organizational requirements. But it needs to integrate all the data disciplines.

By integrating data governance with enterprise architecture, businesses can define application capabilities and interdependencies within the context of their connection to enterprise strategy to prioritize technology investments so they align with business goals and strategies to produce the desired outcomes. A business process and analysis component enables an organization to clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

And data modeling remains the best way to design and deploy new relational databases with high-quality data sources and support application development. Being able to cost-effectively and efficiently discover, visualize and analyze “any data” from “anywhere” underpins large-scale data integration, master data management, Big Data and business intelligence/analytics with the ability to synthesize, standardize and store data sources from a single design, as well as reuse artifacts across projects.

Let’s look at some of the main reasons for bad data and how data governance helps confront these issues …

Reasons for Bad Data

Reasons for Bad Data: Data Entry

The concept of “garbage in, garbage out” explains the most common cause of inaccurate data: mistakes made at data entry. While this concept is easy to understand, totally eliminating errors isn’t feasible so organizations need standards and systems to limit the extent of their damage.

With the right data governance approach, organizations can ensure the right people aren’t left out of the cataloging process, so the right context is applied. Plus you can ensure critical fields are not left blank, so data is recorded with as much context as possible.

With the business process integration discussed above, you’ll also have a single metadata repository.

All of this ensures sensitive data doesn’t fall through the cracks.

Reasons for Bad Data: Data Migration

Data migration is another key reason for bad data. Modern organizations often juggle a plethora of data systems that process data from an abundance of disparate sources, creating a melting pot for potential issues as data moves through the pipeline, from tool to tool and system to system.

The solution is to introduce a predetermined standard of accuracy through a centralized metadata repository with data governance at the helm. In essence, metadata describes data about data, ensuring that no matter where data is in relation to the pipeline, it still has the necessary context to be deciphered, analyzed and then used strategically.

The potential fallout of using inaccurate data has become even more severe with the GDPR’s implementation. A simple case of tagging and subsequently storing personally identifiable information incorrectly could lead to a serious breach in compliance and significant fines.

Such fines must be considered along with the costs resulting from any PR fallout.

Reasons for Bad Data: Data Integration

The proliferation of data sources, types, and stores increases the challenge of combining data into meaningful, valuable information. While companies are investing heavily in initiatives to increase the amount of data at their disposal, most information workers are spending more time finding the data they need rather than putting it to work, according to Database Trends and Applications (DBTA). erwin is co-sponsoring a DBTA webinar on this topic on July 17. To register, click here.

The need for faster and smarter data integration capabilities is growing. At the same time, to deliver business value, people need information they can trust to act on, so balancing governance is absolutely critical, especially with new regulations.

Organizations often invest heavily in individual software development tools for managing projects, requirements, designs, development, testing, deployment, releases, etc. Tools lacking inter-operability often result in cumbersome manual processes and heavy time investments to synchronize data or processes between these disparate tools.

Data integration combines data from several various sources into a unified view, making it more actionable and valuable to those accessing it.

Getting the Data Governance “EDGE”

The benefits of integrated data governance discussed above won’t be realized if it is isolated within IT with no input from other stakeholders, the day-to-day data users – from sales and customer service to the C-suite. Every data citizen has DG roles and responsibilities to ensure data units have context, meaning they are labeled, cataloged and secured correctly so they can be analyzed and used properly. In other words, the data can be trusted.

Once an organization understands that IT and the business are both responsible for data, it can develop comprehensive, holistic data governance capable of:

  • Reaching every stakeholder in the process
  • Providing a platform for understanding and governing trusted data assets
  • Delivering the greatest benefit from data wherever it lives, while minimizing risk
  • Helping users understand the impact of changes made to a specific data element across the enterprise.

To reduce the risks of and tackle the reasons for bad data and realize larger organizational objectives, organizations must make data governance everyone’s business.

To learn more about the collaborative approach to data governance and how it helps compliance in addition to adding value and reducing costs, get the free e-book here.

Data governance is everyone's business

Categories
erwin Expert Blog

The Connection Between Business Process Modeling and Standard Operating Procedures

We began a new blog series last week on business process (BP) modeling and its role within the enterprise. This week’s focus is on the connection between business process modeling and standard operating procedures. Specifically, using BP tools to help organizations streamline how they manage their standard operating procedures (SOPs).

Standard Operating Procedures: A New Approach to Organizing SOP Information

Manually maintaining the standard operating procedures that inform business processes can be a monster of a task. In most industries, SOPs typically are documented in multiple Word or Excel files.

In a process-centric world, heavy lifting is involved when an organization requires a change to an end-to-end process: Each SOP affected by the change may be associated with dozens or even hundreds of steps that exist between the start and conclusion of the process – and the alteration must be made to all of them wherever they occur.

You can imagine the significant man hours that go into wading through a sea of documents to discover and amend relevant SOPs and communicate these business process-related changes across the organization. And you can guess at the toll on productivity and efficiency that the business experiences as a result.

Companies that are eager to embrace business process optimization are keen to have a better approach to organizing SOP information to improve transparency and insight for speedier and more effective change management.

There’s another benefit to be realized from taking a new approach to SOP knowledge management, as well. With better organization comes an increased ability to convey information about current and changed standard operating procedures; companies can offer on-the-fly access to standard practices to teams across the enterprise.

That consistent and easily obtained business process information can help employees innovate, sharing ideas about additional improvements and innovations that could be made to standard operating procedures. It could also save them the time they might otherwise spend on “reinventing the wheel” for SOPs that already exist but that they don’t know about.

Balfour Beatty Construction, the fourth largest general builder in the U.S., saw big results when it standardized and transformed its process documentation, giving workers access to corporate SOPs from any location on almost any device.

As a construction company, keeping field workers out of danger is a major issue, and providing these employees with immediate information about how to accomplish a multi-step business process – such as clearing a site – can promote their safety. Among benefits it saw were a 5% gain in productivity and a reduction in training time for new employees who were now able to tap directly into SOP data.

Business Process Modeling & Standard Operating Procedures

Using Business Process Modeling to Transform SOP Management

How does a company transform manual SOP documentation to more effectively support change management as part of business process optimization? It’s key to adopt business process (BP) modeling and management software to create and store SOP documentation in a single repository, tying them to the processes they interact with for faster discovery and easier maintenance.

Organizations that move to this methodology, for example, will have the advantage of only needing to change an affected SOP in that one repository; the change automatically will propagate to all related processes and procedures.

In effect, the right BP tool automatically generates new SOPs with the necessary updated information.

Such a tool is also suitable for use in conjunction with controlled document repositories that are typically required in heavily regulated industries, such as pharmaceuticals, financial services and healthcare, as part of satisfying compliance mandates. All SOP documentation already is stored in the same repository, rather than scattered across files.

But a business process diagramming and modeling solution comes in handy in these cases by providing a web-based front-end that exposes high-end processes and how they map to related SOPs. This helps users better navigate them to institute and maintain changes and to access job-related procedure information.

To find out about how erwin can streamline SOP document management to positively impact costs, workloads and user benefits, please click here.

In our next blog, we’ll look at how business process modeling strengthens digital transformation initiatives.

Data-Driven Business Transformation whitepaper

Categories
erwin Expert Blog

Business Process Modeling and Its Role Within the Enterprise

To achieve its objectives, an organization must have a complete understanding of its processes. Therefore, business process design and analysis are key to defining how a business operates and ensures employees understand and are accountable for carrying out their responsibilities.

Understanding system interactions, business processes and organizational hierarchies creates alignment, with everyone pulling in the same direction, and supports informed decision-making for optimal results and continuous improvement.

Those organizations operating in industries in which quality, health, safety and environmental issues are constant concerns must be even more in tune with their complexities. After all, revenue and risk are inextricably linked.

What Is Business Process Modeling and Why Does It Matter?

A business process is “an activity or set of activities that will accomplish a specific organizational goal,” as defined by TechTarget. Business process modeling “links business strategy to IT systems development to ensure business value,” according to Gartner.

The research firm goes on to explain that it “combines process/workflow, functional, organizational and data/resource views with underlying metrics, such as costs, cycle times and responsibilities, you establish a foundation for analyzing value chains, activity-based costs, bottlenecks, critical paths and inefficiencies.”

To clearly document, define, map and analyze workflows and build models to drive process improvement and therefore business transformation, you’ll need to invest in a business process (BP) modeling solution.

Only then will you be able to determine where cross-departmental and intra-system process chains break down, as well as identify business practices susceptible to the greatest security, compliance, standards or other risks and where controls and audits are most needed to mitigate exposures.

Companies that maintain accurate BP models also are well-positioned to analyze and optimize end-to-end process threads that help accomplish such strategic business objectives as improving customer journeys and maximizing employee retention. You also can slice and dice models in multiple other ways, including to improve collaboration and efficiency.

Useful change only comes from evaluating process models, spotting sub-optimalities, and taking corrective actions. Business process modeling is also critical to data governance, helping organizations understand their data assets in the context of where their data is and how it’s used in various processes. Then you can drive data opportunities, like increasing revenue, and limit data risks, such as avoiding regulatory and compliance gaffes.

How to Do Business Process Modeling

Business process modeling software creates the documentation and graphical roadmap of how a business works today, detailing the tasks, responsible parties and data elements involved in processes and the interactions that occur across systems, procedures and organizational hierarchies. That knowledge, in turn, prepares the organization for tomorrow’s changes.

Effective BP technology will assist your business in documenting, managing and communicating your business processes in a structured manner that drives value and reduces risks.

It should enable you to:

  • Develop and capture multiple artefacts in a repository to support business-centric objectives
  • Support process improvement methodologies that boost critical capabilities
  • Identify gaps in process documentation to retain internal mastery over core activities
  • Reduce maintenance costs and increase employee access to critical knowledge
  • Incorporate any data from any location into business process models

In addition, a business process modeling solution should work in conjunction with the other data management domains (i.e., enterprise architecture, data modeling and data governance) to provide data clarity across all organizational roles and goals.

Data Governance, Data Modeling, Enterprise Architecture, Business Process - erwin EDGE

Business Process Modeling and Enterprise Data Management

Data isn’t just for “the data people.” To survive and thrive in the digital age, among the likes of Amazon, Airbnb, Netflix and Uber that have transformed their respective industries, organizations must extend the use, understanding and trust of their data everyday across every business function – from the C-level to the front line.

A common source of data leveraged by business process personnel, enterprise architects, data stewards and others encourages a greater understanding of how different line-of-business operations work together as a single unit. Links to data terms and categories contained within a centralized business glossary let enterprises eliminate ambiguity in process and policy procedure documents.

Integrated business models based on a sole source of truth also offer different views for different stakeholders based on their needs, while tight interconnection with enterprise architecture joins Process, Organization, Location, Data, Applications, and Technology (POLDAT) assets to explanatory models that support informed plans for change.

Seamless integration of business process models with enterprise architecture, data modeling and data governance reveals the interdependence between the workforce, the processes they perform, the actively governed assets they interact with and their importance to the business.

Then everyone is invested in and accountable for data, the fuel for the modern enterprise.

To learn more about business process modeling and its role within data-driven business transformation, click here.

Data-Driven Business Transformation whitepaper

Categories
erwin Expert Blog

The Role of An Effective Data Governance Initiative in Customer Purchase Decisions

A data governance initiative will maximize the security, quality and value of data, all of which build customer trust.

Without data, modern business would cease to function. Data helps guide decisions about products and services, makes it easier to identify customers, and serves as the foundation for everything businesses do today. The problem for many organizations is that data enters from any number of angles and gets stored in different places by different people and different applications.

Getting the most out of your data requires that you know what you have, where you have it, and that you understand its quality and value to the organization. This is where data governance comes into play. You can’t optimize your data if it’s scattered across different silos and lurking in various applications.

For about 150 years, manufacturers relied on their machinery and its ability to run reliably, properly and safely, to keep customers happy and revenue flowing. A data governance initiative has a similar role today, except its aim is to maximize the security, quality and value of data instead of machinery.

Customers are increasingly concerned about the safety and privacy of their data. According to a survey by Research+Data Insights, 85 percent of respondents worry about technology compromising their personal privacy. In a survey of 2,000 U.S. adults in 2016, researchers from Vanson Bourne found that 76 percent of respondents said they would move away from companies with a high record of data breaches.

For years, buying decisions were driven mainly by cost and quality, says Danny Sandwell, director of product marketing at erwin, Inc. But today’s businesses must consider their reputations in terms of both cost/quality and how well they protect their customers’ data when trying to win business.

Once the reputation is tarnished because of a breach or misuse of data, customers will question those relationships.

Unfortunately for consumers, examples of companies failing to properly govern their data aren’t difficult to find. Look no further than Under Armour, which announced this spring that 150 million accounts at its MyFitnessPal diet and exercise tracking app were breached, and Facebook, where the data of millions of users was harvested by third parties hoping to influence the 2016 presidential election in the United States.

Customers Hate Breaches, But They Love Data

While consumers are quick to report concerns about data privacy, customers also yearn for (and increasingly expect) efficient, personalized and relevant experiences when they interact with businesses. These experiences are, of course, built on data.

In this area, customers and businesses are on the same page. Businesses want to collect data that helps them build the omnichannel, 360-degree customer views that make their customers happy.

These experiences allow businesses to connect with their customers and demonstrate how well they understand them and know their preferences, like and dislikes – essentially taking the personalized service of the neighborhood market to the internet.

The only way to manage that effectively at scale is to properly govern your data.

Delivering personalized service is also valuable to businesses because it helps turn customers into brand ambassadors, and it’s a fact that it’s much easier to build on existing customer relationships than to find new customers.

Here’s the upshot: If your organization is doing data governance right, it’s helping create happy, loyal customers, while at the same time avoiding the bad press and financial penalties associated with poor data practices.

Putting A Data Governance Initiative Into Action

The good news is that 76 percent of respondents to a November 2017 survey we conducted with UBM said understanding and governing the data assets in the organization was either important or very important to the executives in their organization. Nearly half (49 percent) of respondents said that customer trust/satisfaction was driving their data governance initiatives.

Importance of a data governance initiative

What stops organizations from creating an effective data governance initiative? At some businesses, it’s a cultural issue. Both the business and IT sides of the organization play important roles in data, with the IT side storing and protecting it, and the business side consuming data and analyzing it.

For years, however, data governance was the volleyball passed back and forth over the net between IT and the business, with neither side truly owning it. Our study found signs this is changing. More than half (57 percent) of the respondents said both and IT and the business/corporate teams were responsible for data in their organization.

Who's responsible for a data governance initiative

Once an organization understands that IT and the business are both responsible for data, it still needs to develop a comprehensive, holistic strategy for data governance that is capable of:

  • Reaching every stakeholder in the process
  • Providing a platform for understanding and governing trusted data assets
  • Delivering the greatest benefit from data wherever it lives, while minimizing risk
  • Helping users understand the impact of changes made to a specific data element across the enterprise.

To accomplish this, a modern data governance initiative needs to be interdisciplinary. It should include not only data governance, which is ongoing because organizations are constantly changing and transforming, but other disciples as well.

Enterprise architecture is important because it aligns IT and the business, mapping a company’s applications and the associated technologies and data to the business functions they enable.

By integrating data governance with enterprise architecture, businesses can define application capabilities and interdependencies within the context of their connection to enterprise strategy to prioritize technology investments so they align with business goals and strategies to produce the desired outcomes.

A business process and analysis component is also vital to modern data governance. It defines how the business operates and ensures employees understand and are accountable for carrying out the processes for which they are responsible.

Enterprises can clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

Finally, data modeling remains the best way to design and deploy new relational databases with high-quality data sources and support application development.

Being able to cost-effectively and efficiently discover, visualize and analyze “any data” from “anywhere” underpins large-scale data integration, master data management, Big Data and business intelligence/analytics with the ability to synthesize, standardize and store data sources from a single design, as well as reuse artifacts across projects.

Michael Pastore is the Director, Content Services at QuinStreet B2B Tech. This content originally appeared as a sponsored post on http://www.eweek.com/.

Read the previous post on how compliance concerns and the EU’s GDPR are driving businesses to implement data governance.

Determine how effective your current data governance initiative is by taking our DG RediChek.

Take the DG RediChek

Categories
erwin Expert Blog

An Agile Data Governance Foundation for Building the Data-Driven Enterprise

The data-driven enterprise is the cornerstone of modern business, and good data governance is a key enabler.

In recent years, we’ve seen startups leverage data to catapult themselves ahead of legacy competitors. Companies such as Airbnb, Netflix and Uber have become household names. Although the service each offers differs vastly, all three identify as ‘technology’ organizations because data is integral to their operations.

Data-Driven Business

As with any standard-setting revolution, businesses across the spectrum are now following these examples. But what these organizations need to understand is that simply deciding to be data-driven, or to “do Big Data,” isn’t enough.

As with any strategy or business model, it’s advisable to apply best practices to ensure the endeavor is worthwhile and that it operates as efficiently as possible. In fact, it’s especially important with data, as poorly governed data will lead to slower times to market and oversights in security. Additionally, poorly managed data fosters inaccurate analysis and poor decision-making, further hampering times to market due to inaccuracy in the planning stages, false starts and wasted cycles.

Essentially garbage in, garbage out – so it’s important for businesses to get their foundations right. To build something, you need to know exactly what you’re building and why to understand the best way to progress.

Data Governance 2.0 Is the Underlying Factor

Good data governance (DG) enables every relevant stakeholder – from executives to frontline employees – to discover, understand, govern and socialize data. Then the right people have access to the right data, so the right decisions are easier to make.

Traditionally, DG encompassed governance goals such as maintaining a business glossary of data terms, a data dictionary and catalog. It also enabled lineage mapping and policy authoring.

However, Data Governance 1.0 was siloed with IT left to handle it. Often there were gaps in context, the chain of accountability and the analysis itself.

Data Governance 2.0 remedies this by taking into account the fact that data now permeates all levels of a business. And it allows for greater collaboration.

It gives people interacting with data the required context to make good decisions, and documents the data’s journey, ensuring accountability and compliance with existing and upcoming data regulations.

But beyond the greater collaboration it fosters between people, it also allows for better collaboration between departments and integration with other technology.

By integrating data governance with data modeling (DM), enterprise architecture (EA) and business process (BP), organizations can break down inter-departmental and technical silos for greater visibility and control across domains.

By leveraging a common metadata repository and intuitive role-based and highly configurable user interfaces, organizations can guarantee everyone is singing off the same sheet of music.

Data Governance Enables Better Data Management

The collaborative nature of Data Governance 2.0 is a key enabler for strong data management. Without it, the differing data management initiatives can and often do pull in different directions.

These silos are usually born out of the use of disparate tools that don’t enable collaboration between the relevant roles responsible for the individual data management initiative. This stifles the potential of data analysis, something organizations can’t afford given today’s market conditions.

Businesses operating in highly competitive markets need every advantage: growth, innovation and differentiation. Organizations also need a complete data platform as the rise of data’s involvement in business and subsequent frequent tech advancements mean market landscapes are changing faster than ever before.

By integrating DM, EA and BP, organizations ensure all three initiatives are in sync. Then historically common issues born of siloed data management initiatives don’t arise.

A unified approach, with Data Governance 2.0 at its core, allows organizations to:

  • Enable data fluency and accountability across diverse stakeholders
  • Standardize and harmonize diverse data management platforms and technologies
  • Satisfy compliance and legislative requirements
  • Reduce risks associated with data-driven business transformation
  • Enable enterprise agility and efficiency in data usage.

Data governance is everyone's business

Categories
erwin Expert Blog

Data Modeling in a Jargon-filled World – Internet of Things (IoT)

In the first post of this blog series, we focused on jargon related to the “volume” aspect of Big Data and its impact on data modeling and data-driven organizations. In this post, we’ll focus on “velocity,” the second of Big Data’s “three Vs.”

In particular, we’re going to explore the Internet of Things (IoT), the constellation of web-connected devices, vehicles, buildings and related sensors and software. It’s a great time for this discussion too, as IoT devices are proliferating at a dizzying pace in both number and variety.

Though IoT devices typically generate small “chunks” of data, they often do so at a rapid pace, hence the term “velocity.” Some of these devices generate data from multiple sensors for each time increment. For example, we recently worked with a utility that embedded sensors in each transformer in its electric network and then generated readings every 4 seconds for voltage, oil pressure and ambient temperature, among others.

While the transformer example is just one of many, we can quickly see two key issues that arise when IoT devices are generating data at high velocity. First, organizations need to be able to process this data at high speed.  Second, organizations need a strategy to manage and integrate this never-ending data stream. Even small chunks of data will accumulate into large volumes if they arrive fast enough, which is why it’s so important for businesses to have a strong data management platform.

It’s worth noting that the idea of managing readings from network-connected devices is not new. In industries like utilities, petroleum and manufacturing, organizations have used SCADA systems for years, both to receive data from instrumented devices to help control processes and to provide graphical representations and some limited reporting.

More recently, many utilities have introduced smart meters in their electricity, gas and/or water networks to make the collection of meter data easier and more efficient for a utility company, as well as to make the information more readily available to customers and other stakeholders.

For example, you may have seen an energy usage dashboard provided by your local electric utility, allowing customers to view graphs depicting their electricity consumption by month, day or hour, enabling each customer to make informed decisions about overall energy use.

Seems simple and useful, but have you stopped to think about the volume of data underlying this feature? Even if your utility only presents information on an hourly basis, if you consider that it’s helpful to see trends over time and you assume that a utility with 1.5 million customers decides to keep these individual hourly readings for 13 months for each customer, then we’re already talking about over 14 billion individual readings for this simple example (1.5 million customers x 13 months x over 30 days/month x 24 hours/day).

Now consider the earlier example I mentioned of each transformer in an electrical grid with sensors generating multiple readings every 4 seconds. You can get a sense of the cumulative volume impact of even very small chunks of data arriving at high speed.

With experts estimating the IoT will consist of almost 50 billion devices by 2020, businesses across every industry must prepare to deal with IoT data.

But I have good news because IoT data is generally very simple and easy to model. Each connected device typically sends one or more data streams with each having a value for the type of reading and the time at which it occurred. Historically, large volumes of simple sensor data like this were best stored in time-series databases like the very popular PI System from OSIsoft.

While this continues to be true for many applications, alternative architectures, such as storing the raw sensor readings in a data lake, are also being successfully implemented. Though organizations need to carefully consider the pros and cons of home-grown infrastructure versus time-tested industrial-grade solutions like the PI System.

Regardless of how raw IoT data is stored once captured, the real value of IoT for most organizations is only realized when IoT data is “contextualized,” meaning it is modeled in the context of the broader organization.

The value of modeled data eclipses that of “edge analytics” (where the value is inspected by a software program while inflight from the sensor, typically to see if it falls within an expected range, and either acted upon if required or allowed simply to pass through) or simple reporting like that in the energy usage dashboard example.

It is straightforward to represent a reading of a particular type from a particular sensor or device in a data model or process model. It starts to get interesting when we take it to the next step and incorporate entities into the data model to represent expected ranges –  both for readings under various conditions and representations of how the devices relate to one another.

If the utility in the transformer example has modeled that IoT data well, it might be able to prevent a developing problem with a transformer and also possibly identify alternate electricity paths to isolate the problem before it has an impact on network stability and customer service.

Hopefully this overview of IoT in the utility industry helps you see how your organization can incorporate high-velocity IoT data to become more data-driven and therefore more successful in achieving larger corporate objectives.

Subscribe and join us next time for Data Modeling in a Jargon-filled World – NoSQL/NewSQL.

Data-Driven Business Transformation

Categories
erwin Expert Blog

Data Modeling in a Jargon-filled World – Big Data & MPP

By now, you’ve likely heard a lot about Big Data. You may have even heard about “the three Vs” of Big Data. Originally defined by Gartner, “Big Data is “high-volume, high-velocity, and/or high-variety information assets that require new forms of processing to enable enhanced decision-making, insight discovery and process optimization.”

Categories
erwin Expert Blog

Five Steps to Digital Transformation

Digital transformation is ramping up in all industries. Facing regular market disruptions, and landscape-changing technological breakthroughs, modern businesses must be both malleable and willing to change.

To stay competitive, you must be agile.

Digital Transformation is Inevitable

Increasing numbers of organizations are undergoing a digital transformation. The tried-and-tested yet rigid methods of doing business are being replaced by newer, data-orientated approaches that require thorough but fast analysis.

Some businesses – like Amazon, Netflix and Uber – are leading this evolution. They all provide very different services, but at their core, they are technology focused.

And they’re reaping rewards for it too. Amazon is one of the most valuable businesses in the world, perhaps one of the first companies to reach a $1-trillion valuation.

It’s not too late to adopt digital transformation, but it is  too late to keep fighting against it. The tide of change has quickened, and stubborn businesses could be washed away.

But what’s the best way to get started?

Step One: Determine Your End Goal

Any form of change must start with the end in mind, as it’s impossible to make a transformation without understanding why and how.

Before you make a change, big or small, you need to ask yourself why are we doing this? What are the positives and negatives? And if there are negatives, what can we do to mitigate them?

To ensure a successful digital transformation, it’s important to plot your journey from the beginning through your end goal, understanding how one change or a whole series of changes will alter your business.

Business process modeling tools can help map your digital transformation journey.

Step Two: Get Some Strategic Support

For businesses of any size, transformational change can disrupt day-to-day operations. In most organizations, the expertise to manage a sizeable transformation program doesn’t exist, and from the outset, it can appear quite daunting.

If your goal is to increase profits, it can seem contradictory to pay for support to drive your business forward. However, a slow or incorrect transformational process can be costly in many ways. Therefore, investing in support can be one of the best decisions you make.

Effective strategic planning, rooted in enterprise architecture, can help identify gaps and potential oversights in your strategy. It can indicate where investment is needed and ensure transformative endeavors aren’t undermined by false-starts and U-turns.

Many businesses would benefit further by employing strategic consultants. As experts in their fields, strategic consultants know the right questions to ask to uncover the information you need to influence change.

Their experience can support your efforts by identifying and cataloging underlying components, providing input to the project plan and building the right systems to capture important data needed to meet the business’s transformation goals.

Step Three: Understand What You Have

Once you know where you want to go, it’s important to understand what you currently do. That might seem clear, but even the smallest organizations are underpinned by thousands of business processes.

Before you decide to change something, you need to understand everything about what you currently do, or else a change could have an unanticipated and negative impact.

Enterprise architecture will also benefit a business here, uncovering strategic improvement opportunities – valuable changes you might not have seen.

As third-parties, consultants can provide an impartial view, rather than letting historic or legacy decisions cloud future judgment.

Businesses will also benefit from data modeling. This is due to the exponential increase in the volume of data businesses have to manage – as well as the variety of disparate sources.

Data modeling will ensure data is accessible, understood and better prepared for analysis and the decision-making process.

Step Four: Collect Knowledge from Within

Your employees are a wealth of knowledge and ideas, so it’s important to involve them in the enterprise architecture process.

Consultants can facilitate a series of staff workshops to enable employee insights to be shared and then developed into real, actionable changes.

Step Five: Get Buy-in Across the Business

Once you’ve engaged with your staff to collect the knowledge they hold, make sure you don’t cut them off there. Business change is only successful if everyone understands what is happening and why, with continuous updates.

Ensure that you take your employees through the change process, making them  part of the digital transformation journey.

Evidence suggests that 70 percent of all organizational change efforts fail, with a primary reason being that executives don’t get enough buy-in for new initiatives and ideas.

By involving relevant stakeholders in the strategic planning process, you can mitigate this risk. Strategic planning tools that enable collaboration can achieve this. Thanks to technological advancements in the cloud, collaboration can even be effectively facilitated online.

Take your employees through your digital transformation journey, and you’ll find them celebrating with you when you arrive at your goal.

If you think now is the right time for your business to change, get in touch with us today.

Data-Driven Business Transformation

Categories
erwin Expert Blog

Enterprise Architecture vs. Data Architecture vs. Business Process Architecture

Despite the nomenclature, enterprise architecture, data architecture and business process architecture are very different disciplines. Despite this, organizations that combine the disciplines enjoy much greater success in data management.

Both an understanding of the differences between the three and an understanding of how the three work together, has to start with understanding the disciplines individually:

What is Enterprise Architecture?

Enterprise architecture defines the structure and operation of an organization. Its desired outcome is to determine current and future objectives and translate those goals into a blueprint of IT capabilities.

A useful analogy for understanding enterprise architecture is city planning. A city planner devises the blueprint for how a city will come together, and how it will be interacted with. They need to be cognizant of regulations (zoning laws) and understand the current state of city and its infrastructure.

A good city planner means less false starts, less waste and a faster, more efficient carrying out of the project.

In this respect, a good enterprise architect is a lot like a good city planner.

What is Data Architecture?

The Data Management Body of Knowledge (DMBOK), define data architecture as  “specifications used to describe existing state, define data requirements, guide data integration, and control data assets as put forth in a data strategy.”

So data architecture involves models, policy rules or standards that govern what data is collected and how it is stored, arranged, integrated and used within an organization and its various systems. The desired outcome is enabling stakeholders to see business-critical information regardless of its source and relate to it from their unique perspectives.

There is some crossover between enterprise and data architecture. This is because data architecture is inherently an offshoot of enterprise architecture. Where enterprise architects take a holistic, enterprise-wide view in their duties, data architects tasks are much more refined, and focussed. If an enterprise architect is the city planner, then a data architect is an infrastructure specialist – think plumbers, electricians etc.

For a more in depth look into enterprise architecture vs data architecture, see: The Difference Between Data Architecture and Enterprise Architecture

What is Business Process Architecture?

Business process architecture describes an organization’s business model, strategy, goals and performance metrics.

It provides organizations with a method of representing the elements of their business and how they interact with the aim of aligning people, processes, data, technologies and applications to meet organizational objectives. With it, organizations can paint a real-world picture of how they function, including opportunities to create, improve, harmonize or eliminate processes to improve overall performance and profitability.

Enterprise, Data and Business Process Architecture in Action

A successful data-driven business combines enterprise architecture, data architecture and business process architecture. Integrating these disciplines from the ground up ensures a solid digital foundation on which to build. A strong foundation is necessary because of the amount of data businesses already have to manage. In the last two years, more data has been created than in all of humanity’s history.

And it’s still soaring. Analysts predict that by 2020, we’ll create about 1.7 megabytes of new information every second for every human being on the planet.

While it’s a lot to manage, the potential gains of becoming a data-driven enterprise are too high to ignore. Fortune 1000 companies could potentially net an additional $65 million in income with access to just 10 percent more of their data.

To effectively employ enterprise architecture, data architecture and business process architecture, it’s important to know the differences in how they operate and their desired business outcomes.Enterprise Architecture, Data Architecture and Business Process Architecture

Combining Enterprise, Data and Business Process Architecture for Better Data Management

Historically, these three disciplines have been siloed, without an inherent means of sharing information. Therefore, collaboration between the tools and relevant stakeholders has been difficult.

To truly power a data-driven business, removing these silos is paramount, so as not to limit the potential analysis your organization can carry out. Businesses that understand and adopt this approach will benefit from much better data management when it comes to the ‘3 Vs.’

They’ll be better able to cope with the massive volumes of data a data-driven business will introduce; be better equipped to handle increased velocity of data, processing data accurately and quickly in order to keep time to markets low; and be able to effectively manage data from a growing variety of different sources.

In essence, enabling collaboration between enterprise architecture, data architecture and business process architecture helps an organization manage “any data, anywhere” – or Any2. This all-encompassing view provides the potential for deeper data analysis.

However, attempting to manage all your data without all the necessary tools is like trying to read a book without all the chapters. And trying to manage data with a host of uncollaborative, disparate tools is like trying to read a story with chapters from different books. Clearly neither approach is ideal.

Unifying the disciplines as the foundation for data management provides organizations with the whole ‘data story.’

The importance of getting the whole data story should be very clear considering the aforementioned statistic – Fortune 1000 companies could potentially net an additional $65 million in income with access to just 10 percent more of their data.

Download our eBook, Solving the Enterprise Data Dilemma to learn more about data management tools, particularly enterprise architecture, data architecture and business process architecture, working in tandem.