Categories
erwin Expert Blog

Automated Data Management: Stop Drowning in Your Data 

Due to the wealth of data data-driven organizations are tasked with handling, organizations are increasingly adopting automated data management.

There are 2.5 quintillion bytes of data being created every day, and that figure is increasing in tandem with the production of and demand for Internet of Things (IoT) devices. However, Forrester reports that between 60 and 73 percent of all data within an enterprise goes unused.

Collecting all that data is pointless if it’s not going to be used to deliver accurate and actionable insights.

But the reality is there’s not enough time, people and/or money for effective data management using manual processes. Organizations won’t be able to take advantage of analytics tools to become data-driven unless they establish a foundation for agile and complete data management. And organizations that don’t employ automated data management risk being left behind.

In addition to taking the burden off already stretched internal teams, automated data management’s most obvious benefit is that it’s a key enabler of data-driven business. Without it, a truly data-driven approach to business is either ineffective, or impossible, depending on the scale of data you’re working with.

This is because there’s either too much data left unaccounted for and too much potential revenue left on the table for the strategy to be considered effective. Or it’s because there’s so much disparity in the data sources and silos in where data is stored that data quality suffers to an insurmountable degree, rendering any analysis fundamentally flawed.

But simply enabling the strategy isn’t the most compelling use case, or organizations across the board would have implemented it already.

The Case for Automated Data Management

Business leaders and decision-makers want a business case for automated data management.

So here it is …

Without automation, business transformation will be stymied. Companies, especially large ones with thousands of systems, files and processes, will be particularly challenged by taking a manual approach. And outsourcing these data management efforts to professional services firms only delays schedules and increases cost.

By automating data cataloging and data mapping inclusive of data at rest and data in motion through the integration lifecycle process, organizations will benefit from:

  • A metadata-driven automated framework for cataloging data assets and their flows across the business
  • An efficient, agile and dynamic way to generate data lineage from operational systems (databases, data models, file-based systems, unstructured files and more) across the information management architecture
  • Easy access to what data aligns with specific business rules and policies
  • The ability to inform how data is transformed, integrated and federated throughout business processes – complete with full documentation
  • Faster project delivery and lower costs because data is managed internally, without the need to outsource data management efforts
  • Assurance of data quality, so analysis is reliable and new initiatives aren’t beleaguered with false starts
  • A seamlessly governed data pipeline, operationalized to the benefit of all stakeholders

erwin Data Intelligence

Categories
erwin Expert Blog

Healthy Co-Dependency: Data Management and Data Governance

Data management and data governance are now more important than ever before. The hyper competitive nature of data-driven business means organizations need to get more out of their data than ever before – and fast.

A few data-driven exemplars have led the way, turning data into actionable insights that influence everything from corporate structure to new products and pricing. “Few” being the operative word.

It’s true, data-driven business is big business. Huge actually. But it’s dominated by a handful of organizations that realized early on what a powerful and disruptive force data can be.

The benefits of such data-driven strategies speak for themselves: Netflix has replaced Blockbuster, and Uber continues to shake up the taxi business. Organizations indiscriminate of industry are following suit, fighting to become the next big, disruptive players.

But in many cases, these attempts have failed or are on the verge of doing so.

Now with the General Data Protection Regulation (GDPR) in effect, data that is unaccounted for is a potential data disaster waiting to happen.

So organizations need to understand that getting more out of their data isn’t necessarily about collecting more data. It’s about unlocking the value of the data they already have.

Data Management and Data Governance Co-Dependency

The Enterprise Data Dilemma

However, most organizations don’t know exactly what data they have or even where some of it is. And some of the data they can account for is going to waste because they don’t have the means to process it. This is especially true of unstructured data types, which organizations are collecting more frequently.

Considering that 73 percent of company data goes unused, it’s safe to assume your organization is dealing with some if not all of these issues.

Big picture, this means your enterprise is missing out on thousands, perhaps millions in revenue.

The smaller picture? You’re struggling to establish a single source of data truth, which contributes to a host of problems:

  • Inaccurate analysis and discrepancies in departmental reporting
  • Inability to manage the amount and variety of data your organization collects
  • Duplications and redundancies in processes
  • Issues determining data ownership, lineage and access
  • Achieving and sustaining compliance

To avoid such circumstances and get more value out of data, organizations need to harmonize their approach to data management and data governance, using a platform of established tools that work in tandem while also enabling collaboration across the enterprise.

Data management drives the design, deployment and operation of systems that deliver operational data assets for analytics purposes.

Data governance delivers these data assets within a business context, tracking their physical existence and lineage, and maximizing their security, quality and value.

Although these two disciplines approach data from different perspectives (IT-driven and business-oriented), they depend on each other. And this co-dependency helps an organization make the most of its data.

The P-M-G Hub

Together, data management and data governance form a critical hub for data preparation, modeling and data governance. How?

It starts with a real-time, accurate picture of the data landscape, including “data at rest” in databases, data warehouses and data lakes and “data in motion” as it is integrated with and used by key applications. That landscape also must be controlled to facilitate collaboration and limit risk.

But knowing what data you have and where it lives is complicated, so you need to create and sustain an enterprise-wide view of and easy access to underlying metadata. That’s a tall order with numerous data types and data sources that were never designed to work together and data infrastructures that have been cobbled together over time with disparate technologies, poor documentation and little thought for downstream integration. So the applications and initiatives that depend on a solid data infrastructure may be compromised, and data analysis based on faulty insights.

However, these issues can be addressed with a strong data management strategy and technology to enable the data quality required by the business, which encompasses data cataloging (integration of data sets from various sources), mapping, versioning, business rules and glossaries maintenance and metadata management (associations and lineage).

Being able to pinpoint what data exists and where must be accompanied by an agreed-upon business understanding of what it all means in common terms that are adopted across the enterprise. Having that consistency is the only way to assure that insights generated by analyses are useful and actionable, regardless of business department or user exploring a question. Additionally, policies, processes and tools that define and control access to data by roles and across workflows are critical for security purposes.

These issues can be addressed with a comprehensive data governance strategy and technology to determine master data sets, discover the impact of potential glossary changes across the enterprise, audit and score adherence to rules, discover risks, and appropriately and cost-effectively apply security to data flows, as well as publish data to people/roles in ways that are meaningful to them.

Data Management and Data Governance: Play Together, Stay Together

When data management and data governance work in concert empowered by the right technology, they inform, guide and optimize each other. The result for an organization that takes such a harmonized approach is automated, real-time, high-quality data pipeline.

Then all stakeholders — data scientists, data stewards, ETL developers, enterprise architects, business analysts, compliance officers, CDOs and CEOs – can access the data they’re authorized to use and base strategic decisions on what is now a full inventory of reliable information.

The erwin EDGE creates an “enterprise data governance experience” through integrated data mapping, business process modeling, enterprise architecture modeling, data modeling and data governance. No other software platform on the market touches every aspect of the data management and data governance lifecycle to automate and accelerate the speed to actionable business insights.

Categories
erwin Expert Blog

Data Governance Tackles the Top Three Reasons for Bad Data

In modern, data-driven busienss, it’s integral that organizations understand the reasons for bad data and how best to address them. Data has revolutionized how organizations operate, from customer relationships to strategic decision-making and everything in between. And with more emphasis on automation and artificial intelligence, the need for data/digital trust also has risen. Even minor errors in an organization’s data can cause massive headaches because the inaccuracies don’t involve just one corrupt data unit.

Inaccurate or “bad” data also affects relationships to other units of data, making the business context difficult or impossible to determine. For example, are data units tagged according to their sensitivity [i.e., personally identifiable information subject to the General Data Protection Regulation (GDPR)], and is data ownership and lineage discernable (i.e., who has access, where did it originate)?

Relying on inaccurate data will hamper decisions, decrease productivity, and yield suboptimal results. Given these risks, organizations must increase their data’s integrity. But how?

Integrated Data Governance

Modern, data-driven organizations are essentially data production lines. And like physical production lines, their associated systems and processes must run smoothly to produce the desired results. Sound data governance provides the framework to address data quality at its source, ensuring any data recorded and stored is done so correctly, securely and in line with organizational requirements. But it needs to integrate all the data disciplines.

By integrating data governance with enterprise architecture, businesses can define application capabilities and interdependencies within the context of their connection to enterprise strategy to prioritize technology investments so they align with business goals and strategies to produce the desired outcomes. A business process and analysis component enables an organization to clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

And data modeling remains the best way to design and deploy new relational databases with high-quality data sources and support application development. Being able to cost-effectively and efficiently discover, visualize and analyze “any data” from “anywhere” underpins large-scale data integration, master data management, Big Data and business intelligence/analytics with the ability to synthesize, standardize and store data sources from a single design, as well as reuse artifacts across projects.

Let’s look at some of the main reasons for bad data and how data governance helps confront these issues …

Reasons for Bad Data

Reasons for Bad Data: Data Entry

The concept of “garbage in, garbage out” explains the most common cause of inaccurate data: mistakes made at data entry. While this concept is easy to understand, totally eliminating errors isn’t feasible so organizations need standards and systems to limit the extent of their damage.

With the right data governance approach, organizations can ensure the right people aren’t left out of the cataloging process, so the right context is applied. Plus you can ensure critical fields are not left blank, so data is recorded with as much context as possible.

With the business process integration discussed above, you’ll also have a single metadata repository.

All of this ensures sensitive data doesn’t fall through the cracks.

Reasons for Bad Data: Data Migration

Data migration is another key reason for bad data. Modern organizations often juggle a plethora of data systems that process data from an abundance of disparate sources, creating a melting pot for potential issues as data moves through the pipeline, from tool to tool and system to system.

The solution is to introduce a predetermined standard of accuracy through a centralized metadata repository with data governance at the helm. In essence, metadata describes data about data, ensuring that no matter where data is in relation to the pipeline, it still has the necessary context to be deciphered, analyzed and then used strategically.

The potential fallout of using inaccurate data has become even more severe with the GDPR’s implementation. A simple case of tagging and subsequently storing personally identifiable information incorrectly could lead to a serious breach in compliance and significant fines.

Such fines must be considered along with the costs resulting from any PR fallout.

Reasons for Bad Data: Data Integration

The proliferation of data sources, types, and stores increases the challenge of combining data into meaningful, valuable information. While companies are investing heavily in initiatives to increase the amount of data at their disposal, most information workers are spending more time finding the data they need rather than putting it to work, according to Database Trends and Applications (DBTA). erwin is co-sponsoring a DBTA webinar on this topic on July 17. To register, click here.

The need for faster and smarter data integration capabilities is growing. At the same time, to deliver business value, people need information they can trust to act on, so balancing governance is absolutely critical, especially with new regulations.

Organizations often invest heavily in individual software development tools for managing projects, requirements, designs, development, testing, deployment, releases, etc. Tools lacking inter-operability often result in cumbersome manual processes and heavy time investments to synchronize data or processes between these disparate tools.

Data integration combines data from several various sources into a unified view, making it more actionable and valuable to those accessing it.

Getting the Data Governance “EDGE”

The benefits of integrated data governance discussed above won’t be realized if it is isolated within IT with no input from other stakeholders, the day-to-day data users – from sales and customer service to the C-suite. Every data citizen has DG roles and responsibilities to ensure data units have context, meaning they are labeled, cataloged and secured correctly so they can be analyzed and used properly. In other words, the data can be trusted.

Once an organization understands that IT and the business are both responsible for data, it can develop comprehensive, holistic data governance capable of:

  • Reaching every stakeholder in the process
  • Providing a platform for understanding and governing trusted data assets
  • Delivering the greatest benefit from data wherever it lives, while minimizing risk
  • Helping users understand the impact of changes made to a specific data element across the enterprise.

To reduce the risks of and tackle the reasons for bad data and realize larger organizational objectives, organizations must make data governance everyone’s business.

To learn more about the collaborative approach to data governance and how it helps compliance in addition to adding value and reducing costs, get the free e-book here.

Data governance is everyone's business

Categories
erwin Expert Blog

The Connection Between Business Process Modeling and Standard Operating Procedures

We began a new blog series last week on business process (BP) modeling and its role within the enterprise. This week’s focus is on the connection between business process modeling and standard operating procedures. Specifically, using BP tools to help organizations streamline how they manage their standard operating procedures (SOPs).

Standard Operating Procedures: A New Approach to Organizing SOP Information

Manually maintaining the standard operating procedures that inform business processes can be a monster of a task. In most industries, SOPs typically are documented in multiple Word or Excel files.

In a process-centric world, heavy lifting is involved when an organization requires a change to an end-to-end process: Each SOP affected by the change may be associated with dozens or even hundreds of steps that exist between the start and conclusion of the process – and the alteration must be made to all of them wherever they occur.

You can imagine the significant man hours that go into wading through a sea of documents to discover and amend relevant SOPs and communicate these business process-related changes across the organization. And you can guess at the toll on productivity and efficiency that the business experiences as a result.

Companies that are eager to embrace business process optimization are keen to have a better approach to organizing SOP information to improve transparency and insight for speedier and more effective change management.

There’s another benefit to be realized from taking a new approach to SOP knowledge management, as well. With better organization comes an increased ability to convey information about current and changed standard operating procedures; companies can offer on-the-fly access to standard practices to teams across the enterprise.

That consistent and easily obtained business process information can help employees innovate, sharing ideas about additional improvements and innovations that could be made to standard operating procedures. It could also save them the time they might otherwise spend on “reinventing the wheel” for SOPs that already exist but that they don’t know about.

Balfour Beatty Construction, the fourth largest general builder in the U.S., saw big results when it standardized and transformed its process documentation, giving workers access to corporate SOPs from any location on almost any device.

As a construction company, keeping field workers out of danger is a major issue, and providing these employees with immediate information about how to accomplish a multi-step business process – such as clearing a site – can promote their safety. Among benefits it saw were a 5% gain in productivity and a reduction in training time for new employees who were now able to tap directly into SOP data.

Business Process Modeling & Standard Operating Procedures

Using Business Process Modeling to Transform SOP Management

How does a company transform manual SOP documentation to more effectively support change management as part of business process optimization? It’s key to adopt business process (BP) modeling and management software to create and store SOP documentation in a single repository, tying them to the processes they interact with for faster discovery and easier maintenance.

Organizations that move to this methodology, for example, will have the advantage of only needing to change an affected SOP in that one repository; the change automatically will propagate to all related processes and procedures.

In effect, the right BP tool automatically generates new SOPs with the necessary updated information.

Such a tool is also suitable for use in conjunction with controlled document repositories that are typically required in heavily regulated industries, such as pharmaceuticals, financial services and healthcare, as part of satisfying compliance mandates. All SOP documentation already is stored in the same repository, rather than scattered across files.

But a business process diagramming and modeling solution comes in handy in these cases by providing a web-based front-end that exposes high-end processes and how they map to related SOPs. This helps users better navigate them to institute and maintain changes and to access job-related procedure information.

To find out about how erwin can streamline SOP document management to positively impact costs, workloads and user benefits, please click here.

In our next blog, we’ll look at how business process modeling strengthens digital transformation initiatives.

Data-Driven Business Transformation whitepaper

Categories
erwin Expert Blog

Business Process Modeling and Its Role Within the Enterprise

To achieve its objectives, an organization must have a complete understanding of its processes. Therefore, business process design and analysis are key to defining how a business operates and ensures employees understand and are accountable for carrying out their responsibilities.

Understanding system interactions, business processes and organizational hierarchies creates alignment, with everyone pulling in the same direction, and supports informed decision-making for optimal results and continuous improvement.

Those organizations operating in industries in which quality, health, safety and environmental issues are constant concerns must be even more in tune with their complexities. After all, revenue and risk are inextricably linked.

What Is Business Process Modeling and Why Does It Matter?

A business process is “an activity or set of activities that will accomplish a specific organizational goal,” as defined by TechTarget. Business process modeling “links business strategy to IT systems development to ensure business value,” according to Gartner.

The research firm goes on to explain that it “combines process/workflow, functional, organizational and data/resource views with underlying metrics, such as costs, cycle times and responsibilities, you establish a foundation for analyzing value chains, activity-based costs, bottlenecks, critical paths and inefficiencies.”

To clearly document, define, map and analyze workflows and build models to drive process improvement and therefore business transformation, you’ll need to invest in a business process (BP) modeling solution.

Only then will you be able to determine where cross-departmental and intra-system process chains break down, as well as identify business practices susceptible to the greatest security, compliance, standards or other risks and where controls and audits are most needed to mitigate exposures.

Companies that maintain accurate BP models also are well-positioned to analyze and optimize end-to-end process threads that help accomplish such strategic business objectives as improving customer journeys and maximizing employee retention. You also can slice and dice models in multiple other ways, including to improve collaboration and efficiency.

Useful change only comes from evaluating process models, spotting sub-optimalities, and taking corrective actions. Business process modeling is also critical to data governance, helping organizations understand their data assets in the context of where their data is and how it’s used in various processes. Then you can drive data opportunities, like increasing revenue, and limit data risks, such as avoiding regulatory and compliance gaffes.

How to Do Business Process Modeling

Business process modeling software creates the documentation and graphical roadmap of how a business works today, detailing the tasks, responsible parties and data elements involved in processes and the interactions that occur across systems, procedures and organizational hierarchies. That knowledge, in turn, prepares the organization for tomorrow’s changes.

Effective BP technology will assist your business in documenting, managing and communicating your business processes in a structured manner that drives value and reduces risks.

It should enable you to:

  • Develop and capture multiple artefacts in a repository to support business-centric objectives
  • Support process improvement methodologies that boost critical capabilities
  • Identify gaps in process documentation to retain internal mastery over core activities
  • Reduce maintenance costs and increase employee access to critical knowledge
  • Incorporate any data from any location into business process models

In addition, a business process modeling solution should work in conjunction with the other data management domains (i.e., enterprise architecture, data modeling and data governance) to provide data clarity across all organizational roles and goals.

Data Governance, Data Modeling, Enterprise Architecture, Business Process - erwin EDGE

Business Process Modeling and Enterprise Data Management

Data isn’t just for “the data people.” To survive and thrive in the digital age, among the likes of Amazon, Airbnb, Netflix and Uber that have transformed their respective industries, organizations must extend the use, understanding and trust of their data everyday across every business function – from the C-level to the front line.

A common source of data leveraged by business process personnel, enterprise architects, data stewards and others encourages a greater understanding of how different line-of-business operations work together as a single unit. Links to data terms and categories contained within a centralized business glossary let enterprises eliminate ambiguity in process and policy procedure documents.

Integrated business models based on a sole source of truth also offer different views for different stakeholders based on their needs, while tight interconnection with enterprise architecture joins Process, Organization, Location, Data, Applications, and Technology (POLDAT) assets to explanatory models that support informed plans for change.

Seamless integration of business process models with enterprise architecture, data modeling and data governance reveals the interdependence between the workforce, the processes they perform, the actively governed assets they interact with and their importance to the business.

Then everyone is invested in and accountable for data, the fuel for the modern enterprise.

To learn more about business process modeling and its role within data-driven business transformation, click here.

Data-Driven Business Transformation whitepaper

Categories
erwin Expert Blog

Defining Data Governance: What Is Data Governance?

Data governance (DG) is one of the fastest growing disciplines, yet when it comes to defining data governance many organizations struggle.

Dataversity says DG is “the practices and processes which help to ensure the formal management of data assets within an organization.” These practices and processes can vary, depending on an organization’s needs. Therefore, when defining data governance for your organization, it’s important to consider the factors driving its adoption.

The General Data Protection Regulation (GDPR) has contributed significantly to data governance’s escalating prominence. In fact, erwin’s 2018 State of Data Governance Report found that 60% of organizations consider regulatory compliance to be their biggest driver of data governance.

Defining data governance: DG Drivers

Other significant drivers include improving customer trust/satisfaction and encouraging better decision-making, but they trail behind regulatory compliance at 49% and 45% respectively. Reputation management (30%), analytics (27%) and Big Data (21%) also are factors.

But data governance’s adoption is of little benefit without understanding how DG should be applied within these contexts. This is arguably one of the issues that’s held data governance back in the past.

With no set definition, and the historical practice of isolating data governance within IT, organizations often have had different ideas of what data governance is, even between departments. With this inter-departmental disconnect, it’s not hard to imagine why data governance has historically left a lot to be desired.

However, with the mandate for DG within GDPR, organizations must work on defining data governance organization-wide to manage its successful implementation, or face GDPR’s penalties.

Defining Data Governance: Desired Outcomes

A great place to start when defining an organization-wide DG initiative is to consider the desired business outcomes. This approach ensures that all parties involved have a common goal.

Past examples of Data Governance 1.0 were mainly concerned with cataloging data to support search and discovery. The nature of this approach, coupled with the fact that DG initiatives were typically siloed within IT departments without input from the wider business, meant the practice often struggled to add value.

Without input from the wider business, the data cataloging process suffered from a lack of context. By neglecting to include the organization’s primary data citizens – those that manage and or leverage data on a day-to-day basis for analysis and insight – organizational data was often plagued by duplications, inconsistencies and poor quality.

The nature of modern data-driven business means that such data citizens are spread throughout the organization. Furthermore, many of the key data citizens (think value-adding approaches to data use such as data-driven marketing) aren’t actively involved with IT departments.

Because of this, Data Governance 1.0 initiatives fizzled out at discouraging frequencies.

This is, of course, problematic for organizations that identify regulatory compliance as a driver of data governance. Considering the nature of data-driven business – with new data being constantly captured, stored and leveraged – meeting compliance standards can’t be viewed as a one-time fix, so data governance can’t be de-prioritized and left to fizzle out.

Even those businesses that manage to maintain the level of input data governance needs on an indefinite basis, will find the Data Governance 1.0 approach wanting. In terms of regulatory compliance, the lack of context associated with data governance 1.0, and the inaccuracies it leads to mean that potentially serious data governance issues could go unfounded and result in repercussions for non-compliance.

We recommend organizations look beyond just data cataloging and compliance as desired outcomes when implementing DG. In the data-driven business landscape, data governance finds its true potential as a value-added initiative.

Organizations that identify the desired business outcome of data governance as a value-added initiative should also consider data governance 1.0’s shortcomings and any organizations that hasn’t identified value-adding as a business outcome, should ask themselves, “why?”

Many of the biggest market disruptors of the 21st Century have been digital savvy start-ups with robust data strategies – think Airbnb, Amazon and Netflix. Without high data governance standards, such companies would not have the level of trust in their data to confidently action such digital-first strategies, making them difficult to manage.

Therefore, in the data-driven business era, organizations should consider a Data Governance 2.0 strategy, with DG becoming an organization-wide, strategic initiative that de-silos the practice from the confines of IT.

This collaborative take on data governance intrinsically involves data’s biggest beneficiaries and users in the governance process, meaning functions like data cataloging benefit from greater context, accuracy and consistency.

It also means that organizations can have greater trust in their data and be more assured of meeting the standards set for regulatory compliance. It means that organizations can better respond to customer needs through more accurate methods of profiling and analysis, improving rates of satisfaction. And it means that organizations are less likely to suffer data breaches and their associated damages.

Defining Data Governance: The Enterprise Data Governance Experience (EDGE)

The EDGE is the erwin approach to Data Governance 2.0, empowering an organization to:

  • Manage any data, anywhere (Any2)
  • Instil a culture of collaboration and organizational empowerment
  • Introduce an integrated ecosystem for data management that draws from one central repository and ensures data (including real-time changes) is consistent throughout the organization
  • Have visibility across domains by breaking down silos between business and IT and introducing a common data vocabulary
  • Have regulatory peace of mind through mitigation of a wide range of risks, from GDPR to cybersecurity. 

To learn more about implementing data governance, click here.

Take the DG RediChek

Categories
erwin Expert Blog

Five Pillars of Data Governance Readiness: Delivery Capability

The five pillars of data governance readiness should be the starting point for implementing or revamping any DG initiative.

In a recent CSO Magazine article, “Why data governance should be corporate policy,” the author states: “Data is like water, and water is a fundamental resource for life, so data is an essential resource for the business. Data governance ensures this resource is protected and managed correctly enabling us to meet our customer’s expectations.”

Over the past few weeks, we’ve been exploring the five pillars of data governance (DG) readiness, and this week we turn our attention to the fifth and final pillar, delivery capability.

Together, the five pillars of data governance readiness work as a step-by-step guide to a successful DG implementation and ongoing initiative.

As a refresher, the first four pillars are:

  1. The starting point is garnering initiative sponsorship from executives, before fostering support from the wider organization.

 

  1. Organizations should then appoint a dedicated team to oversee and manage the initiative. Although DG is an organization-wide strategic initiative, it needs experience and leadership to guide it.

 

  1. Once the above pillars are accounted for, the next step is to understand how data governance fits with the wider data management suite so that all components of a data strategy work together for maximum benefits.

 

  1. And then enterprise data management methodology as a plan of action to assemble the necessary tools.

Once you’ve completed these steps, how do you go about picking the right solution for enterprise-wide data governance?

Five Pillars of Data Governance: Delivery Capability – What’s the Right Solution?

Many organizations don’t think about enterprise data governance technologies when they begin a data governance initiative. They believe that using some general-purpose tool suite like those from Microsoft can support their DG initiative. That’s simply not the case.

Selecting the proper data governance solution should be part of developing the data governance initiative’s technical requirements. However, the first thing to understand is that the “right” solution is subjective.

Data stewards work with metadata rather than data 80 percent of the time. As a result, successful and sustainable data governance initiatives are supported by a full-scale, enterprise-grade metadata management tool.

Additionally, many organizations haven’t implemented data quality products when they begin a DG initiative. Product selections, including those for data quality management, should be based on the organization’s business goals, its current state of data quality and enterprise data management, and best practices as promoted by the data quality management team.

If your organization doesn’t have an existing data quality management product, a data governance initiative can support the need for data quality and the eventual evaluation and selection of the proper data quality management product.

Enterprise data modeling is also important. A component of enterprise data architecture, it’s an enabling force in the performance of data management and successful data governance. Having the capability to manage data architecture and data modeling with the optimal products can have a positive effect on DG by providing the initiative architectural support for the policies, practices, standards and processes that data governance creates.

Finally, and perhaps most important, the lack of a formal data governance team/unit has been cited as a leading cause of DG failure. Having the capability to manage all data governance and data stewardship activities has a positive effect.

Shopping for Data Governance Technology

DG is part of a larger data puzzle. Although it’s a key enabler of data-driven business, it’s only effective in the context of the data management suite in which it belongs.

Therefore when shopping for a data governance solution, organizations should look for DG tools that unify critical data governance domains, leverage role-appropriate interfaces to bring together stakeholders and processes to support a culture committed to acknowledging data as the mission-critical asset that it is, and orchestrate the key mechanisms required to discover, fully understand, actively govern and effectively socialize and align data to the business.

Data Governance Readiness: Delivery Capability

Here’s an initial checklist of questions to ask in your evaluation of a DG solution. Does it support:

  • Relational, unstructured, on-premise and cloud data?
  • Business-friendly environment to build business glossaries with taxonomies of data standards?
  • Unified capabilities to integrate business glossaries, data dictionaries and reference data, data quality metrics, business rules and data usage policies?
  • Regulating data and managing data collaboration through assigned roles, business rules and responsibilities, and defined governance processes and workflows?
  • Viewing data dashboards, KPIs and more via configurable role-based interfaces?
  • Providing key integrations with enterprise architecture, business process modeling/management and data modeling?
  • A SaaS model for rapid deployment and low TCO?

To assess your data governance readiness, especially with the General Data Protection Regulation about to take effect, click here.

You also can try erwin DG for free. Click here to start your free trial.

Take the DG RediChek

Categories
erwin Expert Blog

A New Wave in Application Development

Application development is new again.

The ever-changing business landscape – fueled by digital transformation initiatives indiscriminate of industry – demands businesses deliver innovative customer – and partner – facing solutions, not just tactical apps to support internal functions.

Therefore, application developers are playing an increasingly important role in achieving business goals. The financial services sector is a notable example, with companies like JPMorgan Chase spending millions on emerging fintech like online and mobile tools for opening accounts and completing transactions, real-time stock portfolio values, and electronic trading and cash management services.

But businesses are finding that creating market-differentiating applications to improve the customer experience, and subsequently customer satisfaction, requires some significant adjustments. For example, using non-relational database technologies, building another level of development expertise, and driving optimal data performance will be on their agendas.

Of course, all of this must be done with a focus on data governance – backed by data modeling – as the guiding principle for accurate, real-time analytics and business intelligence (BI).

Evolving Application Development Requirements

The development organization must identify which systems, processes and even jobs must evolve to meet demand. The factors it will consider include agile development, skills transformation and faster querying.

Rapid delivery is the rule, with products released in usable increments in sprints as part of ongoing, iterative development. Developers can move from conceptual models for defining high-level requirements to creating low-level physical data models to be incorporated directly into the application logic. This route facilitates dynamic change support to drive speedy baselining, fast-track sprint development cycles and quick application scaling. Logical modeling then follows.

Application Development

Agile application development usually goes hand in hand with using NoSQL databases, so developers can take advantage of more pliable data models. This technology has more dynamic and flexible schema design than relational databases and supports whatever data types and query options an application requires, processing efficiency, and scalability and performance suiting Big Data and new-age apps’ real-time requirements. However, NoSQL skills aren’t widespread so specific tools for modeling unstructured data in NoSQL databases can help staff used to RDBMS ramp up.

Finally, the shift to agile development and NoSQL technology as part of more complex data architectures is driving another shift. Storage-optimized models are moving to the backlines because a new format is available to support real-time app development. It is one that understands what’s being asked of the data and enables schemes to be structured to support application data access requirements for speedy responses to complex queries.

The NoSQL Paradigm

erwin DM NoSQL takes into account all the requirements for the new application development era. In addition to its modeling tools, the solution includes patent-pending Query-Optimized ModelingTM that replaces storage-optimized modeling, giving users guidance to build schemas for optimal performance for NoSQL applications.

erwin DM NoSQL also embraces an “any-squared” approach to data management, so “any data” from “anywhere” can be visualized for greater understanding. And the solution now supports the Couchbase Data Platform in addition to MongoDB. Used in conjunction with erwin DG, businesses also can be assured that agility, speed and flexibility will not take precedence over the equally important need to stringently manage data.

With all this in place, enterprises will be positioned to deliver unique, real-time and responsive apps to enhance the customer experience and support new digital-transformation opportunities. At the same time, they’ll be able to preserve and extend the work they’ve already done in terms of maintaining well-governed data assets.

For more information about how to realize value from app development in the age of digital transformation with the help of data modeling and data governance, you can download our new e-book: Application Development Is New Again.

Categories
erwin Expert Blog

Pillars of Data Governance Readiness: Enterprise Data Management Methodology

Facebook’s data woes continue to dominate the headlines and further highlight the importance of having an enterprise-wide view of data assets. The high-profile case is somewhat different than other prominent data scandals as it wasn’t a “breach,” per se. But questions of negligence persist, and in all cases, data governance is an issue.

This week, the Wall Street Journal ran a story titled “Companies Should Beware Public’s Rising Anxiety Over Data.” It discusses an IBM poll of 10,000 consumers in which 78% of U.S. respondents say a company’s ability to keep their data private is extremely important, yet only 20% completely trust organizations they interact with to maintain data privacy. In fact, 60% indicate they’re more concerned about cybersecurity than a potential war.

The piece concludes with a clear lesson for CIOs: “they must make data governance and compliance with regulations such as the EU’s General Data Protection Regulation [GDPR] an even greater priority, keeping track of data and making sure that the corporation has the ability to monitor its use, and should the need arise, delete it.”

With a more thorough data governance initiative and a better understanding of data assets, their lineage and useful shelf-life, and the privileges behind their access, Facebook likely could have gotten ahead of the problem and quelled it before it became an issue.  Sometimes erasure is the best approach if the reward from keeping data onboard is outweighed by the risk.

But perhaps Facebook is lucky the issue arose when it did. Once the GDPR goes into effect, this type of data snare would make the company non-compliant, as the regulation requires direct consent from the data owner (as well as notification within 72 hours if there is an actual breach).

Five Pillars of DG: Enterprise Data Management Methodology

Considering GDPR, as well as the gargantuan PR fallout and governmental inquiries Facebook faced, companies can’t afford such data governance mistakes.

During the past few weeks, we’ve been exploring each of the five pillars of data governance readiness in detail and how they come together to provide a full view of an organization’s data assets. In this blog, we’ll look at enterprise data management methodology as the fourth key pillar.

Enterprise Data Management in Four Steps

Enterprise data management methodology addresses the need for data governance within the wider data management suite, with all components and solutions working together for maximum benefits.

A successful data governance initiative should both improve a business’ understanding of data lineage/history and install a working system of permissions to prevent access by the wrong people. On the flip side, successful data governance makes data more discoverable, with better context so the right people can make better use of it.

This is the nature of Data Governance 2.0 – helping organizations better understand their data assets and making them easier to manage and capitalize on – and it succeeds where Data Governance 1.0 stumbled.

Enterprise Data Management: So where do you start?

  1. Metadata management provides the organization with the contextual information concerning its data assets. Without it, data governance essentially runs blind.

The value of metadata management is the ability to govern common and reference data used across the organization with cross-departmental standards and definitions, allowing data sharing and reuse, reducing data redundancy and storage, avoiding data errors due to incorrect choices or duplications, and supporting data quality and analytics capabilities.

  1. Your organization also needs to understand enterprise data architecture and enterprise data modeling. Without it, enterprise data governance will be hard to support

Enterprise data architecture supports data governance through concepts such as data movement, data transformation and data integration – since data governance develops policies and standards for these activities.

Data modeling, a vital component of data architecture, is also critical to data governance. By providing insights into the use cases satisfied by the data, organizations can do a better job of proactively analyzing the required shelf-life and better measure the risk/reward of keeping that data around.

Data stewards serve as SMEs in the development and refinement of data models and assist in the creation of data standards that are represented by data models. These artifacts allow your organization to achieve its business goals using enterprise data architecture.

  1. Let’s face it, most organizations implement data governance because they want high quality data. Enterprise data governance is foundational for the success of data quality management.

Data governance supports data quality efforts through the development of standard policies, practices, data standards, common definitions, etc. Data stewards implement these data standards and policies, supporting the data quality professionals.

These standards, policies, and practices lead to effective and sustainable data governance.

  1. Finally, without business intelligence (BI) and analytics, data governance will not add any value. The value of data governance to BI and analytics is the ability to govern data from its sources to destinations in warehouses/marts, define standards for data across those stages, and promote common algorithms and calculations where appropriate. These benefits allow the organization to achieve its business goals with BI and analytics.

Gaining an EDGE on the Competition

Old-school data governance is one-sided, mainly concerned with cataloging data to support search and discovery. The lack of short-term value here often caused executive support to dwindle, so the task of DG was siloed within IT.

These issues are circumvented by using the collaborative Data Governance 2.0 approach, spreading the responsibility of DG among those who use the data. This means that data assets are recorded with more context and are of greater use to an organization.

It also means executive-level employees are more aware of data governance working as they’re involved in it, as well as seeing the extra revenue potential in optimizing data analysis streams and the resulting improvements to times to market.

We refer to this enterprise-wide, collaborative, 2.0 take on data governance as the enterprise data governance experience (EDGE). But organizational collaboration aside, the real EDGE is arguably the collaboration it facilitates between solutions. The EDGE platform recognizes the fundamental reliance data governance has on the enterprise data management methodology suite and unifies them.

By existing on one platform, and sharing one repository, organizations can guarantee their data is uniform across the organization, regardless of department.

Additionally, it drastically improves workflows by allowing for real-time updates across the platform. For example, a change to a term in the data dictionary (data governance) will be automatically reflected in all connected data models (data modeling).

Further, the EDGE integrates enterprise architecture to define application capabilities and interdependencies within the context of their connection to enterprise strategy, enabling technology investments to be prioritized in line with business goals.

Business process also is included so enterprises can clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

Essentially, it’s the approach data governance needs to become a value-adding strategic initiative instead of an isolated effort that peters out.

To learn more about enterprise data management and getting an EDGE on GDPR and the competition, click here.

To assess your data governance readiness ahead of the GDPR, click here.

Take the DG RediChek

Categories
erwin Expert Blog

Five Pillars of Data Governance Readiness: Team Resources

The Facebook scandal has highlighted the need for organizations to understand and apply the five pillars of data governance readiness.

All eyes were on Mark Zuckerberg this week as he testified before the U.S. Senate and Congress on Facebook’s recent data drama.

A statement from Facebook indicates that the data snare was created due to permission settings leveraged by the Facebook-linked third-party app ‘thisisyourdigitallife.’

Although the method used by Cambridge Analytica to amass personal data from 87 million Facebook users didn’t constitute a “data breach,” it’s still a major data governance (DG) issue that is now creating more than a headache for the company.

The #DeleteFacebook movement is gaining momentum, not to mention the company’s stock dip.

With Facebook’s DG woes a mainstay in global news cycles, and the General Data Protection Regulation’s (GDPR) implementation just around the corner, organizations need to get DG-ready.

During the past few weeks, the erwin Expert Blog has been exploring the five pillars of data governance readiness. So far, we’ve covered initiative sponsorship and organizational support. Today, we talk team resources.

Facebook and the Data Governance Awakening

Most organizations lack the enterprise-level experience required to advance a data governance initiative.

This function may be called by another name (e.g., data management, information management, enterprise data management, etc.), a successful organization recognizes the need for managing data as an enterprise asset.

Data governance, as a foundational component of enterprise data management, would reside within such a group.

You would think an organization like Facebook would have this covered. However, it doesn’t appear that they did.

The reason Facebook is in hot water is because the platform allowed ‘thisisyourdigitallife’ to capture personal data from the Facebook friends of those who used the app, increasing the scope of the data snare by an order of magnitude.

Pillars of Data Governance; Facebook

For context, it took only 53 Australian ‘thisisyourdigitallife’ users to capture 310,000 Australian citizens’ data.

Facebook’s permission settings essentially enabled ‘thisisyourdigitallife’ users to consent on behalf of their friends. Had GDPR been in effect, Facebook would have been non-compliant.

Even so, the extent of the PR fallout demonstrates that regulatory compliance shouldn’t be the only driver for implementing data governance.

Understanding who has access to data and what that data can be used for is a key use case for data governance. This considered, it’s not difficult to imagine how a more robust DG program could have covered Facebook’s back.

Data governance is concerned with units of data – what are they used for, what are the associated risks, and what value do they have to the business? In addition, DG asks who is responsible for the data – who has access? And what is the data lineage?

It acts as the filter that makes data more discoverable to those who need it, while shutting out those without the required permissions.

The Five Pillars of Data Governance: #3 Team Resources

Data governance can’t be executed as a short-term fix. It must be an on-going, strategic initiative that the entire organization supports and is part of. But ideally, a fixed and formal data management group needs to oversee it.

As such, we consider team resources one of the key pillars of data governance readiness.

Data governance requires leadership with experience to ensure the initiative is a value-adding success, not the stifled, siloed programs associated with data governance of old (Data Governance 1.0).

Without experienced leadership, different arms of the organization will likely pull in different directions, undermining the uniformity of data that DG aims to introduce. If such experience doesn’t exist within the organization, then outside consultants should be tapped for their expertise.

As the main technical enabler of the practice, IT should be a key DG participant and even house the afore-mentioned data management group to oversee it. The key word here is “participant,” as the inclination to leave data governance to IT and IT alone has been a common reason for Data Governance 1.0’s struggles.

With good leadership, organizations can implement Data Governance 2.0: the collaborative, outcome-driven approach more suited to the data-driven business landscape. DG 2.0 avoids the pitfalls of its predecessor by expanding the practice beyond IT and traditional data stewards to make it an enterprise-wide responsibility.

By approaching data governance in this manner, organizations ensure those with a stake in data quality (e.g., anyone who uses data) are involved in its discovery, understanding, governance and socialization.

This leads to data with greater context, accuracy and trust. It also hastens decision-making and times to market, resulting in fewer bottlenecks in data analysis.

We refer to this collaborative approach to data governance as the enterprise data governance experience (EDGE).

Back to Facebook. If they had a more robust data governance program, the company could have discovered the data snare exploited by Cambridge Analytica and circumvented the entire scandal (and all its consequences).

But for data governance to be successful, organizations must consider team resources as well as enterprise data management methodology and delivery capability (we’ll cover the latter two in the coming weeks).

To determine your organization’s current state of data governance readiness, take the erwin DG RediChek.

To learn more about how to leverage data governance for GDPR compliance and an EDGE on the competition, click here.

Take the DG RediChek