Categories
erwin Expert Blog

Automated Data Management: Stop Drowning in Your Data 

Due to the wealth of data data-driven organizations are tasked with handling, organizations are increasingly adopting automated data management.

There are 2.5 quintillion bytes of data being created every day, and that figure is increasing in tandem with the production of and demand for Internet of Things (IoT) devices. However, Forrester reports that between 60 and 73 percent of all data within an enterprise goes unused.

Collecting all that data is pointless if it’s not going to be used to deliver accurate and actionable insights.

But the reality is there’s not enough time, people and/or money for effective data management using manual processes. Organizations won’t be able to take advantage of analytics tools to become data-driven unless they establish a foundation for agile and complete data management. And organizations that don’t employ automated data management risk being left behind.

In addition to taking the burden off already stretched internal teams, automated data management’s most obvious benefit is that it’s a key enabler of data-driven business. Without it, a truly data-driven approach to business is either ineffective, or impossible, depending on the scale of data you’re working with.

This is because there’s either too much data left unaccounted for and too much potential revenue left on the table for the strategy to be considered effective. Or it’s because there’s so much disparity in the data sources and silos in where data is stored that data quality suffers to an insurmountable degree, rendering any analysis fundamentally flawed.

But simply enabling the strategy isn’t the most compelling use case, or organizations across the board would have implemented it already.

The Case for Automated Data Management

Business leaders and decision-makers want a business case for automated data management.

So here it is …

Without automation, business transformation will be stymied. Companies, especially large ones with thousands of systems, files and processes, will be particularly challenged by taking a manual approach. And outsourcing these data management efforts to professional services firms only delays schedules and increases cost.

By automating data cataloging and data mapping inclusive of data at rest and data in motion through the integration lifecycle process, organizations will benefit from:

  • A metadata-driven automated framework for cataloging data assets and their flows across the business
  • An efficient, agile and dynamic way to generate data lineage from operational systems (databases, data models, file-based systems, unstructured files and more) across the information management architecture
  • Easy access to what data aligns with specific business rules and policies
  • The ability to inform how data is transformed, integrated and federated throughout business processes – complete with full documentation
  • Faster project delivery and lower costs because data is managed internally, without the need to outsource data management efforts
  • Assurance of data quality, so analysis is reliable and new initiatives aren’t beleaguered with false starts
  • A seamlessly governed data pipeline, operationalized to the benefit of all stakeholders

erwin Data Intelligence

Categories
erwin Expert Blog

Healthy Co-Dependency: Data Management and Data Governance

Data management and data governance are now more important than ever before. The hyper competitive nature of data-driven business means organizations need to get more out of their data than ever before – and fast.

A few data-driven exemplars have led the way, turning data into actionable insights that influence everything from corporate structure to new products and pricing. “Few” being the operative word.

It’s true, data-driven business is big business. Huge actually. But it’s dominated by a handful of organizations that realized early on what a powerful and disruptive force data can be.

The benefits of such data-driven strategies speak for themselves: Netflix has replaced Blockbuster, and Uber continues to shake up the taxi business. Organizations indiscriminate of industry are following suit, fighting to become the next big, disruptive players.

But in many cases, these attempts have failed or are on the verge of doing so.

Now with the General Data Protection Regulation (GDPR) in effect, data that is unaccounted for is a potential data disaster waiting to happen.

So organizations need to understand that getting more out of their data isn’t necessarily about collecting more data. It’s about unlocking the value of the data they already have.

Data Management and Data Governance Co-Dependency

The Enterprise Data Dilemma

However, most organizations don’t know exactly what data they have or even where some of it is. And some of the data they can account for is going to waste because they don’t have the means to process it. This is especially true of unstructured data types, which organizations are collecting more frequently.

Considering that 73 percent of company data goes unused, it’s safe to assume your organization is dealing with some if not all of these issues.

Big picture, this means your enterprise is missing out on thousands, perhaps millions in revenue.

The smaller picture? You’re struggling to establish a single source of data truth, which contributes to a host of problems:

  • Inaccurate analysis and discrepancies in departmental reporting
  • Inability to manage the amount and variety of data your organization collects
  • Duplications and redundancies in processes
  • Issues determining data ownership, lineage and access
  • Achieving and sustaining compliance

To avoid such circumstances and get more value out of data, organizations need to harmonize their approach to data management and data governance, using a platform of established tools that work in tandem while also enabling collaboration across the enterprise.

Data management drives the design, deployment and operation of systems that deliver operational data assets for analytics purposes.

Data governance delivers these data assets within a business context, tracking their physical existence and lineage, and maximizing their security, quality and value.

Although these two disciplines approach data from different perspectives (IT-driven and business-oriented), they depend on each other. And this co-dependency helps an organization make the most of its data.

The P-M-G Hub

Together, data management and data governance form a critical hub for data preparation, modeling and data governance. How?

It starts with a real-time, accurate picture of the data landscape, including “data at rest” in databases, data warehouses and data lakes and “data in motion” as it is integrated with and used by key applications. That landscape also must be controlled to facilitate collaboration and limit risk.

But knowing what data you have and where it lives is complicated, so you need to create and sustain an enterprise-wide view of and easy access to underlying metadata. That’s a tall order with numerous data types and data sources that were never designed to work together and data infrastructures that have been cobbled together over time with disparate technologies, poor documentation and little thought for downstream integration. So the applications and initiatives that depend on a solid data infrastructure may be compromised, and data analysis based on faulty insights.

However, these issues can be addressed with a strong data management strategy and technology to enable the data quality required by the business, which encompasses data cataloging (integration of data sets from various sources), mapping, versioning, business rules and glossaries maintenance and metadata management (associations and lineage).

Being able to pinpoint what data exists and where must be accompanied by an agreed-upon business understanding of what it all means in common terms that are adopted across the enterprise. Having that consistency is the only way to assure that insights generated by analyses are useful and actionable, regardless of business department or user exploring a question. Additionally, policies, processes and tools that define and control access to data by roles and across workflows are critical for security purposes.

These issues can be addressed with a comprehensive data governance strategy and technology to determine master data sets, discover the impact of potential glossary changes across the enterprise, audit and score adherence to rules, discover risks, and appropriately and cost-effectively apply security to data flows, as well as publish data to people/roles in ways that are meaningful to them.

Data Management and Data Governance: Play Together, Stay Together

When data management and data governance work in concert empowered by the right technology, they inform, guide and optimize each other. The result for an organization that takes such a harmonized approach is automated, real-time, high-quality data pipeline.

Then all stakeholders — data scientists, data stewards, ETL developers, enterprise architects, business analysts, compliance officers, CDOs and CEOs – can access the data they’re authorized to use and base strategic decisions on what is now a full inventory of reliable information.

The erwin EDGE creates an “enterprise data governance experience” through integrated data mapping, business process modeling, enterprise architecture modeling, data modeling and data governance. No other software platform on the market touches every aspect of the data management and data governance lifecycle to automate and accelerate the speed to actionable business insights.

Categories
erwin Expert Blog

Solving the Enterprise Data Dilemma

Due to the adoption of data-driven business, organizations across the board are facing their own enterprise data dilemmas.

This week erwin announced its acquisition of metadata management and data governance provider AnalytiX DS. The combined company touches every piece of the data management and governance lifecycle, enabling enterprises to fuel automated, high-quality data pipelines for faster speed to accurate, actionable insights.

Why Is This a Big Deal?

From digital transformation to AI, and everything in between, organizations are flooded with data. So, companies are investing heavily in initiatives to use all the data at their disposal, but they face some challenges. Chiefly, deriving meaningful insights from their data – and turning them into actions that improve the bottom line.

This enterprise data dilemma stems from three important but difficult questions to answer: What data do we have? Where is it? And how do we get value from it?

Large enterprises use thousands of unharvested, undocumented databases, applications, ETL processes and procedural code that make it difficult to gather business intelligence, conduct IT audits, and ensure regulatory compliance – not to mention accomplish other objectives around customer satisfaction, revenue growth and overall efficiency and decision-making.

The lack of visibility and control around “data at rest” combined with “data in motion”, as well as difficulties with legacy architectures, means these organizations spend more time finding the data they need rather than using it to produce meaningful business outcomes.

To remedy this, enterprises need smarter and faster data management and data governance capabilities, including the ability to efficiently catalog and document their systems, processes and the associated data without errors. In addition, business and IT must collaborate outside their traditional operational silos.

But this coveted state of data nirvana isn’t possible without the right approach and technology platform.

Enterprise Data: Making the Data Management-Data Governance Love Connection

Enterprise Data: Making the Data Management-Data Governance Love Connection

Bringing together data management and data governance delivers greater efficiencies to technical users and better analytics to business users. It’s like two sides of the same coin:

  • Data management drives the design, deployment and operation of systems that deliver operational and analytical data assets.
  • Data governance delivers these data assets within a business context, tracks their physical existence and lineage, and maximizes their security, quality and value.

Although these disciplines approach data from different perspectives and are used to produce different outcomes, they have a lot in common. Both require a real-time, accurate picture of an organization’s data landscape, including data at rest in data warehouses and data lakes and data in motion as it is integrated with and used by key applications.

However, creating and maintaining this metadata landscape is challenging because this data in its various forms and from numerous sources was never designed to work in concert. Data infrastructures have been cobbled together over time with disparate technologies, poor documentation and little thought for downstream integration, so the applications and initiatives that depend on data infrastructure are often out-of-date and inaccurate, rendering faulty insights and analyses.

Organizations need to know what data they have and where it’s located, where it came from and how it got there, what it means in common business terms [or standardized business terms] and be able to transform it into useful information they can act on – all while controlling its access.

To support the total enterprise data management and governance lifecycle, they need an automated, real-time, high-quality data pipeline. Then every stakeholder – data scientist, ETL developer, enterprise architect, business analyst, compliance officer, CDO and CEO – can fuel the desired outcomes with reliable information on which to base strategic decisions.

Enterprise Data: Creating Your “EDGE”

At the end of the day, all industries are in the data business and all employees are data people. The success of an organization is not measured by how much data it has, but by how well it’s used.

Data governance enables organizations to use their data to fuel compliance, innovation and transformation initiatives with greater agility, efficiency and cost-effectiveness.

Organizations need to understand their data from different perspectives, identify how it flows through and impacts the business, aligns this business view with a technical view of the data management infrastructure, and synchronizes efforts across both disciplines for accuracy, agility and efficiency in building a data capability that impacts the business in a meaningful and sustainable fashion.

The persona-based erwin EDGE creates an “enterprise data governance experience” that facilitates collaboration between both IT and the business to discover, understand and unlock the value of data both at rest and in motion.

By bringing together enterprise architecture, business process, data mapping and data modeling, erwin’s approach to data governance enables organizations to get a handle on how they handle their data. With the broadest set of metadata connectors and automated code generation, data mapping and cataloging tools, the erwin EDGE Platform simplifies the total data management and data governance lifecycle.

This single, integrated solution makes it possible to gather business intelligence, conduct IT audits, ensure regulatory compliance and accomplish any other organizational objective by fueling an automated, high-quality and real-time data pipeline.

With the erwin EDGE, data management and data governance are unified and mutually supportive, with one hand aware and informed by the efforts of the other to:

  • Discover data: Identify and integrate metadata from various data management silos.
  • Harvest data: Automate the collection of metadata from various data management silos and consolidate it into a single source.
  • Structure data: Connect physical metadata to specific business terms and definitions and reusable design standards.
  • Analyze data: Understand how data relates to the business and what attributes it has.
  • Map data flows: Identify where to integrate data and track how it moves and transforms.
  • Govern data: Develop a governance model to manage standards and policies and set best practices.
  • Socialize data: Enable stakeholders to see data in one place and in the context of their roles.

An integrated solution with data preparation, modeling and governance helps businesses reach data governance maturity – which equals a role-based, collaborative data governance system that serves both IT and business users equally. Such maturity may not happen overnight, but it will ultimately deliver the accurate and actionable insights your organization needs to compete and win.

Your journey to data nirvana begins with a demo of the enhanced erwin Data Governance solution. Register now.

erwin ADS webinar

Categories
erwin Expert Blog

Why Data Governance and Business Process Management Must Be Linked

Data governance and business process management must be linked.

Following the boom in data-driven business data governance (DG) has taken the modern enterprise by storm, garnering the attention of both the business and technical realms with an explosion of methodologies, targeted systems and training courses. That’s because a major gap needs to be addressed.

But despite all the admonitions and cautionary tales, little attention has focused on what can literally make or break any data governance initiative, turning it from a springboard for competitive advantage to a recipe for waste, anger and ultimately failure. The two key pivot points on which success hinges are business process management (BPM) and enterprise architecture. This article focuses on the critical connections between data governance and business process management.

Based on a True Story: Data Governance Without Process Is Not Data Governance

The following is based on a true story about a global pharmaceutical company implementing a cloud-based, enterprise-wide CRM system with a third-party provider.

Given the system’s nature, the data it would process, and the scope of the deployment, data security and governance was front and center. There were countless meetings – some with more than 50 participants – with protocols sent, reviewed, adjusted and so on. In fact, more than half a dozen outside security companies and advisors (and yes, data governance experts) came in to help design the perfect data protection system around which the CRM system would be implemented.

The framework was truly mind-boggling: hundreds of security measures, dozens of different file management protocols, data security software appearing every step of the way.  Looking at it as an external observer, it appeared to be an ironclad net of absolute safety and effective governance.

But as the CRM implementation progressed, holes began to appear. They were small at first but quickly grew to the size of trucks, effectively rendering months of preparatory work pointless.

Detailed data transfer protocols were subverted daily by consultants and company employees who thought speed was more important than safety. Software locks and systems were overridden with passwords freely communicated through emails and even written on Post-It Notes. And a two-factor authentication principle was reduced to one person entering half a password, with a piece of paper taped over half the computer screen, while another person entered the other half of the password before a third person read the entire password and pressed enter.

While these examples of security holes might seem funny – in a sad way – when you read them here, they represent a $500,000 failure that potentially could lead to a multi-billion-dollar security breach.

Why? Because there were no simple, effective and clearly defined processes to govern the immense investment in security protocols and software to ensure employees would follow them and management could audit and control them. Furthermore, the organization failed to realize how complex this implementation was and that process changes would be paramount.

Both such failures could have been avoided if the organization had a simple system of managing, adjusting and monitoring its processes. More to the point, the implementation of the entire security and governance framework would have cost less and been completed in half the time. Furthermore, if a failure or breach were discovered, it would be easy to trace and correct.

Gartner Magic Quadrant

Data Governance Starts with BPM

In a rush to implement a data governance methodology and system, you can forget that a system must serve a process – and be governed/controlled by one.

To choose the correct system and implement it effectively and efficiently, you must know – in every detail – all the processes it will impact, how it will impact them, who needs to be involved and when. Do these questions sound familiar? They should because they are the same ones we ask in data governance. They involve impact analysis, ownership and accountability, control and traceability – all of which effectively documented and managed business processes enable.

Data sets are not important in and of themselves. Data sets become important in terms of how they are used, who uses them and what their use is – and all this information is described in the processes that generate, manipulate and use them. So, unless we know what those processes are, how can any data governance implementation be complete or successful?

Consider this scenario: We’ve perfectly captured our data lineage, so we know what our data sets mean, how they’re connected, and who’s responsible for them – not a simple task but a massive win for any organization.  Now a breach occurs. Will any of the above information tell us why it happened? Or where? No! It will tell us what else is affected and who can manage the data layer(s), but unless we find and address the process failure that led to the breach, it is guaranteed to happen again.

By knowing where data is used – the processes that use and manage it – we can quickly, even instantly, identify where a failure occurs. Starting with data lineage (meaning our forensic analysis starts from our data governance system), we can identify the source and destination processes and the associated impacts throughout the organization. We can know which processes need to change and how. We can anticipate the pending disruptions to our operations and, more to the point, the costs involved in mitigating and/or addressing them.

But knowing all the above requires that our processes – our essential and operational business architecture – be accurately captured and modelled. Instituting data governance without processes is like building a castle on sand.

Rethinking Business Process Management

Modern organizations need a simple and easy-to-use BPM system with easy access to all the operational layers across the organization – from high-level business architecture all the way down to data. Sure, most organizations already have various solutions here and there, some with claims of being able to provide a comprehensive picture. But chances are they don’t, so you probably need to rethink your approach.

Modern BPM ecosystems are flexible, adjustable, easy-to-use and can support multiple layers simultaneously, allowing users to start in their comfort zones and mature as they work toward the organization’s goals.

Processes need to be open and shared in a concise, consistent way so all parts of the organization can investigate, ask questions, and then add their feedback and information layers. In other words, processes need to be alive and central to the organization because only then will the use of data and data governance be truly effective.

Are you willing to think outside the traditional boxes or silos that your organization’s processes and data live in?

The erwin EDGE is one of the most comprehensive software platforms for managing an organization’s data governance and business process initiatives, as well as the whole data architecture. It allows natural, organic growth throughout the organization and the assimilation of data governance and business process management under the same platform provides a unique data governance experience because of its integrated, collaborative approach.

To learn more about erwin EDGE, and how data governance underpins and ensures data quality throughout the wider data management-suite, download our resource: Data Governance Is Everyone’s Business.

Data Governance is Everyone's Business

Categories
erwin Expert Blog

The Connection Between Business Process Modeling and Standard Operating Procedures

We began a new blog series last week on business process (BP) modeling and its role within the enterprise. This week’s focus is on the connection between business process modeling and standard operating procedures. Specifically, using BP tools to help organizations streamline how they manage their standard operating procedures (SOPs).

Standard Operating Procedures: A New Approach to Organizing SOP Information

Manually maintaining the standard operating procedures that inform business processes can be a monster of a task. In most industries, SOPs typically are documented in multiple Word or Excel files.

In a process-centric world, heavy lifting is involved when an organization requires a change to an end-to-end process: Each SOP affected by the change may be associated with dozens or even hundreds of steps that exist between the start and conclusion of the process – and the alteration must be made to all of them wherever they occur.

You can imagine the significant man hours that go into wading through a sea of documents to discover and amend relevant SOPs and communicate these business process-related changes across the organization. And you can guess at the toll on productivity and efficiency that the business experiences as a result.

Companies that are eager to embrace business process optimization are keen to have a better approach to organizing SOP information to improve transparency and insight for speedier and more effective change management.

There’s another benefit to be realized from taking a new approach to SOP knowledge management, as well. With better organization comes an increased ability to convey information about current and changed standard operating procedures; companies can offer on-the-fly access to standard practices to teams across the enterprise.

That consistent and easily obtained business process information can help employees innovate, sharing ideas about additional improvements and innovations that could be made to standard operating procedures. It could also save them the time they might otherwise spend on “reinventing the wheel” for SOPs that already exist but that they don’t know about.

Balfour Beatty Construction, the fourth largest general builder in the U.S., saw big results when it standardized and transformed its process documentation, giving workers access to corporate SOPs from any location on almost any device.

As a construction company, keeping field workers out of danger is a major issue, and providing these employees with immediate information about how to accomplish a multi-step business process – such as clearing a site – can promote their safety. Among benefits it saw were a 5% gain in productivity and a reduction in training time for new employees who were now able to tap directly into SOP data.

Business Process Modeling & Standard Operating Procedures

Using Business Process Modeling to Transform SOP Management

How does a company transform manual SOP documentation to more effectively support change management as part of business process optimization? It’s key to adopt business process (BP) modeling and management software to create and store SOP documentation in a single repository, tying them to the processes they interact with for faster discovery and easier maintenance.

Organizations that move to this methodology, for example, will have the advantage of only needing to change an affected SOP in that one repository; the change automatically will propagate to all related processes and procedures.

In effect, the right BP tool automatically generates new SOPs with the necessary updated information.

Such a tool is also suitable for use in conjunction with controlled document repositories that are typically required in heavily regulated industries, such as pharmaceuticals, financial services and healthcare, as part of satisfying compliance mandates. All SOP documentation already is stored in the same repository, rather than scattered across files.

But a business process diagramming and modeling solution comes in handy in these cases by providing a web-based front-end that exposes high-end processes and how they map to related SOPs. This helps users better navigate them to institute and maintain changes and to access job-related procedure information.

To find out about how erwin can streamline SOP document management to positively impact costs, workloads and user benefits, please click here.

In our next blog, we’ll look at how business process modeling strengthens digital transformation initiatives.

Data-Driven Business Transformation whitepaper

Categories
erwin Expert Blog

Pillars of Data Governance Readiness: Enterprise Data Management Methodology

Facebook’s data woes continue to dominate the headlines and further highlight the importance of having an enterprise-wide view of data assets. The high-profile case is somewhat different than other prominent data scandals as it wasn’t a “breach,” per se. But questions of negligence persist, and in all cases, data governance is an issue.

This week, the Wall Street Journal ran a story titled “Companies Should Beware Public’s Rising Anxiety Over Data.” It discusses an IBM poll of 10,000 consumers in which 78% of U.S. respondents say a company’s ability to keep their data private is extremely important, yet only 20% completely trust organizations they interact with to maintain data privacy. In fact, 60% indicate they’re more concerned about cybersecurity than a potential war.

The piece concludes with a clear lesson for CIOs: “they must make data governance and compliance with regulations such as the EU’s General Data Protection Regulation [GDPR] an even greater priority, keeping track of data and making sure that the corporation has the ability to monitor its use, and should the need arise, delete it.”

With a more thorough data governance initiative and a better understanding of data assets, their lineage and useful shelf-life, and the privileges behind their access, Facebook likely could have gotten ahead of the problem and quelled it before it became an issue.  Sometimes erasure is the best approach if the reward from keeping data onboard is outweighed by the risk.

But perhaps Facebook is lucky the issue arose when it did. Once the GDPR goes into effect, this type of data snare would make the company non-compliant, as the regulation requires direct consent from the data owner (as well as notification within 72 hours if there is an actual breach).

Five Pillars of DG: Enterprise Data Management Methodology

Considering GDPR, as well as the gargantuan PR fallout and governmental inquiries Facebook faced, companies can’t afford such data governance mistakes.

During the past few weeks, we’ve been exploring each of the five pillars of data governance readiness in detail and how they come together to provide a full view of an organization’s data assets. In this blog, we’ll look at enterprise data management methodology as the fourth key pillar.

Enterprise Data Management in Four Steps

Enterprise data management methodology addresses the need for data governance within the wider data management suite, with all components and solutions working together for maximum benefits.

A successful data governance initiative should both improve a business’ understanding of data lineage/history and install a working system of permissions to prevent access by the wrong people. On the flip side, successful data governance makes data more discoverable, with better context so the right people can make better use of it.

This is the nature of Data Governance 2.0 – helping organizations better understand their data assets and making them easier to manage and capitalize on – and it succeeds where Data Governance 1.0 stumbled.

Enterprise Data Management: So where do you start?

  1. Metadata management provides the organization with the contextual information concerning its data assets. Without it, data governance essentially runs blind.

The value of metadata management is the ability to govern common and reference data used across the organization with cross-departmental standards and definitions, allowing data sharing and reuse, reducing data redundancy and storage, avoiding data errors due to incorrect choices or duplications, and supporting data quality and analytics capabilities.

  1. Your organization also needs to understand enterprise data architecture and enterprise data modeling. Without it, enterprise data governance will be hard to support

Enterprise data architecture supports data governance through concepts such as data movement, data transformation and data integration – since data governance develops policies and standards for these activities.

Data modeling, a vital component of data architecture, is also critical to data governance. By providing insights into the use cases satisfied by the data, organizations can do a better job of proactively analyzing the required shelf-life and better measure the risk/reward of keeping that data around.

Data stewards serve as SMEs in the development and refinement of data models and assist in the creation of data standards that are represented by data models. These artifacts allow your organization to achieve its business goals using enterprise data architecture.

  1. Let’s face it, most organizations implement data governance because they want high quality data. Enterprise data governance is foundational for the success of data quality management.

Data governance supports data quality efforts through the development of standard policies, practices, data standards, common definitions, etc. Data stewards implement these data standards and policies, supporting the data quality professionals.

These standards, policies, and practices lead to effective and sustainable data governance.

  1. Finally, without business intelligence (BI) and analytics, data governance will not add any value. The value of data governance to BI and analytics is the ability to govern data from its sources to destinations in warehouses/marts, define standards for data across those stages, and promote common algorithms and calculations where appropriate. These benefits allow the organization to achieve its business goals with BI and analytics.

Gaining an EDGE on the Competition

Old-school data governance is one-sided, mainly concerned with cataloging data to support search and discovery. The lack of short-term value here often caused executive support to dwindle, so the task of DG was siloed within IT.

These issues are circumvented by using the collaborative Data Governance 2.0 approach, spreading the responsibility of DG among those who use the data. This means that data assets are recorded with more context and are of greater use to an organization.

It also means executive-level employees are more aware of data governance working as they’re involved in it, as well as seeing the extra revenue potential in optimizing data analysis streams and the resulting improvements to times to market.

We refer to this enterprise-wide, collaborative, 2.0 take on data governance as the enterprise data governance experience (EDGE). But organizational collaboration aside, the real EDGE is arguably the collaboration it facilitates between solutions. The EDGE platform recognizes the fundamental reliance data governance has on the enterprise data management methodology suite and unifies them.

By existing on one platform, and sharing one repository, organizations can guarantee their data is uniform across the organization, regardless of department.

Additionally, it drastically improves workflows by allowing for real-time updates across the platform. For example, a change to a term in the data dictionary (data governance) will be automatically reflected in all connected data models (data modeling).

Further, the EDGE integrates enterprise architecture to define application capabilities and interdependencies within the context of their connection to enterprise strategy, enabling technology investments to be prioritized in line with business goals.

Business process also is included so enterprises can clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

Essentially, it’s the approach data governance needs to become a value-adding strategic initiative instead of an isolated effort that peters out.

To learn more about enterprise data management and getting an EDGE on GDPR and the competition, click here.

To assess your data governance readiness ahead of the GDPR, click here.

Take the DG RediChek

Categories
erwin Expert Blog

Five Pillars of Data Governance Readiness: Initiative Sponsorship

“Facebook at the center of global reckoning on data governance.” This headline from a March 19 article in The Wall Street Journal sums up where we are. With only two months until the General Data Protection Regulation (GDPR) goes into effect, we’re going to see more headlines about improper data governance (DG) – leading to major fines and tarnished brands.

Since the news of the Facebook data scandal broke, the company’s stock has dropped and Nordea, the largest bank in the Nordic region, put a stop to Facebook investments for three months because “we see that the risks related to governance around data protection may have been severely compromised,” it said in a statement.

Last week, we began discussing the five pillars of data governance readiness to ensure the data management foundation is in place for mitigating risks, as well as accomplishing other organizational goals. There can be no doubt that data governance is central to an organization’s customer relationships, reputation and financial results.

So today, we’re going to explore the first pillar of DG readiness: initiative sponsorship. Without initiative sponsorship, organizations will struggle to obtain the funding, resources, support and alignment necessary for successful implementation and subsequent performance.

A Common Roadblock

Data governance isn’t a one-off project with a defined endpoint. It’s an on-going initiative that requires active engagement from executives and business leaders. But unfortunately, the 2018 State of Data Governance Report finds lack of executive support to be the most common roadblock to implementing DG.

This is historical baggage. Traditional DG has been an isolated program housed within IT, and thus, constrained within that department’s budget and resources. More significantly, managing DG solely within IT prevented those in the organization with the most knowledge of and investment in the data from participating in the process.

This silo created problems ranging from a lack of context in data cataloging to poor data quality and a sub-par understanding of the data’s associated risks. Data Governance 2.0 addresses these issues by opening data governance to the whole organization.

Its collaborative approach ensures that those with the most significant stake in an organization’s data are intrinsically involved in discovering, understanding, governing and socializing it to produce the desired outcomes. In this era of data-driven business, C-level executives and department leaders are key stakeholders.

But they must be able to trust it and then collaborate based on their role-specific insights to make informed decisions about strategy, identify new opportunities, address redundancies and improve processes.

So, it all comes back to modern data governance: the ability to understand critical enterprise data within a business context, track its physical existence and lineage, and maximize its value while ensuring quality and security.

Initiative Sponsorship: Encouraging Executive Involvement

This week’s headlines about Facebook have certainly gotten Mark Zuckerberg’s attention, as there are calls for the CEO to appear before the U.S. Congress and British Parliament to answer for his company’s data handling – or mishandling as it is alleged.

Public embarrassment, Federal Trade Commission and GDPR fines, erosion of customer trust/loyalty, revenue loss and company devaluation are real risks when it comes to poor data management and governance practices. Facebook may have just elevated your case for implementing DG 2.0 and involving your executives.

Initiative Sponsorship Data Governance GDPR

Business heads and their teams, after all, are the ones who have the knowledge about the data – what it is, what it means, who and what processes use it and why, and what rules and policies should apply to it. Without their perspective and participation in data governance, the enterprise’s ability to intelligently lock down risks and enable growth will be seriously compromised.

Appropriately implemented – with business data stakeholders driving alignment between DG and strategic enterprise goals and IT handling the technical mechanics of data management – the door opens to trusting data and using it effectively.

Also, a chief data officer (CDO) can serve as the bridge between IT and the business to remove silos in the drive toward DG and subsequent whole-of-business outcomes. He or she would be the ultimate sponsor, leading the charge for the necessary funding, resources, and support for a successful, ongoing initiative.

Initiative Sponsorship with an ‘EDGE’

Once key business leaders understand and buy into the vital role they play in a Data Governance 2.0 strategy, the work of building the infrastructure enabling the workforce and processes to support actively governing data assets and their alignment to the business begins.

To find it, map it, make sure it’s under control, and promote it to appropriate personnel requires a technology- and business-enabling platform that covers the entire data governance lifecycle across all data producer and consumer roles.

The erwin EDGE delivers an ‘enterprise data governance experience’ to unify critical DG domains, use role-appropriate interfaces to bring together stakeholders and processes to support a culture committed to acknowledging data as the mission-critical asset that it is, and orchestrate the key mechanisms that are required to discover, fully understand, actively govern and effectively socialize and align data to the business.

To assess your organizations current data governance readiness, take the erwin DG RediChek.

To learn more about the erwin EDGE, reserve your seat for this webinar.

Take the DG RediChek

Categories
erwin Expert Blog

Digital Trust: Enterprise Architecture and the Farm Analogy

With the General Data Protection Regulation (GDPR) taking effect soon, organizations can use it as a catalyst in developing digital trust.

Data breaches are increasing in scope and frequency, creating PR nightmares for the organizations affected. The more data breaches, the more news coverage that stays on consumers’ minds.

The Equifax breach and subsequent stock price fall was well documented and should serve as a warning to businesses and how they manage their data. Large or small,  organizations have lessons to learn when it comes to building and maintaining digital trust, especially with GDPR looming ever closer.

Previously, we discussed the importance of fostering a relationship of trust between business and consumer.  Here, we focus more specifically on data keepers and the public.

Digital Tust: Data Farm

Digital Trust and The Farm Analogy

Any approach to mitigating the risks associated with data management needs to consider the ‘three Vs’: variety, velocity and volume.

In describing best practices for handling data, let’s imagine data as an asset on a farm. The typical farm’s wide span makes constant surveillance impossible, similar in principle to data security.

With a farm, you can’t just put a fence around the perimeter and then leave it alone. The same is true of data because you need a security approach that makes dealing with volume and variety easier.

On a farm, that means separating crops and different types of animals. For data, segregation serves to stop those without permissions from accessing sensitive information.

And as with a farm and its seeds, livestock and other assets, data doesn’t just come in to the farm. You also must manage what goes out.

A farm has several gates allowing people, animals and equipment to pass through, pending approval. With data, gates need to make sure only the intended information filters out and that it is secure when doing so. Failure to correctly manage data transfer will leave your business in breach of GDPR and liable for a hefty fine.

Furthermore, when looking at the gates in which data enters and streams out of an organization, we must also consider the third ‘V’ – velocity, the amount of data an organization’s systems can process at any given time.

Of course, the velocity of data an organization can handle is most often tied to how efficiently a business operates. Effectively dealing with high velocities of data requires faster analysis and times to market.

However, it’s arguably a matter of security too. Although not a breach, DDOS attacks are one such vulnerability associated with data velocity.

DDOS attacks are designed to put the aforementioned data gates under pressure, ramping up the amount of data that passes through them at any one time. Organizations with the infrastructure to deal with such an attack, especially one capable of scaling to demand, will suffer less preventable down time.

Enterprise Architecture and Harvesting the Farm

Making sure you can access, understand and use your data for strategic benefit – including fostering digital trust – comes down to effective data management and governance. And enterprise architecture is a great starting point because it provides a holistic view of an organization’s capabilities, applications and systems including how they all connect.

Enterprise architecture at the core of any data-driven business will serve to identify what parts of the farm need extra protections – those fences and gates mentioned earlier.

It also makes GDPR compliance and overall data governance easier, as the first step for both is knowing where all your data is.

For more data management best practices, click here. And you can subscribe to our blog posts here.

erwin blog

Categories
erwin Expert Blog

Enterprise Architecture for GDPR Compliance

With the May 2018 deadline for the General Data Protection Regulation (GDPR) fast approaching, enterprise architecture (EA), should be high on the priority list for organizations that handle the personal data of citizens in any European Union state.

GDPR compliance requires an overview of why and how personal data is collected, stored, processed and accessed. It also extends to third-party access and determining – within reason – what internal or external threats exist.

Because of EA’s holistic view of an organization and its systems, enterprise architects are primed to take the lead.

Enterprise Architecture for GDPR

Enterprise architecture for GDPR: Data privacy by design

The fragmented nature of data regulation and the discrepancies in standards from country to country made GDPR inevitable. Those same discrepancies in standards make it very likely that come May 2018, your organization will be uncompliant if changes aren’t made now.

So, organizations have two issues to tackle: 1) the finding problem and 2) the filing problem.

First, organizations must understand where all the private, personal and sensitive data is within all their systems . This also includes all the systems within their respective value chains. Hence, the finding problem.

Second, organizations must address the filing problem, which pertains to how they process data. As well as being a prerequisite for GDPR compliance, tackling the filing problem is essentially a fix to ensure the original finding problem is never as much of a headache again.

Starting with business requirements (A) and working through to product application (B), organizations have to create an environment whereby data goes from A to B via integral checkpoints to maintain data privacy.

This ensures that through every instance of the application development lifecycle – analysis, design, development, implementation and evaluation – the organization has taken all the necessary steps to ensure GDPR standards are met.

Enterprise architecture provides the framework of data privacy by design. By understanding how your organization’s systems fit together, you’ll see where data is as it moves along the application development lifecycle.

Enterprise architecture for GDPR: The benefits of collaboration

Of course, one of the requirements of GDPR is that compliance and all the steps to it can be demonstrated. Dedicated EA tools have the capacity to model the relevant information.

A dedicated and collaborative enterprise architecture tool takes things to the next level by  simplifying the export and sharing of completed models.

But there’s more. Truly collaborative EA tools allow relevant stakeholders (department heads, line managers) directly involved in handling the data of interest to be involved in the modeling process itself. This leads to more accurate reporting, more reliable data, and faster turnaround, all of which have a positive effect on business efficiency and the bottom line.

Approaching GDPR compliance with enterprise architecture does more than complete a chore or tick a box.  It becomes an opportunity for constant business improvement.

In other words, organizations can use enterprise architecture for GDPR as a catalyst for deeper, proactive digital transformation.

erwin partner Sandhill Consultants has produced a three-part webinar series on Navigating the GDPR Waters.

The first webinar covers the identification and classification of personally identifiable information and sensitive information and technologies, such as enterprise architecture, that can assist in identifying and classifying this sort of data.

Click here to access this webinar.

erwin blog

Categories
erwin Expert Blog

Using Enterprise Architecture to Improve Security

The personal data of more than 143 million people – half the United States’ entire population – may have been compromised in the recent Equifax data breach. With every major data breach comes post-mortems and lessons learned, but one area we haven’t seen discussed is how enterprise architecture might aid in the prevention of data breaches.

For Equifax, the reputational hit, loss of profits/market value, and potential lawsuits is really bad news. For other organizations that have yet to suffer a breach, be warned. The clock is ticking for the General Data Protection Regulation (GDPR) to take effect in May 2018. GDPR changes everything, and it’s just around the corner.

Organizations of all sizes must take greater steps to protect consumer data or pay significant penalties. Negligent data governance and data management could cost up to 4 percent of an organization’s global annual worldwide turnover or up to 20 million Euros, whichever is greater.

With this in mind, the Equifax data breach – and subsequent lessons – is a discussion potentially worth millions.

Enterprise architecture for security

Proactive Data Protection and Cybersecurity

Given that data security has long been considered paramount, it’s surprising that enterprise architecture is one approach to improving data protection that has been overlooked.

It’s a surprise because when you consider enterprise architecture use cases and just how much of an organization it permeates (which is really all of it), EA should be commonplace in data security planning.

So, the Equifax breach provides a great opportunity to explore how enterprise architecture could be used for improving cybersecurity.

Security should be proactive, not reactive, which is why EA should be a huge part of security planning. And while we hope the Equifax incident isn’t the catalyst for an initial security assessment and improvements, it certainly should prompt a re-evaluation of data security policies, procedures and technologies.

By using well-built enterprise architecture for the foundation of data security, organizations can help mitigate risk. EA’s comprehensive view of the organization means security can be involved in the planning stages, reducing risks involved in new implementations. When it comes to security, EA should get a seat at the table.

Enterprise architecture also goes a long way in nullifying threats born of shadow IT, out-dated applications, and other IT faux pas. Well-documented, well-maintained EA gives an organization the best possible view of current tech assets.

This is especially relevant in Equifax’s case as the breach has been attributed to the company’s failure to update a web application although it had sufficient warning to do so.

By leveraging EA, organizations can shore up data security by ensuring updates and patches are implemented proactively.

Enterprise Architecture, Security and Risk Management

But what about existing security flaws? Implementing enterprise architecture in security planning now won’t solve them.

An organization can never eliminate security risks completely. The constantly evolving IT landscape would require businesses to spend an infinite amount of time, resources and money to achieve zero risk. Instead, businesses must opt to mitigate and manage risk to the best of their abilities.

Therefore, EA has a role in risk management too.

In fact, EA’s risk management applications are more widely appreciated than its role in security. But effective EA for risk management is a fundamental part of how EA for implementing security works.

Enterprise architecture’s comprehensive accounting of business assets (both technological and human) means it’s best placed to align security and risk management with business goals and objectives. This can give an organization insight into where time and money can best be spent in improving security, as well as the resources available to do so.

This is because of the objective view enterprise architecture analysis provides for an organization.

To use somewhat of a crude but applicable analogy, consider the risks of travel. A fear of flying is more common than fear of driving in a car. In a business sense, this could unwarrantedly encourage more spending on mitigating the risks of flying. However, an objective enterprise architecture analysis would reveal, that despite fear, the risk of travelling by car is much greater.

Applying the same logic to security spending, enterprise architecture analysis would give an organization an indication of how to prioritize security improvements.