Categories
erwin Expert Blog

What’s the State of Data Governance and Empowerment in 2021?

erwin by Quest just released the “2021 State of Data Governance and Empowerment” report. Building on prior research, we worked with Enterprise Strategy Group (ESG) to understand how organizations are defining, adopting and prioritizing data governance, as well as examine the current drivers and challenges of governing data through its lifecycle and integration points.

It’s safe to say that the world looks a lot different than it did just 15 months ago. Today, data needs to fuel rapid decisions that make an organization more effective, customer-centric and competitive. That was true before COVID-19, and it’s even more important in the face of the radical disruption it’s caused.

As a matter of fact, according to the report, 84% of organizations believe their data represents the best opportunity for gaining a competitive advantage during the next 12 to 24 months.

This past year also saw a major shift as the silos between data governance, data operations and data protection diminished, with enterprises seeking to understand their data and the systems they use and secure to empower smarter decision-making.

Highlighting this shift, 82% of organizations have mostly, if not completely, aligned their data governance and data protection strategies, with 55% of survey respondents citing “data protection” as the term they most closely associate with data governance. Additionally, 85% monitor their databases and other data systems as part of their data governance programs. Furthermore, nearly three-quarters reported a need to dramatically improve data infrastructure.

Data Governance Strategy

What else did we learn?

Data Governance Definition Varies

There is still no consensus on how to define data governance. When asked for a definition, the two most popular responses were:

  • Building a set of policies that governs data.
  • Ensuring data usage follows defined rules.

While neither of these answers are wrong, they continue to illustrate that there’s no standard definition.

At erwin, we define data governance as helping organizations establish a sound yet flexible framework for awareness of and access to available data assets, guidance on their use, and guardrails to ensure data policies and best practices are followed.

A New Level of Maturity

When asked “how mature is your data governance program/what stage are you in,” 42% of organizations said they’ve fully implemented data governance.

That’s in sharp contrast to our last study that showed 38% of data governance programs were a work in progress and 31% were just getting started.

So it appears that enterprise data governance programs have indeed reached a new level of maturity, but the majority (58%) say their data governance programs are evolving. However, if we’ve learned anything, isn’t it that data governance is an ever-evolving, ever-changing tenet of modern business?

Key Bottlenecks and Challenges

We explored the bottlenecks and issues causing delays across the entire data value chain.

Thematically “data quality” is at the heart desired data governance outcomes, challenges and bottlenecks, pointing to its overall importance.

Finding, identifying and harvesting data assets, the performance of systems where data is stored, documenting end-to-end data lineage, and visibility into mechanisms to protect data round out the top five bottlenecks to achieving optimal data value.

These are largely consistent with what we learned in our last study, with data system performance and protection making their debut this year as highly relevant problems to address.

With regard to challenges, respondents cite data quality and accuracy, skills shortages/gaps, cost, cultural change and operationalizing data governance – making it a working reality as opposed to a concept – as the top issues to address in maximizing data governance ROI.

Interestingly, 5% said they have no challenges – wouldn’t we like them to share their rose-colored glasses data governance glasses?

Other Key Findings

The report has a lot to unpack, but here is a snapshot of some other key findings:

Time is a major factor.

  • Data stewards still spend too much time on data-related activities, including analyzing (23%), protecting (23%) and searching for data (20%).
  • It takes most business users (e.g., developers, analysts, data scientists) one to two business days to receive the data they request from IT.

More automation opportunities exist.

  • While 42% of organizations have some mix of manual and automated processes, 93% say there’s room to incorporate more automation into their data operations.
  • In line with ensuring trust in data, data quality (27%), data integration (17%), and data preparation (14%) are the three data operations automated the most.

Self-service done right is a game-changer.

  • 93% of organizations have already or plan to leverage self-service in provisioning data, showing that self-service is more important than ever.
  • Seven out of 10 respondents report their organizations’ self-service data provisioning enablement has had a significant business impact.

The Path to Data Empowerment

The report validates Quest’s newly launched Data Empowerment strategy and platform that bridges the gaps between data infrastructure, security and governance initiatives to mitigate risk and unleash more value from data.

Data governance provides visibility, automation, governance and collaboration for data democratization.

As part of a Data Empowerment platform — data governance puts real-time, relevant, role-based data in context in the hands of users to optimize the enterprise data capability.

So with these solutions working in concert, you can ensure the availability of secure, high-quality data to empower everyone in your organization to be more successful – that’s a win-win for employees and the organization in accomplishing its mission.

Click here to download a free copy of the “2021 State of Data Governance and Empowerment” report.

And join us for the Quest Data Empowerment Summit this week, May 18-20. It includes three unique tracks —Data OperationsData Protection and Data Governance — with sessions about the latest trends, best practices and technologies to align these critical areas and close the gaps between your front and back office.

Categories
erwin Expert Blog Data Modeling

Integrating SQL and NoSQL into Data Modeling for Greater Business Value: The Latest Release of erwin Data Modeler

SQL or NoSQL words written on white board, Big data concept

Due to the prevalence of internal and external market disruptors, many organizations are aligning their digital transformation and cloud migration efforts with other strategic requirements (e.g., compliance with the General Data Protection Regulation).

Accelerating the retrieval and analysis of data —so much of it unstructured—is vital to becoming a data-driven business that can effectively respond in real time to customers, partners, suppliers and other parties, and profit from these efforts. But even though speed is critical, businesses must take the time to model and document new applications for compliance and transparency.

For decades, data modeling has been the optimal way to design and deploy new relational databases with high-quality data sources and support application development. It facilitates communication between the business and system developers so stakeholders can understand the structure and meaning of enterprise data within a given context. Today, it provides even greater value because critical data exists in both structured and unstructured formats and lives both on premises and in the cloud.

Comparing SQL and NoSQL

While it may not be the most exciting match up, there’s much to be said when comparing SQL vs NoSQL databases. SQL databases use schemas and pre-defined tables, while NoSQL databases are the complete opposite. Instead of schemas and tables, NoSQL databases store data in ways that depend on what kind of NoSQL database is being used.

While the SQL and NoSQL worlds can complement each other in today’s data ecosystem, most enterprises need to focus on building expertise and processes for the latter format.

After all, they’ve already had decades of practice designing and managing SQL databases that emphasize storage efficiency and referential integrity rather than fast data access, which is so important to building cloud applications that deliver real-time value to staff, customers and other parties. Query-optimized modeling is the new watchword when it comes to supporting today’s fast delivery, iterative and real-time applications

DBMS products based on rigid schema requirements impede our ability to fully realize business opportunities that can expand the depth and breadth of relevant data streams for conversion into actionable information. New, business-transforming use cases often involve variable data feeds, real-time or near-time processing and analytics requirements, and the scale to process large volumes of data.

NoSQL databases, such as Couchbase and MongoDB, are purpose-built to handle the variety, velocity and volume of these new data use cases. Schema-less or dynamic schema capabilities, combined with increased processing speed and built-in scalability, make NoSQL the ideal platform.

Making the Move to NoSQL

Now the hard part. Once we’ve agreed to make the move to NoSQL, the next step is to identify the architectural and technological implications facing the folks tasked with building and maintaining these new mission-critical data sources and the applications they feed.

As the data modeling industry leader, erwin has identified a critical success factor for the majority of organizations adopting a NoSQL platform like Couchbase, Cassandra and MongoDB. Successfully leveraging this solution requires a significant paradigm shift in how we design NoSQL data structures and deploy the databases that manage them.

But as with most technology requirements, we need to shield the business from the complexity and risk associated with this new approach. The business cares little for the technical distinctions of the underlying data management “black box.”

Business data is business data, with the main concerns being its veracity and value. Accountability, transparency, quality and reusability are required, regardless. Data needs to be trusted, so decisions can be made with confidence, based on facts. We need to embrace this paradigm shift, while ensuring it fits seamlessly into our existing data management practices as well as interactions with our partners within the business. Therefore, the challenge of adopting NoSQL in an organization is two-fold: 1) mastering and managing this new technology and 2) integrating it into an expansive and complex infrastructure.

The Newest Release of erwin Data Modeler

There’s a reason erwin Data Modeler is the No.1 data modeling solution in the world.

And the newest release delivers all in one SQL and NoSQL data modeling, guided denormalization and model-driven engineering support for Couchbase, Cassandra, MongoDB, JSON and AVRO. NoSQL users get all of the great capabilities inherent in erwin Data Modeler. It also provides Data Vault modeling, enhanced productivity, and simplified administration of the data modeling repository.

Now you can rely on one solution for all your enterprise data modeling needs, working across DBMS platforms, using modern modeling techniques for faster data value, and centrally governing all data definition, data modeling and database design initiatives.

erwin data models reduce complexity, making it easier to design, deploy and understand data sources to meet business needs. erwin Data Modeler also automates and standardizes model design tasks, including complex queries, to improve business alignment, ensure data integrity and simplify integration.

In addition to the above, the newest release of erwin Data Modeler by Quest also provides:

  • Updated support and certifications for the latest versions of Oracle, MS SQL Server, MS Azure SQL and MS Azure SQL Synapse
  • JDBC-connectivity options for Oracle, MS SQL Server, MS Azure SQL, Snowflake, Couchbase, Cassandra and MongoDB
  • Enhanced administration capabilities to simplify and accelerate data model access, collaboration, governance and reuse
  • New automation, connectivity, UI and workflow optimization to enhance data modeler productivity by reducing onerous manual tasks

erwin Data Modeler is a proven technology for improving the quality and agility of an organization’s overall data capability – and that includes data governance and data intelligence.

Click here for your free trial of erwin Data Modeler.

Categories
Enterprise Architecture erwin Expert Blog

Post-Pandemic Enterprise Architecture Priorities

Enterprise architecture priorities

Before the COVID-19 pandemic, many enterprise architects were focused on standardization. Identifying and putting into practice standard approaches to deploying systems, from the IT infrastructure and network protocols to the integration with other components, decreases the time to market for businesses and increases efficiency. In a world where agility and innovation are highly valued, speed is a critical factor for success.COVID-19 forced many businesses to radically change their business models – or re-evaluate their business processes – shifting the focus of enterprise architects. The top priority became mobility through a cloud-first strategy. By evaluating and deploying the right combination of cloud-based platforms and security tools, enterprise architects played a key role in keeping businesses up and running in a remote-work world.

As the world moves forward, enterprise architecture (EA) is moving with it. The enterprise architect needs to develop an understanding of the organization’s business processes and business architecture. With this understanding, enterprise architects can play a key role in both customer and employee experiences, which are central to growing a business today.

Responding to a Crisis

According to Deloitte’s Enterprise Architecture’s Role in Recovering from a Crisis report,  organizations typically respond to a crisis over three phases: respond, recover and thrive.

EA provides a way to drive change through every phase of recovery by providing an understanding of technology assets with business needs. Enterprise architects have been a critical component to helping businesses navigate the pandemic to reimagine the business, ensure business continuity, and identify the tools to survive and ultimately thrive in a post-COVID world.

We saw in the first phases of the pandemic how organizations had to navigate business continuity to survive. For example, a COVID EA response plan could have been used to ask: Are employees working from home? What roles do they have? What work do they do? And when are they available?

New Priorities

According to a survey by McKinsey and Co., the pandemic acted as an accelerant for digital transformation efforts, speeding up the adoption of digital technologies by several years.

As the world moves forward, so must enterprise architecture. Instead of focusing on standardization, the enterprise architect must play a key role in both customer and employee experiences, aspects that are central to growing a business.

Three priorities have emerged for enterprise architects as we move into this next phase:

Priority 1: Business Process and Business Architecture

Enterprise architects are accustomed to thinking about technology architecture and processes. With IT now being seen as an enabler of the business, enterprise architects need to think in terms of the customer journey and how people interact with the business across the value chain.

Priority 2: The Application Portfolio

Oversight of the application portfolio is not a new responsibility for many enterprise architects. Understanding the applications you have, the applications in use, and the applications that are ripe for retirement is an important part of running an efficient IT operation.

Priority 3: Risk Management – Security and Compliance

Businesses are paying close attention to risk from internal and external sources. With more connections between systems and companies, more third-party partnerships and more advanced attacks from cybercriminals and nation-states alike, security is top of mind from the boardroom on down.

The New Normal

As we move into recovery mode, organizations are assessing the processes, systems and technologies that will help them assimilate to the new normal and thrive post-pandemic. However, the role and priorities of enterprise architecture likely will continue to evolve to include responsibility for products, deployments and customers, as businesses continue to transform.

Whether documenting systems and technology, designing processes and critical value streams, or managing innovation and change, you need the right tools to turn your enterprise architecture artifacts into insights for better decisions.

erwin Evolve by Quest is a full-featured, configurable enterprise architecture and business process (BP) modeling and analysis software suite that tames complexity, manages change and increase operational efficiency. Its automated visualization, documentation and enterprise collaboration capabilities turn EA and BP artifacts into insights both IT and business users can access in a central location for making strategic decisions.

To learn more about the new priorities for enterprise architects post-pandemic, read our latest white paper: Enterprise Architecture: Setting Transformation-Focused Priorities.

 

[blog-cta header=”erwin Evolve” body=”Click here to request a demo of erwin Evolve.” button=”Request Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-evolve-free-trial/” image=”http://erwin.com/wp-content/uploads/2020/02/evolve-pic.jpg” ]

Categories
erwin Expert Blog Data Governance

Data Governance Maturity and Tracking Progress

Data governance is best defined as the strategic, ongoing and collaborative processes involved in managing data’s access, availability, usability, quality and security in line with established internal policies and relevant data regulations.

erwin recently hosted the third in its six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, this latest webinar focused on “Data Governance Maturity & Tracking Progress.”

The webinar looked at how to gauge the maturity and progress of data governance programs and why it is important for both IT and the business to be able to measure success

Data Governance Is Business Transformation

Data governance is about how an organization uses its data. That includes how it creates or collects data, as well as how its data is stored and accessed. It ensures that the right data of the right quality, regardless of where it is stored or what format it is stored in, is available for use – but only by the right people and for the right purpose.

Quite simply, data governance is business transformation, as Mr. Pörschmann highlights in the webinar. Meaning that it is a complex system that changes from one stable state into another stable state.

The basic principles of transformation are:

  • Complexity
  • Predictability
  • Synchronicity

However, the practice of data governance is a relatively new discipline that is still evolving. And while its effects will be felt throughout the entire organization, the weight of its impact will be felt differently across the business.

“You have to deal with this ambiguity and it’s volatile,” said Mr. Pörschmann. “Some business units benefit more from data governance than others, and some business units have to invest more energy and resources into the change than others.”

Maturity Levels

Data governance maturity includes the ability to rely on automated and repeatable processes, which ultimately helps to increase productivity. While it has gained traction over the past few years, many organizations are still formalizing it as a practice.

Implementing a data governance initiative can be tricky, so it is important to have clear goals for what you want it to achieve.

According to Mr. Pörschman, there are six levels of maturity with one being the lowest.

  1. Aware: Partial awareness of data governance but not yet started
  2. Initiated: Some ad-hoc data governance initiatives
  3. Acknowledged: An official acknowledgement of data governance from executive management with budget allocated
  4. Managed: Dedicated resources, managed and adjusted with KPIs
  5. Monitored: Dedicated resources and performance monitoring
  6. Enhanced: Data managed equally

For a fully mature or “enhanced” data governance program, IT and the business need to take responsibility for selling the benefits of data governance across the enterprise and ensure all stakeholders are properly educated about it. However, IT may have to go it alone, at least initially, educating the business on the risks and rewards, as well as the expectations and accountabilities in implementing it.

To move data governance to the next level, organizations need to discover, understand, govern and socialize data assets. Appropriately implemented — with business stakeholders driving alignment between data governance and strategic enterprise goals and IT handling the technical mechanics of data management — the door opens to trusting data, planning for change, and putting it to work for peak organizational performance.

The Medici Maturity Approach

In a rush to implement a data governance methodology and system, you can forget that a system must serve a process – and be governed/controlled by one.

To choose the correct system and implement it effectively and efficiently, you must know – in every detail – all the processes it will impact, how it will impact them, who needs to be involved and when.

Business, data governance and data leaders want a methodology that is lean, scalable and lightweight. This model has been dubbed the Medici maturity model – named after Romina Medici, head of data management and governance for global energy provider E.ON.

Ms. Medici found that the approaches on the market did not cover transformation challenges, and only a few addressed the operational data management disciplines. Her research also found that it doesn’t make sense to look at your functional disciplines unless you already have a minimum maturity (aware plus initiated levels).

People, process, technology and governance structure are on the one side of the axis with functional data management disciplines on the other.

The Medici Maturity Approach is a data governance methodology.

Mr. Pörschman then shared the “Data Maturity Canvas” that incorporates core dimensions, maturity levels and execution disciplines. The first time you run this, you can define the target situation, all the actions needed, and the next best actions.

This methodology gives you a view of the four areas, people, process, technology and governance, so you can link your findings across them. It is an easy method that you can run for different purposes including:

  • Initial assessment
  • Designing a data governance program
  • Monitoring a whole program
  • Beginning strategy processes
  • Benchmarking

Data governance can be many things to many people. Before starting, decide what your primary objectives are: to enable better decision-making or to help you meet compliance objectives. Or are you looking to reduce data management costs and improve data quality through formal, repeatable processes? Whatever your motivation, you need to identify it first and foremost to get a grip on data governance.

Click here to read our success story on how E.ON used erwin Data Intelligence for its digital transformation and innovation efforts.

Register for the fifth webinar in this series “Transparency in Data Governance – Data Catalogs & Business Glossaries” which takes place on April 27. This webinar will discuss how to answer critical questions through data catalogs and business glossaries, powered by effective metadata management. You’ll also see a demo of the erwin Data Intelligence Suite  that includes both data catalog, business glossary and metadata-driven automation.

[blog-cta header=”erwin Data Intelligence” body=”Click here to request a demo of erwin Data Intelligence by Quest.” button=”Request Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-data-intelligence-free-demo/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
Data Governance erwin Expert Blog

How Data Governance Protects Sensitive Data

 

Data governance reduces the risk of sensitive data.

Organizations are managing more data than ever. In fact, the global datasphere is projected to reach 175 zettabytes by 2025, according to IDC.

With more companies increasingly migrating their data to the cloud to ensure availability and scalability, the risks associated with data management and protection also are growing.

How can companies protect their enterprise data assets, while also ensuring their availability to stewards and consumers while minimizing costs and meeting data privacy requirements?

Data Security Starts with Data Governance

Lack of a solid data governance foundation increases the risk of data-security incidents. An assessment of the data breaches that crop up like weeds each year supports the conclusion that companies, absent data governance, wind up building security architectures strictly from a technical perspective.

Given that every company has in its possession important information about and relationships with people based on the private data they provide, every business should understand the related risks and protect against them under the banner of data governance—and avoid the costs and reputation damage that data breaches can inflict more intelligently and better. That’s especially true as the data-driven enterprise momentum grows along with self-service analytics that enable users to have greater access to information, often using it without IT’s knowledge.

Indeed, with nearly everyone in the enterprise involved either in maintaining or using the company’s data, it only makes sense that both business and IT begin to work together to discover, understand, govern and socialize those assets. This should come as part of a data governance plan that emphasizes making all stakeholders responsible not only for enhancing data for business benefit, but also for reducing the risks that unfettered access to and use of it can pose.

With data catalog and literacy capabilities, you provide the context to keep relevant data private and secure – the assets available, their locations, the relationships between them, associated systems and processes, authorized users and guidelines for usage.

Without data governance, organizations lack the ability to connect the dots across data governance, security and privacy – and to act accordingly. So they can’t answer these fundamental questions:

  • What data do we have and where is it now?
  • Where did it come from and how has it changed?
  • Is it sensitive data or are there any risks associated with it?
  • Who is authorized to use it and how?

When an organization knows what data it has, it can define that data’s business purpose. And knowing the business purpose translates into actively governing personal data against potential privacy and security violations.

Do You Know Where Your Sensitive Data Is?

Data is a valuable asset used to operate, manage and grow a business. While sometimes at rest in databases, data lakes and data warehouses; a large percentage is federated and integrated across the enterprise, management and governance issues that must be addressed.

Knowing where sensitive data is located and properly governing it with policy rules, impact analysis and lineage views is critical for risk management, data audits and regulatory compliance.

For example, understanding and protecting sensitive data is especially critical for complying with privacy regulations like the European Union’s General Data Protection Regulation (GDPR).

The demands GDPR places on organizations are all-encompassing. Protecting what traditionally has been considered personally identifiable information (PII) — people’s names, addresses, government identification numbers and so forth — that a business collects, and hosts is just the beginning of GDPR mandates. Personal data now means anything collected or stored that can be linked to an individual (right down to IP addresses), and the term doesn’t only apply to individual pieces of information but also to how they may be combined in revealing relationships. And it isn’t just about protecting the data your business gathers, processes and stores but also any data it may leverage from third-party sources.

When key data isn’t discovered, harvested, cataloged, defined and standardized as part of integration processes, audits may be flawed putting your organization at risk.

Sensitive data – at rest or in motion – that exists in various forms across multiple systems must be automatically tagged, its lineage automatically documented, and its flows depicted so that it is easily found, and its usage easily traced across workflows.

Fortunately, tools are available to help automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data: Better visibility and control across the enterprise to identify data security threats and reduce associated risks
  • Enriching business data elements for sensitive data discovery: Comprehensive mechanism to define business data element for PII, PHI and PCI across database systems, cloud and Big Data stores to easily identify sensitive data based on a set of algorithms and data patterns
  • Providing metadata and value-based analysis: Discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data including PII, PHI, PCI and other sensitive information.

Minimizing Risk Exposure with Data Intelligence

Organizations suffering data losses won’t benefit from the money spent on security technologies nor the time invested in developing data privacy classifications if they can’t get a handle on how they handle their data.

They also may face heavy fines and other penalties – not to mention bad PR.

Don’t let that happen to your organization.

A well-formed security architecture that is driven by and aligned by data intelligence is your best defense. Being prepared means you can minimize your risk exposure.

With erwin Data Intelligence by Quest, you’ll have an unfettered view of where sensitive data resides with the ability to seamlessly apply privacy rules and create access privileges.

Additionally, with Quest’s acquisition of erwin comes the abilities to mask, encrypt, redact and audit sensitive data for an automated and comprehensive solution to resolve sensitive-data issues.

When an organization knows what data it has, it can define that data’s business purpose. And knowing the business purpose translates into actively governing personal data against potential privacy and security violations.

From risk management and regulatory compliance to innovation and digital transformation, you need data intelligence. With erwin by Quest, you will know your data so you can fully realize its business benefits.

[blog-cta header=”erwin Data Intelligence” body=”Click here to request a demo of erwin Data Intelligence by Quest.” button=”Request Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-data-intelligence-free-demo/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog Data Governance

The Value of Data Governance and How to Quantify It

erwin recently hosted the second in its six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, the second webinar focused on “The Value of Data Governance & How to Quantify It.”

As Mr. Pörschmann highlighted at the beginning of the series, data governance works best when it is strongly aligned with the drivers, motivations and goals of the business.

The business drivers and motivation should be the starting point for any data governance initiative. If there is no clear end goal in sight, it will be difficult to get stakeholders on board. And with many competing projects and activities vying for people’s time, it must be clear to people why choosing data governance activities will have a direct benefit to them.

“Usually we talk about benefits which are rather qualitative measures, but what we need for decision-making processes are values,” Pörschmann says. “We need quantifiable results or expected results that are fact-based. And the interesting thing with data governance, it seems to be easier for organizations and teams to state the expected benefits.”

The Data Governance Productivity Matrix

In terms of quantifying data governance, Pörschmann cites the productivity matrix as a relatively simple way to calculate real numbers. He says, “the basic assumption is if an organization equips their managers with the appropriate capabilities and instruments, then it’s management’s obligation to realize productivity potential over time.”

According to IDC, professionals who work with data spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analytics.

Specifically, 80 percent of data professionals’ time is spent on data discovery, preparation and protection, and only 20 percent on analysis leading to insights.

Data governance maturity includes the ability to rely on automated and repeatable processes, which ultimately helps to increase productivity.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement — for data lineage and impact analysis — without interrupting system developers’ normal work methods.

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology — that is, reverse-engineering code from existing XML within the ETL layer — to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

With automation, data professionals can meet the above needs at a fraction of the cost of the traditional, manual way. To summarize, just some of the benefits of data automation are:

  • Centralized and standardized code management with all automation templates stored in a governed repository
  • Better quality code and minimized rework
  • Business-driven data movement and transformation specifications
  • Superior data movement job designs based on best practices
  • Greater agility and faster time to value in data preparation, deployment and governance
  • Cross-platform support of scripting languages and data movement technologies

For example, one global pharmaceutical giant reduced cost by 70 percent and generated 95 percent of production code with “zero touch.” With automation, the company improved the time to business value and significantly reduced the costly re-work associated with error-prone manual processes.

Risk Management and Regulatory Compliance

Risk management, specifically around regulatory compliance, is an important use case to demonstrate the true value of data governance.

According to Pörschmann, risk management asks two main questions.

  1. How likely is a specific event to happen?
  2. What is the impact or damage if this event happens? (e.g.m, cost of repair, cost of reputation, etc.)

“You have to understand the concept or thinking of risk officers or the risk teams,” he says. The risk teams are process-oriented, and they understand how to calculate and how to cover IT risks. But to be successful in communicating data risks with the risk management team, you need to understand how your risk teams are thinking in terms of the risk matrix.

Take the European Union’s General Data Protection Regulation (GDPR) as an example of a data cost. Your team needs to ask, “what is the likelihood that we will fail on data-based activities related to GDPR?” And then ask, “what can we do from the data side to reduce the impact or the total damage?”

But it’s not easy to design and deploy compliance in an environment that’s not well understood and difficult in which to maneuver. Data governance enables organizations to plan and document how they will discover and understand their data within context, track its physical existence and lineage, and maximize its security, quality and value.

With the right technology, organizations can automate and accelerate regulatory compliance in five steps:

  1. Catalog systems. Harvest, enrich/transform and catalog data from a wide array of sources to enable any stakeholder to see the interrelationships of data assets across the organization.
  2. Govern PII “at rest”. Classify, flag and socialize the use and governance of personally identifiable information regardless of where it is stored.
  3. Govern PII “in motion”. Scan, catalog and map personally identifiable information to understand how it moves inside and outside the organization and how it changes along the way.
  4. Manage policies and rules. Govern business terminology in addition to data policies and rules, depicting relationships to physical data catalogs and the applications that use them with lineage and impact analysis views.
  5. Strengthen data security. Identify regulatory risks and guide the fortification of network and encryption security standards and policies by understanding where all personally identifiable information is stored, processed and used.

It’s also important to understand that the benefits of data governance don’t stop with regulatory compliance.

A better understanding of what data you have, where it’s stored and the history of its use and access isn’t only beneficial in fending off non-compliance repercussions. In fact, such an understanding is arguably better put to use proactively.

Data governance improves data quality standards, it enables better decision-making and ensures businesses can have more confidence in the data informing those decisions.

[blog-cta header=”erwin DG Webinar Series” body=”Register now for the March 30 webinar ‘Data Governance Maturity & Tracking Progress.'” button=”Register Now” button_link=”https://register.gotowebinar.com/register/8531817018173466635″ image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog Data Governance

Top Data Management Trends for Chief Data Officers (CDOs)

Chief Data Officer (CDOs) 2021 Study

The role of chief data officer (CDO) is becoming essential at forward-thinking organizations — especially those in financial services — according to “The Evolving Role of the CDO at Financial Organizations: 2021 Chief Data Officer (CDO) Study” just released by FIMA and sponsored by erwin.

The e-guide takes a deep dive into the evolving role of CDOs at financial organizations, tapping into the minds of 100+ financial global financial leaders and C-suite executives to look at the latest trends and provide a roadmap for developing an offensive data management strategy.

Data Governance Is Not Just About Compliance

Interestingly, the report found that 45% of respondents say compliance is now handled so well that it is no longer the top driver for data governance, while 38% say they have fully realized a “governance 2.0” model in which the majority of their compliance concerns are fully automated.”

Chief data officers and other data professionals have taken significant steps toward a data governance model that doesn’t just safeguard data but also drives business improvements.

erwin also found this to be the case as revealed in our 2020 “State of Data Governance and Automation” report.

However, while compliance is no longer the top driver of data governance, it still requires a significant investment. According to the CDO report, 88% of organizations devote 40% or more of their data practice’s operating budget to compliance activities.

COVID’s Impact on Data Management

FIMA also looked at 2020 and the pandemic’s impact on data management.

Some financial organizations that were approaching a significant level of data management maturity had to put their initiatives on hold to address more immediate issues. But it led some sectors to innovate, moving processes that were once manual to the digital realm.

The research team asked respondents to describe how their data practices were impacted by the need to adapt to changes in the work environment created by COVID-19. “Overall, most respondents said they avoided any catastrophic impact on their data operations. Most of these respondents note the fact that they had been updating their tools and programs ahead of time to prepare for such risks, and those investments inevitably paid off.”

The respondents who did note that the pandemic caused a disruption repeatedly said that they nonetheless managed to “keep everything in check.” As one CIO at an investment bank puts it, “Data practices became more precise and everyone got more conscious as the pandemic reached its first peak. Key programs have been kept in check and have been restarted securely.”

What Keeps CDOs Up at Night

Financial services organizations are usually at the forefront of data management and governance because they operate in such a heavily regulated environment. So it’s worth knowing what’s on those data executives’ minds, even if your organization is in another sector.

For example, the FIMA study indicates that:

  • 70% of CDOs say risk data aggregation is a primary regulatory concern within the IT departments.
  • Compliance is secondary to overall business improvement, but 88% of organizations devote 40%+ of their data practice’s operating budget to it.
  • Lack of downstream visibility into data consumption (69%) and unclear data provenance and tagging information (65%) are significant challenges.
  • They struggle to apply metadata.
  • Manual processes remain.

The e-guide discusses how data executives must not only secure data and meet rigorous data requirements but also find ways to create new business value with it.

All CDOs and other data professionals likely must deal with the challenges mentioned above – plus improve customer outcomes and boost profitability.

Both mitigating risk and unleashing potential is possible with the right tools, including data catalog, data literacy and metadata-driven automation capabilities for data governance and any other data-centric use case.

Harmonizing Data Management and Data Governance Processes

With erwin Data Intelligence by Quest, your organization can harness and activate your data in a single, unified catalog and then make it available to your data communities in context and in line with business requirements.

The solution harmonizes data management and governance processes to fuel an automated, real-time, high-quality data pipeline enterprise stakeholders can tap into for the information they need to achieve results. Such data intelligence leads to faster, smarter decisions to improve overall organizational performance.

Data is governed properly throughout its lifecycle, meeting both offensive and defensive data management needs. erwin Data Intelligence provides total data visibility, end-to-end data lineage and provenance.

To download the full “The Evolving Role of the CDO at Financial Organizations: 2021 Chief Data Officer (CDO) Study,” please visit: https://go.erwin.com/the-evolving-role-of-the-cdo-at-financial-organizations-report.

[blog-cta header=”Free trial of erwin Data Intelligence” body=”Improve enterprise data access, literacy and knowledge to support data governance, digital transformation and other critical initiatives.” button=”Start Free Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-data-intelligence-free-demo/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog

Honoring Women in Tech: Challenging IT Industry Perceptions

International Women’s Day is a global celebration of the social, economic, cultural and political achievements of women. Celebrated this year on Monday, March 8, the theme is #ChooseToChallenge. That motivated us to honor women in technology, including one of our inspiring customer advocates who is herself choosing to challenge limitations in how we view data and IT.

“I try to treat people the way I’d like to be treated and try to walk in their shoes.”

Meet Libby McNairy … 

Championing learning for women in tech

Libby has been a Data Governance Specialist for about 20 years and has always worked in some capacity in healthcare and healthcare information technology. She began her career in the healthcare industry managing large physician practices where she quickly realized that data could increase productivity as well as patient outcomes.

Libby is passionate about working with the right tools to capture the insights that are embedded in data so organizations run more effectively, while also making sure companies are compliant and as healthy as they can be. Having worked with both hardware and software, Libby has seen IT from numerous vantage points.

We had the opportunity to meet with Libby and ask her a few questions about her #ChoosetoChallenge commitments and passion for women in tech.

Q: Can you tell us a little bit about yourself and how long you’ve been at the company you’re at now?

I recently started working with pharmaceutical company Alkermes where I am assisting with the startup of a data governance program for the commercial organization.  I am extremely proud to be working for them, not only for their commitment to data governance but for their commitment to mental health and addiction.

My career has been working in healthcare and healthcare information technology in some way, and over the years my love for IT just grew and the understanding of protecting the data we gather is a must.

Q: Did you always know that working in technology was what you wanted to do?

I have always been fascinated by data, and in healthcare there is so much – but the question was “how do you get to it, use it and create something out of it that helps the business make decisions”.  With healthcare as most other industries – the issue is you have multiple systems and multiple types of data…. how do you merge those to tell a story with it?  How do you know if this term means the same in this system as it does in another, and once you are done with all the manipulation…then where did it really come from?  So, a person who loves process and procedure – I loved tying it all together and seeing the importance of governing that data so when we use that data it can be trusted.

Q: What are some of the challenges that you relish in your work?

I have had to teach myself about the different aspects of IT. When those job descriptions say, “other duties as assigned”, well I think that has applied to me.  I am not an engineer or developer but worked with a group of engineers for a couple of years, so I had to learn their language and perspective.  Do not be afraid to take on a new challenge, ask questions, dig and learn and then pay that forward.

I have evolved and grown as I have learned new skills and believe me, I am so grateful for those opportunities.  I think you must get out of your comfort zone and understand about what comes before what you do and what comes after.

Q: Do you have a guiding quote you live by or a saying that’s meaningful?

Well, I guess I try to treat people the way I’d like to be treated and try to walk in their shoes. Until you understand what they do and what they’re trying to accomplish, you can’t expect things out of people. In governance we have a saying “It’s a process, not a project” I think life is like that…. we are always in the process, and that can change daily.

Q: What are some things that you have chosen to challenge?

In my life, as I mentioned previously, mental health and addiction issues are important after losing family.  We must be more compassionate about those.

My challenge in my career is that we as women help other women to move up the ladder in IT and to get young women interested in technology as a career.  I think we need to work with our local universities, jr. colleges, technical schools and help mentor.

Q: What advice would you give to women entering the tech field?

As I mentioned above, I have been lucky because I have always had some great mentors.   I encourage women to find a mentor, and for those in the field to look for someone to be a good mentor to.  Also, do not limit yourself to what you were doing in your first IT job. Take advantage of learning all you can…. “Never stop learning”.  If you do not know much about Data Governance – it is a rewarding career.

Another woman in tech #ChoosingToChallenge the narrow definition of what it means to be in IT. Thank you, Libby!

Categories
erwin Expert Blog Data Governance

The What & Why of Data Governance

Modern data governance is a strategic, ongoing and collaborative practice that enables organizations to discover and track their data, understand what it means within a business context, and maximize its security, quality and value.

It is the foundation for regulatory compliance and de-risking operations for competitive differentiation and growth.

However, while digital transformation and other data-driven initiatives are desired outcomes, few organizations know what data they have or where it is, and they struggle to integrate known data in various formats and numerous systems – especially if they don’t have a way to automate those processes.

But when IT-driven data management and business-oriented data governance work together in terms of both personnel, processes and technology, decisions can be made and their impacts determined based on a full inventory of reliable information.

Recently, erwin held the first in a six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, it examined “The What & Why of Data Governance.”

The What: Data Governance Defined

Data governance has no standard definition. However, Dataversity defines it as “the practices and processes which help to ensure the formal management of data assets within an organization.”

At erwin by Quest, we further break down this definition by viewing data governance as a strategic, continuous commitment to ensuring organizations are able to discover and track data, accurately place it within the appropriate business context(s), and maximize its security, quality and value.

Mr. Pörschmann asked webinar attendees to stop trying to explain what data governance is to executives and clients. Instead, he suggests they put data governance in real-world scenarios to answer these questions: “What is the problem you believe data governance is the answer to?” Or “How would you recognize having effective data governance in place?”

In essence, Mr. Pörschmann laid out the “enterprise data dilemma,” which stems from three important but difficult questions for an enterprise to answer: What data do we have? Where is it? And how do we get value from it?

Asking how you recognize having effective data governance in place is quite helpful in executive discussions, according to Mr. Pörschmann. And when you talk about that question at a high level, he says, you get a very “simple answer,”– which is ‘the only thing we want to have is the right data with the right quality to the right person at the right time at the right cost.’

The Why: Data Governance Drivers

Why should companies care about data governance?

erwin’s 2020 State of Data Governance and Automation report found that better decision-making is the primary driver for data governance (62 percent), with analytics secondary (51 percent), and regulatory compliance coming in third (48 percent).

In the webinar, Mr. Pörschmann called out that the drivers of data governance are the same as those for digital transformation initiatives. “This is not surprising at all,” he said. “Because data is one of the success elements of a digital agenda or digital transformation agenda. So without having data governance and data management in place, no full digital transformation will be possible.”

Drivers of data governance

Data Privacy Regulations

While compliance is not the No. 1 driver for data governance, it’s still a major factor – especially since the rollout of the European Union’s General Data Protection Regulation (GDPR) in 2018.

According to Mr. Pörschmann, many decision-makers believe that if they get GDPR right, they’ll be fine and can move onto other projects. But he cautions “this [notion] is something which is not really likely to happen.”

For the EU, he warned, organizations need to prepare for the Digital Single Market, agreed on last year by the European Parliament and commission. With it comes clear definitions or rules on data access and exchange, especially across digital platforms, as well as clear regulations and also instruments to execute on data ownership. He noted, “Companies will be forced to share some specific data which is relevant for public security, i.e., reduction of carbon dioxide. So companies will be forced to classify their data and to find mechanisms to share it with such platforms.”

GDPR is also proving to be the de facto model for data privacy across the United States. The new Virginia Consumer Data Privacy Act, which was modeled on the California Consumer Privacy Act (CCPA), and the California Privacy Rights Act (CPRA), all share many of the same requirements as GDPR.

Like CCPA, the Virginia bill would give consumers the right to access their data, correct inaccuracies, and request the deletion of information. Virginia residents also would be able to opt out of data collection.

Nevada, Vermont, Maine, New York, Washington, Oklahoma and Utah also are leading the way with some type of consumer privacy regulation. Several other bills are on the legislative docket in Alabama, Arizona, Florida, Connecticut and Kentucky, all of which follow a similar format to the CCPA.

Stop Wasting Time

In addition to drivers like digital transformation and compliance, it’s really important to look at the effect of poor data on enterprise efficiency/productivity.

Respondents to McKinsey’s 2019 Global Data Transformation Survey reported that an average of 30 percent of their total enterprise time was spent on non-value-added tasks because of poor data quality and availability.

Wasted time is also an unfortunate reality for many data stewards, who spend 80 percent of their time finding, cleaning and reorganizing huge amounts of data, and only 20 percent of their time on actual data analysis.

According to erwin’s 2020 report, about 70 percent of respondents – a combination of roles from data architects to executive managers – said they spent an average of 10 or more hours per week on data-related activities.

The Benefits of erwin Data Intelligence

erwin Data Intelligence by Quest supports enterprise data governance, digital transformation and any effort that relies on data for favorable outcomes.

The software suite combines data catalog and data literacy capabilities for greater awareness of and access to available data assets, guidance on their use, and guardrails to ensure data policies and best practices are followed.

erwin Data Intelligence automatically harvests, transforms and feeds metadata from a wide array of data sources, operational processes, business applications and data models into a central catalog. Then it is accessible and understandable via role-based, contextual views so stakeholders can make strategic decisions based on accurate insights.

You can request a demo of erwin Data Intelligence here.

[blog-cta header=”Webinar: The Value of Data Governance & How to Quantify It” body=”Join us March 15 at 10 a.m. ET for the second webinar in this series, “The Value of Data Governance & How to Quantify It.” Mr. Pörschmann will discuss how justifying a data governance program requires building a solid business case in which you can prove its value.” button=”Register Now” button_link=”https://attendee.gotowebinar.com/register/5489626673791671307″ image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
Enterprise Architecture erwin Expert Blog

6 Steps to Building a Great Enterprise Architecture Practice

Enterprise architecture provides business and IT alignment by mapping applications, technologies and data to the value streams and business functions they support. It defines business capabilities and interdependencies as they relate to enterprise strategy, bridging the gap between ideation and implementation.

An effective enterprise architecture framework provides a blueprint for business and operating models, identifies risks and opportunities, and enables the creation of technology roadmaps. Simply put, it enables IT and business transformation by helping technology and business innovation leaders focus on achieving successful, value-driven outcomes.

As an enterprise moves and shifts, enterprise architecture is central to managing change and addressing key issues facing organizations. Today, enterprises are trying to grow and innovate – while cutting costs and managing compliance – in the midst of a global pandemic.

 

How Enterprise Architecture Guides QAD

Scott Lawson, Director of IT Architecture for QAD, which provides ERP and other adaptive, cloud-based enterprise software and services for global manufacturing companies, recently shared how he and his company use enterprise architecture for “X-ray vision into the enterprise.”

“We use the architecture of the moment, the stuff that we have in our website to understand what the enterprise is today. It is what it is today, and then we move and use that information to figure out what it’s going to be tomorrow. But we don’t have this compare and contrast because it’s a reference,” he said.

QAD uses the Zachman Framework, which is considered an “ontology” or “schema” to help organize enterprise architecture artifacts, such as documents, specifications and models, which has helped them build a strong practice.

Based on QAD’s success, Lawson explains the six steps that any organization can take to solidify its enterprise architecture:

1. Define your goals. (WHO) While Zachman poses this as the final question, QAD opted to address it first. The reason for the “why” was not only to have a vision into the enterprise, but to change it, to do something about it, to make it better and more efficient. The goal for enterprise architecture for QAD was to add visibility. They cataloged all their systems and what departments used them, and how they communicated with one another, and built a large physical map with all of the information.

2. Define the objects you will collect. (WHAT) Lawson says, “the zero step there is to determine what things you’re going to make a list of. You can’t make a list of everything.”

3. Define your team and the methods to build the pieces. (HOW) There are fundamental questions to ask: How are you going to create it? Are you going to do it manually? Are you going to buy a tool that will collect all the information? Are you going to hire consultants? What are the methods you’re going to use, and how are you going to build those pieces together? Lawson advises that enterprise architecture needs to be a consistent practice. His team does some architecture every day.

4. Define your team and stakeholders. (WHO) Who is going to be the recipient of your architecture, and who is going to be the creator of your architecture? When building a great practice, involve other departments, suggests Lawson. While his department is IT, they reach out to a lot of other departments around the company and ask them about their processes and document those processes for them.

5. Define the tools, artifacts and deliverables. (WHERE) According to Lawson, you have to define where this information is going to exist, what tools you are going to use, and what artifacts and deliverables you are going to produce. He pointed out that an artifact is different than a deliverable. It’s a single unit of things (e.g., one artifact might be a list of servers), while deliverables are typically sent out as diagrams and reports, but it’s a good idea to define them upfront.

6. Define time scale of models: As is, to be, both or one off. (WHEN) What time scale do you want? QAD does an “as-is” architecture (e.g., what is happening today). The company keeps it up to date by collecting information from multiple systems in an automated fashion.

Using erwin Evolve

QAD is an erwin Evolve customer. erwin Evolve is a full-featured, configurable set of enterprise architecture and business process modeling and analysis tools. With it, you can map IT capabilities to the business functions they support and determine how people, processes, data, technologies and applications interact to ensure alignment in achieving enterprise objectives.

With erwin Evolve you can:

  • Harmonize enterprise architecture/business process modeling capabilities for greater visibility, control and intelligence in managing any use case.
  • Quickly and easily explore model elements, links and dependencies.
  • Identify and understand the impact of changes. Increase employee education and awareness, helping maintain institutional knowledge.
  • Democratize content to facilitate broader enterprise collaboration for better decision-making.
  • Achieve faster time to actionable insights and value with integrated views across initiatives.
  • Record end-to-end processes and assign responsibilities and owners to them.
  • Improve performance and profitability with harmonized, optimized and visible processes.

To replay QAD’s session from the erwin Insights global conference on enterprise modeling and data governance and intelligence, which covers the six steps above and more about their use of enterprise architecture and erwin Evolve, click here.

[blog-cta header=”Free no-risk trial of erwin Evolve” body=”If you’d like to start turning your enterprise architecture and business process artifacts into insights for better decisions, you can start a no-risk trial of erwin Evolve” button=”Start Free Trial” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-evolve-free-trial/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2020/02/evolve-pic.jpg” ]