Categories
erwin Expert Blog

What’s the State of Data Governance and Empowerment in 2021?

erwin by Quest just released the “2021 State of Data Governance and Empowerment” report. Building on prior research, we worked with Enterprise Strategy Group (ESG) to understand how organizations are defining, adopting and prioritizing data governance, as well as examine the current drivers and challenges of governing data through its lifecycle and integration points.

It’s safe to say that the world looks a lot different than it did just 15 months ago. Today, data needs to fuel rapid decisions that make an organization more effective, customer-centric and competitive. That was true before COVID-19, and it’s even more important in the face of the radical disruption it’s caused.

As a matter of fact, according to the report, 84% of organizations believe their data represents the best opportunity for gaining a competitive advantage during the next 12 to 24 months.

This past year also saw a major shift as the silos between data governance, data operations and data protection diminished, with enterprises seeking to understand their data and the systems they use and secure to empower smarter decision-making.

Highlighting this shift, 82% of organizations have mostly, if not completely, aligned their data governance and data protection strategies, with 55% of survey respondents citing “data protection” as the term they most closely associate with data governance. Additionally, 85% monitor their databases and other data systems as part of their data governance programs. Furthermore, nearly three-quarters reported a need to dramatically improve data infrastructure.

Data Governance Strategy

What else did we learn?

Data Governance Definition Varies

There is still no consensus on how to define data governance. When asked for a definition, the two most popular responses were:

  • Building a set of policies that governs data.
  • Ensuring data usage follows defined rules.

While neither of these answers are wrong, they continue to illustrate that there’s no standard definition.

At erwin, we define data governance as helping organizations establish a sound yet flexible framework for awareness of and access to available data assets, guidance on their use, and guardrails to ensure data policies and best practices are followed.

A New Level of Maturity

When asked “how mature is your data governance program/what stage are you in,” 42% of organizations said they’ve fully implemented data governance.

That’s in sharp contrast to our last study that showed 38% of data governance programs were a work in progress and 31% were just getting started.

So it appears that enterprise data governance programs have indeed reached a new level of maturity, but the majority (58%) say their data governance programs are evolving. However, if we’ve learned anything, isn’t it that data governance is an ever-evolving, ever-changing tenet of modern business?

Key Bottlenecks and Challenges

We explored the bottlenecks and issues causing delays across the entire data value chain.

Thematically “data quality” is at the heart desired data governance outcomes, challenges and bottlenecks, pointing to its overall importance.

Finding, identifying and harvesting data assets, the performance of systems where data is stored, documenting end-to-end data lineage, and visibility into mechanisms to protect data round out the top five bottlenecks to achieving optimal data value.

These are largely consistent with what we learned in our last study, with data system performance and protection making their debut this year as highly relevant problems to address.

With regard to challenges, respondents cite data quality and accuracy, skills shortages/gaps, cost, cultural change and operationalizing data governance – making it a working reality as opposed to a concept – as the top issues to address in maximizing data governance ROI.

Interestingly, 5% said they have no challenges – wouldn’t we like them to share their rose-colored glasses data governance glasses?

Other Key Findings

The report has a lot to unpack, but here is a snapshot of some other key findings:

Time is a major factor.

  • Data stewards still spend too much time on data-related activities, including analyzing (23%), protecting (23%) and searching for data (20%).
  • It takes most business users (e.g., developers, analysts, data scientists) one to two business days to receive the data they request from IT.

More automation opportunities exist.

  • While 42% of organizations have some mix of manual and automated processes, 93% say there’s room to incorporate more automation into their data operations.
  • In line with ensuring trust in data, data quality (27%), data integration (17%), and data preparation (14%) are the three data operations automated the most.

Self-service done right is a game-changer.

  • 93% of organizations have already or plan to leverage self-service in provisioning data, showing that self-service is more important than ever.
  • Seven out of 10 respondents report their organizations’ self-service data provisioning enablement has had a significant business impact.

The Path to Data Empowerment

The report validates Quest’s newly launched Data Empowerment strategy and platform that bridges the gaps between data infrastructure, security and governance initiatives to mitigate risk and unleash more value from data.

Data governance provides visibility, automation, governance and collaboration for data democratization.

As part of a Data Empowerment platform — data governance puts real-time, relevant, role-based data in context in the hands of users to optimize the enterprise data capability.

So with these solutions working in concert, you can ensure the availability of secure, high-quality data to empower everyone in your organization to be more successful – that’s a win-win for employees and the organization in accomplishing its mission.

Click here to download a free copy of the “2021 State of Data Governance and Empowerment” report.

And join us for the Quest Data Empowerment Summit this week, May 18-20. It includes three unique tracks —Data OperationsData Protection and Data Governance — with sessions about the latest trends, best practices and technologies to align these critical areas and close the gaps between your front and back office.

Categories
Data Governance erwin Expert Blog

Data Governance Maturity and Tracking Progress

Data governance is best defined as the strategic, ongoing and collaborative processes involved in managing data’s access, availability, usability, quality and security in line with established internal policies and relevant data regulations.

erwin recently hosted the third in its six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, this latest webinar focused on “Data Governance Maturity & Tracking Progress.”

The webinar looked at how to gauge the maturity and progress of data governance programs and why it is important for both IT and the business to be able to measure success

Data Governance Is Business Transformation

Data governance is about how an organization uses its data. That includes how it creates or collects data, as well as how its data is stored and accessed. It ensures that the right data of the right quality, regardless of where it is stored or what format it is stored in, is available for use – but only by the right people and for the right purpose.

Quite simply, data governance is business transformation, as Mr. Pörschmann highlights in the webinar. Meaning that it is a complex system that changes from one stable state into another stable state.

The basic principles of transformation are:

  • Complexity
  • Predictability
  • Synchronicity

However, the practice of data governance is a relatively new discipline that is still evolving. And while its effects will be felt throughout the entire organization, the weight of its impact will be felt differently across the business.

“You have to deal with this ambiguity and it’s volatile,” said Mr. Pörschmann. “Some business units benefit more from data governance than others, and some business units have to invest more energy and resources into the change than others.”

Maturity Levels

Data governance maturity includes the ability to rely on automated and repeatable processes, which ultimately helps to increase productivity. While it has gained traction over the past few years, many organizations are still formalizing it as a practice.

Implementing a data governance initiative can be tricky, so it is important to have clear goals for what you want it to achieve.

According to Mr. Pörschman, there are six levels of maturity with one being the lowest.

  1. Aware: Partial awareness of data governance but not yet started
  2. Initiated: Some ad-hoc data governance initiatives
  3. Acknowledged: An official acknowledgement of data governance from executive management with budget allocated
  4. Managed: Dedicated resources, managed and adjusted with KPIs
  5. Monitored: Dedicated resources and performance monitoring
  6. Enhanced: Data managed equally

For a fully mature or “enhanced” data governance program, IT and the business need to take responsibility for selling the benefits of data governance across the enterprise and ensure all stakeholders are properly educated about it. However, IT may have to go it alone, at least initially, educating the business on the risks and rewards, as well as the expectations and accountabilities in implementing it.

To move data governance to the next level, organizations need to discover, understand, govern and socialize data assets. Appropriately implemented — with business stakeholders driving alignment between data governance and strategic enterprise goals and IT handling the technical mechanics of data management — the door opens to trusting data, planning for change, and putting it to work for peak organizational performance.

The Medici Maturity Approach

In a rush to implement a data governance methodology and system, you can forget that a system must serve a process – and be governed/controlled by one.

To choose the correct system and implement it effectively and efficiently, you must know – in every detail – all the processes it will impact, how it will impact them, who needs to be involved and when.

Business, data governance and data leaders want a methodology that is lean, scalable and lightweight. This model has been dubbed the Medici maturity model – named after Romina Medici, head of data management and governance for global energy provider E.ON.

Ms. Medici found that the approaches on the market did not cover transformation challenges, and only a few addressed the operational data management disciplines. Her research also found that it doesn’t make sense to look at your functional disciplines unless you already have a minimum maturity (aware plus initiated levels).

People, process, technology and governance structure are on the one side of the axis with functional data management disciplines on the other.

The Medici Maturity Approach is a data governance methodology.

Mr. Pörschman then shared the “Data Maturity Canvas” that incorporates core dimensions, maturity levels and execution disciplines. The first time you run this, you can define the target situation, all the actions needed, and the next best actions.

This methodology gives you a view of the four areas, people, process, technology and governance, so you can link your findings across them. It is an easy method that you can run for different purposes including:

  • Initial assessment
  • Designing a data governance program
  • Monitoring a whole program
  • Beginning strategy processes
  • Benchmarking

Data governance can be many things to many people. Before starting, decide what your primary objectives are: to enable better decision-making or to help you meet compliance objectives. Or are you looking to reduce data management costs and improve data quality through formal, repeatable processes? Whatever your motivation, you need to identify it first and foremost to get a grip on data governance.

Click here to read our success story on how E.ON used erwin Data Intelligence for its digital transformation and innovation efforts.

Register for the fifth webinar in this series “Transparency in Data Governance – Data Catalogs & Business Glossaries” which takes place on April 27. This webinar will discuss how to answer critical questions through data catalogs and business glossaries, powered by effective metadata management. You’ll also see a demo of the erwin Data Intelligence Suite  that includes both data catalog, business glossary and metadata-driven automation.

[blog-cta header=”erwin Data Intelligence” body=”Click here to request a demo of erwin Data Intelligence by Quest.” button=”Request Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-data-intelligence-free-demo/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
erwin Expert Blog Data Governance

How Data Governance Protects Sensitive Data

 

Data governance reduces the risk of sensitive data.

Organizations are managing more data than ever. In fact, the global datasphere is projected to reach 175 zettabytes by 2025, according to IDC.

With more companies increasingly migrating their data to the cloud to ensure availability and scalability, the risks associated with data management and protection also are growing.

How can companies protect their enterprise data assets, while also ensuring their availability to stewards and consumers while minimizing costs and meeting data privacy requirements?

Data Security Starts with Data Governance

Lack of a solid data governance foundation increases the risk of data-security incidents. An assessment of the data breaches that crop up like weeds each year supports the conclusion that companies, absent data governance, wind up building security architectures strictly from a technical perspective.

Given that every company has in its possession important information about and relationships with people based on the private data they provide, every business should understand the related risks and protect against them under the banner of data governance—and avoid the costs and reputation damage that data breaches can inflict more intelligently and better. That’s especially true as the data-driven enterprise momentum grows along with self-service analytics that enable users to have greater access to information, often using it without IT’s knowledge.

Indeed, with nearly everyone in the enterprise involved either in maintaining or using the company’s data, it only makes sense that both business and IT begin to work together to discover, understand, govern and socialize those assets. This should come as part of a data governance plan that emphasizes making all stakeholders responsible not only for enhancing data for business benefit, but also for reducing the risks that unfettered access to and use of it can pose.

With data catalog and literacy capabilities, you provide the context to keep relevant data private and secure – the assets available, their locations, the relationships between them, associated systems and processes, authorized users and guidelines for usage.

Without data governance, organizations lack the ability to connect the dots across data governance, security and privacy – and to act accordingly. So they can’t answer these fundamental questions:

  • What data do we have and where is it now?
  • Where did it come from and how has it changed?
  • Is it sensitive data or are there any risks associated with it?
  • Who is authorized to use it and how?

When an organization knows what data it has, it can define that data’s business purpose. And knowing the business purpose translates into actively governing personal data against potential privacy and security violations.

Do You Know Where Your Sensitive Data Is?

Data is a valuable asset used to operate, manage and grow a business. While sometimes at rest in databases, data lakes and data warehouses; a large percentage is federated and integrated across the enterprise, management and governance issues that must be addressed.

Knowing where sensitive data is located and properly governing it with policy rules, impact analysis and lineage views is critical for risk management, data audits and regulatory compliance.

For example, understanding and protecting sensitive data is especially critical for complying with privacy regulations like the European Union’s General Data Protection Regulation (GDPR).

The demands GDPR places on organizations are all-encompassing. Protecting what traditionally has been considered personally identifiable information (PII) — people’s names, addresses, government identification numbers and so forth — that a business collects, and hosts is just the beginning of GDPR mandates. Personal data now means anything collected or stored that can be linked to an individual (right down to IP addresses), and the term doesn’t only apply to individual pieces of information but also to how they may be combined in revealing relationships. And it isn’t just about protecting the data your business gathers, processes and stores but also any data it may leverage from third-party sources.

When key data isn’t discovered, harvested, cataloged, defined and standardized as part of integration processes, audits may be flawed putting your organization at risk.

Sensitive data – at rest or in motion – that exists in various forms across multiple systems must be automatically tagged, its lineage automatically documented, and its flows depicted so that it is easily found, and its usage easily traced across workflows.

Fortunately, tools are available to help automate the scanning, detection and tagging of sensitive data by:

  • Monitoring and controlling sensitive data: Better visibility and control across the enterprise to identify data security threats and reduce associated risks
  • Enriching business data elements for sensitive data discovery: Comprehensive mechanism to define business data element for PII, PHI and PCI across database systems, cloud and Big Data stores to easily identify sensitive data based on a set of algorithms and data patterns
  • Providing metadata and value-based analysis: Discovery and classification of sensitive data based on metadata and data value patterns and algorithms. Organizations can define business data elements and rules to identify and locate sensitive data including PII, PHI, PCI and other sensitive information.

Minimizing Risk Exposure with Data Intelligence

Organizations suffering data losses won’t benefit from the money spent on security technologies nor the time invested in developing data privacy classifications if they can’t get a handle on how they handle their data.

They also may face heavy fines and other penalties – not to mention bad PR.

Don’t let that happen to your organization.

A well-formed security architecture that is driven by and aligned by data intelligence is your best defense. Being prepared means you can minimize your risk exposure.

With erwin Data Intelligence by Quest, you’ll have an unfettered view of where sensitive data resides with the ability to seamlessly apply privacy rules and create access privileges.

Additionally, with Quest’s acquisition of erwin comes the abilities to mask, encrypt, redact and audit sensitive data for an automated and comprehensive solution to resolve sensitive-data issues.

When an organization knows what data it has, it can define that data’s business purpose. And knowing the business purpose translates into actively governing personal data against potential privacy and security violations.

From risk management and regulatory compliance to innovation and digital transformation, you need data intelligence. With erwin by Quest, you will know your data so you can fully realize its business benefits.

[blog-cta header=”erwin Data Intelligence” body=”Click here to request a demo of erwin Data Intelligence by Quest.” button=”Request Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-data-intelligence-free-demo/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
Data Governance erwin Expert Blog

The Value of Data Governance and How to Quantify It

erwin recently hosted the second in its six-part webinar series on the practice of data governance and how to proactively deal with its complexities. Led by Frank Pörschmann of iDIGMA GmbH, an IT industry veteran and data governance strategist, the second webinar focused on “The Value of Data Governance & How to Quantify It.”

As Mr. Pörschmann highlighted at the beginning of the series, data governance works best when it is strongly aligned with the drivers, motivations and goals of the business.

The business drivers and motivation should be the starting point for any data governance initiative. If there is no clear end goal in sight, it will be difficult to get stakeholders on board. And with many competing projects and activities vying for people’s time, it must be clear to people why choosing data governance activities will have a direct benefit to them.

“Usually we talk about benefits which are rather qualitative measures, but what we need for decision-making processes are values,” Pörschmann says. “We need quantifiable results or expected results that are fact-based. And the interesting thing with data governance, it seems to be easier for organizations and teams to state the expected benefits.”

The Data Governance Productivity Matrix

In terms of quantifying data governance, Pörschmann cites the productivity matrix as a relatively simple way to calculate real numbers. He says, “the basic assumption is if an organization equips their managers with the appropriate capabilities and instruments, then it’s management’s obligation to realize productivity potential over time.”

According to IDC, professionals who work with data spend 80 percent of their time looking for and preparing data and only 20 percent of their time on analytics.

Specifically, 80 percent of data professionals’ time is spent on data discovery, preparation and protection, and only 20 percent on analysis leading to insights.

Data governance maturity includes the ability to rely on automated and repeatable processes, which ultimately helps to increase productivity.

For example, automatically importing mappings from developers’ Excel sheets, flat files, Access and ETL tools into a comprehensive mappings inventory, complete with automatically generated and meaningful documentation of the mappings, is a powerful way to support governance while providing real insight into data movement — for data lineage and impact analysis — without interrupting system developers’ normal work methods.

When data movement has been tracked and version-controlled, it’s possible to conduct data archeology — that is, reverse-engineering code from existing XML within the ETL layer — to uncover what has happened in the past and incorporating it into a mapping manager for fast and accurate recovery.

With automation, data professionals can meet the above needs at a fraction of the cost of the traditional, manual way. To summarize, just some of the benefits of data automation are:

  • Centralized and standardized code management with all automation templates stored in a governed repository
  • Better quality code and minimized rework
  • Business-driven data movement and transformation specifications
  • Superior data movement job designs based on best practices
  • Greater agility and faster time to value in data preparation, deployment and governance
  • Cross-platform support of scripting languages and data movement technologies

For example, one global pharmaceutical giant reduced cost by 70 percent and generated 95 percent of production code with “zero touch.” With automation, the company improved the time to business value and significantly reduced the costly re-work associated with error-prone manual processes.

Risk Management and Regulatory Compliance

Risk management, specifically around regulatory compliance, is an important use case to demonstrate the true value of data governance.

According to Pörschmann, risk management asks two main questions.

  1. How likely is a specific event to happen?
  2. What is the impact or damage if this event happens? (e.g.m, cost of repair, cost of reputation, etc.)

“You have to understand the concept or thinking of risk officers or the risk teams,” he says. The risk teams are process-oriented, and they understand how to calculate and how to cover IT risks. But to be successful in communicating data risks with the risk management team, you need to understand how your risk teams are thinking in terms of the risk matrix.

Take the European Union’s General Data Protection Regulation (GDPR) as an example of a data cost. Your team needs to ask, “what is the likelihood that we will fail on data-based activities related to GDPR?” And then ask, “what can we do from the data side to reduce the impact or the total damage?”

But it’s not easy to design and deploy compliance in an environment that’s not well understood and difficult in which to maneuver. Data governance enables organizations to plan and document how they will discover and understand their data within context, track its physical existence and lineage, and maximize its security, quality and value.

With the right technology, organizations can automate and accelerate regulatory compliance in five steps:

  1. Catalog systems. Harvest, enrich/transform and catalog data from a wide array of sources to enable any stakeholder to see the interrelationships of data assets across the organization.
  2. Govern PII “at rest”. Classify, flag and socialize the use and governance of personally identifiable information regardless of where it is stored.
  3. Govern PII “in motion”. Scan, catalog and map personally identifiable information to understand how it moves inside and outside the organization and how it changes along the way.
  4. Manage policies and rules. Govern business terminology in addition to data policies and rules, depicting relationships to physical data catalogs and the applications that use them with lineage and impact analysis views.
  5. Strengthen data security. Identify regulatory risks and guide the fortification of network and encryption security standards and policies by understanding where all personally identifiable information is stored, processed and used.

It’s also important to understand that the benefits of data governance don’t stop with regulatory compliance.

A better understanding of what data you have, where it’s stored and the history of its use and access isn’t only beneficial in fending off non-compliance repercussions. In fact, such an understanding is arguably better put to use proactively.

Data governance improves data quality standards, it enables better decision-making and ensures businesses can have more confidence in the data informing those decisions.

[blog-cta header=”erwin DG Webinar Series” body=”Register now for the March 30 webinar ‘Data Governance Maturity & Tracking Progress.'” button=”Register Now” button_link=”https://register.gotowebinar.com/register/8531817018173466635″ image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
Data Governance erwin Expert Blog

Top Data Management Trends for Chief Data Officers (CDOs)

Chief Data Officer (CDOs) 2021 Study

The role of chief data officer (CDO) is becoming essential at forward-thinking organizations — especially those in financial services — according to “The Evolving Role of the CDO at Financial Organizations: 2021 Chief Data Officer (CDO) Study” just released by FIMA and sponsored by erwin.

The e-guide takes a deep dive into the evolving role of CDOs at financial organizations, tapping into the minds of 100+ financial global financial leaders and C-suite executives to look at the latest trends and provide a roadmap for developing an offensive data management strategy.

Data Governance Is Not Just About Compliance

Interestingly, the report found that 45% of respondents say compliance is now handled so well that it is no longer the top driver for data governance, while 38% say they have fully realized a “governance 2.0” model in which the majority of their compliance concerns are fully automated.”

Chief data officers and other data professionals have taken significant steps toward a data governance model that doesn’t just safeguard data but also drives business improvements.

erwin also found this to be the case as revealed in our 2020 “State of Data Governance and Automation” report.

However, while compliance is no longer the top driver of data governance, it still requires a significant investment. According to the CDO report, 88% of organizations devote 40% or more of their data practice’s operating budget to compliance activities.

COVID’s Impact on Data Management

FIMA also looked at 2020 and the pandemic’s impact on data management.

Some financial organizations that were approaching a significant level of data management maturity had to put their initiatives on hold to address more immediate issues. But it led some sectors to innovate, moving processes that were once manual to the digital realm.

The research team asked respondents to describe how their data practices were impacted by the need to adapt to changes in the work environment created by COVID-19. “Overall, most respondents said they avoided any catastrophic impact on their data operations. Most of these respondents note the fact that they had been updating their tools and programs ahead of time to prepare for such risks, and those investments inevitably paid off.”

The respondents who did note that the pandemic caused a disruption repeatedly said that they nonetheless managed to “keep everything in check.” As one CIO at an investment bank puts it, “Data practices became more precise and everyone got more conscious as the pandemic reached its first peak. Key programs have been kept in check and have been restarted securely.”

What Keeps CDOs Up at Night

Financial services organizations are usually at the forefront of data management and governance because they operate in such a heavily regulated environment. So it’s worth knowing what’s on those data executives’ minds, even if your organization is in another sector.

For example, the FIMA study indicates that:

  • 70% of CDOs say risk data aggregation is a primary regulatory concern within the IT departments.
  • Compliance is secondary to overall business improvement, but 88% of organizations devote 40%+ of their data practice’s operating budget to it.
  • Lack of downstream visibility into data consumption (69%) and unclear data provenance and tagging information (65%) are significant challenges.
  • They struggle to apply metadata.
  • Manual processes remain.

The e-guide discusses how data executives must not only secure data and meet rigorous data requirements but also find ways to create new business value with it.

All CDOs and other data professionals likely must deal with the challenges mentioned above – plus improve customer outcomes and boost profitability.

Both mitigating risk and unleashing potential is possible with the right tools, including data catalog, data literacy and metadata-driven automation capabilities for data governance and any other data-centric use case.

Harmonizing Data Management and Data Governance Processes

With erwin Data Intelligence by Quest, your organization can harness and activate your data in a single, unified catalog and then make it available to your data communities in context and in line with business requirements.

The solution harmonizes data management and governance processes to fuel an automated, real-time, high-quality data pipeline enterprise stakeholders can tap into for the information they need to achieve results. Such data intelligence leads to faster, smarter decisions to improve overall organizational performance.

Data is governed properly throughout its lifecycle, meeting both offensive and defensive data management needs. erwin Data Intelligence provides total data visibility, end-to-end data lineage and provenance.

To download the full “The Evolving Role of the CDO at Financial Organizations: 2021 Chief Data Officer (CDO) Study,” please visit: https://go.erwin.com/the-evolving-role-of-the-cdo-at-financial-organizations-report.

[blog-cta header=”Free trial of erwin Data Intelligence” body=”Improve enterprise data access, literacy and knowledge to support data governance, digital transformation and other critical initiatives.” button=”Start Free Demo” button_link=”https://s38605.p1254.sites.pressdns.com/erwin-data-intelligence-free-demo/” image=”https://s38605.p1254.sites.pressdns.com/wp-content/uploads/2018/11/iStock-914789708.jpg” ]

Categories
Enterprise Architecture erwin Expert Blog

Top 7 Enterprise Architecture Certifications

Enterprise architecture certifications and the professionals who obtain them give organizations more confidence in their enterprise architecture initiatives. This post outlines the best enterprise architecture certifications and certifications for enterprise architects.

Enterprise architecture (EA) helps align business and IT efforts by documenting and mapping data, applications and assets to the functions they support.

While a number of different approaches exist, EA must be performed in line with recognized frameworks in order to be sustainable. EA certifications serve as proof that an enterprise architect is familiar with these frameworks – benefiting both the organization and the EA professional.

Think about it. If your organization’s EA was modeled and mapped according to an enterprise architect’s own standards – with no basis in recognized frameworks – your EA projects become tied to that employee.

With this approach, complications become inevitable.

A few of the potential issues with tying your EA standards to a person rather than using enterprise architecture frameworks are:

  • The enterprise architect would have to teach the relevant stakeholders how to understand their models to effectively communicate their plans.
  • Should the enterprise architect leave the organization, the enterprise architecture practice leaves with them, so future projects would have to be built from the ground up.

Top 7 Enterprise Architect Certifications and What They Cost

Each of the following certifications for enterprise architects have their own advantages – from cost, to versatility, to their suitability in specialist use-cases.

Below lists the top 7 enterprise architecture certifications in order of cost:

Enterprise Architecture Certification

The Best EA Certifications

While the enterprise architecture certification that’s best for you is subjective, you should let the context influence your decision.

As a new enterprise architect, prioritizing widely recognized certifications for the most common frameworks makes the most sense. The Open Group TOGAF 9 Certification is a good place to start

Vendor-agnostic and globally-recognized, the versatility afforded by a TOGAF certification is certainly worth pursuing.

However, some organizations will prioritize recruiting enterprise architects with more specialty or platform-specific certifications.

A common example here is Amazon’s AWS certification: AWS Certified Solution Architect.

Amazon state that AWS Certified Solution Architects can “effectively demonstrate knowledge of how to architect and deploy secure and robust applications on AWS technologies” among other skills the certification validates.

Earning this certification requires hands-on experience with AWS services. Amazon recommends “experience using compute, networking, storage, and database AWS services” as well as “an understanding of the basic architectural principles of building on the AWS Cloud.”

And considering current trends, certifications for cloud platforms are becoming increasingly relevant.

The Google Professional Cloud Architect certification and the Professional Cloud Solutions Architect Certification are two cloud-oriented certifications.

Benefits of Enterprise Architecture Certifications for Employers

Employing certified enterprise architects benefits organizations in the following ways:

  • Organizations can ensure candidates they interview and eventually hire are up to speed with the frameworks currently observed within the organization
  • They can achieve faster time to markets because new enterprise architects are brought up to speed more quickly.
  • They have greater ability to and ease in expanding and upscaling their enterprise architecture initiatives
  • They have greater confidence their efforts are sustainable beyond the tenure of any one enterprise architect

Benefits of Enterprise Architecture Certifications for Enterprise Architects

For enterprise architects, EA certifications offer these benefits:

  • Collaborative enterprise architecture becomes easier to implement and manage, since everyone speaks the same language
  • Certified enterprise architects have demonstrated their understanding of particular EA, solution architecture, cloud architecture or technical architecture frameworks and validated their skills
  • Certified enterprise architects enjoy improved employability because organizations will look for EAs with specific certifications for continuity in existing projects
  • Certified enterprise architects can command a premium; the average salary for a TOGAF certified enterprise architect is $137,000

More on enterprise architecture:

Categories
Data Governance Data Intelligence erwin Expert Blog

Doing Cloud Migration and Data Governance Right the First Time

More and more companies are looking at cloud migration.

Migrating legacy data to public, private or hybrid clouds provide creative and sustainable ways for organizations to increase their speed to insights for digital transformation, modernize and scale their processing and storage capabilities, better manage and reduce costs, encourage remote collaboration, and enhance security, support and disaster recovery.

But let’s be honest – no one likes to move. So if you’re going to move from your data from on-premise legacy data stores and warehouse systems to the cloud, you should do it right the first time. And as you make this transition, you need to understand what data you have, know where it is located, and govern it along the way.

cloud migration

Automated Cloud Migration

Historically, moving legacy data to the cloud hasn’t been easy or fast.

As organizations look to migrate their data from legacy on-prem systems to cloud platforms, they want to do so quickly and precisely while ensuring the quality and overall governance of that data.

The first step in this process is converting the physical table structures themselves. Then you must bulk load the legacy data. No less daunting, your next step is to re-point or even re-platform your data movement processes.

Without automation, this is a time-consuming and expensive undertaking. And you can’t risk false starts or delayed ROI that reduces the confidence of the business and taint this transformational initiative.

By using automated and repeatable capabilities, you can quickly and safely migrate data to the cloud and govern it along the way.

But transforming and migrating enterprise data to the cloud is only half the story – once there, it needs to be governed for completeness and compliance. That means your cloud data assets must be available for use by the right people for the right purposes to maximize their security, quality and value.

Why You Need Cloud Data Governance

Companies everywhere are building innovative business applications to support their customers, partners and employees and are increasingly migrating from legacy to cloud environments. But even with the “need for speed” to market, new applications must be modeled and documented for compliance, transparency and stakeholder literacy.

The desire to modernize technology, over time, leads to acquiring many different systems with various data entry points and transformation rules for data as it moves into and across the organization.

These tools range from enterprise service bus (ESB) products, data integration tools; extract, transform and load (ETL) tools, procedural code, application program interfaces (APIs), file transfer protocol (FTP) processes, and even business intelligence (BI) reports that further aggregate and transform data.

With all these diverse metadata sources, it is difficult to understand the complicated web they form much less get a simple visual flow of data lineage and impact analysis.

Regulatory compliance is also a major driver of data governance (e.g., GDPR, CCPA, HIPAA, SOX, PIC DSS). While progress has been made, enterprises are still grappling with the challenges of deploying comprehensive and sustainable data governance, including reliance on mostly manual processes for data mapping, data cataloging and data lineage.

Introducing erwin Cloud Catalyst

erwin just announced the release of erwin Cloud Catalyst, a suite of automated cloud migration and data governance software and services. It helps organizations quickly and precisely migrate their data from legacy, on-premise databases to the cloud and then govern those data assets throughout their lifecycle.

Only erwin provides software and services that automate the complete cloud migration and data governance lifecycle – from the reverse-engineering and transformation of legacy systems and ETL/ELT code to moving bulk data to cataloging and auto generating lineage. The metadata-driven suite automatically finds, models, ingests, catalogs and governs cloud data assets.

erwin Cloud Catalyst is comprised of erwin Data Modeler (erwin DM), erwin Data Intelligence (erwin DI) and erwin Smart Data Connectors, working together to simplify and accelerate cloud migration by removing barriers, reducing risks and decreasing time to value for your investments in these modern systems, such Snowflake, Microsoft Azure and Google Cloud.

We start with an assessment of your cloud migration strategy to determine what automation and optimization opportunities exist. Then we deliver an automation roadmap and design the appropriate smart data connectors to help your IT services team achieve your future-state cloud architecture, including accelerating data ingestion and ETL conversion.

Once your data reaches the cloud, you’ll have deep and detailed metadata management with full data governance, data lineage and impact analysis. With erwin Cloud Catalyst, you automate these data governance steps:

  • Harvest and catalog cloud data: erwin DM and erwin DI’s Metadata Manager natively scans RDBMS sources to catalog/document data assets.
  • Model cloud data structures: erwin DM converts, modifies and models the new cloud data structures.
  • Map data movement: erwin DI’s Mapping Manager defines data movement and transformation requirements via drag-and-drop functionality.
  • Generate source code: erwin DI’s automation framework generates data migration source code for any ETL/ELT SDK.
  • Test migrated data: erwin DI’s automation framework generates test cases and validation source code to test migrated data.
  • Govern cloud data: erwin DI gives cloud data assets business context and meaning through the Business Glossary Manager, as well as policies and rules for use.
  • Distribute cloud data: erwin DI’s Business User Portal provides self-service access to cloud data asset discovery and reporting tools.

Request an erwin Cloud Catalyst assessment.

And don’t forget to register for erwin Insights 2020 on October 13-14, with sessions on Snowflake, Microsoft and data lake initiatives powered by erwin Cloud Catalyst.

erwin Data Intelligence

Subscribe to the erwin Expert Blog

Once you submit the trial request form, an erwin representative will be in touch to verify your request and help you start data modeling.

Categories
Data Modeling erwin Expert Blog

The Top Six Benefits of Data Modeling – What Is Data Modeling?

Understanding the benefits of data modeling is more important than ever.

Data modeling is the process of creating a data model to communicate data requirements, documenting data structures and entity types.

It serves as a visual guide in designing and deploying databases with high-quality data sources as part of application development.

Data modeling has been used for decades to help organizations define and categorize their data, establishing standards and rules so it can be consumed and then used by information systems. Today, data modeling is a cost-effective and efficient way to manage and govern massive volumes of data, aligning data assets with the business functions they serve.

You can automatically generate data models and database designs to increase efficiency and reduce errors to make the lives or your data modelers – and other stakeholders – much more productive.

Categories
Data Intelligence Data Governance erwin Expert Blog

Data Governance Definition, Best Practices and Benefits

Any organziation with a data-driven strategy should understand the definition of data governance. In fact, in light of increasingly stringent data regulations, any organzation that uses or even stores data, should understand the definition of data governance.

Organizations with a solid understanding of data governance (DG) are better equipped to keep pace with the speed of modern business.

In this post, the erwin Experts address:

 

 

Data Governance Definition

Data governance’s definition is broad as it describes a process, rather than a predetermined method. So an understanding of the process and the best practices associated with it are key to a successful data governance strategy.

Data governance is best defined as the strategic, ongoing and collaborative processes involved in managing data’s access, availability, usability, quality and security in line with established internal policies and relevant data regulations.

It’s often said that when we work together, we can achieve things greater than the sum of our parts. Collective, societal efforts have seen mankind move metaphorical mountains and land on the literal moon.

Such feats were made possible through effective government – or governance.

The same applies to data. A single unit of data in isolation can’t do much, but the sum of an organization’s data can prove invaluable.

Put simply, DG is about maximizing the potential of an organization’s data and minimizing the risk. In today’s data-driven climate, this dynamic is more important than ever.

That’s because data’s value depends on the context in which it exists: too much unstructured or poor-quality data and meaning is lost in a fog; too little insight into data’s lineage, where it is stored, or who has access and the organization becomes an easy target for cybercriminals and/or non-compliance penalties.

So DG is quite simply, about how an organization uses its data. That includes how it creates or collects data, as well as how its data is stored and accessed. It ensures that the right data of the right quality, regardless of where it is stored or what format it is stored in, is available for use – but only by the right people and for the right purpose.

With well governed data, organizations can get more out of their data by making it easier to manage, interpret and use.

Why Is Data Governance Important?

Although governing data is not a new practice, using it as a strategic program is and so are the expectations as to who is responsible for it.

Historically, governing data has been IT’s business because it primarily involved cataloging data to support search and discovery.

But now, governing data is everyone’s business. Both the data “keepers” in IT and the data users everywhere else within the organization have a role to play.

That makes sense, too. The sheer volume and importance of data the average organization now processes are too great to be effectively governed by a siloed IT department.

Think about it. If all the data you access as an employee of your organization had to be vetted by IT first, could you get anything done?

While the exponential increase in the volume and variety of data has provided unparalleled insights for some businesses, only those with the means to deal with the velocity of data have reaped the rewards.

By velocity, we mean the speed at which data can be processed and made useful. More on “The Three Vs of Data” here.

Data giants like Amazon, Netflix and Uber have reshaped whole industries, turning smart, proactive data governance into actionable and profitable insights.

And then, of course, there’s the regulatory side of things. The European Union’s General Data Protection Regulation (GDPR) mandates organization’s govern their data.

Poor data governance doesn’t just lead to breaches – although of course it does – but compliance audits also need an effective data governance initiative in order to pass.

Since non-compliance can be costly, good data governance not only helps organizations make money, it helps them save it too. And organizations are recognizing this fact.

In the lead up to GDPR, studies found that the biggest driver for initiatives for governing data was regulatory compliance. However, since GDPR’s implementation better decision-making and analytics are their top drivers for investing in data governance.

Other areas in where well governed data plays an important role include digital transformation, data standards and uniformity, self-service and customer trust and satisfaction.

For the full list of drivers and deeper insight into the state of data governance, get the free 2020 State of DGA report here.

What Is Good Data Governance?

We’re constantly creating new data whether we’re aware of it or not. Every new sale, every new inquiry, every website interaction, every swipe on social media generates data.

This means the work of governing data is ongoing, and organizations without it can become overwhelmed quickly.

Therefore good data governance is proactive not reactive.

In addition, good data governance requires organizations to encourage a culture that stresses the importance of data with effective policies for its use.

An organization must know who should have access to what, both internally and externally, before any technical solutions can effectively compartmentalize the data.

So good data governance requires both technical solutions and policies to ensure organizations stay in control of their data.

But culture isn’t built on policies alone. An often-overlooked element of good data governance is arguably philosophical. Effectively communicating the benefits of well governed data to employees – like improving the discoverability of data – is just as important as any policy or technology.

And it shouldn’t be difficult. In fact, it should make data-oriented employees’ jobs easier, not harder.

What Are the Key Benefits of Data Governance?

Organizations with a effectively governed data enjoy:

  • Better alignment with data regulations: Get a more holistic understanding of your data and any associated risks, plus improve data privacy and security through better data cataloging.
  • A greater ability to respond to compliance audits: Take the pain out of preparing reports and respond more quickly to audits with better documentation of data lineage.
  • Increased operational efficiency: Identify and eliminate redundancies and streamline operations.
  • Increased revenue: Uncover opportunities to both reduce expenses and discover/access new revenue streams.
  • More accurate analytics and improved decision-making: Be more confident in the quality of your data and the decisions you make based on it.
  • Improved employee data literacy: Consistent data standards help ensure employees are more data literate, and they reduce the risk of semantic misinterpretations of data.
  • Better customer satisfaction/trust and reputation management: Use data to provide a consistent, efficient and personalized customer experience, while avoiding the pitfalls and scandals of breaches and non-compliance.

For a more in-depth assessment of data governance benefits, check out The Top 6 Benefits of Data Governance.

The Best Data Governance Solution

Data has always been important to erwin; we’ve been a trusted data modeling brand for more than 30 years. But we’ve expanded our product portfolio to reflect customer needs and give them an edge, literally.

The erwin EDGE platform delivers an “enterprise data governance experience.” And at the heart of the erwin EDGE is the erwin Data Intelligence Suite (erwin DI).

erwin DI provides all the tools you need for the effective governance of your data. These include data catalog, data literacy and a host of built-in automation capabilities that take the pain out of data preparation.

With erwin DI, you can automatically harvest, transform and feed metadata from a wide array of data sources, operational processes, business applications and data models into a central data catalog and then make it accessible and understandable via role-based, contextual views.

With the broadest set of metadata connectors, erwin DI combines data management and DG processes to fuel an automated, real-time, high-quality data pipeline.

See for yourself why erwin DI is a DBTA 2020 Readers’ Choice Award winner for best data governance solution with your very own, very free demo of erwin DI.

data governance preparedness

Categories
Enterprise Architecture erwin Expert Blog

What Is Enterprise Architecture? – Definition, Methodology & Best Practices

Enterprise architecture (EA) is a strategic planning initiative that helps align business and IT. It provides a visual blueprint, demonstrating the connection between applications, technologies and data to the business functions they support.