Categories
erwin Expert Blog Metadata Management

7 Benefits of Metadata Management

Metadata management is key to wringing all the value possible from data assets.

However, most organizations don’t use all the data at their disposal to reach deeper conclusions about how to drive revenue, achieve regulatory compliance or accomplish other strategic objectives.

What Is Metadata?

Analyst firm Gartner defines metadata as “information that describes various facets of an information asset to improve its usability throughout its life cycle. It is metadata that turns information into an asset.”

Quite simply, metadata is data about data. It’s generated every time data is captured at a source, accessed by users, moved through an organization, integrated or augmented with other data from other sources, profiled, cleansed and analyzed.

It’s valuable because it provides information about the attributes of data elements that can be used to guide strategic and operational decision-making. Metadata management is the administration of data that describes other data, with an emphasis on associations and lineage. It involves establishing policies and processes to ensure information can be integrated, accessed, shared, linked, analyzed and maintained across an organization.

Metadata Answers Key Questions

A strong data management strategy and supporting technology enables the data quality the business requires, including data cataloging (integration of data sets from various sources), mapping, versioning, business rules and glossaries maintenance and metadata management (associations and lineage).

Metadata answers a lot of important questions:

  • What data do we have?
  • Where did it come from?
  • Where is it now?
  • How has it changed since it was originally created or captured?
  • Who is authorized to use it and how?
  • Is it sensitive or are there any risks associated with it?

Metadata also helps your organization to:

  • Discover data. Identify and interrogate metadata from various data management silos.
  • Harvest data. Automate the collection of metadata from various data management silos and consolidate it into a single source.
  • Structure and deploy data sources. Connect physical metadata to specific data models, business terms, definitions and reusable design standards.
  • Analyze metadata. Understand how data relates to the business and what attributes it has.
  • Map data flows. Identify where to integrate data and track how it moves and transforms.
  • Govern data. Develop a governance model to manage standards, policies and best practices and associate them with physical assets.
  • Socialize data. Empower stakeholders to see data in one place and in the context of their roles.

Metadata management

The Benefits of Metadata Management

1. Better data quality. With automation, data quality is systemically assured with the data pipeline seamlessly governed and operationalized to the benefit of all stakeholders. Data issues and inconsistencies within integrated data sources or targets are identified in real time to improve overall data quality by increasing time to insights and/or repair. It’s easier to map, move and test data for regular maintenance of existing structures, movement from legacy systems to new systems during a merger or acquisition or a modernization effort.

2. Quicker project delivery. Automated enterprise metadata management provides greater accuracy and up to 70 percent acceleration in project delivery for data movement and/or deployment projects. It harvests metadata from various data sources and maps any data element from source to target and harmonize data integration across platforms. With this accurate picture of your metadata landscape, you can accelerate Big Data deployments, Data Vaults, data warehouse modernization, cloud migration, etc.

3. Faster speed to insights. High-paid knowledge workers like data scientists spend up to 80 percent of their time finding and understanding source data and resolving errors or inconsistencies, rather than analyzing it for real value. That equation can be reversed with stronger data operations and analytics leading to insights more quickly, with access/connectivity to underlying metadata and its lineage. Technical resources are free to concentrate on the highest-value projects, while business analysts, data architects, ETL developers, testers and project managers can collaborate more easily for faster decision-making.

4. Greater productivity & reduced costs. Being able to rely on automated and repeatable metadata management processes results in greater productivity. For example, one erwin DI customer has experienced a steep improvement in productivity – more than 85 percent – because manually intensive and complex coding efforts have been automated and 70+ percent because of seamless access to and visibility of all metadata, including end-to-end lineage. Significant data design and conversion savings, up to 50 percent and 70 percent respectively, also are possible with data mapping costs going down as much as 80 percent.

5. Regulatory compliance. Regulations such as the General Data Protection Regulation (GDPR), Health Insurance and Portability Accountability Act (HIPAA), Basel Committee on Banking Supervision (BCBS) and The California Consumer Privacy Act (CCPA) particularly affect sectors such as finance, retail, healthcare and pharmaceutical/life sciences. When key data isn’t discovered, harvested, cataloged, defined and standardized as part of integration processes, audits may be flawed. Sensitive data is automatically tagged, its lineage automatically documented, and its flows depicted so that it is easily found and its use across workflows easily traced.

6. Digital transformation. Knowing what data exists and its value potential promotes digital transformation by 1) improving digital experiences because you understand how the organization interacts with and supports customers, 2) enhancing digital operations because data preparation and analysis projects happen faster, 3) driving digital innovation because data can be used to deliver new products and services, and 4) building digital ecosystems because organizations need to establish platforms and partnerships to scale and grow.

7. An enterprise data governance experience. Stakeholders include both IT and business users in collaborative relationships, so that makes data governance everyone’s business. Modern, strategic data governance must be an ongoing initiative, and it requires everyone from executives on down to rethink their data duties and assume new levels of cooperation and accountability. With business data stakeholders driving alignment between data governance and strategic enterprise goals and IT handling the technical mechanics of data management, the door opens to finding, trusting and using data to effectively meet any organizational objective.

An Automated Solution

When approached manually, metadata management is expensive, time-consuming, error-prone and can’t keep pace with a dynamic enterprise data management infrastructure.

And while integrating and automating data management and data governance is still a new concept for many organizations, its advantages are clear.

erwin’s metadata management offering, the erwin Data Intelligence Suite (erwin DI), includes data catalogdata literacy and automation capabilities for greater awareness of and access to data assets, guidance on their use, and guardrails to ensure data policies and best practices are followed. Its automated, metadata-driven framework gives organizations visibility and control over their disparate data streams – from harvesting to aggregation and integration, including transformation with complete upstream and downstream lineage and all the associated documentation.

erwin has been named a leader in the Gartner 2020 “Magic Quadrant for Metadata Management Solutions” for two consecutive years. Click here to download the full Gartner 2020 “Magic Quadrant for Metadata Management Solutions” report.

Categories
erwin Expert Blog

Documenting and Managing Governance, Risk and Compliance with Business Process

Managing an organization’s governance, risk and compliance (GRC) via its enterprise and business architectures means managing them against business processes (BP).

Shockingly, a lot of organizations, even today, manage this through, either homemade tools or documents, checklists, Excel files, custom-made databases and so on and so forth. The three main reasons organizations tend to still operate in this manual and disparate way comes down to three reasons:

  1. Cost
  2. Governance, risk and compliance are treated as isolated bubbles.
  3. Data-related risks are not connected with the data architects/data scientists.

If we look at this past year, COVID-19 fundamentally changed everything overnight – and it was something that nobody could have anticipated. However, only organizations that had their risks mapped at the process level could see their operational risk profiles and also see what processes needed adjustments – quickly.

Furthermore, by linking compliance with process, those organizations were prepared to answer very specific compliance questions. For example, if a customer asked, “Since most of your employees are working from home now, how can you ensure that my data is not shared with their kids?” Organizations with business process could respond with, “We have anticipated these kinds of risks and implemented the following controls, and this is how we protect you in different layers.”

Every company must understand its business processes, particularly those in industries in which quality, regulatory, health, safety or environmental standards are serious considerations. BP modeling and analysis shows process flows, system interactions and organizational hierarchies to identity areas for improvement as well as practices susceptible to the greatest security, compliance or other risks so controls and audits can be implemented to mitigate exposures.

Connecting the GRC, Data and Process Layers

The GRC layer comprises mandatory components like risks, controls and compliance elements. Traditionally, these are manually documented, monitored and managed.

For example, if tomorrow you decide you want ISO (International Organization for Standardization) 27001 compliance for your information security management system, you can go to the appropriate ISO site, download the entire standard with all the assessments with all the descriptions, mandates, questions and documents that you will need to provide. All of these items would comprise the GRC layer.

However, many organizations maintain Excel files with risk and control information and other Office files with compliance files and information in isolation. Or some of these files are uploaded to various systems, but they don’t talk to each other or any other enterprise systems for that matter. This is the data layer, which is factual, objective and, as opposed to the GRC layer, can be either fully or partly automated.

Now, let’s add the process layer to the equation. Why? Because that is where the GRC and data layers meet. How? Processes produce, process and consume data –information captured in the metadata layer. By following the process sequence, I can actually trace the data lineage as it flows across the entire business ecosystem, beyond the application layer.

Taking it further, from processes, I can look at how the data is being managed by my capabilities. In other words, if I do have a data breach, how do I mitigate it? What impact will it have on my organization? And what are the necessary controls to manage it? Looking at them from right to left, I can identify the effected systems, and I can identify the interfaces between systems.

Mitigating Data Breaches

Most data breaches happen either at the database or interface level. Interfaces are how applications talk to each other.

Organizations are showing immense interest in expanding the development of risk profiles, not only for isolated layers but also in how those layers interact – how applications talk to each other, how processes use data, how data is stored, and how infrastructure is managed. Understanding these profiles allows for more targeted and even preemptive risk mitigation, enabling organizations to fortify their weak points with sufficient controls but also practical and effective processes.

We’re moving from a world in which everything is performed manually and in isolation to one that is fully automated and integrated.

erwin instructs how to document and manage governance, risk and compliance using business process modeling and enterprise architecture solution erwin Evolve.

The C-Level Demands GRC Real-Time Impact Analysis

Impact analysis is critical. Everything needs to be clearly documented, covering all important and relevant aspects. No service, capability or delivery process is considered complete unless the risks and controls that affect it, or are implemented through it, are mapped and that assessment is used to generate risk profiles for the process, service or capability. And the demand for this to happen automatically increases daily.

This is now one of the key mandates across many organizations. C-level executives now demand risk profile dashboards at the process ,organizational and local level.

For example, an executive travelling from one country to another, or from one continent to another, can make a query: “I’m traveling to X, so what is the country’s risk profile and how is it being managed What do I need to be aware of or address while I’m there?” Or when a new legislation is introduced affecting multiple countries, the impact of that legislation to those countries’ risk profiles can be quickly and accurately calculated and actions planned accordingly.

erwin Evolve

GRC is more critical than ever. Organizations and specifically the C-suite are demanding to see risk profiles at different slices and dices of a particular process. But this is impossible without automation.

erwin Evolve is a full-featured, configurable enterprise architecture (EA) and BP modeling and analysis software suite that aids regulatory and industry compliance and maps business systems that support the enterprise. Its automated visualization, documentation and enterprise collaboration capabilities turn EA and BP artifacts into insights both IT and business users can access in a central location for making strategic decisions and managing GRC.

Please click here to start your free trial of erwin Evolve.

Categories
erwin Expert Blog Data Governance

Are Data Governance Bottlenecks Holding You Back?

Better decision-making has now topped compliance as the primary driver of data governance. However, organizations still encounter a number of bottlenecks that may hold them back from fully realizing the value of their data in producing timely and relevant business insights.

While acknowledging that data governance is about more than risk management and regulatory compliance may indicate that companies are more confident in their data, the data governance practice is nonetheless growing in complexity because of more:

  • Data to handle, much of it unstructured
  • Sources, like IoT
  • Points of integration
  • Regulations

Without an accurate, high-quality, real-time enterprise data pipeline, it will be difficult to uncover the necessary intelligence to make optimal business decisions.

So what’s holding organizations back from fully using their data to make better, smarter business decisions?

Data Governance Bottlenecks

erwin’s 2020 State of Data Governance and Automation report, based on a survey of business and technology professionals at organizations of various sizes and across numerous industries, examined the role of automation in  data governance and intelligence  efforts.  It uncovered a number of obstacles that organizations have to overcome to improve their data operations.

The No.1 bottleneck, according to 62 percent of respondents, was documenting complete data lineage. Understanding the quality of source data is the next most serious bottleneck (58 percent); followed by finding, identifying, and harvesting data (55 percent); and curating assets with business context (52 percent).

The report revealed that all but two of the possible bottlenecks were marked by more than 50 percent of respondents. Clearly, there’s a massive need for a data governance framework to keep these obstacles from stymying enterprise innovation.

As we zeroed in on the bottlenecks of day-to-day operations, 25 percent of respondents said length of project/delivery time was the most significant challenge, followed by data quality/accuracy is next at 24 percent, time to value at 16 percent, and reliance on developer and other technical resources at 13 percent.

Are Data Governance Bottlenecks Holding You Back?

Overcoming Data Governance Bottlenecks

The 80/20 rule describes the unfortunate reality for many data stewards: they spend 80 percent of their time finding, cleaning and reorganizing huge amounts of data and only 20 percent on actual data analysis.

In fact, we found that close to 70 percent of our survey respondents spent an average of 10 or more hours per week on data-related activities, most of it searching for and preparing data.

What can you do to reverse the 80/20 rule and subsequently overcome data governance bottlenecks?

1. Don’t ignore the complexity of data lineage: It’s a risky endeavor to support data lineage using a manual approach, and businesses that attempt it that way will find it’s not sustainable given data’s constant movement from one place to another via multiple routes – and doing it correctly down to the column level. Adopting automated end-to-end lineage makes it possible to view data movement from the source to reporting structures, providing a comprehensive and detailed view of data in motion.

2. Automate code generation: Alleviate the need for developers to hand code connections from data sources to target schema. Mapping data elements to their sources within a single repository to determine data lineage and harmonize data integration across platforms reduces the need for specialized, technical resources with knowledge of ETL and database procedural code. It also makes it easier for business analysts, data architects, ETL developers, testers and project managers to collaborate for faster decision-making.

3. Use an integrated impact analysis solution: By automating data due diligence for IT you can deliver operational intelligence to the business. Business users benefit from automating impact analysis to better examine value and prioritize individual data sets. Impact analysis has equal importance to IT for automatically tracking changes and understanding how data from one system feeds other systems and reports. This is an aspect of data lineage, created from technical metadata, ensuring nothing “breaks” along the change train.

4. Put data quality first: Users must have confidence in the data they use for analytics. Automating and matching business terms with data assets and documenting lineage down to the column level are critical to good decision-making. If this approach hasn’t been the case to date, enterprises should take a few steps back to review data quality measures before jumping into automating data analytics.

5. Catalog data using a solution with a broad set of metadata connectors: All data sources will be leveraged, including big data, ETL platforms, BI reports, modeling tools, mainframe, and relational data as well as data from many other types of systems. Don’t settle for a data catalog from an emerging vendor that only supports a narrow swath of newer technologies, and don’t rely on a catalog from a legacy provider that may supply only connectors for standard, more mature data sources.

6. Stress data literacy: You want to ensure that data assets are used strategically. Automation expedites the benefits of data cataloging. Curated internal and external datasets for a range of content authors doubles business benefits and ensures effective management and monetization of data assets in the long-term if linked to broader data governance, data quality and metadata management initiatives. There’s a clear connection to data literacy here because of its foundation in business glossaries and socializing data so all stakeholders can view and understand it within the context of their roles.

7. Make automation the norm across all data governance processes: Too many companies still live in a world where data governance is a high-level mandate, not practically implemented. To fully realize the advantages of data governance and the power of data intelligence, data operations must be automated across the board. Without automated data management, the governance housekeeping load on the business will be so great that data quality will inevitably suffer. Being able to account for all enterprise data and resolve disparity in data sources and silos using manual approaches is wishful thinking.

8. Craft your data governance strategy before making any investments: Gather multiple stakeholders—both business and IT— with multiple viewpoints to discover where their needs mesh and where they diverge and what represents the greatest pain points to the business. Solve for these first, but build buy-in by creating a layered, comprehensive strategy that ultimately will address most issues. From there, it’s on to matching your needs to an automated data governance solution that squares with business and IT – both for immediate requirements and future plans.

Register now for the first of a new, six-part webinar series on the practice of data governance and how to proactively deal with the complexities. “The What & Why of Data Governance” webinar on Tuesday, Feb. 23rd at 3 pm GMT/10 am ET.

Categories
Enterprise Architecture erwin Expert Blog

Top 7 Enterprise Architecture Certifications

Enterprise architecture certifications and the professionals who obtain them give organizations more confidence in their enterprise architecture initiatives. This post outlines the best enterprise architecture certifications and certifications for enterprise architects.

Enterprise architecture (EA) helps align business and IT efforts by documenting and mapping data, applications and assets to the functions they support.

While a number of different approaches exist, EA must be performed in line with recognized frameworks in order to be sustainable. EA certifications serve as proof that an enterprise architect is familiar with these frameworks – benefiting both the organization and the EA professional.

Think about it. If your organization’s EA was modeled and mapped according to an enterprise architect’s own standards – with no basis in recognized frameworks – your EA projects become tied to that employee.

With this approach, complications become inevitable.

A few of the potential issues with tying your EA standards to a person rather than using enterprise architecture frameworks are:

  • The enterprise architect would have to teach the relevant stakeholders how to understand their models to effectively communicate their plans.
  • Should the enterprise architect leave the organization, the enterprise architecture practice leaves with them, so future projects would have to be built from the ground up.

Top 7 Enterprise Architect Certifications and What They Cost

Each of the following certifications for enterprise architects have their own advantages – from cost, to versatility, to their suitability in specialist use-cases.

Below lists the top 7 enterprise architecture certifications in order of cost:

Enterprise Architecture Certification

The Best EA Certifications

While the enterprise architecture certification that’s best for you is subjective, you should let the context influence your decision.

As a new enterprise architect, prioritizing widely recognized certifications for the most common frameworks makes the most sense. The Open Group TOGAF 9 Certification is a good place to start

Vendor-agnostic and globally-recognized, the versatility afforded by a TOGAF certification is certainly worth pursuing.

However, some organizations will prioritize recruiting enterprise architects with more specialty or platform-specific certifications.

A common example here is Amazon’s AWS certification: AWS Certified Solution Architect.

Amazon state that AWS Certified Solution Architects can “effectively demonstrate knowledge of how to architect and deploy secure and robust applications on AWS technologies” among other skills the certification validates.

Earning this certification requires hands-on experience with AWS services. Amazon recommends “experience using compute, networking, storage, and database AWS services” as well as “an understanding of the basic architectural principles of building on the AWS Cloud.”

And considering current trends, certifications for cloud platforms are becoming increasingly relevant.

The Google Professional Cloud Architect certification and the Professional Cloud Solutions Architect Certification are two cloud-oriented certifications.

Benefits of Enterprise Architecture Certifications for Employers

Employing certified enterprise architects benefits organizations in the following ways:

  • Organizations can ensure candidates they interview and eventually hire are up to speed with the frameworks currently observed within the organization
  • They can achieve faster time to markets because new enterprise architects are brought up to speed more quickly.
  • They have greater ability to and ease in expanding and upscaling their enterprise architecture initiatives
  • They have greater confidence their efforts are sustainable beyond the tenure of any one enterprise architect

Benefits of Enterprise Architecture Certifications for Enterprise Architects

For enterprise architects, EA certifications offer these benefits:

  • Collaborative enterprise architecture becomes easier to implement and manage, since everyone speaks the same language
  • Certified enterprise architects have demonstrated their understanding of particular EA, solution architecture, cloud architecture or technical architecture frameworks and validated their skills
  • Certified enterprise architects enjoy improved employability because organizations will look for EAs with specific certifications for continuity in existing projects
  • Certified enterprise architects can command a premium; the average salary for a TOGAF certified enterprise architect is $137,000

More on enterprise architecture:

Categories
erwin Expert Blog

Cloud Migration and the Importance of Data Governance

Tackling data-related challenges to keep cloud migration projects on track and optimized

By Wendy Petty 

The cloud has many operational and competitive advantages, so cloud-first and other cloud transformation initiatives continue to be among the top data projects organizations are pursuing.

For many of those yet to adopt and adapt, it is a case of “when” not “if” the enterprise will undergo a form of digital transformation requiring data migration to the cloud.

Due to today’s prevalence of internal and external market disruptors, many organizations are aligning their digital transformation and cloud migration efforts with other strategic requirements (e.g., compliance with the General Data Protection Regulation).

And now organizations also must navigate a post-COVID world, which is forcing organizations to fast track their cloud migrations to become more agile, lean and focused on business outcomes that will enable the business to survive and then thrive new market dynamics.

However, cloud migration is not just a lift and shift, a one-off or a silver bullet. Usually when organizations go from an on-premises environment to a cloud environment, they are actually converting two different technologies. And as you migrate to the cloud, you need to keep in mind some data-related challenges.

cloud migration data governance

Dollars and Cents

For 47 percent of enterprise companies, cost optimization is the main reason they migrate to the cloud. However, cloud migrations can be expensive, with costs piling up the longer a migration takes to complete.

Not only are cloud migrations generally expensive, but many companies don’t budget for them appropriately. In 2020, companies went over their public cloud spend budget by an average of 23 percent. Most likely, this comes down to a lack of planning, leading to long, drawn-out migrations and ill-informed product decisions. Additionally, completely manual migrations generally take longer and cost more than those that employ automation.

In terms of budget and cost, automated tools that scan repositories in your environment help by adding structure and business context (where it is, who can access it, etc.) in the transformation of legacy structures. New structures will enable new capabilities for your data and business processes.

Automated tools can help you lower risks and costs and reduce the time it takes to realize value. Automated software handles data cataloging and locates, models and governs cloud data assets.

Tools that help IT organizations plan and execute their cloud migrations aren’t difficult to find. Many large cloud providers offer tools to help ease the migration to their platform. But a technology-agnostic approach to such tools adds value to cloud migration projects.

Proprietary tools from cloud vendors funnel clients into a single preferred environment. Agnostic tools, on the other hand, help organizations understand which cloud environment is best for them. Their goal is to identify the cloud platform and strategy that will deliver the most value after taking budget and feature requirements into account.

Institutional Knowledge

Institutional knowledge is another obstacle many companies face when exploring cloud migrations. People leave the organization and take with them an understanding of how and why things are done. Because of this, you may not know what data you have or how you should be using it.

The challenge comes when it’s time to migrate; you need to understand what you have, how it’s used, what its value is, and what should be migrated. Otherwise, you may spend time and money migrating data, only to discover that no one has touched it in several years and it wasn’t necessary for you to retain it.

In addition, if you’re planning to use a multi-cloud approach, you need to ensure that the clouds you work with are compatible. Only 24 percent of IT organizations have a high degree of interoperability between their cloud environments. This means that more than three-quarters suffer from inefficient cloud setups and can’t readily combine or analyze data from multiple cloud environments.

Data Governance

Migrating enterprise data to the cloud is only half the story – once there, it has to be governed. That means your cloud data assets must be available for use by the right people for the right purposes to maximize their security, quality and value.

Around 60 percent of enterprises worry about regulatory issues, governance and compliance with cloud services. The difficulty comes with creating good governance around data while avoiding risk and getting more out of that data. More than three-quarters (79 percent) of businesses are looking for better integrated security and governance for the data they put in the cloud.

Cloud migration provides a unique opportunity not simply to move things as they are to the cloud but also to make strategic changes. Companies are using the move to the cloud to make data governance a priority and show their customers they are good data stewards.

Unfortunately, 72 percent of companies state that deciding which workloads they should migrate to the cloud is one of their top four hurdles to cloud implementation. However, cloud migration is not an endpoint; it’s just the next step in making your business flexible and agile for the long term.

Determining which data sets need to be migrated can help you prepare for growth in the long run. The degree of governance each set of data needs will help determine what you should migrate and what you should keep in place.

Automated Cloud Migration and Data Governance

The preceding list of cloud migration challenges might seem daunting, especially for an organization that collects and manages a great deal of data. When enterprises face the prospect of manual, cumbersome work related to their business processes, IT infrastructure, and more, they often turn to automation.

You can apply the same idea to your cloud migration strategy because automated software tools can aid in the planning and heavy lifting of cloud migrations. As such, they should be considered when it comes to choosing platforms, forecasting costs, and understanding the value of the data being considered for migration.

erwin Cloud Catalyst is a suite of automated cloud migration and data governance software and services to simplify and accelerate the move to cloud platforms and govern those data assets throughout their lifecycle. Automation is a critical differentiator for erwin’s cloud migration and data governance tools.

Key Benefits of erwin Cloud Catalyst:

  • Cost Mitigation: Automated tools scan repositories in your environment and add structure and business context (where it is, who can access it, etc.) in the transformation of legacy structures.
  • Reduced Risk and Faster Time to Value: Automated tools can help you reduce risks, costs and the time it takes to realize value.
  • Tech-Agnostic: Technology-agnostic approach adds value to cloud migration projects.
  • Any Cloud to Any Cloud: Automatically gathering the abstracted essence of the data will make it easier to point that information at another cloud platform or technology if, or likely when, you migrate again.
  • Institutional Knowledge Retention: Collect and retain institutional knowledge around data and enable transparency.
  • Continuous Data Governance: Automation helps IT organizations address data governance during cloud migrations and then for the rest of the cloud data lifecycle and minimizes human intervention.

Every customer’s environment and data is unique. That’s why the first step is working with you to assess your cloud migration strategy. Then we deliver an automation roadmap and design the appropriate smart data connectors to help your IT services team achieve your future-state architecture, including accelerating data ingestion and ETL conversion.

To get started, request your cloud-readiness assessment.

And here’s a video with some more information about our approach to cloud migration and data governance.

Gartner Magic Quadrant

Categories
erwin Expert Blog

Data Intelligence in the Next Normal; Why, Who and When?

While many believe that the dawn of a new year represents a clean slate or a blank canvas, we simply don’t leave the past behind by merely flipping over a page in the calendar.

As we enter 2021, we will also be building off the events of  2020 – both positive and negative – including the acceleration of digital transformation as the next normal begins to be defined.

data intelligence

As the pandemic took hold, IDC surveyed technology users and decision makers around the globe, reaching out every two weeks until September, when the survey frequency shifted to monthly. These surveys helped IDC develop a model that describes the five stages of enterprise recovery, aligning business focus with the economic situation:

  • When the COVID-19 crisis hit, organizations focused on business continuity.
  • As the economy slowed, they focused on cost optimization.
  • In the recession period, their focus turned to business resiliency.
  • As the economy returns to growth, organizations are making targeted investments.
  • When we enter into the next normal, the future enterprise will emerge.

The IDC surveys explored how the crisis impacted budgets across different areas of IT, from hardware and networking, to software and professional services. When the pandemic first hit, there was some negative impact on big data and analytics spending.

However, the economic situation changed as time went on. Digital transformation was accelerated, and budgets for spending on big data and analytics increased. This spending has continued during the return to growth, with more organizations moving toward becoming the future enterprise.

I have long stated that data is the lifeblood of digital transformation, and if the pandemic really has accelerated digital transformation, then the trends reported in IDC’s worldwide surveys make sense.

But data without intelligence is just data, and this is WHY data intelligence is required.

Data intelligence is a key input to data enablement in the digital enterprise, both by improving data literacy among data-native workers and by assuring the right data is being used at the right time, and for the right reason(s).

WHO needs to be involved in implementing and using data intelligence in the digital enterprise?

There is an ever-growing number of roles that work with data daily to complete tasks, make decisions, and affect business outcomes. These roles range from technical to business, from operations to strategy, and from the back office to the front office.

IDC has defined people in these roles as a generation: “Generation Data,” or “Gen-D” for short. Gen-D workers are data-natives — data is what they work in and work with to complete their tasks, tactical and/or strategic.

You may be part of Gen-D if “data” is in your job title, you are expected to make data-driven decisions, and you are able to use data to communicate with others. Gen-D workers also contribute to the overall data knowledge in the organization by participating in data intelligence and data literacy efforts and promoting good data culture.

WHEN do you need to gather intelligence about your data?

Now is the time.

The next or new normal has already begun and the more you know about your data, the better your digital business outcomes will be. It has been said that while it can take a long time to gain a customer’s trust, it only takes one bad experience to lose it.

Personally, I have had several instances of poor digital experiences such as items sent to the wrong address or orders (including mobile food orders) being fulfilled incorrectly.

Each represents a data problem: incorrect data, incorrect data interpretation, or a complete disconnect between the virtual and physical world. In these cases, better data intelligence could have helped in assuring the correct address, enabling correct order fulfillment, and assisting with interpretation through better data definition and description.

Even if you don’t have a formal data intelligence program in place, there is a good possibility your organization has intelligence about its data, because it is difficult for data to exist without some form of associated metadata.

Technical metadata is what makes up database schema and table definitions. Logical and physical data models may exist in data modeling or general-purpose diagraming software.

There is also a high likelihood that data models, data dictionaries, and data catalogs exist in the ubiquitous spreadsheet, or in centralized document repositories. However, just having metadata isn’t the same as managing and leveraging it as intelligence. Data in modern business environments is very dynamic, constantly moving, drifting, and shifting – requiring automated collection, management, and analytics to extract and leverage intelligence about it.

In many English-speaking countries, “Auld Lang Syne,” a Scots-language poem written by Robbie Burns and set to a common folk song tune, is often sung as the clock strikes midnight on the first day of the new year.

The phrase “auld lang syne” has several interpretations, but it can loosely be translated as “for the sake of old times.” As we move into 2021, we need to forget the negatives of 2020, and build on the positives to help define the next normal.

Categories
erwin Expert Blog

erwin’s Predictions for 2021: Data Relevance Shines at the End of the Tunnel

I think it’s fair to say that we’re all glad to usher in a New Year.

Unexpected, unprecedented and disruptive, 2020 required radical transformation – within global organizations and our day-to-day lives.

Our predictions for 2021 are rooted in what we’ve learned from the past year and the relevance of data in getting us to where we are and where we need to go. With that said, here are seven predictions for the New Year:

Data predictions 2021

1. “Death of the Office” Alters Compliance:

Because of the pandemic, many organizations have decided remote work is here to stay. However, challenges persist if your organization doesn’t take proper precautions in supporting a remote workforce — from human resources to productivity and IT security – especially when regulations such as the European Union’s General Data Protection Regulation (GDPR) are involved.

In highly regulated environments, such as financial services, healthcare and pharma, attestations, audit trails and compliance reporting are required regardless of circumstances and will be difficult with a manual, laborious approach. As a result, look for more automated compliance solutions to hit the market.

2. Data’s Move to the Cloud Hits Warp Speed:

As a result of COVID-19, enterprises are expediting their digital transformation and cloud migration efforts at record rates, with no end in sight. Historically, moving legacy data to the cloud hasn’t been easy or fast.

As businesses migrate from legacy systems to the cloud, data governance and data intelligence will become increasingly relevant to the C-suite and tools to automate and expedite the process will take center stage.

3. AI-Fueled Data Governance:

Artificial intelligence (AI) has been narrowly tied to the internet of things (IoT) with smart features like Alexa, Nest and self-driving cars. However, that definition is too narrow in terms of AI’s relation to data governance.

AI will drive automation to ensure an organization’s framework is always accurate and up to date. AI-powered data governance will ensure data environments always stay up to date and controlled, enabling compliance and tagging of sensitive data.

4. Rise of Data Governance Standards:

Despite the rise of data governance, industry-wide standards have been lacking, surprisingly. This year, that will change as semantic metadata comes into its own driven by Microsoft’s Common Data Model (CDM). The CDM provides a best-practices approach to defining data to accelerate data literacy, automation, integration and governance across the enterprise. We anticipate more standards to be developed and released in CDM’s wake.

5. EA Becomes Integral to Innovation:

Enterprise architecture (EA) is having a resurgence as COVID-19 has forced organizations around the globe to re-examine or reimagine themselves. However, even in “normal times,” business leaders need to understand how to grow, bring new products to market through organic growth or acquisition, identify new trends and opportunities, determine if new opportunities provide a return on investment, etc.

Organizations that can identify these opportunities and respond to them have a distinct edge over their competitors. EA will become even more critical to innovation in the year ahead.

6. Data Becomes a Matter of Life or Death:

To say that data will be the difference between life and death in 2021 is not hyperbole. The initial rounds of COVID 19 vaccines are being distributed, and at the heart of these monumental medical breakthroughs is data.

From re-engineering research and development to managing trials and regulatory approvals to setting up manufacturing and distribution, data has made and will continue to make a life-and-death difference.

Figuratively, data will mean life or death to businesses around the world to restoring some sort of normal activities, organizations across all industries will rely on data to make it happen. That data will need to be accurate and trusted so it can lead to actionable insights – and it will need to be governed throughout its lifecycle.

7. Workplace Humanity:

In 2020, the boundaries separating work from the rest of our lives shrank considerably. Whether it was via Zoom, Microsoft Teams, Google Hangout, etc., the pandemic forced us to bring our true selves, our spouses, our kids and our pets to work last year.

As a result, we’ve developed a new level of humanity in the workplace, which may be the ultimate silver lining from these dark times. From employees to customers, we’ve had to communicate more, listen more, and help each other more. My hope is that this trend of being human in the workplace extends beyond 2021.

COVID changed everything. However, I believe we will start to see the light at the end of this dark tunnel in 2021. Innovation will accelerate at a pace we haven’t seen in years, perhaps not since the dot-com boom. But to succeed in the new normal and avoid false starts and bad decisions, enterprises must be able to find, navigate, understand and use their most valuable data assets.

Metadata and its management and governance are key to organizations being able to fully harness the benefits of their data. As such, erwin is pleased to sponsor the first webinar of Dataversity’s 2021 Real-World Data Governance Webinar Series: “Data and Metadata Will Not Govern Themselves.” Click here to learn more and register.

Categories
erwin Expert Blog Data Modeling

erwin, Microsoft and the Power of the Common Data Model

What is Microsoft’s Common Data Model (CDM), and why is it so powerful?

Imagine if every person in your organization spoke a different language, and you had no simple way to translate what they were saying? It would make your work frustrating, complicated and slow.

The same is true for data, with a number of vendors creating data models by vertical industry (financial services, healthcare, etc.) and making them commercially available to improve how organizations understand and work with their data assets. The CDM takes this concept to the next level.

Microsoft has delivered a critical building block for the data-driven enterprise by capturing proven business data constructs and semantic descriptors for data across a wide range of business domains in the CDM and providing the contents in an open-source format for consumption and integration. The CDM provides a best-practices approach to defining data to accelerate data literacy, automation, integration and governance across the enterprise.

Why Is the CDM Such a Big Deal?

The value of the CDM shows up in multiple ways. One is enabling data to be unified. Another is to reduce the time and effort in manual mapping – ultimately saving the organization money.

By having a single definition of something, complex ETL doesn’t have to be performed repeatedly. Once something is defined, then then everyone can map to the standard definition of what the data means.

Beyond saving time, effort and money, CDM can help transform your business in even more ways, including:

  • Innovation: With data having a common meaning, the business can unlock new scenarios, like modern and advanced analytics, experiential analytics, AI, email, etc.
  • Insights: Given the meaning of the data is the same, regardless of the domain it came from, an organization can use its data to power business insights.
  • Compliance: It improves data governance to comply with such regulations as the General Data Protection Regulation (GDPR).
  • Cloud migration and other data platform modernization efforts: definition is missing here.

Once the organization understands what something is, and it is commonly understood across the enterprise, anyone can build semantically aware reporting and analytical requirements plus deliver a uniform view because there is a common understanding of data.

Data Modeling Tool

erwin Expands Collaboration with Microsoft

The combination of Microsoft’s CDM with erwin’s industry-leading data modeling, governance and automation solutions can optimize an organization’s data capability and accelerate the impact and business value of enterprise data.

erwin recently announced its expanded collaboration with Microsoft. By working together, the companies will help organizations get a handle on disparate data, put it in one place, and then determine how to do something meaningful with it.

The erwin solutions that use Microsoft’s CDM are:

erwin Data Modeler: erwin DM automatically transforms the CDM into a graphical model, complete with business-data constructs and semantic metadata, to feed your existing data-source models and new database designs – regardless of the technology upon which these structures are deployed.

erwin DM’s reusable model templates, design layer and model compare/synchronization capabilities, combined with our design lifecycle and modeler collaboration services, enables organizations to capture and use CDM contents and best practices to optimize enterprise data definition, design and deployment.

erwin DM also enables the reuse of the CDM in the design and maintenance of enterprise data sources. It automatically consumes, integrates and maintains CDM metadata in a standardized, reusable design and supports logical and physical modeling and integration with all major DBMS technologies.

The erwin Data Intelligence Suite: erwin DI automatically scans, captures and activates metadata from the CDM into a central business glossary. Here, it is intelligently integrated and connected to the metadata from the data sources that feed enterprise applications.

Your comprehensive metadata landscape, including CDM metadata, is governed with the appropriate terminology, policies, rules and other business classifications you decide to build into your framework.

The resulting data intelligence is then discoverable via a self-service business user portal that provides role-based, contextual views. All this metadata-driven automation is possible thanks to erwin DI’s ability to consume and associate CDM metadata to create a data intelligence framework.

erwin and Microsoft recently co-presented a session on the power of the CDM that included a demonstration of how to create a data lake for disparate data sources, migrate all that data to it, and then provide business users with contextual views of the underlying metadata, based on a CDM-enriched business glossary.

The simulation also discussed the automatic generation of scripts for ETL tools, as well as the auto generation of data lineage diagrams and impact analysis so data governance is built in and continuous.

You can watch the full erwin/Microsoft session here.

Data Modeling Data Goverance

Categories
erwin Expert Blog

Learn. Transform. Advance.

Learn. Transform. Advance.

That was the theme of the global conference we produced in October, but I’d venture to say it’s the mantra global organizations need to adopt as we continue to deal with the most disruptive event of our lifetime: COVID-19.

Learn

When I look back, I should not be surprised that 2020 has been incredibly busy for us here at erwin. I’ve made decisions during the past nine months I never thought I’d have to, and I’m sure this is the reality for you and your organization as well.

As CEO, of course I had big plans for us, but then the world and everything in it changed. However, data is central to our mission, and data is the key to sound decision-making. Thankfully, we were in the right business at the right time.

After making sure our employees were safe and equipped to work from home, once we closed our offices, we emerged with more focus on that mission and more time to spend with customers. All the time I used to spend traveling, I now spend talking – or listening – to my employees or our customers from my dining room table.

Transform

To survive and eventually thrive in the face of radical disruption requires transformation that’s just as radical:

  • Developing new business models, like breweries and distillers manufacturing hand sanitizer
  • Reimaging current business models, like moving fitness classes outdoors
  • Creating new products and services, such as restaurants providing text check-in curb service
  • Market expansion, with traditional grocers becoming online shopping hubs

The companies that come out of this historic period of global uncertainty and change will be those that take an intelligent, data-driven approach to business.

The best example I can point to, which underscores the importance and relevance of data, is pharmaceutical companies that have been working at unprecedented speed to develop not only effective treatments for the coronavirus but also vaccines. Earlier this week, several historic announcements were made about the availability of the first shots and the plans to ship and administer them.

At the heart of these medical breakthroughs is data. From re-engineering research and development to managing trials and regulatory approvals to setting up manufacturing and distribution, data has made and will continue to make a life-and-death difference. Although I can’t share the details, I’m proud that erwin had a hand in this massive global health initiative.

While your organization may not be on the front line in the fight against the virus, you’ve no doubt been impacted by it. Have you had to revisit and reset priorities? Change your business processes? Confront new compliance challenges as a result of the changes you had to institute?

And when all the crazy ends, and it will end, are you prepared to capitalize on new opportunities?

Advance

Adversity spurs innovation, and I believe innovation will drive forward at a rapid pace once the global ecosystem finds its new normal. We might well be on our way. Will you be ready?

Enterprises must be able to find, navigate, understand and use their most valuable data assets. We knew that to be true before the pandemic, but now it’s become even more obvious.

Despite social distancing, I have to say our customer intimacy has never been greater. I believe that’s because our employees are operating with a heightened sense of purpose and urgency. We focused on doing what’s right, including making training for our solutions and other resources available online at no charge and releasing a free app to help organizations communicate and manage their remote workforces more effectively. We also made professional and consulting services more accessible.

The point is that if you stay close to your employees and your customers, good things will happen. Over communication is key. Those relationships will develop and deepen, even via Zoom or Microsoft Teams. Perhaps that’s because we’ve all realized how much we have in common. I might have a “C” in my title, but my wife is still annoyed that I’ve hijacked the dining room, and my daughters make a lot of noise being home instead of at school. Titles have taken a backseat to humanity as we all try to stay healthy, sane and accomplish what’s on our to-do lists each day.

Finding that balance is important. You must make time for your family and yourself because we do seem to be working longer and harder. Exercise. Listen to less news. Be thankful.

And with that, I want to express my appreciation to our customers, partners, employees and their families. I wish each of you health, time to enjoy the holiday season in a meaningful but perhaps smaller way, and here is to a brighter 2021 together.

erwin Insights 2020 on demand

Categories
Data Intelligence erwin Expert Blog

From Chaos to Control with Data Intelligence

As the amount of data grows exponentially, organizations turn to data intelligence to reach deeper conclusions about driving revenue, achieving regulatory compliance and accomplishing other strategic objectives.

It’s no secret that data has grown in volume, variety and velocity, with 2.5 quintillion bytes generated every day and 90 percent of the world’s data volume created just in the last two years. This data explosion has overwhelmed most organizations, making it nearly impossible for them to manage much less put to smart, strategic use. How do you identify the time-sensitive, relevant insights that could mean the difference between the life and death of your business?

Data Intelligence

Time sensitivity in data management and analytics is a massive issue. Data needs to fuel rapid decisions that make your organization more effective, customer-centric and competitive. That was true before COVID 19, and it’s even more important in the face of the radical disruption it’s caused. The answer is radical transformation, made possible by an intelligent, data-driven approach to:

  • New business models
  • New products and services
  • Hyper-competition
  • Market expansion

One Customer’s Journey to Controlling Data Chaos

Ultra Mobile recently shared how it uses erwin Data Intelligence (erwin DI) as part of a modern, ongoing approach to data governance and therefore control versus chaos.

To manage not only risk but also to grow and compete effectively requires the ability to deal with both planned and unplanned change.

With erwin DI as its data governance platform, Ultra Mobile has a one-stop shop to see all of its data – and data changes – in one place thanks to forward and reverse lineage. Now, questions about the health of the business can be answered, including how best to retain customers, and it can explore ways to grow the subscriber base.

Being able to integrate all data touchpoints, including erwin DM for data modeling, Denodo for data visualization, and Jira for ticketing, has been key. This metadata is ingested into the data catalog, definitions are added within a business glossary, and the searchable repository enables users to understand how data is used and stored.

erwin DI’s mind map also has proved helpful with being able to see associations and entity relationships, especially in terms of impact analysis for evaluating planned changes and their downstream effects.

Watch the full webinar.

Data Intelligence Just Got Smarter

erwin just released a new version of erwin DI. The enhancements include improvements to the user interface (UI), plus new artificial intelligence (AI) and self-service data discovery capabilities.

 

The new erwin DI makes it easier for organizations to tailor the solution to meet the unique needs of their data governance frameworks, identify and socialize the most valuable data assets, and expand metadata scanning and sensitive data tracking.

Using erwin DI, customers are powering comprehensive data governance initiatives, cloud migration and other massive digital transformation projects.

It facilitates both IT- and business-friendly discovery, navigation and understanding of data assets within context and in line with governance protocols.

And it provides organizations with even more flexibility to ensure the software fits their unique frameworks and workflows because one size does not fit all when it comes to data governance.

Backed by a flexible metamodel and deep metadata-driven automation, the updated erwin DI uniquely addresses both IT and business data governance needs to safeguard against risks and harness opportunities.

It combines and then raises the visibility of business and physical data assets in a framework that is flexible but always in sync and therefore sustainable. Then stakeholders from across the enterprise can discover, manage and collaborate on the most relevant and valuable data assets.

The latest erwin DI release builds on prior 2020 updates with:

  • New role-based and governance assignment capabilities, making it easier for an organization to tailor erwin DI to its data governance needs and framework
  • Enhanced UI, workflow and search to speed navigation, asset discovery, contextual understanding and data governance management
  • Expanded AI capabilities to enrich metadata scanning and speed the handling of sensitive data for automated GDPR and CCPA compliance programs
  • Greater visibility into business and data lineage through new vantage points, filters and drilldowns
  • Improved socialization and collaboration features to increase business user engagement and capitalize on organizational data quality knowledge
  • More administrative tools to efficiently onboard new users and roles, manage access rights, and address audit requests

Additionally, erwin DI was recently evaluated by Gartner for the 2020 “Metadata Management Solutions Magic Quadrant,” which named erwin as a “Leader” for the second consecutive year. Click here to download a copy of the Gartner Magic Quadrant Report.

You also can request a free demo of erwin DI here.

Gartner Magic Quadrant