Categories
erwin Expert Blog

Choosing the Right Data Modeling Tool

The need for an effective data modeling tool is more significant than ever.

For decades, data modeling has provided the optimal way to design and deploy new relational databases with high-quality data sources and support application development. But it provides even greater value for modern enterprises where critical data exists in both structured and unstructured formats and lives both on premise and in the cloud.

In today’s hyper-competitive, data-driven business landscape, organizations are awash with data and the applications, databases and schema required to manage it.

For example, an organization may have 300 applications, with 50 different databases and a different schema for each. Additional challenges, such as increasing regulatory pressures – from the General Data Protection Regulation (GDPR) to the Health Insurance Privacy and Portability Act (HIPPA) – and growing stores of unstructured data also underscore the increasing importance of a data modeling tool.

Data modeling, quite simply, describes the process of discovering, analyzing, representing and communicating data requirements in a precise form called the data model. There’s an expression: measure twice, cut once. Data modeling is the upfront “measuring tool” that helps organizations reduce time and avoid guesswork in a low-cost environment.

From a business-outcome perspective, a data modeling tool is used to help organizations:

  • Effectively manage and govern massive volumes of data
  • Consolidate and build applications with hybrid architectures, including traditional, Big Data, cloud and on premise
  • Support expanding regulatory requirements, such as GDPR and the California Consumer Privacy Act (CCPA)
  • Simplify collaboration across key roles and improve information alignment
  • Improve business processes for operational efficiency and compliance
  • Empower employees with self-service access for enterprise data capability, fluency and accountability

Data Modeling Tool

Evaluating a Data Modeling Tool – Key Features

Organizations seeking to invest in a new data modeling tool should consider these four key features.

  1. Ability to visualize business and technical database structures through an integrated, graphical model.

Due to the amount of database platforms available, it’s important that an organization’s data modeling tool supports a sufficient (to your organization) array of platforms. The chosen data modeling tool should be able to read the technical formats of each of these platforms and translate them into highly graphical models rich in metadata. Schema can be deployed from models in an automated fashion and iteratively updated so that new development can take place via model-driven design.

  1. Empowering of end-user BI/analytics by data source discovery, analysis and integration. 

A data modeling tool should give business users confidence in the information they use to make decisions. Such confidence comes from the ability to provide a common, contextual, easily accessible source of data element definitions to ensure they are able to draw upon the correct data; understand what it represents, including where it comes from; and know how it’s connected to other entities.

A data modeling tool can also be used to pull in data sources via self-service BI and analytics dashboards. The data modeling tool should also have the ability to integrate its models into whatever format is required for downstream consumption.

  1. The ability to store business definitions and data-centric business rules in the model along with technical database schemas, procedures and other information.

With business definitions and rules on board, technical implementations can be better aligned with the needs of the organization. Using an advanced design layer architecture, model “layers” can be created with one or more models focused on the business requirements that then can be linked to one or more database implementations. Design-layer metadata can also be connected from conceptual through logical to physical data models.

  1. Rationalize platform inconsistencies and deliver a single source of truth for all enterprise business data.

Many organizations struggle to breakdown data silos and unify data into a single source of truth, due in large part to varying data sources and difficulty managing unstructured data. Being able to model any data from anywhere accounts for this with on-demand modeling for non-relational databases that offer speed, horizontal scalability and other real-time application advantages.

With NoSQL support, model structures from non-relational databases, such as Couchbase and MongoDB can be created automatically. Existing Couchbase and MongoDB data sources can be easily discovered, understood and documented through modeling and visualization. Existing entity-relationship diagrams and SQL databases can be migrated to Couchbase and MongoDB too. Relational schema also will be transformed to query-optimized NoSQL constructs.

Other considerations include the ability to:

  • Compare models and databases.
  • Increase enterprise collaboration.
  • Perform impact analysis.
  • Enable business and IT infrastructure interoperability.

When it comes to data modeling, no one knows it better. For more than 30 years, erwin Data Modeler has been the market leader. It is built on the vision and experience of data modelers worldwide and is the de-facto standard in data model integration.

You can learn more about driving business value and underpinning governance with erwin DM in this free white paper.

Data Modeling Drives Business Value

Categories
erwin Expert Blog

The Role of An Effective Data Governance Initiative in Customer Purchase Decisions

A data governance initiative will maximize the security, quality and value of data, all of which build customer trust.

Without data, modern business would cease to function. Data helps guide decisions about products and services, makes it easier to identify customers, and serves as the foundation for everything businesses do today. The problem for many organizations is that data enters from any number of angles and gets stored in different places by different people and different applications.

Getting the most out of your data requires that you know what you have, where you have it, and that you understand its quality and value to the organization. This is where data governance comes into play. You can’t optimize your data if it’s scattered across different silos and lurking in various applications.

For about 150 years, manufacturers relied on their machinery and its ability to run reliably, properly and safely, to keep customers happy and revenue flowing. A data governance initiative has a similar role today, except its aim is to maximize the security, quality and value of data instead of machinery.

Customers are increasingly concerned about the safety and privacy of their data. According to a survey by Research+Data Insights, 85 percent of respondents worry about technology compromising their personal privacy. In a survey of 2,000 U.S. adults in 2016, researchers from Vanson Bourne found that 76 percent of respondents said they would move away from companies with a high record of data breaches.

For years, buying decisions were driven mainly by cost and quality, says Danny Sandwell, director of product marketing at erwin, Inc. But today’s businesses must consider their reputations in terms of both cost/quality and how well they protect their customers’ data when trying to win business.

Once the reputation is tarnished because of a breach or misuse of data, customers will question those relationships.

Unfortunately for consumers, examples of companies failing to properly govern their data aren’t difficult to find. Look no further than Under Armour, which announced this spring that 150 million accounts at its MyFitnessPal diet and exercise tracking app were breached, and Facebook, where the data of millions of users was harvested by third parties hoping to influence the 2016 presidential election in the United States.

Customers Hate Breaches, But They Love Data

While consumers are quick to report concerns about data privacy, customers also yearn for (and increasingly expect) efficient, personalized and relevant experiences when they interact with businesses. These experiences are, of course, built on data.

In this area, customers and businesses are on the same page. Businesses want to collect data that helps them build the omnichannel, 360-degree customer views that make their customers happy.

These experiences allow businesses to connect with their customers and demonstrate how well they understand them and know their preferences, like and dislikes – essentially taking the personalized service of the neighborhood market to the internet.

The only way to manage that effectively at scale is to properly govern your data.

Delivering personalized service is also valuable to businesses because it helps turn customers into brand ambassadors, and it’s a fact that it’s much easier to build on existing customer relationships than to find new customers.

Here’s the upshot: If your organization is doing data governance right, it’s helping create happy, loyal customers, while at the same time avoiding the bad press and financial penalties associated with poor data practices.

Putting A Data Governance Initiative Into Action

The good news is that 76 percent of respondents to a November 2017 survey we conducted with UBM said understanding and governing the data assets in the organization was either important or very important to the executives in their organization. Nearly half (49 percent) of respondents said that customer trust/satisfaction was driving their data governance initiatives.

Importance of a data governance initiative

What stops organizations from creating an effective data governance initiative? At some businesses, it’s a cultural issue. Both the business and IT sides of the organization play important roles in data, with the IT side storing and protecting it, and the business side consuming data and analyzing it.

For years, however, data governance was the volleyball passed back and forth over the net between IT and the business, with neither side truly owning it. Our study found signs this is changing. More than half (57 percent) of the respondents said both and IT and the business/corporate teams were responsible for data in their organization.

Who's responsible for a data governance initiative

Once an organization understands that IT and the business are both responsible for data, it still needs to develop a comprehensive, holistic strategy for data governance that is capable of:

  • Reaching every stakeholder in the process
  • Providing a platform for understanding and governing trusted data assets
  • Delivering the greatest benefit from data wherever it lives, while minimizing risk
  • Helping users understand the impact of changes made to a specific data element across the enterprise.

To accomplish this, a modern data governance initiative needs to be interdisciplinary. It should include not only data governance, which is ongoing because organizations are constantly changing and transforming, but other disciples as well.

Enterprise architecture is important because it aligns IT and the business, mapping a company’s applications and the associated technologies and data to the business functions they enable.

By integrating data governance with enterprise architecture, businesses can define application capabilities and interdependencies within the context of their connection to enterprise strategy to prioritize technology investments so they align with business goals and strategies to produce the desired outcomes.

A business process and analysis component is also vital to modern data governance. It defines how the business operates and ensures employees understand and are accountable for carrying out the processes for which they are responsible.

Enterprises can clearly define, map and analyze workflows and build models to drive process improvement, as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

Finally, data modeling remains the best way to design and deploy new relational databases with high-quality data sources and support application development.

Being able to cost-effectively and efficiently discover, visualize and analyze “any data” from “anywhere” underpins large-scale data integration, master data management, Big Data and business intelligence/analytics with the ability to synthesize, standardize and store data sources from a single design, as well as reuse artifacts across projects.

Michael Pastore is the Director, Content Services at QuinStreet B2B Tech. This content originally appeared as a sponsored post on http://www.eweek.com/.

Read the previous post on how compliance concerns and the EU’s GDPR are driving businesses to implement data governance.

Determine how effective your current data governance initiative is by taking our DG RediChek.

Take the DG RediChek