Categories
erwin Expert Blog Data Modeling

Integrating SQL and NoSQL into Data Modeling for Greater Business Value: The Latest Release of erwin Data Modeler

SQL or NoSQL words written on white board, Big data concept

Due to the prevalence of internal and external market disruptors, many organizations are aligning their digital transformation and cloud migration efforts with other strategic requirements (e.g., compliance with the General Data Protection Regulation).

Accelerating the retrieval and analysis of data —so much of it unstructured—is vital to becoming a data-driven business that can effectively respond in real time to customers, partners, suppliers and other parties, and profit from these efforts. But even though speed is critical, businesses must take the time to model and document new applications for compliance and transparency.

For decades, data modeling has been the optimal way to design and deploy new relational databases with high-quality data sources and support application development. It facilitates communication between the business and system developers so stakeholders can understand the structure and meaning of enterprise data within a given context. Today, it provides even greater value because critical data exists in both structured and unstructured formats and lives both on premises and in the cloud.

Comparing SQL and NoSQL

While it may not be the most exciting match up, there’s much to be said when comparing SQL vs NoSQL databases. SQL databases use schemas and pre-defined tables, while NoSQL databases are the complete opposite. Instead of schemas and tables, NoSQL databases store data in ways that depend on what kind of NoSQL database is being used.

While the SQL and NoSQL worlds can complement each other in today’s data ecosystem, most enterprises need to focus on building expertise and processes for the latter format.

After all, they’ve already had decades of practice designing and managing SQL databases that emphasize storage efficiency and referential integrity rather than fast data access, which is so important to building cloud applications that deliver real-time value to staff, customers and other parties. Query-optimized modeling is the new watchword when it comes to supporting today’s fast delivery, iterative and real-time applications

DBMS products based on rigid schema requirements impede our ability to fully realize business opportunities that can expand the depth and breadth of relevant data streams for conversion into actionable information. New, business-transforming use cases often involve variable data feeds, real-time or near-time processing and analytics requirements, and the scale to process large volumes of data.

NoSQL databases, such as Couchbase and MongoDB, are purpose-built to handle the variety, velocity and volume of these new data use cases. Schema-less or dynamic schema capabilities, combined with increased processing speed and built-in scalability, make NoSQL the ideal platform.

Making the Move to NoSQL

Now the hard part. Once we’ve agreed to make the move to NoSQL, the next step is to identify the architectural and technological implications facing the folks tasked with building and maintaining these new mission-critical data sources and the applications they feed.

As the data modeling industry leader, erwin has identified a critical success factor for the majority of organizations adopting a NoSQL platform like Couchbase, Cassandra and MongoDB. Successfully leveraging this solution requires a significant paradigm shift in how we design NoSQL data structures and deploy the databases that manage them.

But as with most technology requirements, we need to shield the business from the complexity and risk associated with this new approach. The business cares little for the technical distinctions of the underlying data management “black box.”

Business data is business data, with the main concerns being its veracity and value. Accountability, transparency, quality and reusability are required, regardless. Data needs to be trusted, so decisions can be made with confidence, based on facts. We need to embrace this paradigm shift, while ensuring it fits seamlessly into our existing data management practices as well as interactions with our partners within the business. Therefore, the challenge of adopting NoSQL in an organization is two-fold: 1) mastering and managing this new technology and 2) integrating it into an expansive and complex infrastructure.

The Newest Release of erwin Data Modeler

There’s a reason erwin Data Modeler is the No.1 data modeling solution in the world.

And the newest release delivers all in one SQL and NoSQL data modeling, guided denormalization and model-driven engineering support for Couchbase, Cassandra, MongoDB, JSON and AVRO. NoSQL users get all of the great capabilities inherent in erwin Data Modeler. It also provides Data Vault modeling, enhanced productivity, and simplified administration of the data modeling repository.

Now you can rely on one solution for all your enterprise data modeling needs, working across DBMS platforms, using modern modeling techniques for faster data value, and centrally governing all data definition, data modeling and database design initiatives.

erwin data models reduce complexity, making it easier to design, deploy and understand data sources to meet business needs. erwin Data Modeler also automates and standardizes model design tasks, including complex queries, to improve business alignment, ensure data integrity and simplify integration.

In addition to the above, the newest release of erwin Data Modeler by Quest also provides:

  • Updated support and certifications for the latest versions of Oracle, MS SQL Server, MS Azure SQL and MS Azure SQL Synapse
  • JDBC-connectivity options for Oracle, MS SQL Server, MS Azure SQL, Snowflake, Couchbase, Cassandra and MongoDB
  • Enhanced administration capabilities to simplify and accelerate data model access, collaboration, governance and reuse
  • New automation, connectivity, UI and workflow optimization to enhance data modeler productivity by reducing onerous manual tasks

erwin Data Modeler is a proven technology for improving the quality and agility of an organization’s overall data capability – and that includes data governance and data intelligence.

Click here for your free trial of erwin Data Modeler.

Categories
Data Modeling erwin Expert Blog

erwin, Microsoft and the Power of the Common Data Model

What is Microsoft’s Common Data Model (CDM), and why is it so powerful?

Imagine if every person in your organization spoke a different language, and you had no simple way to translate what they were saying? It would make your work frustrating, complicated and slow.

The same is true for data, with a number of vendors creating data models by vertical industry (financial services, healthcare, etc.) and making them commercially available to improve how organizations understand and work with their data assets. The CDM takes this concept to the next level.

Microsoft has delivered a critical building block for the data-driven enterprise by capturing proven business data constructs and semantic descriptors for data across a wide range of business domains in the CDM and providing the contents in an open-source format for consumption and integration. The CDM provides a best-practices approach to defining data to accelerate data literacy, automation, integration and governance across the enterprise.

Why Is the CDM Such a Big Deal?

The value of the CDM shows up in multiple ways. One is enabling data to be unified. Another is to reduce the time and effort in manual mapping – ultimately saving the organization money.

By having a single definition of something, complex ETL doesn’t have to be performed repeatedly. Once something is defined, then then everyone can map to the standard definition of what the data means.

Beyond saving time, effort and money, CDM can help transform your business in even more ways, including:

  • Innovation: With data having a common meaning, the business can unlock new scenarios, like modern and advanced analytics, experiential analytics, AI, email, etc.
  • Insights: Given the meaning of the data is the same, regardless of the domain it came from, an organization can use its data to power business insights.
  • Compliance: It improves data governance to comply with such regulations as the General Data Protection Regulation (GDPR).
  • Cloud migration and other data platform modernization efforts: definition is missing here.

Once the organization understands what something is, and it is commonly understood across the enterprise, anyone can build semantically aware reporting and analytical requirements plus deliver a uniform view because there is a common understanding of data.

Data Modeling Tool

erwin Expands Collaboration with Microsoft

The combination of Microsoft’s CDM with erwin’s industry-leading data modeling, governance and automation solutions can optimize an organization’s data capability and accelerate the impact and business value of enterprise data.

erwin recently announced its expanded collaboration with Microsoft. By working together, the companies will help organizations get a handle on disparate data, put it in one place, and then determine how to do something meaningful with it.

The erwin solutions that use Microsoft’s CDM are:

erwin Data Modeler: erwin DM automatically transforms the CDM into a graphical model, complete with business-data constructs and semantic metadata, to feed your existing data-source models and new database designs – regardless of the technology upon which these structures are deployed.

erwin DM’s reusable model templates, design layer and model compare/synchronization capabilities, combined with our design lifecycle and modeler collaboration services, enables organizations to capture and use CDM contents and best practices to optimize enterprise data definition, design and deployment.

erwin DM also enables the reuse of the CDM in the design and maintenance of enterprise data sources. It automatically consumes, integrates and maintains CDM metadata in a standardized, reusable design and supports logical and physical modeling and integration with all major DBMS technologies.

The erwin Data Intelligence Suite: erwin DI automatically scans, captures and activates metadata from the CDM into a central business glossary. Here, it is intelligently integrated and connected to the metadata from the data sources that feed enterprise applications.

Your comprehensive metadata landscape, including CDM metadata, is governed with the appropriate terminology, policies, rules and other business classifications you decide to build into your framework.

The resulting data intelligence is then discoverable via a self-service business user portal that provides role-based, contextual views. All this metadata-driven automation is possible thanks to erwin DI’s ability to consume and associate CDM metadata to create a data intelligence framework.

erwin and Microsoft recently co-presented a session on the power of the CDM that included a demonstration of how to create a data lake for disparate data sources, migrate all that data to it, and then provide business users with contextual views of the underlying metadata, based on a CDM-enriched business glossary.

The simulation also discussed the automatic generation of scripts for ETL tools, as well as the auto generation of data lineage diagrams and impact analysis so data governance is built in and continuous.

You can watch the full erwin/Microsoft session here.

Data Modeling Data Goverance

Categories
erwin Expert Blog Data Modeling

The Top Six Benefits of Data Modeling – What Is Data Modeling?

Understanding the benefits of data modeling is more important than ever.

Data modeling is the process of creating a data model to communicate data requirements, documenting data structures and entity types.

It serves as a visual guide in designing and deploying databases with high-quality data sources as part of application development.

Data modeling has been used for decades to help organizations define and categorize their data, establishing standards and rules so it can be consumed and then used by information systems. Today, data modeling is a cost-effective and efficient way to manage and govern massive volumes of data, aligning data assets with the business functions they serve.

You can automatically generate data models and database designs to increase efficiency and reduce errors to make the lives or your data modelers – and other stakeholders – much more productive.

Categories
Data Intelligence erwin Expert Blog Enterprise Architecture Business Process Data Modeling

Benefits of Enterprise Modeling and Data Intelligence Solutions

Users discuss how they are putting erwin’s data modeling, enterprise architecture, business process modeling, and data intelligences solutions to work

IT Central Station members using erwin solutions are realizing the benefits of enterprise modeling and data intelligence. This article highlights some specific use cases and the results they’re experiencing within the organizations.

Enterprise Architecture & Business Process Modeling with erwin Evolve

An enterprise architect uses erwin Evolve at an aerospace/defense firm with more than 10,000 employees. His team is “doing business process modeling and high-level strategic modeling with its capabilities.” Others in his company are using it for IT infrastructure, such as aligning requirements to system solutions.

For Matthieu G., a senior business process management architect at a pharma/biotech company with more than 5,000 employees, erwin Evolve was useful for enterprise architecture reference. As he put it, “We are describing our business process and we are trying to describe our data catalog. We are describing our complete applications assets, and we are interfacing to the CMDB of our providers.”

His team also is using the software to manage roadmaps in their main transformation programs. He added, “We have also linked it to our documentation repository, so we have a description of our data documents.” They have documented 200 business processes in this way. In particular, the tool helped them to design their qualification review, which is necessary in a pharmaceutical business.

erwin Evolve users are experiencing numerous benefits. According to the aerospace enterprise architect, “It’s helped us advance in our capabilities to perform model-based systems engineering, and also model-based enterprise architecture.”

This matters because, as he said, “By placing the data and the metadata into a model, which is what the tool does, you gain the abilities for linkages between different objects in the model, linkages that you cannot get on paper or with Visio or PowerPoint.” That is a huge differentiator for this user.

This user also noted, “I use the automatic diagramming features a lot. When one of erwin’s company reps showed that to me a couple of years ago, I was stunned. That saves hours of work in diagramming. That capability is something I have not seen in other suppliers’ tools.”

He further explained “that really helps too with when your data is up to date. The tool will then automatically generate the updated diagram based on the data, so you know it’s always the most current version. You can’t do that in things like Visio and PowerPoint. They’re static snapshots of a diagram at some point in time. This is live and dynamic.”

erwin DM

Data Modeling with erwin Data Modeler

George H., a technology manager, uses erwin Data Modeler (erwin DM) at a pharma/biotech company with more than 10,000 employees for their enterprise data warehouse.

He elaborated by saying, “We have an enterprise model being maintained and we have about 11 business-capability models being maintained. Examples of business capabilities would be finance, human resources, supply-chain, sales and marketing, and procurement. We maintain business domain models in addition to the enterprise model.”

Roshan H., an EDW architect/data modeler who uses erwin DM at Royal Bank of Canada, works on diverse platforms, including Microsoft SQL Server, Oracle, DB2, Teradata and NoSQL. After gathering requirements and mapping data on Excel, they start building the conceptual model and then the logical model with erwin DM.

He said, “When we have these data models built in the erwin DM, we generate the PDF data model diagrams and take it to the team (DBA, BSAs, QA and others) to explain the model diagram. Once everything is reviewed, then we go on to discuss the physical data model.”

“We use erwin DM to do all of the levels of analysis that a data architect does,” said Sharon A., a senior manager, data governance at an insurance company with over 500 employees. She added, “erwin DM does conceptual, logical and physical database or data structure capture and design, and creates a library of such things.

We do conceptual data modeling, which is very high-level and doesn’t have columns and tables. It’s more concepts that the business described to us in words. We can then use the graphic interface to create boxes that contain descriptions of things and connect things together. It helps us to do a scope statement at the beginning of a project to corral what the area is that the data is going to be using.”

Data Governance with erwin Data Intelligence

IT Central Station members are seeing benefits from using erwin Data Intelligence (erwin DI) for data governance use cases. For Rick D., a data architect at NAMM, a small healthcare company, erwin DI “enabled us to centralize a tremendous amount of data into a common standard, and uniform reporting has decreased report requests.”

As a medical company, they receive data from 17 different health plans. Before adopting erwin DI, they didn’t have a centralized data dictionary of their data. The benefit of data governance, as he saw it, was that “everybody in our organization knows what we are talking about. Whether it is an institutional claim, a professional claim, Blue Cross or Blue Shield, health plan payer, group titles, names, etc.”

A solution architect at a pharma/biotech company with more than 10,000 employees used erwin DI for metadata management, versioning of metadata and metadata mappings and automation. In his experience, applying governance to metadata and creating mappings has helped different stakeholders gain a good understanding of the data they use to do their work.

Sharon A. had a comparable use case. She said, “You can map the business understanding in your glossary back to your physical so you can see it both ways. With erwin DI, I can have a full library of physical data there or logical data sets, publish it out through the portal, and then the business can do self-service. The DBAs can use it for all different types of value-add from their side of the house. They have the ability to see particular aspects, such as RPII, and there are some neat reports which show that. They are able manage who can look at these different pieces of information.”

For more real erwin user experiences, visit IT Central Station.

Categories
erwin Expert Blog Data Modeling

How to Do Data Modeling the Right Way

Data modeling supports collaboration among business stakeholders – with different job roles and skills – to coordinate with business objectives.

Data resides everywhere in a business, on-premise and in private or public clouds. And it exists across these hybrid architectures in different formats: big and unstructured and traditional structured business data may physically sit in different places.

What’s desperately needed is a way to understand the relationships and interconnections between so many entities in data sets in detail.

Visualizing data from anywhere defined by its context and definition in a central model repository, as well as the rules for governing the use of those data elements, unifies enterprise data management. A single source of data truth helps companies begin to leverage data as a strategic asset.

What, then, should users look for in a data modeling product to support their governance/intelligence requirements in the data-driven enterprise?

Data Modeling

Nine Steps to Data Modeling

  1. Provide metadata and schema visualization regardless of where data is stored

Data modeling solutions need to account for metadata and schema visualization to mitigate complexity and increase collaboration and literacy across a broad range of data stakeholders. They should automatically generate data models, providing a simple, graphical display to visualize a wide range of enterprise data sources based on a common repository of standard data assets through a single interface.

  1. Have a process and mechanism to capture, document and integrate business and semantic metadata for data sources

As the best way to view metadata to support data governance and intelligence, data models can depict the metadata content for a data catalog. A data modeling solution should make it possible for business and semantic metadata to be created to augment physical data for ingestion into a data catalog, which provides a mechanism for IT and business users to make use of the metadata or data structures underpinning source systems.

High-functioning data catalogs will provide a technical view of information flow as well as deeper insights into semantic lineage – that is, how the data asset metadata maps to corresponding business usage tables.

Data stewards can associate business glossary terms, data element definitions, data models and other semantic details with different mappings, drawing upon visualizations that demonstrate where business terms are in use, how they are mapped to different data elements in different systems and the relationships among these different usage points.

  1. Create database designs from visual models

Time is saved and errors are reduced when visual data models are available for use in translating the high-quality data sources that populate them into new relational and non-relational database design, standardization, deployment and maintenance.

  1. Reverse engineer databases into data models

Ideally a solution will let users create a logical and physical data model by adroitly extracting information from an existing data source – ERP, CRM or other enterprise application — and choosing the objects to use in the model.

This can be employed to translate the technical formats of the major database platforms into detailed physical entity-relationship models rich in business and semantic metadata that visualizes and diagrams the complex database objects.

Database code reverse-engineering, integrated development environment connections and model exchange will ensure efficiency, effectiveness and consistency in the design, standardization, documentation and deployment of data structures for comprehensive enterprise database management. Also helpful is if the offline reverse-engineering process is automated so that modelers can focus on other high-value tasks.

  1. Harness model reusability and design standards

When data modelers can take advantage of intuitive graphical interfaces, they’ll have an easier time viewing data from anywhere in context or meaning and relationships support of artifact reuse for large-scale data integration, master data management, big data and business intelligence/analytics initiatives.

It’s typically the case that modelers will want to create models containing reusable objects such as modeling templates, entities, tables, domains, automation macros. naming and database standards, formatting options, and so on.

The ability to modify the way data types are mapped for specific DBMS data types and to create reusable design standards across the business should be fostered through customizable functionality. Reuse serves to help lower the costs of development and maintenance and ensure data quality for governance requirements.

Additionally, templates should be available to help enable standardization and reuse while accelerating the development and maintenance of models. Standardization and reuse of models across data management environments will be possible when there is support for model exchange.

Consistency and reuse are more efficient when model development and assets are centralized. That makes it easier to publish models across various stakeholders and incorporate comments and changes from them as necessary.

  1. Enable user configuration and point-and-click report interfaces

A key part of data modeling is to create text-based reports for diagrams and metadata via a number of formats – HTML, PDF and CSV. By taking the approach of using point-and-click interfaces, a solution can make it easier to create detailed metadata reports of models and drill down into granular graphical views of reports that are inclusive of object types-tables, UDPS and more.

The process is made even more simple when users can take advantage of out-of-the-box reports that are pertinent to their needs as well as create them for individual models or across multiple models.

When generic ODBC interfaces are included, options grow for querying metadata, regardless of where it is sourced, from a variety of tools and interfaces.

  1. Support an all-inclusive environment of collaboration

When solutions focus on model management in a centralized repository, modular and bidirectional collaboration services are empowered across all data generators – human or machine—and stewards and consumers across the enterprise.

Data siloes, of course, are the enemies of data governance. They make it difficult to have a clear understanding of where information resides and how data is commonly defined.

It’s far better to centralize and manage access to ordered assets – whether by particular internal staff roles or to business partners granted role-based and read-only access – to maintain security.

Such an approach supports coordinated version control, model change management and conflict resolution and seeds cross-model impact analysis across stakeholders. Modeler productivity and independence can be enhanced, too.

  1. Promote data literacy

Stakeholder collaboration, in fact, depends on and is optimized by data literacy, the key to creating an organization that is fluent in the language of data. Everyone in the enterprise – from data scientists to ETL developers to compliance officers to C-level executives – ought to be assured of having a dynamic view of high-quality data pipelines operating on common and standardized terms.

So, it is critical that solutions focus on making the pipeline data available and discoverable in such a way that it reflects different user roles. When consumers can view data relevant to their roles and understand its definition within the business context in which they operate, their ability to produce accurate, actionable insights and collaborate across the enterprise to enact them for the desired outcomes is enhanced.

Data literacy built on business glossaries that enable the collaborative definition of enterprise data in business terms and rules for built-in accountability and workflow promote adherence to governance requirements.

  1. Embed data governance constructs within data models

Data governance should be integrated throughout the data modeling process. It manifests in a solution’s ability to adroitly discover and document any data from anywhere for consistency, clarity and artifact reuse across large-scale data integration, master data management, metadata management and big data requirements.

Data catalogs and business glossaries with properly defined data definitions in a controlled central repository are the result of ingesting metadata from data models for business intelligence and analytics initiatives.

You Don’t Know What You’ve Got

Bottom line, without centralized data models and a metadata hub, there is no efficient means to comply with industry regulations and business standards regarding security and privacy; set permissions for access controls; and consolidate information in easy-to-understand reports for business analysts.

The value of participating in data modeling to classify the data that is most important to the business in terms that are meaningful to the business and having a breakdown of complex data organization scenarios supports critical business reporting, intelligence and analytics tasks. That’s a clear need, as organizations today analyze and use less than 0.5 percent of the information they take in – a huge loss of potential value in the age of data-driven business.

Without illustrative data models businesses may not even realize that they already have the data needed for a new report, and time is lost and costs increase as data is gathered and interfaces are rebuilt.

To learn more about data modeling and its role in the enterprise, join us for our upcoming webinar, Data Modeling Is Cool, Seriously.

Data Modeling Is Cool, Seriously

Categories
erwin Expert Blog Data Modeling

Modern Data Modeling: The Foundation of Enterprise Data Management and Data Governance

The role of data modeling (DM) has expanded to support enterprise data management, including data governance and intelligence efforts. After all, you can’t manage or govern what you can’t see, much less use it to make smart decisions.

Metadata management is the key to managing and governing your data and drawing intelligence from it. Beyond harvesting and cataloging metadata, it also must be visualized to break down the complexity of how data is organized and what data relationships there are so that meaning is explicit to all stakeholders in the data value chain.

Data Governance and Automation

Data models provide this visualization capability, create additional metadata and standardize the data design across the enterprise.

While modeling has always been the best way to understand complex data sources and automate design standards, modern data modeling goes well beyond these domains to ensure and accelerate the overall success of data governance in any organization.

You can’t overestimate the importance of success as data governance keeps the business in line with privacy mandates such as the General Data Protection Regulation (GDPR). It drives innovation too. Companies who want to advance AI initiatives, for instance, won’t get very far without quality data and well-defined data models.

Why Is Data Modeling the Building Block of Enterprise Data Management?

DM mitigates complexity and increases collaboration and literacy across a broad range of data stakeholders.

  • DM uncovers the connections between disparate data elements.

The DM process enables the creation and integration of business and semantic metadata to augment and accelerate data governance and intelligence efforts.

  • DM captures and shares how the business describes and uses data.

DM delivers design task automation and enforcement to ensure data integrity.

  • DM builds higher quality data sources with the appropriate structural veracity.

DM delivers design task standardization to improve business alignment and simplify integration.

  • DM builds a more agile and governable data architecture.

The DM process manages the design and maintenance lifecycle for data sources.

  • DM governs the design and deployment of data across the enterprise.

DM documents, standardizes and aligns any type of data no matter where it lives. 

Realizing the Data Governance Value from Data Modeling

Modeling becomes the point of true collaboration within an organization because it delivers a visual source of truth for everyone to follow – data management and business professionals – to conform to governance requirements.

Information is readily available within intuitive business glossaries, accessible to user roles according to parameters set by the business. The metadata repository behind these glossaries, populated by information stored in data models, serves up the key terms that are understandable and meaningful to every party in the enterprise.

The stage, then, is equally set for improved data intelligence, because stakeholders now can use, understand and trust relevant data to enhance decision-making across the enterprise.

The enterprise is coming to the point where both business and IT co-own data modeling processes and data models. Business analysts and other power users start to understand data complexities because they can grasp terms and contribute to making the data in their organization accurate and complete, and modeling grows in importance in the eyes of business users.

Bringing data to the business and making it easy to access and understand increases the value of  data assets, providing a return on investment and a return on opportunity. But neither would be possible without data modeling providing the backbone for metadata management and proper data governance.

For more information, check out our whitepaper, Drive Business Value and Underpin Data Governance with an Enterprise Data Model.

You also can take erwin DM, the world’s No. 1 data modeling software, for a free spin.

erwin Data Modeler Free Trial - Data Modeling

Categories
Data Modeling erwin Expert Blog

Data Modeling Best Practices for Data-Driven Organizations

As data-driven business becomes increasingly prominent, an understanding of data modeling and data modeling best practices is crucial. This posts outlines just that, and other key questions related to data modeling such as “SQL vs. NoSQL.”

What is Data Modeling?

Data modeling is a process that enables organizations to discover, design, visualize, standardize and deploy high-quality data assets through an intuitive, graphical interface.

Data models provide visualization, create additional metadata and standardize data design across the enterprise.

As the value of data and the way it is used by organizations has changed over the years, so too has data modeling.

In the modern context, data modeling is a function of data governance.

While data modeling has always been the best way to understand complex data sources and automate design standards, modern data modeling goes well beyond these domains to accelerate and ensure the overall success of data governance in any organization.

 

 

As well as keeping the business in compliance with data regulations, data governance – and data modeling – also drive innovation.

Companies that want to advance artificial intelligence (AI) initiatives, for instance, won’t get very far without quality data and well-defined data models.

With the right approach, data modeling promotes greater cohesion and success in organizations’ data strategies.

But what is the right data modeling approach?

Data Modeling Data Goverance

Data Modeling Best Practices

The right approach to data modeling is one in which organizations can make the right data available at the right time to the right people. Otherwise, data-driven initiatives can stall.

Thanks to organizations like Amazon, Netflix and Uber, businesses have changed how they leverage their data and are transforming their business models to innovate – or risk becoming obsolete.

According to a 2018 survey by Tech Pro Research, 70 percent of survey respondents said their companies either have a digital transformation strategy in place or are working on one. And 60% of companies that have undertaken digital transformation have created new business models.

But data-driven business success doesn’t happen by accident. Organizations that adapt that strategy without the necessary processes, platforms and solutions quickly realize that data creates a lot of noise but not necessarily the right insights.

This phenomenon is perhaps best articulated through the lens of the “three Vs” of data: volume, variety and velocity.

Data Modeling Tool

Any2 Data Modeling and Navigating Data Chaos

The three Vs describe the volume (amount), variety (type) and velocity (speed at which it must be processed) of data.

Data’s value grows with context, and such context is found within data. That means there’s an incentive to generate and store higher volumes of data.

Typically, an increase in the volume of data leads to more data sources and types. And higher volumes and varieties of data become increasingly difficult to manage in a way that provides insight.

Without due diligence, the above factors can lead to a chaotic environment for data-driven organizations.

Therefore, the data modeling best practice is one that allows users to view any data from anywhere – a data governance and management best practice we dub “any-squared” (Any2).

Organizations that adopt the Any2 approach can expect greater consistency, clarity and artifact reuse across large-scale data integration, master data management, metadata management, Big Data and business intelligence/analytics initiatives.

SQL or NoSQL? The Advantages of NoSQL Data Modeling

For the most part, databases use “structured query language” (SQL) for maintaining and manipulating data. This structured approach and its proficiency in handling complex queries has led to its widespread use.

But despite the advantages of such structure, its inherent sequential nature (“this, then “this”), means it can be hard to operate holistically and deal with large amounts of data at once.

Additionally, as alluded to earlier, the nature of modern, data-driven business and the three VS means organizations are dealing with increasing amounts of unstructured data.

As such in a modern business context, the three Vs have become somewhat of an Achilles’ heel for SQL databases.

The sheer rate at which businesses collect and store data – as well as the various types of data stored – mean organizations have to adapt and adopt databases that can be maintained with greater agility.

That’s where NoSQL comes in.

Benefits of NoSQL

Despite what many might assume, adopting a NoSQL database doesn’t mean abandoning SQL databases altogether. In fact, NoSQL is actually a contraction of “not only SQL.”

The NoSQL approach builds on the traditional SQL approach, bringing old (but still relevant) ideas in line with modern needs.

NoSQL databases are scalable, promote greater agility, and handle changes to data and the storing of new data more easily.

They’re better at dealing with other non-relational data too. NoSQL supports JavaScript Object Notation (JSON), log messages, XML and unstructured documents.

Data Modeling Is Different for Every Organization

It perhaps goes without saying, but different organizations have different needs.

For some, the legacy approach to databases meets the needs of their current data strategy and maturity level.

For others, the greater flexibility offered by NoSQL databases makes NoSQL databases, and by extension NoSQL data modeling, a necessity.

Some organizations may require an approach to data modeling that promotes collaboration.

Bringing data to the business and making it easy to access and understand increases the value of data assets, providing a return-on-investment and a return-on-opportunity. But neither would be possible without data modeling providing the backbone for metadata management and proper data governance.

Whatever the data modeling need, erwin can help you address it.

erwin DM is available in several versions, including erwin DM NoSQL, with additional options to improve the quality and agility of data capabilities.

And we just announced a new version of erwin DM, with a modern and customizable modeling environment, support for Amazon Redshift; updated support for the latest DB2 releases; time-saving modeling task automation, and more.

New to erwin DM? You can try the new erwin Data Modeler for yourself for free!

erwin Data Modeler Free Trial - Data Modeling