Categories
erwin Expert Blog Data Modeling

Integrating SQL and NoSQL into Data Modeling for Greater Business Value: The Latest Release of erwin Data Modeler

SQL or NoSQL words written on white board, Big data concept

Due to the prevalence of internal and external market disruptors, many organizations are aligning their digital transformation and cloud migration efforts with other strategic requirements (e.g., compliance with the General Data Protection Regulation).

Accelerating the retrieval and analysis of data —so much of it unstructured—is vital to becoming a data-driven business that can effectively respond in real time to customers, partners, suppliers and other parties, and profit from these efforts. But even though speed is critical, businesses must take the time to model and document new applications for compliance and transparency.

For decades, data modeling has been the optimal way to design and deploy new relational databases with high-quality data sources and support application development. It facilitates communication between the business and system developers so stakeholders can understand the structure and meaning of enterprise data within a given context. Today, it provides even greater value because critical data exists in both structured and unstructured formats and lives both on premises and in the cloud.

Comparing SQL and NoSQL

While it may not be the most exciting match up, there’s much to be said when comparing SQL vs NoSQL databases. SQL databases use schemas and pre-defined tables, while NoSQL databases are the complete opposite. Instead of schemas and tables, NoSQL databases store data in ways that depend on what kind of NoSQL database is being used.

While the SQL and NoSQL worlds can complement each other in today’s data ecosystem, most enterprises need to focus on building expertise and processes for the latter format.

After all, they’ve already had decades of practice designing and managing SQL databases that emphasize storage efficiency and referential integrity rather than fast data access, which is so important to building cloud applications that deliver real-time value to staff, customers and other parties. Query-optimized modeling is the new watchword when it comes to supporting today’s fast delivery, iterative and real-time applications

DBMS products based on rigid schema requirements impede our ability to fully realize business opportunities that can expand the depth and breadth of relevant data streams for conversion into actionable information. New, business-transforming use cases often involve variable data feeds, real-time or near-time processing and analytics requirements, and the scale to process large volumes of data.

NoSQL databases, such as Couchbase and MongoDB, are purpose-built to handle the variety, velocity and volume of these new data use cases. Schema-less or dynamic schema capabilities, combined with increased processing speed and built-in scalability, make NoSQL the ideal platform.

Making the Move to NoSQL

Now the hard part. Once we’ve agreed to make the move to NoSQL, the next step is to identify the architectural and technological implications facing the folks tasked with building and maintaining these new mission-critical data sources and the applications they feed.

As the data modeling industry leader, erwin has identified a critical success factor for the majority of organizations adopting a NoSQL platform like Couchbase, Cassandra and MongoDB. Successfully leveraging this solution requires a significant paradigm shift in how we design NoSQL data structures and deploy the databases that manage them.

But as with most technology requirements, we need to shield the business from the complexity and risk associated with this new approach. The business cares little for the technical distinctions of the underlying data management “black box.”

Business data is business data, with the main concerns being its veracity and value. Accountability, transparency, quality and reusability are required, regardless. Data needs to be trusted, so decisions can be made with confidence, based on facts. We need to embrace this paradigm shift, while ensuring it fits seamlessly into our existing data management practices as well as interactions with our partners within the business. Therefore, the challenge of adopting NoSQL in an organization is two-fold: 1) mastering and managing this new technology and 2) integrating it into an expansive and complex infrastructure.

The Newest Release of erwin Data Modeler

There’s a reason erwin Data Modeler is the No.1 data modeling solution in the world.

And the newest release delivers all in one SQL and NoSQL data modeling, guided denormalization and model-driven engineering support for Couchbase, Cassandra, MongoDB, JSON and AVRO. NoSQL users get all of the great capabilities inherent in erwin Data Modeler. It also provides Data Vault modeling, enhanced productivity, and simplified administration of the data modeling repository.

Now you can rely on one solution for all your enterprise data modeling needs, working across DBMS platforms, using modern modeling techniques for faster data value, and centrally governing all data definition, data modeling and database design initiatives.

erwin data models reduce complexity, making it easier to design, deploy and understand data sources to meet business needs. erwin Data Modeler also automates and standardizes model design tasks, including complex queries, to improve business alignment, ensure data integrity and simplify integration.

In addition to the above, the newest release of erwin Data Modeler by Quest also provides:

  • Updated support and certifications for the latest versions of Oracle, MS SQL Server, MS Azure SQL and MS Azure SQL Synapse
  • JDBC-connectivity options for Oracle, MS SQL Server, MS Azure SQL, Snowflake, Couchbase, Cassandra and MongoDB
  • Enhanced administration capabilities to simplify and accelerate data model access, collaboration, governance and reuse
  • New automation, connectivity, UI and workflow optimization to enhance data modeler productivity by reducing onerous manual tasks

erwin Data Modeler is a proven technology for improving the quality and agility of an organization’s overall data capability – and that includes data governance and data intelligence.

Click here for your free trial of erwin Data Modeler.

Categories
erwin Expert Blog

Cloud Migration and the Importance of Data Governance

Tackling data-related challenges to keep cloud migration projects on track and optimized

By Wendy Petty 

The cloud has many operational and competitive advantages, so cloud-first and other cloud transformation initiatives continue to be among the top data projects organizations are pursuing.

For many of those yet to adopt and adapt, it is a case of “when” not “if” the enterprise will undergo a form of digital transformation requiring data migration to the cloud.

Due to today’s prevalence of internal and external market disruptors, many organizations are aligning their digital transformation and cloud migration efforts with other strategic requirements (e.g., compliance with the General Data Protection Regulation).

And now organizations also must navigate a post-COVID world, which is forcing organizations to fast track their cloud migrations to become more agile, lean and focused on business outcomes that will enable the business to survive and then thrive new market dynamics.

However, cloud migration is not just a lift and shift, a one-off or a silver bullet. Usually when organizations go from an on-premises environment to a cloud environment, they are actually converting two different technologies. And as you migrate to the cloud, you need to keep in mind some data-related challenges.

cloud migration data governance

Dollars and Cents

For 47 percent of enterprise companies, cost optimization is the main reason they migrate to the cloud. However, cloud migrations can be expensive, with costs piling up the longer a migration takes to complete.

Not only are cloud migrations generally expensive, but many companies don’t budget for them appropriately. In 2020, companies went over their public cloud spend budget by an average of 23 percent. Most likely, this comes down to a lack of planning, leading to long, drawn-out migrations and ill-informed product decisions. Additionally, completely manual migrations generally take longer and cost more than those that employ automation.

In terms of budget and cost, automated tools that scan repositories in your environment help by adding structure and business context (where it is, who can access it, etc.) in the transformation of legacy structures. New structures will enable new capabilities for your data and business processes.

Automated tools can help you lower risks and costs and reduce the time it takes to realize value. Automated software handles data cataloging and locates, models and governs cloud data assets.

Tools that help IT organizations plan and execute their cloud migrations aren’t difficult to find. Many large cloud providers offer tools to help ease the migration to their platform. But a technology-agnostic approach to such tools adds value to cloud migration projects.

Proprietary tools from cloud vendors funnel clients into a single preferred environment. Agnostic tools, on the other hand, help organizations understand which cloud environment is best for them. Their goal is to identify the cloud platform and strategy that will deliver the most value after taking budget and feature requirements into account.

Institutional Knowledge

Institutional knowledge is another obstacle many companies face when exploring cloud migrations. People leave the organization and take with them an understanding of how and why things are done. Because of this, you may not know what data you have or how you should be using it.

The challenge comes when it’s time to migrate; you need to understand what you have, how it’s used, what its value is, and what should be migrated. Otherwise, you may spend time and money migrating data, only to discover that no one has touched it in several years and it wasn’t necessary for you to retain it.

In addition, if you’re planning to use a multi-cloud approach, you need to ensure that the clouds you work with are compatible. Only 24 percent of IT organizations have a high degree of interoperability between their cloud environments. This means that more than three-quarters suffer from inefficient cloud setups and can’t readily combine or analyze data from multiple cloud environments.

Data Governance

Migrating enterprise data to the cloud is only half the story – once there, it has to be governed. That means your cloud data assets must be available for use by the right people for the right purposes to maximize their security, quality and value.

Around 60 percent of enterprises worry about regulatory issues, governance and compliance with cloud services. The difficulty comes with creating good governance around data while avoiding risk and getting more out of that data. More than three-quarters (79 percent) of businesses are looking for better integrated security and governance for the data they put in the cloud.

Cloud migration provides a unique opportunity not simply to move things as they are to the cloud but also to make strategic changes. Companies are using the move to the cloud to make data governance a priority and show their customers they are good data stewards.

Unfortunately, 72 percent of companies state that deciding which workloads they should migrate to the cloud is one of their top four hurdles to cloud implementation. However, cloud migration is not an endpoint; it’s just the next step in making your business flexible and agile for the long term.

Determining which data sets need to be migrated can help you prepare for growth in the long run. The degree of governance each set of data needs will help determine what you should migrate and what you should keep in place.

Automated Cloud Migration and Data Governance

The preceding list of cloud migration challenges might seem daunting, especially for an organization that collects and manages a great deal of data. When enterprises face the prospect of manual, cumbersome work related to their business processes, IT infrastructure, and more, they often turn to automation.

You can apply the same idea to your cloud migration strategy because automated software tools can aid in the planning and heavy lifting of cloud migrations. As such, they should be considered when it comes to choosing platforms, forecasting costs, and understanding the value of the data being considered for migration.

erwin Cloud Catalyst is a suite of automated cloud migration and data governance software and services to simplify and accelerate the move to cloud platforms and govern those data assets throughout their lifecycle. Automation is a critical differentiator for erwin’s cloud migration and data governance tools.

Key Benefits of erwin Cloud Catalyst:

  • Cost Mitigation: Automated tools scan repositories in your environment and add structure and business context (where it is, who can access it, etc.) in the transformation of legacy structures.
  • Reduced Risk and Faster Time to Value: Automated tools can help you reduce risks, costs and the time it takes to realize value.
  • Tech-Agnostic: Technology-agnostic approach adds value to cloud migration projects.
  • Any Cloud to Any Cloud: Automatically gathering the abstracted essence of the data will make it easier to point that information at another cloud platform or technology if, or likely when, you migrate again.
  • Institutional Knowledge Retention: Collect and retain institutional knowledge around data and enable transparency.
  • Continuous Data Governance: Automation helps IT organizations address data governance during cloud migrations and then for the rest of the cloud data lifecycle and minimizes human intervention.

Every customer’s environment and data is unique. That’s why the first step is working with you to assess your cloud migration strategy. Then we deliver an automation roadmap and design the appropriate smart data connectors to help your IT services team achieve your future-state architecture, including accelerating data ingestion and ETL conversion.

To get started, request your cloud-readiness assessment.

And here’s a video with some more information about our approach to cloud migration and data governance.

Gartner Magic Quadrant

Categories
erwin Expert Blog

erwin and Snowflake Partnership: Helping Our Customers Manage and Govern the Entire Data Lifecycle

In my role as chief sales officer, I am fortunate to spend my time with the industry’s most passionate and committed customers. In these highly competitive enterprises and in the new post-COVID era of business, erwin’s customers are laser-focused on helping their businesses enhance their operations through the application of data truths: insights that help them improve everything — from how they serve their customers to increasing their competitive edge to delivering new products and services to meet the demand of new digital paradigms.

That’s why I’m so excited about our announcement about our new partnership with Snowflake. Our customers are in search of creative and sustainable ways to increase their speed to insights for digital transformation, infrastructure modernization and cloud migration and many of them are looking to implement the Snowflake Cloud Data Platform.

It’s designed with a patented new architecture to be the centerpiece for data pipelines, data warehousing, data lakes, data application development, and for building data exchanges to easily and securely share governed data.

With our new partnership, we can now help our customers manage and govern the entire Snowflake data lifecycle, speed transformation of legacy systems to Snowflake and automatically ingest, catalog and govern the data in these scalable, high-performance cloud data stores.

The native erwin DM integration lets customers automate the creation of Snowflake-specific data models; forward-engineer or generate code for Snowflake database schema; reverse-engineer existing Snowflake schema into erwin models; and compare, analyze and synchronize Snowflake models with the databases they represent.

The erwin Data Connector for Snowflake automatically scans and ingests metadata from Snowflake platforms into erwin DI, enabling data mapping to and from Snowflake databases to generate data movement code, lineage and impact analysis. And because erwin DM and erwin DI are integrated, there’s a complete picture of physical, semantic and business metadata in every Snowflake instance, and the creation and association of terms within the business glossary can be accelerated.

Sounds like a match made in heaven? Well, we think so. Let me know your thoughts on the new erwin/Snowflake partnership. Drop me a line.

erwin Rapid Response Resource Center (ERRRC)