erwin Expert Blog

Why the NoSQL Database is a Necessary Step

 The NoSQL database is gaining huge traction and for good reason.

Traditionally, most organizations have leveraged relational databases to manage their data. Relational databases ensure the referential integrity, constraints, normalization and structured access for data across disparate tools, which is why they’re so widely used.

But as with any technology, evolving trends and requirements eventually push the limits of capability and suitability for emerging business use cases.

New data sources, characterized by increased volume, variety and velocity have exposed limitations in the strict relational approach to managing data.  These characteristics require a more flexible approach to the storage and provisioning of data assets that can support these new forms of data with the agility and scalability they demand.

Technology – specifically data – has changed the way organizations operate. Lower development costs are allowing start ups and smaller business to grow far quicker. In turn, this leads to less stable markets and more frequent disruptions.

As more and more organizations look to cut their own slice of the data pie, businesses are more focused on in-house development than ever.

This is where relational data modeling becomes somewhat of a stumbling block.

Rise of the NoSQL Database

More and more, application developers are turning to the NoSQL database.

The NoSQL database is a more flexible approach that enables increased agility in development teams. Data models can be evolved on the fly to account for changing application requirements.

This enables businesses to adopt an agile system to releasing new iterations and code. They’re scalable and object oriented, and can also handle large volumes of structured, semi-structured and unstructured data.

Due to the growing deployment of NoSQL and the fact that our customers need the same tools to manage them as their relational databases, erwin is excited to announce the availability of a beta program for our new erwin DM for NoSQL product.

With our new erwin DM NoSQL option, we’re the only provider to help you model, govern and manage your unstructured cloud data just like any other traditional database in your business.

  • Building new cloud-based apps running on MongoDB?
  • Migrating from a relational database to MongoDB or the reverse?
  • Want to ensure that all your data is governed by a logical enterprise model, no matter where its located?

Then erwin DM NoSQL is the right solution for you. Click here to apply for our erwin DM NoSQL/MongoDB beta program now.

And look for more info here on the power and potential of  NoSQL databases in the coming weeks.

erwin NoSQL database

erwin Expert Blog

GDPR guide: The role of the Data Protection Officer

Over the past few weeks we’ve been exploring aspects related to the new EU data protection law (GDPR) which will come into effect in 2018.

erwin Expert Blog

Managing Any Data, Anywhere with Any2

The amount of data in the world is staggering. And as more and more organizations adopt digitally orientated business strategies the total keeps climbing. Modern organizations need to be equipped to manage Any2 – any data, anywhere.

Analysts predict that the total amount of data in the world will reach 44 zettabytes by 2020 – one zettabyte = 44 trillion gigabytes. That’s an incredible feat in and of itself. But considering the fact that the total had only reached 4.4 zettabytes in 2013, the rate at which data is collected and stored becomes even more astonishing.

However, it is equally incredible that less than 0.5% of that data is currently analyzed and/or utilized effectively by the business.

What does this mean for business?

Perhaps the most obvious answer is opportunity. You likely wouldn’t be reading this blog if you weren’t at least passively aware of the potential insight that can be derived from a series of ones and zeros.

Start-ups such as Uber, Netflix and Airbnb are perhaps some of the best examples of data’s potential being realized. It’s even more apparent when you consider these three organizations refer to themselves as technology companies, as opposed to the fields their services fall under.

But with data’s potential, potentially open for any business to invest in, action, and benefit from, competition is more fierce than ever, which brings us to what else this new wave of data means for business. That being effective data management.

All of this new data is being created, or even stored, under one manageable umbrella. It’s disparate, it’s noisy, and in its raw form it’s often useless. So to uncover data’s aforementioned potential, businesses must take the necessary steps to “clean it up”.

That’s what the Any2 concept is all about. Allowing businesses to manage, govern and analyse any data, anywhere.

Any2 - Data Management Platform

Any2 – Any Data

The first part of the Any2 equation, pertains to Any Data.

Managing data requires facing the challenges that come with the ‘three Vs of data’: volume, variety and velocity, with volume referring the amount of data, variety to its different sources, and velocity the speed in which it must be processed.

We can stretch these three Vs to five when we include veracity (confidence in the accuracy of the data), and value.

Generally, any data concerns the variety ‘V’, referring to the numbered and disparate potential sources data can be derived from. But as we need to be able to incorporate all of the varying forms of data to accurately analyze it, we can also say any data concerns the volume, and velocity too – especially where Big Data is considered.

Big Data initiatives increase the volume of data businesses have to manage exponentially, and to achieve desired time to market, it must be processed quickly (albeit thoroughly), too.

Additionally, data can be represented as either structured or unstructured.

Traditionally, most data fell under the structured label. Data including business data, relational data, and operational data, for example. And although the different types of data were still disparate, being inherently structured within their own vertical still made them far easier to manage, define, and analyze.

Unstructured data, however, is the polar opposite. It’s inherently messy and it’s hard to define, making both reporting and analysis potentially problematic. This is an issue many businesses face when transitioning to a more data-centric approach to operations.

Big data sources such as click stream data, IoT data, machine data and social media data all fall under this banner. All of these sources need to be rationalized and correlated so they can be analyzed more effectively, and in the same vain as the aforementioned structured data.

Any2 – Anywhere

The anywhere half of the equation is arguably also predominantly focused on the variety ‘V’ – but from a different angle. Anywhere is more concerned with the differing and disparate ways and places in which data can be securely stored, rather than the variety in the data itself.

Although an understanding of where your data is has always been a necessity, it’s now become more relevant than ever. Prior to the adoption of cloud storage and services, data would have to have been managed locally, within the “firewall”.

Businesses would still have to know where the data was saved, and how it could be accessed.

However, the advantages of storing data outside of the business have become more apparent and more widely accepted. This has seen many businesses take the leap and invest in varying capacities, into-cloud based storage and software-as-a-service (SaaS).

Take SAP, for example. SAP provides one solution and one collated database, in favour of a business paying installation and upkeep fees for multiple softwares and databases.

And we still need to consider the uptick in the amount of businesses that buy customer data.

All of this data still has to be integrated, documented and understood in order for it to be useful, as poor management of data can lead to poor results – or, garbage in, garbage out for short.

Therefore, the key focus of the anywhere part of the equation is granting businesses the ability to manage external data at the same level as internal.

Effectively managing data anywhere, requires data modeling, business process and enterprise architecture.

Data modeling is needed to establish what you have whether internal or external, and to identify what that data is.

Business Processes is required to understand how the data should be used and how it best drives the business.

Enterprise Architecture is useful as it allows a business to determine how best to leverage the data to drive value. It’s also needed to ensure the business has a solid enough architecture to allow for this value to come to fruition, and in analyzing/predicting the impact of change, so that value isn’t adversely affected.

So how do we manage Any Data, Anywhere?

The best way to effectively manage Any Data, Anywhere, so that we can ensure investing in data management and analysis adds value, is to consider the ‘3Vs’ in relation to the data timeline. You should also consider the various initiatives (Data Modeling, Enterprise Architecture and Business Process) that can be actioned at each stage to ensure the data is properly processed and understood.

Any2 - Data management platform

Any2 approach helps you:

  • Effectively manage and govern massive volumes of data
  • Consolidate and build applications with hybrid data architectures – traditional + Big Data, cloud and on-premise
  • Support expanding regulatory and legislative requirements: GDPR etc
  • Simplify collaboration and improve alignment with accurate financial and operation information
  • Improve business processes for operational efficiency and compliance standards
  • Empower your people with self-service data access: The right information at the right time to improve corporate decision-making



For more Data Modeling, Enterprise Architecture, and Business Process advice follow us on Twitter and Linkedin to stay updated with the new posts!

Importance of Governing Data

erwin Expert Blog

Data management tools essential for building the data foundation platform

Instead of utilizing built for purpose data management tools, businesses in the early stages of a data strategy often leverage pre existing, make-shift software.

However, the rate in which modern businesses create and store data, means these methods can be quickly outgrown.

In our last post, we looked at why any business with current, or future plans for a data-driven strategy need to ensure a strong data foundation is in place.

Without this, the insight provided by data can often be incomplete and misleading. This negates many of the benefits data strategies are typically implemented to find, and can cause problems down the line such as slowing down time to markets; increasing the potential for missteps and false starts; and above all else, adding to costs.

Leveraging a combination of data management tools, including data modeling, enterprise architecture and business processes can ensure the data foundations are strong, and analysis going forward is as accurate as possible.

For a breakdown of each discipline, how they fit together, and why they’re better together, read on below:

This post is part two of a two part series. For part 1, see here.

Data management tools for an agile data foundation.

Data Modeling

Effective Data Modeling helps an organization catalogue, and standardize data, making the data more consistent and easier to digest and comprehend. It can provide direction for a systems strategy and aid in data analysis when developing new databases.

The value in the former is that it can indicate what kind of data should influence business processes, while the latter helps an organization find exactly what data they have available to them and categorize it.

In the modern world, data is a valuable resource, and so active data modeling in order to manage data, can reveal new threads of useful information. It gives businesses a way to query their databases for more refined and targeted analysis. Without an effective data model, insightful data can quite easily be overlooked.

Data modeling also helps organizations break down data silos. Typically, much of the data an organization possesses is kept on disparate systems and thus, making meaningful connections between them can be difficult. Data modeling serves to ease the integration of these systems, adding a new layer of depth to analysis.

Additionally, data modeling makes collaborating easier. As a rigorous and visual form of documentation, it can break down complexity and provide an organization with a defined framework, making communicating and sharing information about the business and its operations more straightforward.

Enterprise Architecture

Enterprise Architecture (EA) is a form of strategic planning used to map a businesses current capabilities, and determine the best course of action to achieve the ideal future state vision for the organization.

It typically straddles two key responsibilities. Those being ‘foundational’ enterprise architecture, and ‘vanguard’ enterprise architecture. Foundational EA tends to be more focused on the short term and is essentially implemented to govern ‘legacy IT’ tasks. The tasks we colloquially refer to as ‘keeping on the lights’.

It benefits a business by ensuring things like duplications in process, redundant processes, and unaccounted for systems and shelfware don’t cost the business time and money.

Vanguard enterprise architects tend to work with the long term vision in mind, and are expected to innovate to find the business new ways of reaching their future state objectives that could be more efficient than the current strategy.

It’s value to a business becomes more readily apparent when it enterprise architects operate in terms of business outcomes, and include better alignment of IT and the wider business; better strategic planning by adding transparency to the strategy, allowing the whole business to align behind, and work towards the future objective; and a healthier approach to risk, as the value (reward) in relation to the risk can be more accurately established.

Business Process

Business process solutions help leadership, operations and IT understand the complexities of their organizations in order to make better, more informed and intelligent opinions.

There are a number of factors that can influence an organization who had been making it by without a business process solution, to implement the initiative. Including strategic initiatives – like business transformation, mergers and acquisitions and business expansion; compliance & audits – such as new/changing industry regulations, government legislation and internal policies; and process improvement – enhancing financial performance, lowering operating costs and polishing the customer experience.

We can also look at the need for business process solutions from the perspective of challenges it can help overcome. Challenges including the complexities of a large organization and international workforces; confusion born of undefined and undocumented processes as well as outdated and redundant ones; competitor driven market disruption; and managing change.

Business process solutions aim to tackle these issues by allowing an organization to do the following:-

  • Establish processes where they don’t exist
  • Document processes that exist but aren’t consistently followed
  • Examine/analyze/improve/eliminate processes that don’t work
  • Optimize processes that take too long, cost too much or don’t make sense
  • Harmonize redundant processes across the organization.
  • Construct processes for new products, markets and organizations
  • Disrupt processes with new technology and data assets.

The Complete, Agile Foundation for the Data-Driven Enterprise.

As with data, these three examples of data management tools also benefit from a more fluent relationship, and for a long time, industry professionals have hoped for a more comprehensive approach. With DM, EA and BP tools that work in tandem with, and complement one another inherently.

It’s a request that makes sense too, as although all three data management tools are essential in their own right, they all influence one another.

We can look at acquiring, storing and analyzing data, then creating a strategy from that analysis’ as separate acts, or chapters. And when we bring the whole process together, under one suite, we effectively have the whole ‘Data Story’ available to us in a format we can analyze and inspect as a whole.


erwin Expert Blog

Data Modeling: What the Experts Think (Part II)

Donna Burbank’s recent Enterprise Management 360 podcast was a hive of useful information. The Global Data Strategy CEO sat down with Data Modeling experts, and discussed the benefits of Data Modeling, and why practice is now more relevant than ever.

You can listen to the podcast on the Enterprise Management 360 website here – and below, you’ll find part 2 of 2. If you missed Part 1, find it here.

Guest Speakers:

Danny Sandwell, Product Manager at erwin (
Simon Carter, CEO at Sandhill Consultants (
Dr. Jean-Marie Fiechter, Business Owner Customer Reporting at Swisscom (
Hosted by: Donna Burbank, CEO at Global Data Strategy (

Data Modeling

Do you think that a Data Model is both a cost saver and revenue driver? Is data driving business profitability?

Simon Carter

Yes, I think it does. Data models obviously facilitate efficiency improvements, and they do that by identifying and eliminating duplication and promoting standardization. Efficiency improvements are going to bring you some cost reduction, and the reduction in operational risk through improved data quality can also deliver a competitive edge.

Opportunities for automation and rapid deployment of new technologies via a good understanding of your underlying data can make an organization agile, and reduce time-to-market for new products and services. So overall, absolutely, I can see it driving efficiency and reducing costs and delivering serious financial benefit.


For us, its mostly something that increases efficiency and in the end help us reduce costs, because it makes maintenance of the data warehouse much easier. If you have a proper data model you don’t have that chaos that happens if you just get data in and never model it correctly.

To start with its sometimes easier to just get data in and get the first reports out, but down the road when you try to maintain that chaos, it is much more costly. We like to do the thing right from the beginning; we model it, we integrate it, we avoid data redundancy that way, and it makes it much cheaper in the end.

You have a little bit longer at the beginning to do it properly, but in the long-run or medium-run you’re much more efficient and much faster. Because sometimes, you already have the data that is needed for a new report, and if you don’t have a data model, you don’t realize that you already have that data. You recreate a new interface and get the data a second, third, fourth or tenth time, and that takes a lot longer and is more costly.

So yes, it’s certainly more efficient, reduces costs, and because you have the data and can visualize what you already have, it certainly gives some more opportunity to get new business or new ideas, new analytics. That helps the business get ahead of the competition.

Danny Sandwell

What I’m seeing is a lot surveys of organizations, whether it’s the CIO, CEO or the CDO, and talking about their approach to data management and their approach to their business from a data-driven perspective. There’s an unbelievable correlation between people that take a business-focused approach to data management, so business alignment first, technology and infrastructure second, and the growth of those companies overall.

Businesses that are a little behind the curve in terms of that alignment tend to be lower-growth companies. And the way that people are looking at data – because they’re looking at it as a proper asset, they’re not just looking at return-on-investment, they’re looking at return-on-opportunity – increases significantly when you bring that data to the business and make it easy for them to access.

So, the data model is the place where business aligns with data, so to have a business-driven data strategy requires that process of modeling it properly. And then pushing that information out. One of the benefits of data modeling, it doesn’t just support the initiative today, if you do it right and set it up right, it supports the initiative today, tomorrow and the day after that, at a much lower cost every time you iterate against that data model to bring value to a specific initiative.

By opening it up and allowing people to understand the data model in their own time and in their own terms, it increases trust in data. And when you have trust in data, you have strategic data usage. And when you have strategic data usage, all the statistics are showing that it leads to not just efficiencies and lower costs, but to new opportunities and growth in businesses.

Who in an organization typically uses Data Models?


In our organization, the users are still a majority of technical users. The people that work either in Business Intelligence, building analytics, building reports, or building ETL jobs and stuff like that. But increasingly, we also have power users that are outside of Business Intelligence that are sufficiently technical enough that they can see the use of the model.

They use the model as documentation to see what kind of data they need for a report, cockpit, BI, and how to link that data together to get something that is efficient and meaningful. So, it’s still very technical at Swisscom but it’s getting a little bit broader.

Danny Sandwell

I think for a large segment of the business world, it is still the technical or IT person. The viewing and understanding and more collaboration is on the business-side. But I think there’s also a difference in terms of the maturity of the organization and the lifecycle of the organization.

Organizations that have a large legacy and have been transitioning from a brick-and-mortar traditional business to more of a digital business, they have some challenges with legacy infrastructure, and the legacy infrastructure requires IT being involved. A lot more hands on, because it’s just that big and complex and there are a lot constraints.

You have a lot of organizations that are starting up now that have no legacy to deal with and have access to the cloud and all these self-service, off-premise type capabilities, and their infrastructure is much newer. And what I’m seeing in organizations like that, is beyond just viewing data models, they’re actually starting to build the data models.

So, your starting to see power-users or analysts on the business-side, and folks like that who are building a conceptual data model and then using that model to start going to whatever IT service they have, whether in the cloud or on-premise, to show what their requirements are and have them have those things built underneath.

So, we’re still very much in flux in terms of where an organization is, what their history is and how fast they’ve transformed in terms becoming a digital business. But I’m seeing the trend where you have more and more business people involved in the actual building at the appropriate level, and then using that as the hand-off and contract between them and all the different service providers that they might be taking advantage of.

Whether its traditional and ETL type architectures, or whether its these new analytics use-cases supported by data virtualization. At the end of the day, the business person is able to articulate their requirements and needs, and then push that down, where it used to be more of a bottom up approach.

Simon Carter

I very much follow the line that Danny was taking there, which is that most of the doing is still done by the technical team, most of the building of the models is done by the technical team. While a lot of looking at models is done by the business users, they’re also verifying things and contributing significantly to the data model.

I’ll go back to my common taxonomy. You know, business models are being used by business analysts to validate data requirements with subject matter experts, and can be the basis of data glossaries used throughout an organization.

Application models are used by solution architects who are designing and validating solutions to store and retrieve the business data, and communicate that design to developers. Implementation models are used by database designers and administrators to create and maintain the structures needed to implement design.

Increasingly though, the business-level metadata is being used to enable those business users to drive down into the actual data, and verify its lineage and quality. And that’s due to the ability to map a business term right through the various models to the data it describes.

With a lot of data-driven transformation being focused around new technologies like Cloud, Big Data and Internet of Things, is Data Modeling still relevant?

Simon Carter

I think data modeling is still incredibly relevant in this age of data technologies. Big Data is referring to data storage and retrieval technology, so the Business and Application models are unaffected. All we need is for the Implementation models to be able to properly represent any new technology requirement.

Danny Sandwell

Generally, there’s new technology that comes out and everybody thinks they are the be-all-end-all to basically re-engineer the world, and leave everything else behind. But the reality is, you end up using Big Data for the appropriate applications that traditional data doesn’t handle, but you’re not going to rip-and-replace all the infrastructure that you have underneath supporting your traditional business data.

So, we end up with a hybrid data architecture. And with that hybrid data architecture, it becomes even more important to have a data model because there are some significant differences in terms of where data may physically sit in the organizations.

You know, people get this new technology and they think they don’t need any models and they know how to work with the technology. And that works when things are very encapsulated, when things are together and they’re not looking for integration. But the reality is, whatever is in Big Data probably needs to be integrated with the rest of the data.

So, what we’re seeing is at the outset, people are using the data model to document what is in those Big Data instances, because it’s a bit of a black box to the business, and the business is where the data drives value – so the business requires it. And as we see more of an impetus for proper data governance, both to manage the assets that are strategic in our organization, but also to respond to the legislative and regulatory compliance requirements that are now becoming a reality for most businesses, there is a need there.

First off, it’s a documentation tool so that people can see data no matter where it sits, in the same format, and relate to that data from a business perspective. Its building it into the architecture so you can see that its governed and managed with the same rigor that the rest of your data is, so you can establish trust in data across your organization.


I think what we see at our place, when I look at the Big Data cluster that we have, it’s a mix. If you have stuff that you do, a one-time shot at the data and then some analytics, then you’re probably using something like schema on read and you don’t really have a data model.

But as soon as you get a Big Data cluster to do analytics in a repetitive way, where you have the same questions popping up every day, every week or every month, then you certainly will have some part of your Big Data cluster that are schema on write, and then you have a data model. Because that’s your only way that you can ensure that your analytics, data mining and what not always encounter the same structure of the data.

You have some parts of the Big Data cluster that are not modeled because they are very transient. And you have some parts that are used as a source for your data warehouse or other analytics systems. Those are definitely modeled, otherwise you waste too much time every time something changes.

For more Data Modeling insight, follow us on Twitter here, and join our erwin Group on LinkedIn.

Data Governance & Data Modeling White Paper Download

erwin Expert Blog

The Rise of NoSQL and NoSQL Data Modeling

With NoSQL data modeling gaining traction, data governance isn’t the only data shakeup organizations are currently facing.

erwin Expert Blog

Data Modeling: What the Experts Think

In a recently hosted podcast for Enterprise Management 360˚, Donna Burbank spoke to several experts in data modeling and asked about their views on some of today’s key enterprise data questions.

erwin Expert Blog

Why Data vs. Process is dead, and why we should look at the two working together

Whether a collection of data could be useful to a business, is all just a matter of perspective. We can view data in its raw form like a tangled set of wires, and for them to be useful again, they need to be separated.

We’ve talked before about how Data Modeling, and Enterprise Architecture can make data easier to manage and decipher, but arguably, there’s still a piece of the equation missing.

To make the most out of Big Data, the data must also be rationalized in the context of the business’ processes, where the data is used, by whom, and how. This is what process modeling aims to achieve. Without process modeling, businesses will find it difficult to quantify, and/or prioritize the data from a business perspective – making a truly business outcome-focused approach harder to realize.

So What is Process Modeling?

“Process modeling is the documentation of an organization’s processes designed to enhance company performance,” said Martin Owen, erwin’s VP of Product Management.

It does this by enabling a business to understand what they do, and how they do it.

As is commonplace for disciplines of this nature, there are multiple industry standards that provide the basis of the approach to how this documentation is handled.

The most common of which, is the “business process modeling notation” (BPMN) standard. With BPMN, businesses can analyze their processes from different perspectives, such as a human capital perspective, shining a light on the roles and competencies required for a process to perform.

Where does Data Modeling tie in with Process Modeling?

Historically, industry analysts have viewed Data and Process Modeling as two competing approaches. However, it’s time that notion was cast aside, as the benefits of the two working in tandem are too great to just ignore.

The secret behind making the most out of data, is being able to see the full picture, as well as drill down – or rather, zoom in – on what’s important in the given context.

From a process perspective, you will be able to see what data is used in the process and architecture models. And from a data perspective, users can see the context of the data and the impact of all the places it is used in processes across the enterprise. This provides a more well-rounded view of the organization and the data. Data modelers will benefit from this, enabling them to create and manage better data models, as well as implement more context specific data deployments.

It could be that the former approach to Data and Process Modeling was born out of the cost to invest in both (for some businesses) being too high, aligning the two approaches being too difficult, or a cocktail of both.

The latter is perhaps the more common culprit, though. This is evident when we consider the many companies already modeling both their data and processes. But the problem with the current approach is that the two model types are siloed, severing the valuable connections between the data and meaning alignment is difficult to achieve. Additionally, although all the data is there, the aforementioned severed connections are just as useful as the data itself, and so denying them means a business isn’t seeing the full picture.

However, there are now examples of both Data and Process Modeling being united under one banner.

“By bringing both data and process together, we are delivering more value to different stakeholders in the organization by providing more visibility of each domain,” suggested Martin. “Data isn’t locked into the database administrator or architect, it’s now expressed to the business by connections to process models.”

The added visibility provided by a connected data and process modeling approach is essential to a Big Data strategy. And there are further indications this approach will soon be (or already is), more crucial than ever before. The Internet of Things (IoT), for example, continues to gain momentum, and with it will come more data, at quicker speeds, from more disparate sources. Businesses will need to adopt this sort of approach to govern how this data is moved and united, and to identify/tackle any security issues that arise.

Enterprise Data Architecture and Data Governance

erwin Expert Blog

Big Data Benefits, with Enterprise Architecture and Data Modelling

If Gartner’s word is anything to go by, Big Data adoption is seeing an uptick. The analyst cites “increasing inquiries” into Big Data analytics tools, as more businesses look for new opportunities in capturing increasing amounts, or eek more value out of the large amounts of data they already own.

Supporting this, a US-based study into  budgetary plans, indicated that 60% of CIOs believe Big Data will be a ‘top driver’ of IT spending.

Generally speaking, a collective shift in the industry is rarely a coincidence. Trends are usually propped up by a series of concrete benefits, and in the case of Big Data, this is no different.

Companies with a well actioned Big Data strategy can make more well-rounded and informed decisions. One of the key uses of Big Data is to get a better understanding of the market, prospects and customers.

Data is sometimes referred to as the “oil of the 21st century”, and customer data specifically, is arguably the key factor in that. Online and digital business models, and notably Social Media, has opened up a two way dialogue between people and the rest of the world, and provided businesses with an unprecedented level of meaningful data insight.

As a result, businesses now know more about their customers than ever, and this information can be used to earn new ones.

In gaining a better understanding of the market, Big Data can be used to gauge potential market interest. As well as indicating whether a new service or product is worth providing, this information can also help businesses forecast supply with greater accuracy, in relation to demand.

As well as understanding external factors, Big Data can also provide new insights to understand internal operations and process efficiencies. The data can can highlight capabilities and processes that are ripe for improvement, and be used to guide the best course of action to optimization.

Why You Need Enterprise Architecture and Data Modeling

When businesses get it right, Big Data can open a lot of new doors, and allow a business to reach new heights. But simply collecting the data isn’t enough. To return to an aforementioned analogy, much like oil, Big Data isn’t of much use in its raw form. It needs to be refined, and concentrated into something decipherable, and greater than the sum of its parts.

Both data modeling (DM) and enterprise architecture (EA) are essential in making the most out of this refinement process. Data Modelling helps you to analyze the data by providing a contextualized perspective of the information across various platforms. Enterprise Architecture helps you translate and apply data to strategic business and IT objectives. It also aids in indicating which data insights are a priority within your current-state organization and which data will be critical to support your future-state.

This is great news for businesses who already have established a functioning EA and/or DM initiative, but those behind in terms of architecture and modeling will have to find room in the budget for new tools.

In the past, this would have always been a daunting exercise. Encouraging stakeholder investment into EA especially has been notoriously difficult. High, local installation costs and long term contractual commitments are enough to make any business think twice, especially when the business is trying to stay agile. – and this goes doubly for a specialist profession such as EA, where business leaders and stakeholders might not be fully aware of the potential gains.

However, the introduction of Software as a Service based tools has provided the aforementioned apprehensive businesses a new life line. Local installation costs and long term commitments are avoided, in favor of flexibility.

What’s more, integrating enterprise architecture tools with data modeling tools brings significant benefits in alignment of processes and systems.

Enterprise Architecture & Data Modeling White Paper