Categories
erwin Expert Blog

Automation Gives DevOps More Horsepower

Almost 70 percent of CEOs say they expect their companies to change their business models in the next three years, and 62 percent report they have management initiatives or transformation programs underway to make their businesses more digital, according to Gartner.

Wouldn’t it be advantageous for these organizations to accelerate these digital transformation efforts? They have that option with automation, shifting DevOps away from dependence on manual processes. Just like with cars, more horsepower in DevOps translates to greater speed.

DevOps Automation

Doing More with Less

We have clients looking to do more with existing resources, and others looking to reduce full-time employee count on their DevOps teams. With metadata-driven automation, many DevOps processes can be automated, adding more “horsepower” to increase their speed and accuracy. For example:

Auto-documentation of data mappings and lineage: By using data harvesting templates, organizations can eliminate time spent updating and maintaining data mappings, creating them directly from code written by the ETL staff. Such automation can save close to 100 percent of the time usually spent on this type of documentation.

  • Data lineage and impact analysis views for ‘data in motion’ also stay up to date with no additional effort.
  • Human errors are eliminated, leading to higher quality documentation and output.

Automatic updates/changes reflected throughout each release cycle: Updates can be picked up and the ETL job/package generated with 100-percent accuracy. An ETL developer is not required to ‘hand code’ mappings from a spreadsheet – greatly reducing the time spent on the ETL process, and perhaps the total number of resources required to manage that process month over month.

  • ETL skills are still necessary for validation and to compile and execute the automated jobs, but the overall quality of these jobs (machine-generated code) will be much higher, also eliminating churn and rework.

Auto-scanning of source and target data assets with synchronized mappings: This automation eliminates the need for a resource or several resources dealing with manual updates to the design mappings, creating additional time savings and cost reductions associated with data preparation.

  • A change in the source-column header may impact 1,500 design mappings. Managed manually, this process – opening the mapping document, making the change, saving the file with a new version, and placing it into a shared folder for development – could take an analyst several days. But synchronization instantly updates the mappings, correctly versioned, and can be picked up and packaged into an ETL job/package within the same hour. Whether using agile or classic waterfall development, these processes will see exponential improvement and time reduction. 

Data Intelligence: Speed and Quality Without Compromise

Our clients often understand that incredible DevOps improvements are possible, but they fear the “work” it will take to get there.

It really comes down to deciding to embrace change a la automation or continue down the same path. But isn’t the definition of insanity doing the same thing over and over, expecting but never realizing different results?

With traditional means, you may improve speed but sacrifice quality. On the flipside, you may improve quality but sacrifice speed.

However, erwin’s technology shifts this paradigm. You can have both speed and quality.

The erwin Data Intelligence Suite (erwin DI) combines the capabilities of erwin Data Catalog with erwin Data Literacy to fuel an automated, real-time, high-quality data pipeline.

Then all enterprise stakeholders – data scientists, data stewards, ETL developers, enterprise architects, business analysts, compliance officers, CDOs and CEOs – can access data relevant to their roles for insights they can put into action.

It creates the fastest path to value, with an automation framework and metadata connectors configured by our team to deliver the data harvesting and preparation features that make capturing enterprise data assets fast and accurate.

Click here to request a free demo of erwin DI.

erwin Data Intelligence

Categories
erwin Expert Blog Enterprise Architecture

Agile Enterprise Architecture for DevOps Explained …

How do organizations innovate? Taking an idea from concept to delivery requires strategic planning and the ability to execute. In the case of software development, understanding agile enterprise architecture and its relevance to DevOps is also key.

DevOps, the fusion of software development and IT operations, stems from the agile development movement. In more practical terms, it integrates developers and operations teams to improve collaboration and productivity by automating infrastructure, workflows and continuously measuring application performance.

The goal is to balance the competing needs of getting new products into production while maintaining 99.9-percent application uptime for customers in an agile manner. 

To understand this increase in complexity, we need to look at how new features and functions are applied to software delivery. The world of mobile apps, middleware and cloud deployment has reduced release cycles to days and weeks not months — with an emphasis on delivering incremental change.

Previously, a software release would occur every few months with a series of modules that were hopefully still relevant to the business goals.

The shorter, continuous-delivery lifecycle helps organizations:

  • Achieve shorter releases by incremental delivery and delivering faster innovation
  • Be more responsive to business needs by improved collaboration, better quality and more frequent releases
  • Manage the number of applications impacted by a business release by allowing local variants for a global business and continuous delivery within releases

The DevOps approach achieves this by providing an environment that:

  • Minimizes software delivery batch sizes to increase flexibility and enable continuous feedback as every team delivers features to production as they are completed
  • Replaces projects with release trains that minimize batch-waiting time to reduce lead times and waste
  • Shifts from central planning to decentralized execution with a pull philosophy, thus minimizing batch transaction cost to improve efficiency
  • Makes DevOps economically feasible through test virtualization, build automation and automated release management as we prioritize and sequence batches to maximize business value and select the right batches, sequence them in the right order, guide the implementation, track execution and make planning adjustments to maximize business value

An Approach with an Enterprise Architecture View

So far, we have only looked at the delivery aspects. So how does this approach integrate with an enterprise architecture view?

To understand this, we need to look more closely at the strategic planning lifecycle. The figure below shows how the strategic planning lifecycle supports an ‘ideas-to-delivery’ framework.

Agile Enterprise Architecture: The Strategic Planning Lifecycle

Figure 1: The strategic planning lifecycle

You can see the high-level relationship between the strategy and goals of an organization and the projects that deliver the change to meet these goals. Enterprise architecture provides the model to govern the delivery of projects in line with these goals.

However, we must ensure that any model built include ‘just-enough’ enterprise architecture to produce the right level of analysis for driving change. The agile enterprise architecture model, then, is then one that enables enough analysis to plan which projects should be undertaken and ensures full architectural governance for delivery. The last part of this is achieved by connecting to the tools used in the agile space.

Agile Enterprise Architecture: Detailed View of the Strategic Planning Lifecycle

Figure 2: Detailed view of the strategic planning lifecycle

The Agile Enterprise Architecture Lifecycle

An agile enterprise architecture has its own lifecycle with six stages.

Vision and strategy: Initially, the organization begins by revisiting its corporate vision and strategy. What things will differentiate the organization from its competitors in five years? What value propositions will it offer customers to create that differentiation? The organization can create a series of campaigns or challenges to solicit new ideas and requirements for its vision and strategy.

Value proposition: The ideas and requirements are rationalized into a value proposition that can be examined in more detail.

Resources: The company can look at what resources it needs to have on both the business side and the IT side to deliver the capabilities needed to realize the value propositions. For example, a superior customer experience might demand better internet interactions and new applications, processes, and infrastructure on which to run. Once the needs are understood, they are compared to what the organization already has. The transition planning determines how the gaps will be addressed.

Execution: With the strategy and transition plan in place, enterprise architecture execution begins. The transition plan provides input to project prioritization and planning since those projects aligned with the transition plan are typically prioritized over those that do not align. This determines which projects are funded and entered into or continue to the DevOps stage.

Guidelines: As the solutions are developed, enterprise architecture assets such as models, building blocks, rules, patterns, constraints and guidelines are used and followed. Where the standard assets aren’t suitable for a project, exceptions are requested from the governance board. These exceptions are tracked carefully. Where assets are frequently the subject of exception requests, they must be examined to see if they really are suitable for the organization.

Updates: Periodic updates to the organization’s vision and strategy require a reassessment of the to-be state of the enterprise architecture. This typically results in another look at how the organization will differentiate itself in five years, what value propositions it will offer, the capabilities and resources needed, and so on. If we’re not doing things the way we said we wanted them done, then we must ask if our target architectures are still correct. This helps keep the enterprise architecture current and useful.

Enterprise Architecture Tools for DevOps

DevOps can use a number of enterprise architecture solutions. For example, erwin’s enterprise architecture products use open standards to link to other products within the overall lifecycle. This approach integrates agile enterprise architecture with agile development, connecting project delivery with effective governance of the project lifecycle. Even if the software delivery process is agile, goals and associated business needs are linked and can be met.

To achieve this goal, a number of internal processes must be interoperable. This is a significant challenge, but one that can be met by building an internal center of excellence and finding a solution by starting small and building a working environment.

The erwin EA product line takes a rigorous approach to enterprise architecture to ensure that current and future states are published for a wider audience to consume. The erwin EA repository can be used as an enterprise continuum (in TOGAF terms).

Available as a cloud-based platform or on-premise, erwin EA solutions provide a quick and cost-effective path for launching a collaborative enterprise architecture program. With built-in support for such industry frameworks as ArchiMate® and TOGAF®,  erwin enables you to model the enterprise, capture the IT blueprint, generate roadmaps and provide meaningful insights to both technical and business stakeholders.

According to Gartner, enterprise architecture is becoming a “form of internal management consulting,” helping define and shape business and operating models, identify risks and opportunities, and then create technology roadmaps. Understanding how vision and strategy impacts enterprise architecture is important – with an overall goal of traceability from our ideas and initiatives all the way through delivery.

enterprise architecture devops

Categories
erwin Expert Blog

Top 10 Data Governance Predictions for 2019

This past year witnessed a data governance awakening – or as the Wall Street Journal called it, a “global data governance reckoning.” There was tremendous data drama and resulting trauma – from Facebook to Equifax and from Yahoo to Marriott. The list goes on and on. And then, the European Union’s General Data Protection Regulation (GDPR) took effect, with many organizations scrambling to become compliant.

So what’s on the horizon for data governance in the year ahead? We’re making the following data governance predictions for 2019:

Data Governance Predictions

Top 10 Data Governance Predictions for 2019

1. GDPR-esque regulation for the United States:

GDPR has set the bar and will become the de facto standard across geographies. Look at California as an example with California Consumer Privacy Act (CCPA) going into effect in 2020. Even big technology companies like Apple, Google, Amazon and Twitter are encouraging more regulations in part because they realize that companies that don’t put data privacy at the forefront will feel the wrath from both the government and the consumer.

2. GDPR fines are coming and they will be massive:

Perhaps one of the safest data governance predictions for 2019 is the coming clamp down on GDPR enforcement. The regulations weren’t brought in for show and so it’s likely the fine-free streak for GDPR will be ending … and soon. The headlines will resemble data breaches or hospitals with Health Information Portability Privacy Act (HIPAA) violations in the U.S. healthcare sector. Lots of companies will have an “oh crap” moment and realize they have a lot more to do to get their compliance house in order.

3. Data policies as a consumer buying criteria:

The threat of “data trauma” will continue to drive visibility for enterprise data in the C-suite. How they respond will be the key to their long-term success in transforming data into a true enterprise asset. We will start to see a clear delineation between organizations that maintain a reactive and defensive stance (pain avoidance) versus those that leverage this negative driver as an impetus to increase overall data visibility and fluency across the enterprise with a focus on opportunity enablement. The latter will drive the emergence of true data-driven entities versus those that continue to try to plug the holes in the boat.

4. CDOs will rise, better defined role within the organization:

We will see the chief data officer (CDO) role elevated from being a lieutenant of the CIO to taking a proper seat at the table beside the CIO, CMO and CFO.  This will give them the juice needed to create a sustainable vision and roadmap for data. So far, there’s been a profound lack of consensus on the nature of the role and responsibilities, mandate and background that qualifies a CDO. As data becomes increasingly more vital to an organization’s success from a compliance and business perspective, the role of the CDO will become more defined.

5. Data operations (DataOps) gains traction/will be fully optimized:

Much like how DevOps has taken hold over the past decade, 2019 will see a similar push for DataOps. Data is no longer just an IT issue. As organizations become data-driven and awash in an overwhelming amount of data from multiple data sources (AI, IOT, ML, etc.), organizations will need to get a better handle on data quality and focus on data management processes and practices. DataOps will enable organizations to better democratize their data and ensure that all business stakeholders work together to deliver quality, data-driven insights.

Data Management and Data Governance

6. Business process will move from back office to center stage:

Business process management will make its way out of the back office and emerge as a key component to digital transformation. The ability for an organization to model, build and test automated business processes is a gamechanger. Enterprises can clearly define, map and analyze workflows and build models to drive process improvement as well as identify business practices susceptible to the greatest security, compliance or other risks and where controls are most needed to mitigate exposures.

7. Turning bad AI/ML data good:

Artificial Intelligence (AI) and Machine Learning (ML) are consumers of data. The risk of training AI and ML applications with bad data will initially drive the need for data governance to properly govern the training data sets. Once trained, the data they produce should be well defined, consistent and of high quality. The data needs to be continuously governed for assurance purposes.

8. Managing data from going over the edge:

Edge computing will continue to take hold. And while speed of data is driving its adoption, organizations will also need to view, manage and secure this data and bring it into an automated pipeline. The internet of things (IoT) is all about new data sources (device data) that often have opaque data structures. This data is often integrated and aggregated with other enterprise data sources and needs to be governed like any other data. The challenge is documenting all the different device management information bases (MIBS) and mapping them into the data lake or integration hub.

9. Organizations that don’t have good data harvesting are doomed to fail:

Research shows that data scientists and analysts spend 80 percent of their time preparing data for use and only 20 percent of their time actually analyzing it for business value. Without automated data harvesting and ingesting data from all enterprise sources (not just those that are convenient to access), data moving through the pipeline won’t be the highest quality and the “freshest” it can be. The result will be faulty intelligence driving potentially disastrous decisions for the business.

10. Data governance evolves to data intelligence:

Regulations like GDPR are driving most large enterprises to address their data challenges. But data governance is more than compliance. “Best-in-breed” enterprises are looking at how their data can be used as a competitive advantage. These organizations are evolving their data governance practices to data intelligence – connecting all of the pieces of their data management and data governance lifecycles to create actionable insights. Data intelligence can help improve the customer experiences and enable innovation of products and services.

The erwin Expert Blog will continue to follow data governance trends and provide best practice advice in the New Year so you can see how our data governance predictions pan out for yourself. To stay up to date, click here to subscribe.

Data Management and Data Governance: Solving the Enterprise Data Dilemma