Almost 70 percent of CEOs say they expect their companies to change their business models in the next three years, and 62 percent report they have management initiatives or transformation programs underway to make their businesses more digital, according to Gartner.
Wouldn’t it be advantageous for these organizations to accelerate these digital transformation efforts? They have that option with automation, shifting DevOps away from dependence on manual processes. Just like with cars, more horsepower in DevOps translates to greater speed.
Doing More with Less
We have clients looking to do more with existing resources, and others looking to reduce full-time employee count on their DevOps teams. With metadata-driven automation, many DevOps processes can be automated, adding more “horsepower” to increase their speed and accuracy. For example:
Auto-documentation of data mappings and lineage: By using data harvesting templates, organizations can eliminate time spent updating and maintaining data mappings, creating them directly from code written by the ETL staff. Such automation can save close to 100 percent of the time usually spent on this type of documentation.
- Data lineage and impact analysis views for ‘data in motion’ also stay up to date with no additional effort.
- Human errors are eliminated, leading to higher quality documentation and output.
Automatic updates/changes reflected throughout each release cycle: Updates can be picked up and the ETL job/package generated with 100-percent accuracy. An ETL developer is not required to ‘hand code’ mappings from a spreadsheet – greatly reducing the time spent on the ETL process, and perhaps the total number of resources required to manage that process month over month.
- ETL skills are still necessary for validation and to compile and execute the automated jobs, but the overall quality of these jobs (machine-generated code) will be much higher, also eliminating churn and rework.
Auto-scanning of source and target data assets with synchronized mappings: This automation eliminates the need for a resource or several resources dealing with manual updates to the design mappings, creating additional time savings and cost reductions associated with data preparation.
- A change in the source-column header may impact 1,500 design mappings. Managed manually, this process – opening the mapping document, making the change, saving the file with a new version, and placing it into a shared folder for development – could take an analyst several days. But synchronization instantly updates the mappings, correctly versioned, and can be picked up and packaged into an ETL job/package within the same hour. Whether using agile or classic waterfall development, these processes will see exponential improvement and time reduction.
Data Intelligence: Speed and Quality Without Compromise
Our clients often understand that incredible DevOps improvements are possible, but they fear the “work” it will take to get there.
It really comes down to deciding to embrace change a la automation or continue down the same path. But isn’t the definition of insanity doing the same thing over and over, expecting but never realizing different results?
With traditional means, you may improve speed but sacrifice quality. On the flipside, you may improve quality but sacrifice speed.
However, erwin’s technology shifts this paradigm. You can have both speed and quality.
Then all enterprise stakeholders – data scientists, data stewards, ETL developers, enterprise architects, business analysts, compliance officers, CDOs and CEOs – can access data relevant to their roles for insights they can put into action.
It creates the fastest path to value, with an automation framework and metadata connectors configured by our team to deliver the data harvesting and preparation features that make capturing enterprise data assets fast and accurate.
Click here to request a free demo of erwin DI.