What’s wrong with using Jenkins to push database changes?
…or XebiaLabs XL Deploy or IBM Urbancode or Automic?
If you are involved in any part of your company’s process to build and deliver software, you’d agree that there’s an incessant push to go faster while simultaneously maintaining or increasing quality. That’s a fine outcome to desire, but things get tough when your team has to deliver faster and at higher quality without any additional resources.
Across industries, software teams have turned to “DevOps” and “Agile” tools and processes with the intention of achieving faster application releases and higher quality code without necessarily increasing staffing. The name of the game, as you probably know, is automation and process re-engineering.
As the DevOps phenomenon continues its viral growth, organizations are aggressively investing and rolling out “continuous integration” (CI) and “continuous delivery” (CD) tools such as Jenkins, Bamboo, Team Foundation Server, Travis CI and other, and CD (or release automation) tools like IBM UrbanCode Deploy, Spinnaker, and XebiaLabs XL Deploy. While these tools are effective in bringing continuous integration and continuous delivery to application code, they simply do not appropriately address the deployment of database changes.
Database Releases Without Database Release Automation
Database changes – typically in the form of SQL scripts – can certainly be committed to source code control. These scripts can be checked out of source code control and packaged into an artifact such as a ZIP file with a CI tool like Jenkins. The resulting artifact could be pushed to an artifact repository, from where a release automation tool like XebiaLabs XL Deploy could be used to deploy to environments along the release pipeline, all the way to production. While it may very well seem like database continuous integration and database continuous delivery can be accomplished with tools like Jenkins or Automic or Electric Cloud Electric Flow, organizations are setting themselves up for failure.
It’s beyond obvious but necessary nonetheless to recall that databases have state. Consequently, database changes must be carefully managed because you do not want to corrupt their state. While it’s possible to simply replace an older version of an application by overwriting it with an updated version, the same simply isn’t true for the database. A bad database release can result in irrecoverable data loss for the organization or a substantial outage for an application. It’s simply not possible to ‘blow away’ an old version of the database and overwrite it with an updated version as is the case with most application executables.
Fundamentally, relying solely on build, configuration, and release automation tools for managing database deployments puts data at risk. Given the consequences of a bad change, database changes are often handled in a separate, manual process. Organizations are left living with the lower application release velocity and lower code quality that are inherent with a manual database change process. In the end, if relying solely on build and release automation tools, application and database changes will never flow thought the release pipeline at the same pace – to the detriment of the customer experience. This is precisely where Datical fits in. Datical brings DevOps to the Database and allows organizations to apply database release automation so they can move quickly on the database component of continuous integration and continuous delivery with safety.
Database Releases with Datical
Datical is a database release automation solution that provides the necessary capabilities to bring safe automation to database releases, from development to production. Whether it is developers needing to rework database changes or database professionals needing to perform an audit, Datical augments build and release automation tools with a number of essential capabilities that are required when looking to bring database changes into the same single, unified release pipeline that application code flows through.
Database Code Development
Just as with any code, it’s not uncommon for a developer to rework a database change. Unfortunately, given that database retain state, reworking a database change currently requires an order of magnitude more effort than any other type of code. Typically, a manual out-of-band effort must be exerted by a DBA to revert a database environment to allow a developer to rework a change. Without manually un-doing the change that needs rework, a different roll-forward is done in lower level environments such as DEV, and a completely differently change is applied to higher level environments that were never exposed to the original change. This breaks the fundamental DevOps concept of “build once, deploy often” as the deployment to higher environments isn’t consistent with the deployment to lower environments.
Datical enables database continuous integration and dramatically simplifies the process of reworking database changes. With Datical, developers can treat database code just like application code and check an updated version of the database change into source code control. Datical intelligently sanitizes lower level environments where an older version of the change was made and applies the updated version, and properly applies only the updated change to higher level environments that were never exposed to the old version of the change. Build and release tools are not able to intelligently manage this type of change. Effectively, Datical eliminates the separate, manual effort, and allows for a consistent artifact that can be deployed through the pipeline.
Database Code Validation
Once a developer checks in code, it’s customary for the code to go through a series of automated tests during to the build process. As a professional, you are no stranger to Jenkins linting your source code, pushing it through Gradle (or a similar build tool), running your application executable through JUnit and Cucumber, and should everything go smoothly, push a nice application artifact to an artifact repository such as Nexus or JFrog Artifactory. The same sadly isn’t true today for database code. The validation of database code is typically a completely manual process – which should be a jaw-dropping showstopper.
This is where database continuous integration and continuous delivery capabilities from Datical saves the day. With Datical’s Dynamic Rules Engine, organizations can codify standards and best practices, and automate the validation of database code. Datical employs an object-based rules engine as opposed to a simple regular-expression engine so that functional rules such as limiting the total number of indices per table can be easily achieved. Tools such as XebiaLabs XL Deploy or Jenkins or Ansible or whatever other CI, CD, or configuration automation solution you have cannot perform this level of validation. Datical’s Dynamic Rules Engine eliminates the tedious manual effort otherwise required by database professionals and enables developers to get near-instant feedback on database changes submitted to source code control, just as they currently get with application code.
Database deployments are delicate matters – as previously noted, a bad database deployment can be costly to recover from and possibly fatal to the organization. To eliminate errors or issues in database deployments, Datical’s Change Management Simulator builds an in-memory model of the target database, applies the proposed changes to the model, and verifies that the final state of the model meets expectations. Once again, your standard CI and CD tools do not have the ability to simulate the impact of database changes before they are deployed – they can blindly apply SQL scripts to environments – which, if done all the way to your production server is likely to guarantee a SEV1 outage in no time. Datical provides the necessary safeguards in the database release process so organizations can comfortably bring automation to database deployments without worrying about introducing any risk. In fact, Datical’s automated simulation usually reduces risk and helps organizations achieve much higher success rates on first-time deployments.
Reporting and Auditing
Whether operating in a heavily regulated industry or not, auditing and reporting on database changes is a best practice in maintaining hygiene. Datical’s Deployment Management Console includes a real-time dashboard that enables stakeholders to quickly track and report on the status of every database deployment across the enterprise. Powerful filtering tools built into the interface make it possible to get immediate insight into any specific database change that has been made to a Datical managed database. Datical provides a pipeline view and tracks database deployment velocity at each step in each pipeline so that bottlenecks can be quickly identified and addressed as part of a continuous improvement process. The Deployment Management Console makes historical reports readily accessible to simplify audit and compliance tasks. This much-needed level of granularity in reporting is simply not available in CI and CD tools.
As organizations strive to streamline their software releases, they have turned to build and release automation tools. While these tools have transformed the delivery of application code, they lack the capabilities to do the same for database code. Given that end user experience requires both the application code and the database code, organizations are increasingly finding that the database release process is a bottleneck – in spite of the automation that has been brought the application release process. By adopting Datical, organizations can bring DevOps to the Database and realize more value from their investments in application build and release automation tools while accelerating the delivery of the entire software stack.
To learn more about database continuous integration and continuous delivery please check out this Refcard from DZone: DevOps for Database.