The Complete Guide to Database Deployment 

If you build and deliver software for your company, you’d agree that there’s a constant push to go faster while maintaining or increasing quality. That’s a fine outcome to desire, but things get tough when your team has to deliver faster and at higher quality without any additional resources. Add to this the challenges a database deployment offers, and your team can become a bottleneck quickly.

Across industries, software teams have turned to “DevOps” and “Agile” processes to get faster application releases and higher quality code. But the name of the game is automation and process re-engineering.

As the DevOps phenomenon continues growing, organizations are aggressively investing and rolling out “continuous integration” (CI) and “continuous delivery” (CD) tools such as:

They might also be relying on CI (or release automation) tools like IBM UrbanCode Deploy, Spinnaker, and XebiaLabs XL Deploy. While these tools can bring CI and CD to application code, they do not appropriately address the deployment of database changes.

<Check out the DZone DevOps for Database Ref Card to learn more about database continuous integration and delivery.>

Database Releases Without Database Release Automation

Before you even think about deployment, you have to guarantee effortless database change cadences. Database changes – often in the form of SQL scripts – can be committed to source code control. These scripts can be checked out of source code control and packaged into an artifact such as a ZIP file with a CI tool like Jenkins. The resulting artifact could be pushed to an artifact repository, from where a release automation tool like XebiaLabs XL Deploy could deploy to environments along the release pipeline all the way to production.

While it may very well seem like DB CI and DB CD can be accomplished with tools like Jenkins, organizations are setting themselves up for failure.

It’s beyond obvious (but necessary, nonetheless) to recall that databases have state. So, database changes must be carefully managed because you do not want to corrupt their state. While it’s possible to replace an older version of an application by overwriting it with an updated version, the same isn’t true for the database. Also, a bad database release can result in irrecoverable data loss for the organization or a major outage for an application.

Relying solely on build, configuration, and release automation tools for database deployments puts data at risk. Given the consequences of a bad change, database changes are often handled in a separate, manual process. Organizations are left living with the lower application release velocity and lower code quality that are inherent with a manual database change process.

In the end, if your team relies solely on build and release automation tools, application and database changes will never flow through the release pipeline at the same pace – to the detriment of the customer experience.

Working with Datical can help you automate your processes safely from development to production. Whether it be developers needing to rework database changes or database professionals needing to perform an audit, Datical can help. Our solutions augment build and release automation tools with a number of essential capabilities. Together, these tools and automation unify DB changes into the same pipeline that application code flows through.

Database Code Development

It’s not uncommon for a developer to rework a database change prior to database deployment. But, given that databases retain state, reworking a change currently requires more effort than any other type of code. A manual out-of-band effort must be exerted by a DBA to revert a database environment to allow a developer to rework a change. Without manually un-doing the change that needs reworking, a different roll-forward is done in lower level environments such as DEV. Also, a completely different change is applied to higher level environments that were never exposed to the original change. This breaks the fundamental DevOps concept of “build once, deploy often,” as the deployment to higher environments isn’t consistent with the deployment to lower environments.

Datical enables database CI and simplifies the process of reworking database changes. With Datical, developers can treat DB code just like application code and check an updated version of the database change into source code control. Datical intelligently sanitizes lower level environments where an older version of the change was made and applies the updated version. It also properly applies only the updated change to higher level environments that were never exposed to the old version of the change. Regular build and release tools are not able to intelligently manage this type of change. Effectively, Datical gets rid of the separate, manual effort, and allows for a consistent artifact that can be deployed through the pipeline.

Database Code Validation

Once a developer checks in code, it’s customary for the code to go through a series of automated tests during the build process. Database code is not so lucky – the validation of database code is a completely manual process.

This is where database CI and CD capabilities from Datical saves the day. With Datical’s Dynamic Rules Engine, you can codify standards and best practices while automating the validation of database code. Datical employs an object-based rules engine as opposed to a simple regular-expression engine so that functional rules such as limiting the total number of indices per table can be easily achieved.

Tools such as XebiaLabs XL Deploy or Jenkins or Ansible or whatever other CI, CD, or configuration automation solution you have cannot perform this level of validation. Datical’s Dynamic Rules Engine gets rid of tedious manual effort otherwise required by database professionals. Dynamic Rule Engine lets developers get near-instant feedback on database changes submitted to source code control, just as they currently get with application code.

Deploying the Database

A bad database deployment can be costly to recover from and possibly fatal to the organization. To get rid of errors or issues in database deployments, Datical’s Change Management Simulator can:

  • Build an in-memory model of the target database
  • Apply proposed changes to the model
  • Verify that the final state of the model meets expectations.

Standard CI and CD tools can’t simulate the impact of DB changes before they are deployed. Instead, they blindly apply SQL scripts to environments, which guarantees a SEV1 outage if done all the way to your production server, which can ruin the state of a database deployment. Datical provides safeguards in the DB release process so teams can automate database deployments without any risk. In fact, Datical’s automated simulation can reduce risk and helps achieve much higher success rates on first-time deployments.

Reporting and Auditing

After a database deployment, you will need to be accountable and monitor problems that may arise. Whether operating in a heavily regulated industry or not, auditing and reporting on database changes is a best practice. Datical’s Deployment Management Console includes a real-time dashboard that lets stakeholders track and report on the status of every database deployment across the enterprise. Powerful filtering tools built into the interface make it possible to get immediate insight into any specific database change that has been made to a Datical managed database.

Datical provides a pipeline view and tracks database deployment velocity at each step in each pipeline. This helps identify and fix bottlenecks quickly. The Deployment Management Console also makes historical reports to help simplify audit and compliance tasks.

Need Help Automating Your Database Deployment?

By adopting Datical, organizations can bring DevOps to the Database and realize more value from their investments in application build and release automation tools while accelerating the delivery of the entire software stack.

Check out the DZone DevOps for Database Ref Card to learn more about database continuous integration and delivery.

Seeing is believing.