In reading case studies, surveys, and research articles on DevOps I am reminded of a cartoon I saw taped to my high school physics teacher’s file cabinet. Two scientists stand in front of a complex proof scrawled on a chalkboard. Separating steps 1 and 3 of the proof is the statement, “Then a Miracle Occurs.” One of the scientists points to the statement and says, “I think you should be more explicit here in step 2.”
This is an apt metaphor for the DevOps zeitgeist’s avoidance of the database. ISVs and analysts spend a lot of time talking about how to streamline, accelerate, and automate application development and testing (Step 1) and how to monitor and automate the deployment of those applications (Step 3). But when it comes to managing the database change necessary to support the application and its release processes, it seems that everyone is waiting for that miracle to occur.
This oversight is unfortunate. DBAs have a lot to offer when it comes to correlating the development of technology with the management of the environment in which it’s hosted. In a sense, DBAs have been DevOps all along.
On the ‘Dev’ side, DBAs carefully evaluate each change request to ensure that it is well thought out, is compliant with organizational best practices and won’t have unintended consequences on database performance or the validity of dependent objects. They have developed and tested all of the SQL that has materially changed the database and crafted it into what it is today.
On the ‘Ops’ side, DBAs have designed and provisioned the data platform. They are in charge of monitoring their databases and keeping them available and high-performing. They manage access to and the overall security of the platform. They perform release activities in support of the application and troubleshoot any errors that happen during that process or during day to day operation.
So why isn’t the larger DevOps community including the data platform in the conversation? Why aren’t DBAs a crucial part of every DevOps team? Why are we relying on a miracle like the scientist in the cartoon?
Every decision a DBA makes, every process or standard they put in place is influenced by the dual nature of their role. They are used to thinking about the impact of any task, requirement, or problem from both sides. It is deeply ingrained in them. They are deliberate and meticulous in their approach to change management because they understand that what they do has impact beyond what’s directly in front of them. They rely heavily on their expertise and their deep understanding of their organization’s data strategy to ensure they are making safe, future proof decisions to the best of their ability.
This also explains why we haven’t seen the same proliferation of automation and orchestration solutions for the database that we have seen for other domains in the SDLC. True database automation has to be more than “run this script after that one” or “make this object look like that object over there.” To have a meaningful impact on your release throughput and staff productivity your database automation has to make use of your team’s intelligence. Any solution you implement has to harness the collective experience and knowledge of your DBAs and apply that knowledge to the evaluation of changes as they’re introduced in development. It has to be…miraculous.
This is the whole reason we started Datical. When it comes to completing your DevOps vision, we wanted Datical to provide a real answer to Step 2 from that comic that has stuck with me all these years. Our flagship product Datical DB was designed and implemented to address the unique challenges associated with automating database changes in support of application releases. It automates and accelerates the propagation of database changes without sacrificing safety. We provide a highly customizable automation framework that is fueled by your experts. It allows you to define standards specific to your organization that are incorporated in our pre-deployment analysis. This greatly reduces the time spent manually reviewing SQL and checking the logic of your SQL scripts for errors that will cause application failures and downtime.
About the Author/ Pete Pickerill
Pete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software industry veteran who has built his career in Austin’s technology sector. Prior to co-founding Datical, he was employee number one at Phurnace Software and helped lead the company to a high profile acquisition by BMC Software, Inc. His ability to understand product demands from a customer’s perspective and translate those demands into actionable product and development plans has led to expanded duties at every company for which he’s worked. In his time at Symantec, Pete worked directly with large financial services companies and online retailers to implement and improve online fraud prevention solutions that were used daily by millions of people worldwide. In addition to managing a development team and product schedule at BMC, Pete worked directly with large financial services institutions to provide the features they needed to achieve greater efficiency and cost savings in their application deployment processes.