DevOps has quickly matured from a fairy tale to a bestseller at most large enterprises. However, we are still at the beginning of the DevOps story, with many exciting chapters still to be written. This article looks at several emerging technologies shaping DevOps today and a handful of developing trends that will shape the ecosystem in the years to come.
Today’s IT organizations increasingly are moving away from monolithic applications deployed through grand-slam releases with security slapped on after the fact. We are seeing many enterprises transitioning to more atomic applications released in continuous deployments with application security baked in from the beginning. Fueling today’s emerging DevOps is a bevy of new products focused on architectures including microservices, service mesh and function-as-a-service (FaaS); advancements in pipeline automation; and more adaptable security paradigms such as interactive application security testing (IAST) and runtime application self-protection (RASP). While these solutions herald a great step forward, they do demand that legacy applications be refactored into smaller granularity services, layered onto newer architectures and migrated to processes that allow greater automation.
OK, that’s where we sit today: Great progress creating great opportunities and great challenges. But let’s head a little further down the road to understand where today’s advancements are driving DevOps tomorrow.
There are five trends consistently appearing on the whiteboards of venture firms and forward-thinking startups that will shape DevOps in the upcoming years. At a macro-level, DevOps must become wider and smarter. As we dig layer deeper, we see five distinct patterns appear in which emerging technologies will make the release pipeline more insightful and proactive while allowing it to handle a wider spectrum of pipeline artifacts beyond just applications:
- AI for DevOps: Artificial intelligence (AI) is making inroads into all facets of IT and DevOps is no exception. AI techniques soon will make the DevOps pipeline smarter, able to predict the impact and risk of deployments, spot procedural bottlenecks and identify automation shortcuts. Advancements in robotic process automation (RPA) lend themselves directly to optimizing the various handoff and automation points of the release pipeline. AI-based predictive analytics also will allow for higher fidelity operational capacity planning and pre-deployment fault prediction.
- DevOps for AI: Today we are seeing a handful of new tools, commonly called MLOps, emerging on the market that allow developers and modelers to deploy new AI models rapidly and repeatably. However, these products have yet to collaborate closely with the Ops side of the enterprise. They are generally a development aid, not an operational aid—or said another way, they are all “ML” and very little “Ops.” This space will mature rapidly, however, allowing release of AI models via the DevOps ecosystem so as to include operational aspects. Conversely, tomorrow’s DevOps pipeline will also allow AI models that have been learning from real-world data to be extracted from production and moved back into the development environment.
- DBOps: Today, new databases, schemas, indexes, triggers and stored procedures typically are deployed manually by DBAs outside the DevOps pipeline, with their operational impact being provided by best-guess capacity planning. This glaring gap soon will be closed by products that allow databases and their associated artifacts to be released through the deployment pipeline along with the applications with which they are coupled. This will not just encompass traditional relational databases, of course, but also support non-structured databases and object stores. The prime objective being that applications and their supporting data repositories will be developed, tested and deployed via the same automated pipeline as applications without the need for specialized database handholding.
- DataOps: When new apps are deployed, the associated analytics and visualizations are typically brought online by data scientists after the fact through manual deployments. Looking forward, expect data analytics to be introduced into the DevOps pipeline soon after the requirements for the release are defined. Much like DBOps above, the goal is to eliminate specialized manual promotion. But beyond just synchronous deployment, DataOps will allow data governance, data loss prevention (DLP), data lifecycle management and copy data management features to be integrated during each new deployment.
- Blockchain for DevOps: Today’s and tomorrow’s DevOps evolutions will increase the velocity, variety and volume of pipeline traffic dramatically. While this sounds promising, it undoubtedly will be a nightmare to audit, verify compliance and ultimately control and manage. One technology being eyed to untangle this spaghetti is a distributed consensus mechanism such as blockchain. Stop rolling your eyes—we’re not talking about Bitcoin or ICOs. We are describing the need for a centralized, permissioned, secure, distributed ledger to record, track, authenticate and audit the torrent of pipeline transactions. Blockchain seems to fit the bill nicely and also has the added benefit of enabling digital contracts at each pipeline stage that can be verified at runtime to trigger artifact promotion and also audited long after the fact to ensure that all rules were followed.
DevOps has already become the standard deployment paradigm in many enterprise IT shops and its adoption will continue to become more pervasive. But this growth requires DevOps products and processes to become smarter, more secure and more controllable as the release pipeline incorporates a wider set of deployment artifacts beyond just applications. This is a daunting challenge to be sure, but there are exciting new products that will emerge to help write the next chapters in the DevOps story.