How is DevOps changing the way we compute? One big DevOps innovation is helping developers and admins consume exactly what resources they need to deploy an app—nothing more and nothing less.
Traditionally, using (and paying for) only what you need has been tough in all parts of life. If I want to park my car for just 10 minutes on a Baltimore street, I have to pay $1 for a half-hour’s worth of time, because that’s the minimum parking fee. In similar fashion, I pay about $50 a month for the heating bill in the unoccupied house I just bought, even though the heat is turned off, because that’s the minimum monthly charge.
The problem of having to pay for more than you actually need was a big issue in the pre-DevOps digital world, too. In the past, if you wanted to deploy an app, you had two options. The first was to run it on a bare-metal server. If you did that, to help mitigate the risk of the server becoming overloaded, you’d dedicate more resources to the app than would be necessary during normal usage. That meant paying for capacity that you were not using most of the time.
The other option was to use a virtual machine (VM). Thanks to tricks such as live-migration and “hot-plugging,” commercial virtualization platforms provided some elasticity to VMs, helping to assure that the VMs consumed only the resources they needed at any given time. But virtualization still means you are wasting resources on hardware emulation and running guest operating systems that needlessly duplicate most of the functionality of the host. Even the leanest VM environment requires you to dedicate more resources to deploying an app than the app actually requires.
DevOps and the Consumption Revolution
DevOps technologies are changing all of this. They are revolutionizing the way we consume digital resources and introducing efficiencies that were not possible in the age before DevOps.
The most obvious example of such innovation is containers. With a containerized app, you get the portability and elasticity that traditionally have been available only from virtual machines. But with containers, you don’t have to waste resources on virtualization. Your apps only consume what they need at any given moment in time.
Technologies such as Lambda functions on AWS are bringing the same efficiency to the public cloud. With Lambda, you don’t have to pay for virtual servers, memory or disk space on an ongoing basis just to ensure that you have them available when you need them. Instead, you can use Lambda functions to get access to a large amount of compute capacity in the cloud instantly, and stop paying for it the millisecond you no longer need it.
Also worth noting is all of the automation that DevOps brings to software delivery by promoting continuous delivery practices. In the age of waterfall development, each part of the organization had to wait around on other teams to push code or test results down the pipeline before it could perform its own tasks. With continuous delivery, everyone works continuously and in parallel. The result is no more wasted time and skills.
In a multitude of ways, then, DevOps is optimizing the digital consumption. In the future, we’ll wonder why we ever accepted “miminum” fees or had to pay for more than what we needed to get the job done.