Developing applications, and deploying servers faster are great goals. Being agile and streamlined can give an organization a clear competitive advantage. However, faster does not equal better, and being speedy for the sake of being speedy can lead to avoidable mistakes. Faster is only better if you can achieve greater speed without sacrificing quality.
At the same time companies are racing to be faster than their rivals, they’re also dealing with an increasingly complex IT infrastructure. IT environments are becoming highly heterogeneous—merging physical, virtual, and cloud platforms, and combining legacy systems with cutting edge technologies. The challenges of managing such a diverse IT infrastructure simply add to the potential issues that can arise as organizations try to work faster.
That is where DevOps plays an important role. DevOps tools and technologies establish a process chain between development and deployment. It enables organizations and IT admins to reliably track and understand the components in use, and any changes or updates that occur. In order to achieve speed without sacrificing quality, it’s important to maintain as consistent an environment as possible, and to know exactly what has been changed.
“For hypervisor based virtual images, this depends on the strict process around handling the images, but for container based virtual images depending on a cascading snapshot mount namespace, where there are layers of images built one on top of the other, strict handling becomes unnecessary because everyone can see transparently (via the image cascade) what changes at each stage of the process,” explained James Bottomley, CTO of server virtualization for Parallels. “This gives everyone in the DevOps chain the ability to verify each step and see the changes for themselves, leading to much higher assurance that the resulting deployment is correctly tested and released.”
Ken Cheney, VP of business development for Chef, describes another way that DevOps helps boost speed without affecting quality. “Automation turns infrastructure into code. For example, Chef recipes automate all facets of management and orchestration—from provisioning bare-metal and cloud servers to automated configuration and application deployment. You can replace manual, slow, error-prone procedures with code that is versionable, testable and repeatable.”
“When infrastructure is code you can test it just as you would applications—using automated tests that find problems early, before you release to production,” added Cheney.
In its 2014 State of DevOps Report, Puppet Labs drew an interesting correlation between speed and quality. Essentially, Puppet Labs found that speed and quality become a self-feeding circle.
“There’s a virtuous circle at work here: As stability improves, IT performance improves. This improved performance helps to create a better-functioning business that can pay attention to the communications and processes that enhance and improve stability.”
The converse is also true. Poor quality results in issues, and necessitates allocating resources to troubleshoot and resolve problems, which impedes the speed at which an organization can function.
DevOps tools and technologies enable organizations to streamline development, and work more efficiently without sacrificing quality. Businesses that attempt to work faster, without also working smarter by using DevOps are putting themselves at a disadvantage.