We’re in a strange state of the pendulum swing right now, where speed of delivery is king and all else seems to take a back seat. Many orgs even place quality behind speed in the claim that “we’ll get it right later,” but I’m talking about the more universal rush to deploy and the infrastructures we’re building to do it.
In short, look around your organization. Do you have apps more than 10 years old? Are you working on the same type of app? Then spin your mind ahead and consider what it will be like for someone in that time.
Future-Proofing Delivery
You cannot stop the advance of technology, but you can make the environment adaptable. There are some easy tricks to provide for the future when it comes to your DevOps toolset.

Virtually Perfect
First, build your continuous integration/continuous deployment/application release automation (CI/CD/ARA) architecture on virtual machines (VMs) or cloud instances. If your standard is containers, then use them—but on top of a VM. The reason is simple: Given the vagaries of business environment and IT priorities, a fully installed VM gives you a buffer—you have everything needed to run the environment in a contained package. Containers alone won’t be good enough, as the underlying OS impacts container operations. As an extreme example, the switch to Red Hat Enterprise Linux (RHEL) 7.0 was massive and broke things. A DevOps environment based solely on containers would break with it, while a VM running RHEL 5 or 6X would keep clipping along, as long as your virtualization environment ran.
Get Versions In-House
While the executables for any CI/CD/ARA toolset and development languages will be in-house, there is far more to a modern development environment than just those parts. The libraries/modules your code relies on must be available. Increasingly, we pull them on build, but that becomes a problem when suddenly they are significantly changed or downright missing. There is a market of hosted, verified module vendors out there these days that guarantee availability of a given version of each module they support, and that is an option; however, a better one is to pull them once and put them into your own enterprise repository. Is it another thing to maintain? Yes. But it might save your project from having to emergency patch—and frankly, that kind of insurance is worth it.
Choose Wisely
Like any relatively new market, there are a lot of CI/CD/ARA toolsets out there. Do evaluations; don’t just grab the one that a team member likes or is familiar with. Look at the longevity and stability of the project/vendor as much as the tool, because we’re talking about hedging for the future. In the early days (and even occasionally today), open source aficionados would say “We have the source, we’ll do fine,” which is technically true, but most organizations aren’t in the business of maintaining their dev/deploy tools and don’t want to be, so a little research goes a long way.
Plan as Though You Chose Poorly
Know what you’ll do if, in spite of evaluations, the project/product you chose goes away. While it’s not CI/CD, I was on a team that selected Mambo right before the big team split that created Joomla and doomed Mambo. Our choice was intelligently made based upon available information, but things still came around to burn us. And it could happen to your choice, too, whether you choose open source or commercial. So have an idea of what you would do in that scenario and document it. If we adopt enough IT tools, it’s guaranteed that some of them will fail over time, so this is not a totally wasted exercise. And should you ever need it, knowing what the plan is before things are falling apart helps keep level heads.
Quality Over Speed
Remember that time to delivery is super hugely important, and for decades was not given the credence it deserves. But as I mentioned above, recognize that right now the pendulum has swung so far the other way that many shops have a “just get it out, we’ll make it better later” mentality. Don’t join in where your DevOps environment is concerned. The architecture you are constructing should be for the long run—unless you enjoy rebuilding it regularly and the delays to production code such rebuilding threatens, then ignore this point.
Security, Security, Security
DevOps and security has had a rough relationship, but let’s talk about your DevOps environment instead of security’s impact on rapid delivery. Once you’ve thrown all of your build into one easy-to-use tool, the risk from a successful outsider or a risky insider is commensurately larger. Use tools that support role-based access control (RBAC) and make sure application programming interfaces (APIs) are properly locked down. Give security the time it needs to build a list of recommendations for DevOps toolsets that will protect them in almost any eventuality. Because the combined ability to mess with the build process and the deployment process through one automated toolset is powerful and dangerous, make sure you’re monitoring and it’s secure.
Consider a Modicum of Standardization
In my entire career, “We are an X shop” has never seemed to work. I’ve been at places that tried to use a single database vendor or a single development language, and inevitably, other forces override these directives. But there is a benefit to keeping the number of languages or database variants down, and that benefit is most felt in the CI/CD space. Keeping track of which DB goes with this project, or keeping that DB up to date—or determining the new release will introduce incompatibilities—all falls on the DevOps environments. Keeping the amount of accumulated future change down is in the organization’s best interests.
Make Cool Stuff
Every time I think I’ve maxed out on awe, something new comes along. We are on the cusp of fully automated build/test/deploy with ARA tools calling both CI/CD tools and application provisioning tools. Keep pushing the edge, give vendors your feedback or projects your mods, and making it better. Because there’s a fundamental change here that we want to see through to the end. And I love playing with the toys.