The rise of DevOps is more an evolution than a revolution, and developer jobs aren’t going anywhere
Development died a long time ago and was replaced by DevOps. Or, at least that was the way it was supposed to happen.
In reality, developers are doing just fine. There are currently almost 150,000 development jobs right now on one major job site, compared with only about 16,000 positions for DevOps. It seems that the revolution in which developers would be replaced with DevOps engineers never actually happened.
In fact, and as we’ve previously argued, many people find that having DevOps on their job title actually harms them when it comes to finding the right job. In this article, we’ll look at why.
The Death of the Developer
The “death” of the developer has been prophesied for quite some time now. Ever since the creation of the concept of DevOps, in fact. The most vocal and prominent instance of this argument was offered by programmer Jeff Knupp back in 2014, who used a blog post to argue that the popularity of DevOps meant that programmers were increasingly unable to focus on what they do best: writing code.
Instead, argued Knupp, the tyranny of DevOps was forcing coders to become generalists rather than specialists. They were being forced, he said, to include on their CVs an ever-growing number of “skill sets,” some of which were pretty tangential to their core role within an organization. Instead of traditional developers, it was feared, an army of “DevOps engineers” would rise who could just about do a wide range of tasks, but do none of them very well.
In the interests of balance, it’s worth pointing out that Knupp—and many who have repeated his argument—have a vested interest in “defending” traditional coders against the rise of DevOps. It might be true that a thorough knowledge of GDPR compliance processes is not technically necessary for a developer, but some of the skills the DevOps highlighted—not least backend security—are best addressed at the development stage. Developers, in other words, shouldn’t be given a free pass when it comes to thinking about the way in which their creations are actually used.
Adapt and Survive
Nonetheless, the popularity of the concept of DevOps has forced many developers to adapt their skill sets, or at least the way they present them. Concepts such as CI/CD, segmentation of internal communication, automation and “continuous everything” have become central to the way many developers work and are highly sought after. We have DevOps to thank for that.
In addition, the rise of DevOps has made room for an ever-expanding set of novel approaches. Five years ago, it seemed that a new approach to IT management was being developed once a month. First came DevOps, then DevSecOps, then QAOps. This explosion of new approaches does seem to be calming down, but they remain influential on at least two levels.
The first is that, although none of these approaches (and arguably even DevOps, as we’ll see below) have developed into distinct disciplines, they have subtly redefined the way in which developers are understood by the average organization. The second is that, in this context, many developers have begun to stress that they have cross-cutting expertise. In other words, they have had to adapt the way in which they present themselves to survive.
The Death of DevOps
Because of this, we should see the rise of DevOps more as an evolution than a revolution.
At the broadest level, DevOps has had a hugely beneficial effect on the process of software development. There is an expectation that developers will have at least a passing acquaintance with the systems and processes that IT teams use. This was not always the case in the past. Developers today have to know how things such as cloud IAM frameworks or APM platforms work, and why they are important for deploying and managing the applications that developers write.
In this context, it seems strange that some coders ever worried that their job would be replaced. While it’s true that developers are now expected to have much broader skills than they did 10 years ago, organizations still need people who can actually code, and will do for decades yet (until we are all replaced with AI, that is).
It seems that some of us were too eager to see a dramatic revolution, whereas the reality has been more subtle and much slower. And indeed, in many cases the “transition” to DevOps has been almost invisible: new graduates are now trained in the basic principles of DevOps, without this ever having become a discrete college course.
Many of these new graduates, in fact, would be hard-pressed to define what DevOps actually is. But that’s not because it’s gone away; quite the opposite. It is now such a fundamental part of the way that most of us do our jobs that it is losing its identity as a separate idea. This means, eventually, that it will not be “development” that dies, but “DevOps” as a distinct addition to it.
In short, DevOps has not killed the developer, and there is no evidence that it ever will. Some of us, it seems, were being a little dramatic. On the other hand, DevOps has had huge effects on the way that developers work, because it points to the skills that the best developers have. For that reason, there remain tens of thousands of jobs for developers in the conventional sense.
In short, developers work alongside DevOps engineers; they don’t compete with them for the same roles. So instead of seeing ourselves in competition, let’s try a little collaboration.