Given the myriad benefits they provide, it seems crazy that big data and DevOps have been implemented independent of each other
If you have been working from home during the recent lockdowns, you probably don’t feel like you are making many new connections, but when we come to look back on this time in a few years, it might prove to have been one of the most effective networking sessions of the past decade.
Nowhere is this more true than in the twin worlds of big data and DevOps. It seems that being forced to work from home, far from locking engineers in both disciplines deeper within their silos, has brought them out of their shells. In recent months, we’ve seen an outpouring of speculation on how DevOps and big data can work together, and even a number of impassioned manifestos on why it is time to bring DevOps ideas to big data.
In this article, we’ll look at why this crossover is suddenly so popular, how these two disciplines can engage in a symbiotic relationship with each other, and how they are already doing so.
The Pandemic and the Silo
For consumers, the experience of lockdown has accelerated internet usage trends. While some of the changes that have recently swept the online world were easy to guess, the speed at which they have taken hold has been astounding. It’s therefore no exaggeration to say, as the New York Times put it recently, that the virus has changed the way we use the internet.
These consumer trends are driving changes throughout the software development industry, forcing firms to adapt to rapid release, testing and update cycles.
One of the biggest changes in the way that developers collect data over the past decade has been the source of the data. This year, and largely due to the COVID-19 pandemic, we passed a notable watershed: Mobile devices and smartphones now account for more than half of all internet traffic, resulting in an increased data flow that boggles the mind and creates issues of its own. This increased data load from consumer mobile devices, rather than from corporate data acquisition systems, is forcing teams to work more closely together.
In the words of Bill Detwiler, editor in chief of TechRepublic, in a recent CBS interview: “The COVID-19 pandemic has accelerated the blending of data analytics and DevOps, meaning developers, data scientists, and product managers will need to work more closely together than ever before.” For most firms, the way in which these consumer trends are fed into the development life cycle is via analytics teams. This has meant that, during the pandemic, analytics teams were generating actionable insights at an increased rate, and this has been the driving force behind the newly intimate relationship between big data and DevOps staff.
Waterfall vs. DevOps
To understand the way in which big data and DevOps teams can best work together, it’s instructive to visualize the way in which developers, operational staff and data analytics agents work.
A decade ago, most software developers were using some form of waterfall development methodology. As the name suggests, this approach was essentially linear—the specifications for software were set in advance, and after this point multiple teams worked in tandem to deliver a continuous “stream” of software updates.
The DevOps framework changed that by encouraging communication and collaboration between operations and development staff. The idea, in short, was to ensure that software underwent continual testing—not just in terms of usability and security, but also against the requirements of the operations staff who were charged with maintaining it.
Now, we might be facing a shift of similar scale. Many firms have now adopted big data frameworks, but these systems generally have been seen as a way of harvesting and collating market insights. This was always a waste of potential for big data because as long as it was regarded as part of marketing, it’s utility in other parts of an organization were difficult to discern.
Not any longer, it seems. The changes caused by the pandemic—both in consumer behavior and in the way that organizations communicate internally—have given developers an appreciation of the value of the data that analytics teams hold.
Pioneers Show the Way Forward
It’s also worth noting that big data and DevOps teams working in tandem is not a utopian vision; rather, this way of working was already being explored within leading companies even before the pandemic.
In retrospect, in fact, it seems crazy that these two teams ever worked independently. If you’re a transportation company, the ability to track your loads on the road and the health and safety of the cargo that they’re carrying has become mission-critical. If you’re in the armed forces and using drones on the battlefield to conduct and report reconnaissance in real-time flyovers, the ability to adapt software parameters on-the-fly can be the difference between success and failure.
The military is, in fact, the place to look for examples of how big data and DevOps will shortly become one process. Big data in the military has long been more advanced than that in the civilian sector and is beginning to inform everything from IT predictive analytics to real-time usage analytics.
The Future of Big Data and DevOps
There remain, of course, significant challenges to be overcome before the promise of DevOps working with big data teams can be realized. Perhaps the most thorny of these is the fact that computing performance is crucial for big data in a way it hasn’t been for development: The latter teams may need to build capabilities to make use of the insights generated by big data teams.
Still, communication can only be a good thing, and if the pandemic has forced some of us to emerge from our bunkers and start to work together, then at least one good thing will have come from it.