Thanks to the rise of artificial intelligence (AI), the nature of DevOps is about to change as developers spend less time writing code in favor of training models based on various types of algorithms to accomplish a task.
Dan Scholnick, general partner with Trinity Ventures, an early investor in companies such as New Relic and Docker Inc., says developers in 2018 will absolutely be required to gain mastery over how to implement machine and deep learning algorithms to stay relevant. Once a model based on those algorithms is incorporated into an application, the challenge for the DevOps team will be implementing the processes needed to make sure the AI created based on those models remains tuned and optimized to the business process it’s supposed to drive.
Now that the price of cloud services based on graphical processor units (GPUs) has dropped and open source tools for building AI models such as TensorFlow are readily available, Scholnick says the cost of infusing AI models into applications has dropped considerably. Before the end the end of the year, most users of business-to-business (B2B) applications will expect that application to exhibit many of same natural language and speech recognition capabilities that already are becoming commonplace in many consumer applications, he says.
Serverless computing frameworks also will make it easier to scale those applications up and down as needed using event-driven architectures. Most instances of serverless computing, such as AWS Lambda on the Amazon Web Services (AWS), are employed on a public cloud. But as serverless computing becomes more common, Scholnick says Trinity Ventures is expecting competition for workloads among cloud service providers to become fierce as IT organizations continue to shy away from proprietary technologies.
Scholnick says the rise of serverless computing doesn’t necessarily means batch-oriented applications will disappear. But it does mean that a much higher percentage of the applications being built will be either real time or near real-time.
The challenge most organizations will face building these applications is not necessarily developing them, as the tools needed to infuse AI into applications are themselves becoming more automated. The challenge will be in collecting enough data to inform the AI model. Most of the algorithms being employed to build AI applications are decades old; the only thing that has really changed is the cost of acquiring all the data required to drive those algorithms—it has dropped substantially, thanks primarily to the rise of cloud computing. Because of those lower costs, DevOps teams can expect that not only will new applications be infused with AI capabilities, but also there will be a significant effort to modernize or replace legacy applications.
At this juncture it’s not at all clear that DevOps teams are prepared to incorporate AI technologies into application development processes. But pressure to accomplish that goal is about to increase: A new market research report published by Tractica forecasts that annual worldwide AI software revenue will increase in 2016 to $89.8 billion by 2025 from $3.2 billion in 2016. Most of that growth will be driven by 266 AI use cases distributed across 29 industries, according to Tractica. Whatever the outcome, however, the DevOps landscape in 2018 is about to transformed utterly once again.