Google this week added a generative artificial intelligence (AI) tool dubbed Gemini Cloud Assist to automate a wider range of tasks across the software development lifecycle, including troubleshooting applications running on the Google Cloud Platform (GCP) to surface recommendations to improve performance and enhance security.
Announced at the Google Cloud Next ’24 conference, Gemini Cloud Assist is integrated with an existing publish and subscribe capability to enable DevOps teams to launch generative AI prompts to automate tasks that previously would have required them to develop a script. Gemini Cloud Assist can also generate architecture configurations based on the code being deployed.
At the same time, Google is making available via a private preview an update to Gemini Code Assist tool to generate code that has been trained on a large language model based on one million tokens.
Brad Calder, vice president and general manager for GCP and technical infrastructure told conference attendees that version 1.5 of Gemini Code Assist will now generate code based on the largest LLM trained using code vetted by Google to generate more reliable outputs.
The challenge going forward will be determining where the line between generative AI tools provided by a cloud service provider such as Google leaves off and where the generative AI tools provided by Google partners such as GitLab begins.
GitLab has been working closely with Google to develop its Duo AI capabilities using core services that Google makes available via its Vertex platform for managing machine learning operations (MLOps).
David DeSanto, chief product officer for GitLab, said as AI continues to be applied to DevSecOps workflows the routing capabilities developed by GitLab will make it simpler to orchestrate prompts that can be applied to multiple LLMs. That approach provides a layer of abstraction from the underlying LLMs that makes it simpler to swap out as new capabilities are added, he noted. Most DevOps teams are not going to need to access an LLM directly because the natural language prompts they create will be executed by the underlying platform, DeSanto added.
It’s not clear to what degree AI is capable of automating DevSecOps workflows, but as the reasoning engines embedded in LLMs become more advanced, the ability of these platforms to automate manual tasks is only going to increase. As those manual tasks are eliminated, developer productivity should substantially increase because much less time will, for example, be needed to spin up a remote development environment.
Rather than tracking outdated metrics such as the amount of code generated, however, IT leaders should be focused on the rate at which test failures are declining as the quality of code being written steadily improves, said DeSanto. Too many developers are spending the bulk of their time trying to get code that runs on their laptop to execute in production environments, he noted.
In addition, it will become a lot easier to onboard new members to a software engineering team as, for example, summarizations of how existing code works become readily available, said DeSanto.
Ultimately, the number of applications any DevOps team can build, deploy and manage is about to substantially increase in a way that doesn’t require IT organizations to add a small army of software engineers to build, deploy, manage and secure.