GitLab extended its alliance with Google as part of an effort to bring more generative artificial intelligence (AI) capabilities to DevOps workflows.
The GitLab suite of software-as-a-service (SaaS) applications already reside on the Google cloud platform to provide GitLab with the foundation of data required to train those AI models. Over the course of the last two months, GitLab has already added numerous capabilities that rely on multiple types of AI technologies.
For example, there is now an experimental Explain This Vulnerability capability that provides a natural language summary of an issue in a way developers and cybersecurity teams can easily comprehend.
Taylor McCaslin, a product group manager for data science and AI/machine learning for GitLab, said going forward, most of the AI focus is going to be on leveraging generative AI capabilities. Those capability will be enabled by Google using a large language model (LLM) that GitLab developed for DevOps workflows. That approach enables GitLab to surface more accurate recommendations based on validated data compared to the general purpose LLM that was used to create the ChatGPT service.
In addition, GitLab can continuously update the AI models it is running on the Google Vertex AI cloud service using the data from its SaaS application environment which is continuously monitored and updated, McCaslin noted.
It’s not clear what impact AI may have DevOps workflows, but GitLab is forecasting a 10x improvement. That will be accomplished by, for example, surfacing code that can be used to remediate a vulnerability. Today, many vulnerabilities are not addressed simply because developers don’t have enough time to write a patch.
A recent GitLab survey, however, suggested developers are already embracing AI to improve productivity, with 62% of developers using AI and machine learning algorithms to check code. More than a third (36%) also rely on AI and machine learning algorithms to review code.
At this juncture, the one thing that is certain is AI and other associated technologies are going to make developers more productive. It’s not nearly as apparent what impact increased amounts of code moving simultaneously through DevOps pipelines is going to have on the software engineers that manage those processes. The expectation is that similar types of AI advances will also enable more code to flow through those pipelines without, hopefully, further exacerbating any existing bottlenecks that might exist.
In the meantime, it’s clear the AI genie is out of the bottle. There will soon be more LLMs for all kinds of tasks. DevOps teams should start planning today based on the assumption that many manual tasks that conspire to make software engineering tedious are going to fade away. As such, the roles with a DevOps team are going to change and evolve. The assumption those DevOps teams should make is that these changes will be for the better. After all, the reason organizations embraced DevOps in the first place was to ruthlessly automate IT processes—AI is simply the latest iteration of that commitment.