Tabnine this week made available in beta a tool that enables developers to use natural language to interact with the artificial intelligence (AI) models embedded within its test automation platform.
Brandon Jung, vice president of ecosystems for Tabnine, said Tabnine Chat extends generative AI capabilities the company developed to make it simpler for developers to invoke its test automation platform from within their integrated development environment (IDE).
In addition to making it possible to search a codebase more efficiently, that capability can also be used to explain how a piece of code functions, said Jung.
Developers can also generate new code via a natural language interface using Tabnine Chat in the same way other generative AI tools do, he added.
In addition to developing its own LLMs, Tabnine previously announced it will be integrating LLMs provided by Google as part of a strategy to mix and match LLMs regardless of where they are hosted.
Unlike general-purpose LLMs, the ones used by Tabnine were trained using a narrower base of testing code that the company curated using tests made freely available on the internet. No customer data was used to create any of those tests, so developers do not need to worry their code may inadvertently be accessed by anyone else, noted Jung.
In general, as AI makes test creation simpler, tests will be run more frequently, said Jung. Over time, those AI capabilities and more frequent tests should significantly improve the overall quality of applications running in production environments, he added.
It’s now only a matter of time before generative AI is applied across the entire software development life cycle (SDLC). Testing, however, may be one area where it will have the earliest impact.
In the meantime, the debate over the impact generative AI will have on DevOps continues. Some proponents contend most of the software engineering process is about to become automated, while others suggest there will always be a need for a human to be in the loop of any process automated using AI models. Much of the current level of toil associated with DevOps today will be reduced, but the need for software engineers to manage the overall process will remain, they contend.
The one thing that is certain is that the pace at which applications are built and deployed is about to be greatly accelerated. Prior to rise of generative AI, testing processes were barely keeping pace, so as more code is created with the help of AI, the need for similar advances in testing will become acute.
There is as much irrational exuberance about AI as there is fear and loathing, but DevOps teams that committed early on to ruthless automation will naturally be at the forefront of adoption. The immediate challenge is assessing what functions will be automated and the impact those changes will have on the way DevOps teams are currently structured. Ultimately, the goal is to eliminate as many of the bottlenecks that conspire to slow application development and deployment while simultaneously improving the quality of the application experience.