Blogs

Tabnine Extends Generative AI Testing Platform by Embracing RAG

Tabnine today revealed that its namesake generative artificial intelligence (AI) platform for creating test code can now surface more accurate and personalized recommendations based on specific code and engineering patterns.

At the same time, Tabnine also announced that Tabnine Chat, a tool for interacting with the large language models (LLMs) at the core of the Tabine platform using a natural language interface, is now generally available.

Data that resides in a codebase or an integrated developer environment (IDE) can now be used to extend the large language models (LLMs) on which the Tabnine platform is based using retrieval augmented generation (RAG) techniques. Previously, the Tabnine platform would only generate test code based on the data that Tabnine had exposed to its LLMs.

Tabnine president Peter Guagenti said RAG makes it possible to provide additional contextual awareness for the test code generated along with documentation and explanations of the test code created. The overall goal is to leverage generative AI to make it easier to shift more responsibility for application testing further left toward developers, he noted.

That approach enables developers to create and run more routine tests earlier in the software development life cycle, which should provide application testing teams with more time to run more complex tests before applications are deployed, added Guagenti. Rather than being a job killer, generative AI will eliminate rote tasks that most members of the DevOps team don’t really enjoy doing, he noted.

Tabnine uses a mix of LLMs it developed with Google to apply generative AI to application testing. Unlike general-purpose LLMs, the ones used by Tabnine were trained using a narrower base of testing code that the company curated using tests made freely available on the internet.

It’s now only a matter of time before generative AI is applied across the entire software development life cycle (SDLC). It’s not clear how automated testing will become in the age of AI, but the overall quality of applications should steadily improve as more tests are run. The goal is to make it possible to conduct more tests without slowing down the overall pace of application development.

There continues to be as much irrational exuberance about AI as fear and loathing. Still, DevOps teams that committed early on to ruthless automation will naturally be at the forefront of adoption. The immediate challenge is assessing what functions will be automated and the impact those changes will have on the way DevOps teams are currently structured. Ultimately, the goal is to eliminate as many of the bottlenecks that conspire to slow application development and deployment while simultaneously improving the quality of the application experience.

It may be a while before generative AI is pervasively applied across DevOps workflows, but one of the first places it will undoubtedly manifest is in test automation. The challenge and the opportunity now is determining how best to apply it in a way that augments DevOps teams that arguably have never had enough time to properly test code in the first place.

https://api.soundcloud.com/tracks/1761475548

Mike Vizard

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Recent Posts

Five Great DevOps Job Opportunities

Looking for a great new DevOps job? Check out these available opportunities at Northrup Grumman, GovCIO, Northwestern Mutual, and more.

7 hours ago

Tools for Sustainability in Cloud Computing

You’re probably sold on the environmental benefits of moving to the cloud. These tools can help you get there faster…

3 days ago

OpenTofu Denies Hashicorp’s Code-Stealing Accusations

The legal battle between the faux-open-source HashiCorp and the open source OpenTofu heats up.

4 days ago

DevOps Unbound Special Edition from KubeCon Paris 2024 – DevOps Unbound EP 44

During this special KubeCon + CloudNativeCon Europe 2023 edition of DevOps Unbound , Alan Shimel and Mitch Ashley are joined…

4 days ago

Google Adds GenAI Tool to Automate DevOps Workflows

Google this week added a generative artificial intelligence (AI) tool dubbed Gemini Cloud Assist to automate a wider range of…

5 days ago

Rocket Software Makes Testing iSeries Apps More Secure

Rocket Software this week extended its DevOps platform for iSeries platforms from IBM to make it simpler to test applications…

5 days ago