NVIDIA in partnership with JFrog and Dynatrace are working toward integrating DevOps tools with a platform for building artificial intelligence (AI) agents.
At the COMPUTEX conference in Taiwan this week, NVIDIA added a validated design for building AI agents to the NVIDIA Enterprise AI Factory, an application development platform based on the NVIDIA Inference Microservices (NIM) framework that leverages containers to provide access to reusable services.
Additionally, NVIDIA also added NVIDIA AI Blueprints to make it simpler for developers to build avatars, otherwise known as smart AI teammates.
At the same time, JFrog revealed that its platform for managing software artifacts will be included in the NVIDIA Enterprise AI Factory platform.
JFrog CTO Yoav Landman said the Software Supply Chain Platform developed by JFrog will now serve as a single source of record for keeping track of the provenance of the software artifacts and AI models used to build AI applications. The JFrog Software Supply Chain Platform provides the ability to track the pulling, uploading and hosting of AI models and datasets, AI containers, Docker containers, and dependencies across NVIDIA’s AI Factory platform. All told now, the JFrog artifact repository can now support more than 40 different types of software packages, including NVIDIA NIM.
Additionally, DevSecOps teams building these applications will be able to use JFrog tools to scan for vulnerabilities and other security-related issues, he added.
JFrog also revealed the core JFrog Platform can now also run natively on NVIDIA’s Grace Blackwell graphical processor units (GPUs) to improve overall performance.
Dynatrace, meanwhile, announced it is integrating its observability platform with NVIDIA Enterprise AI Factory to enable DevOps teams to troubleshoot the NVIDIA application development platform.
It’s not clear how widely adopted the NVIDIA Enterprise AI Factory has been adopted, but as more AI applications are built, the need to integrate these types of platforms into DevOps workflows is becoming more pressing. On one level, AI models are just another type of artifact being used to build an application. However, the way existing artifacts are managed using version control systems doesn’t as easily apply to AI models. Each new version of an AI model, for instance, isn’t backwards compatible with the previous instance of the model, noted Landman.
Regardless of how AI application development becomes integrated with DevOps tooling and platforms, the pace at which these applications and associated AI agents are being built will increase dramatically in the age of AI.
The challenge and the opportunity is integrating a set of application development platforms that, in many instances, are adopted by data science teams rather than traditional application developers. In effect, data scientists are emerging as a new type of end user that DevOps teams will need to support.
Undoubtedly there will be culture clashes in the months ahead as organizations look to extend their existing DevOps workflows to govern the development of AI applications. In the meantime, however, DevOps teams would be well-advised to investigate how their organizations may be building AI applications today in a way that probably isn’t going to scale as easily as it should.