Pulumi Adds Generative AI CoPilot to Manage Cloud Infrastructure

Pulumi today added a public beta of a Copilot tool to its platform for managing cloud infrastructure that makes use of generative artificial intelligence (AI) to automate a range of DevOps tasks.

Based on a large language model (LLM) developed by OpenAI, Pulumi Copilot is trained using more than 10,000 prompts created based on the Pulumi data model and the REST application programming interfaces (APIs) exposed by cloud service providers.

Pulumi CEO Joe Duffy said it has as a result a semantic understanding of more than 160 cloud computing environments to provide relevant and contextual responses based in usage patterns to natural language queries. In effect, Pulumi Copilot functions much like an engineer who has been added to a DevOps team.

For example, Pulumi Copilot can find resources in a cloud environment that the AI assistant can then use to generate additional code or documentation, troubleshoot issues or diagnose compliance and security errors. In the near future, those AI assistants rather than merely responding to a prompt will be able to automatically perform an action on behalf of a DevOps team, noted Duffy.

Initially, Pulumi Copilot is available for free during the public beta but in time the company will adopt a pricing model for it that will be similar to the one used by GitHub, said Duffy. Organizations that are Pulumi Enterprise customers, for example, will have full access.

As generative AI assistants become more available, the amount of time currently required to onboard an engineer to a DevOps team should substantially decline. Today it can take several months for a new member of a DevOps team to become effective.

Orchestrating AI Assistants

The next major challenge will be orchestrating all the AI assistants that might become part of a DevOps workflow. As each provider of a DevOps tool or platform makes available AI assistants a need to orchestrate the assignment of tasks to them will emerge. There might, for example, eventually be a need for a primary AI agent that manages the workflows assigned to AI assistants who have been trained to perform a narrow range of tasks.

Regardless of approach, much of the drudgery that often conspires to make managing DevOps workflows a tedious process may soon be eliminated. The challenge and the opportunity now will be reforming DevOps teams that will soon be made up of human engineers and their various AI assistants.

In the meantime, DevOps engineers should start creating an inventory of their least favorite tasks with an eye toward eventually assigning them to an AI assistant to perform.

It’s not likely AI assistants will replace the need for DevOps engineers any time soon. It’s generally expected thanks to advances in AI that more software will be created and deployed in the next few years than what was previously deployed in the past two decades. There will always be a need for software engineers to supervise application environments where the number of dependencies that exist between applications will require some all too human intuition to troubleshoot. The only difference is instead of requiring a small army of software engineers, a smaller team will be able to manage applications at a level of scale that not too long ago would have seemed unimaginable.

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Recent Posts

Pair of Surveys Surface Primary Developer Productivity Challenges

A survey of 900 developers finds well over two-thirds (69%) said developers are losing eight or more hours of their…

7 hours ago

4 Reasons Why Tech Leaders Should Prioritize the Testing & Mocking Phase for Better Development

Automated testing and mocking need to be the most prioritized area in your SDLC to eliminate friction for your developer…

12 hours ago

Implementing Threat Modeling in a DevOps Workflow

Integrating threat modeling into the DevOps workflow is essential to identify and mitigate potential security threats.

14 hours ago

Valory Unveils AI Software Engineer for Building Multiple Types of Agents

Valory has launched an artificial intelligence (AI) agent, dubbed Propel Genie, that is specially designed to act as a software…

18 hours ago

Tricentis Acquires SeaLights to Map Code Changes to Tests

Tricentis today revealed it has acquired SeaLights, a provider of a software-as-a-service (SaaS) platform that uses machine learning algorithms to…

18 hours ago

Real-Time Monitoring of Third-Party APIs: Benefits and Implementation

Exploring the benefits of real-time API monitoring for third-party integrations and provides a guide on implementing it effectively.

2 days ago