Harness today extended the artificial intelligence (AI) capabilities it provides in its DevOps platform to make it possible to use plain language to create pipelines that adhere to corporate standards.
In addition, Harness AI will analyze pipeline logs, correlate errors, pinpoint root causes and then in a few minutes recommend fixes that if approved will automatically be applied.
It can also be used to enable DevOps teams to use natural language to create policies that can be deployed as code using the Rego programming language developed for building and deploying Open Policy Agent (OPA) software within a DevSecOps workflow.
Harness AI makes use of a mix of AI capabilities, including AI agents, to automate tasks using large language models and a knowledge graph the company previously developed. It will dynamically select either the Anthropic Claude 3.7 Sonnet or OpenAI GPT4.0 large language models (LLMs) to complete a task based on the use case and continuous internal benchmarking that has been embedded into the platform.
Rohan Gupta, product lead for Harness AI, said these latest enhancements to the platform leverage support for the Model Context Protocol (MCP) developed by Anthropic to expose its AI tools developed by Harness to access pipelines, templates, services, environments, connectors and secrets to accurately generate pipelines. That data provides the context required to generate pipelines without having to manually edit, for example, YAML files, he added.
Harness has also already signaled its intention to support the Agent-to-Agent (A2A) protocol that is now being advanced under the auspices of the Linux Foundation, noted Gupta.
It’s not clear at what pace DevOps teams are embracing AI, but a recent Futurum Group survey finds 41% of respondents expect generative AI tools and platforms will be used to generate, review and test code. The challenge that many DevOps teams are starting to encounter today is that the pace at which application developers are able to create code is starting to overwhelm their ability to create the pipelines used to update software builds and deploy applications in production environments.
At this juncture, the one certain thing is that with the rise of AI, organizations are now locked in a productivity arms race, said Gupta. As the pace at which software can be developed and deployed continues to accelerate, organizations will need to update their existing DevOps workflows if they hope to keep pace with rivals, he added.
Inevitably, organizations in the months and years ahead will be building and deploying software at rates that not too long ago would have been considered unattainable. AI technologies are not going to eliminate the need for DevOps engineers so much as they will reduce the current level of toil in a way that makes it simpler to build and deploy applications at scale. In fact, ultimately AI should make the benefits of DevOps more accessible to an even larger number of organizations that previously would not have been able to find and retain the level of DevOps engineering expertise that should now be a lot more accessible.