Blogs

Microsoft Infuses AI into DevOps Workflows

Microsoft this week added a bevy of tools to its portfolio that infuses generative artificial intelligence (AI) into DevOps workflows.

Unveiled at the Microsoft Build 2024 conference, those additions include GitHub Copilot for Azure which enables DevOps teams to build, troubleshoot and deploy applications on the Microsoft Azure cloud using a natural language interface.

GitHub Copilot for Azure is one of the multiple extensions that Microsoft and third-party partners are providing to, for example, customize tools from Docker, Inc. and Sentry. Microsoft is also embedding its Copilot into Microsoft Teams and Microsoft 365 development tools.

At the same time, Microsoft is previewing a Copilot for Azure that DevOps teams can use to orchestrate application deployments on Azure.

In addition, Microsoft Visual Studio 17.10 embeds GitHub Copilot directly into the integrated development environment (IDE) to provide diagnostic and code review capabilities. Microsoft also previewed updates to Azure Developer Command Line Interface and AI Toolkit for Microsoft Visual Studio Code that enable DevOps teams to integrate copilot sample repositories into DevOps workflows that can be extended to address large language model operations (LLMOps).

Microsoft CEO Satya Nadella told conference attendees that Microsoft is now applying the AI capabilities it provides to write code to manage IT infrastructure and IT operations. Microsoft is redefining software development as part of an effort to one day enable anyone to go from idea to code in an instant, he added.

Paul Nashawaty, practice lead for application development and modernization at the Futurum Group, noted Microsoft in effect is streamlining development processes in a way that promises to make developers 50% more efficient as they build AI applications.

As part of that effort, Microsoft is also making available reference architectures and implementation guidance for building and deploying applications infused with AI models on Azure.

Microsoft is also previewing a Microsoft Azure Compute Fleet service that simplifies the provisioning of compute resources across different types of virtual machines, including spot instances, running in multiple availability zones depending on cost, capacity and performance requirements. IT teams will be able to deploy and manage up to 10,000 virtual machines with a single call to an application programming interface (API) to manage resources that can now dynamically scale as needed.

There is also now an additional instance of an Azure ND MI300X v5 VM based on graphical processor units (GPUs) from AMD and Microsoft is also previewing a Cobalt 100 Arm-based virtual machine based on its custom silicon. Microsoft has also made available a migration tool for shifting instances of Linux operating systems to Azure.

Microsoft is also adding support for a sidecar capability through which DevOps teams can embed logging, monitoring and caching capabilities without having to change application code. DevOps teams can now also use Git repositories to manage the development of applications created using the Microsoft Power framework.

In addition, Microsoft is also adding support for Pulumi as another option for provisioning cloud infrastructure as code.

Finally, Microsoft also made an Azure API Center generally available to discover, manage and govern APIs and make it possible to import Azure OpenAI endpoints as APIs into the Azure API Management service that now also supports OData and gRPC APIs.

Organizations Increasingly Operationalize AI

It’s not clear to what degree DevOps teams are infusing AI into workflows. However, as the amount of code being written using generative AI tools increases it’s only a matter of time before they will need to revisit existing pipelines. Legacy continuous integration/continuous delivery (CI/CD) platforms were not designed for the AI era.

In addition, it’s only a matter of time before machine learning operations (MLOps) workflows are merged with DevOps workflows to accelerate the building and deployment of AI applications.

The only thing that remains to be seen is whether those changes will be made this year, or next, as organizations increasingly operationalize AI.

Mike Vizard is a seasoned IT journalist with over 25 years of experience. He also contributed to IT Business Edge, Channel Insider, Baseline and a variety of other IT titles. Previously, Vizard was the editorial director for Ziff-Davis Enterprise as well as Editor-in-Chief for CRN and InfoWorld.

Recent Posts

Why Business Risk Observability is Becoming Critical for Modern Applications

Full visibility into hybrid infrastructures is crucial for supporting the secure development and deployment of modern applications.

20 hours ago

Decoding DevSecOps: Striking the Right Balance

DevSecOps has promise and pitfalls, and we need a path forward to achieve a balance between speed and security.

20 hours ago

SmartBear Adds More Generative AI Testing Tools to Platform

SmartBear this week extended its efforts to bring generative artificial intelligence (AI) to its test automation portfolio to include test…

23 hours ago

From Concept to Reality: How ASPM Brings DevSecOps to Life

Building on the foundation laid by DevSecOps, ASPM represents a leap forward in operationalizing these principles within the CI/CD process.

1 day ago

Transforming the Mainframe Developer Experience With Generative AI: A Day in the Life of a Developer

GenAI helps developers maintain consistency and compliance across the codebase, reducing the risk of introducing errors or vulnerabilities.

1 day ago

Cyber A.I. Group Announces the Engagement of Walter L. Hughes as Chief Executive Officer

Miami, United States, 20th June 2024, CyberNewsWire

2 days ago