Large language models can draft code or move artifacts, but without situational awareness they still trip over the basics. Cloudsmith CEO Glenn Weinstein tells Mike Vizard why a new piece of plumbing—the Model Context Protocol (MCP) server—is quickly becoming table stakes. Think of MCP as a receptionist for AI agents: it answers questions like “Which Docker images are in my repo?” and supplies environment-specific details the model would otherwise guess—or miss entirely.
Context alone isn’t enough. Developers increasingly chain agents together to run multi-step jobs—pull a package, scan it, publish it—without a human in the loop. That hand-off requires agent-to-agent (A2A) protocols so one bot can call another securely and without endless re-authentication. Google’s recent move to donate A2A to the Linux Foundation hints at how fast the ecosystem is converging on open standards.
More context and more agents lead to more builds—sometimes hundreds per day. Weinstein warns that existing CI/CD pipelines will bottleneck if artifact storage can’t keep up. Teams already accustomed to nightly releases will feel pain when AI turns “once a day” into “once an hour” unless repositories serve packages globally and caches stay warm.
There’s also a supply-chain angle. Hallucinating agents may suggest outdated or even nonexistent packages. An artifact manager that doubles as a control plane—tracking provenance, scanning for vulnerabilities and rejecting spoofed names—becomes a final checkpoint before code reaches production.
Weinstein’s takeaway is blunt: experiment with AI copilots today, but raise expectations for every tool in your stack. If a platform can’t expose its data through an MCP endpoint and play nicely with agents, it will feel ancient in a year. Start mapping where context lives, audit your APIs and assume that the next generation of developers will treat AI companions as a given. Your pipelines should be ready before they arrive.