At its PuppetConf 2017 conference, Puppet made it clear its ambitions now extend well beyond its namesake IT automation framework for IT infrastructure. On the heels of acquiring Distelli in September, the company announced it is adding several products to its portfolio based on technologies developed by Distelli.
The new offerings include Puppet Pipelines for Apps, to automate commits; Puppet Pipelines for Containers, to automate the building of Build Docker Images from a source repository and then deploying them on a Kubernetes cluster; and Puppet Container Registry, to host Docker images and unify views of them regardless of whether the repositories are local or remote.
At the same time, the Blueshift platform developed to automate deployments of container platforms such as Kubernetes, Docker Swam and Mesos is also being enhanced. The Kubernetes module now installs the container cluster in a more secure way that is also highly available. The Docker module adds support for Puppet Code Manager, Docker Swarm mode and Docker Secrets. A Kream tool makes it easier to deploy Kubernetes and Helm, a package manager for Kubernetes, at the same time. Helm also now can manage applications deployed on Kubernetes.
In addition, the company announced Puppet Discovery, a tool that makes it simpler to identify infrastructure distributed across a hybrid computing environment, and Puppet Tasks, which employs new open-source agentless software dubbed Puppet Bolt to automate entry-level IT tasks. Bolt is capable of running existing scripts written in any framework and executing them on any platform that Puppet supports.
The company is also adding a Task Manager to Puppet Enterprise that makes it easier to manage ad hoc tasks and executing them across hundreds of Puppet nodes. The latest release of Puppet Enterprise includes tools for inspecting packages deployed on and a tool to simplify configurations of those nodes.
Puppet has also expanded its alliance with Splunk to tighten integration between the two IT operations’ platforms, and formed new alliances with Baracuda Networks, CloudPassage, Conjur/CyberArk, Cumulus Networks, Electric Cloud, F5, Intelliment Security, Onyx Point, Praecipio Consulting and Sensu. In addition, the Puppet module for the Microsoft Azure cloud has been updated to provide access to more disk storage, expanded networking support and support for Azure tags.
Tim Zonca, vice president of marketing and business development, says with these products Puppet should no longer be considered a one-product company. It is now committed to developing a full range of DevOps management capabilities. That’s significant, notes Zonca, because today there is too much tension between IT operations teams and developers caused by silos of automation across various DevOps processes. As part of that effort, the company is also committing to start to synchronize the release cycles of the various elements of its portfolio.
As DevOps continues to mature, the line between how infrastructure, applications and security are managed continues to blur. Many IT organizations prefer to acquire a set of DevOps tools from one vendor because it simplifies purchasing and support. Over time, those vendors combine those tools into an integrated suite that aims to unify the management of infrastructures, applications and security. Most vendors today are a long way from unifying those functions, but a wave of consolidation across the DevOps sector suggests the race is on to provide a unified DevOp stack capable of being employed across multiple platforms.
The percentage of IT organizations that make extensive use of IT automation frameworks such as Puppet is relatively small. Much of the resistance to IT automation frameworks can be attributed now to simple inertia. But as the size and scope of IT environments exceed the capabilities of an internal IT team to manage manually, it’s now only a matter of time before reliance on IT automation increases.
Puppet and other vendors also have infused their platforms with machine learning algorithms to expand the scope of what can be automated within an IT environment. In general, Zonca says reliance on those capabilities tends to increase in time once the DevOps teams gains confidence in the algorithms’ abilities to make the right decision. Until such time, Zonca says most IT professionals employ those algorithms to either present them with a series of suggested options based on the most recent events that have occurred. Given the nearly infinite number of possibilities that can be inferred from those events, most DevOps teams already are going to need all the machine learning algorithm help they can get.