This is 4th part of the multi-part series covering Automation, provisioning and Configuration Management. In this article best practices for Puppet shall be covered.
Puppet, one of the most well-liked configuration management tools, can also turn out to be complicated at times. In this post I’m going to briefly go through a few practices that have emerged as good, recommendable and high-level best practices. I would like to share some of them with the world so that the beginners get an advanced start.
1). Use Modules when possible:
Puppet modules are something everyone should use. If you have an application you are managing, add a module for it, so that you can keep the manifests, plugins (if any), source files, and templates altogether.
2). Keep your Puppet content in Version Control:
There is NO reason, not to use a version control system while developing puppet manifest/modules. You can pick your favorite systems — popular choices being Git, Mercurial or Bazaar which prove to be particularly useful due to the ease that they provide in managing multiple branches of code.
Version control is well-supported by the Puppet ecosystem. It is possible to use a mature Software Development Lifecycle to manage the development and maintenance of your Puppet manifests, tightly integrated with a branch-based workflow that truly realizes the ideals of “infrastructure-as-code.”
Using version control helps in opening up of lot of additional possibilities with puppet, like better tracking of changes, testing your Puppet manifests in an isolated environment, promoting your configuration from environment to environment, etc. Version control even provides a free backup for your configuration code.
By making a use of a collaboration tool like Github or Bitbucket, you and your team can easily revise each of the changes separately before they are applied. This results in much better, more sustainable Puppet code. Placing all your Puppet files under version control is counted as one of the best Puppet practice and whenever you are ready to “deploy” any kind of changes to your Puppetmaster, you just have to sync the working copy on the server with the code in the version control repository.
3). Make use of Environments:
Environments are isolated groups of Puppet agent’s nodes. A Puppet master server can serve each environment with completely different main manifests and module paths. Puppet has this concept of Environment that helps in applying your configuration changes on less critical servers first and after the changes has been tested and ready, then promoting those changes to production. This frees you to use different versions of the same modules for different populations of nodes, which is useful for testing changes to your Puppet code before implementing them on production machines.
Staging and production are two Puppet environments that are used where Staging environment is used at an initial provisioning of a server to all pre-production boxes and production servers make use of the Production environment. Both environments are tied to a specific branch in Git repository i.e., “master” branch in Git is production and “staging” branch is staging.
4). Use dry-runs:
Even with the best precautions taken, sometimes your Puppet manifest doesn’t do exactly what you expected. Things can get messy at times whenever you actually get to run the Puppet agent to apply your configuration updates on your servers. For example, if it would update a config file and restart a production service this could result in unplanned downtime. Also, sometimes manual configuration changes are made on a server which Puppet would overwrite.
To reduce the risk of problems, you can use Puppet’s agent in “dry run” mode (also called noop mode, for no operation) using the following options:
puppet agent […] –verbose –noop –test
By making use of this practice you will be able to see the difference for all files that would modify as well as validate things according to your expectation. It will help Puppet agent to only show what it would do, not what it did.
5). Managing Puppet modules with librarian-puppet:
Handling module dependencies can sometimes become a source of worries, especially when several numbers of people are working on Puppet code and each one demand to test it on their own computer. Librarian-puppet is a bundler for your puppet infrastructure that helps in providing sanity to the process by automatically managing your module dependencies.
Librarian-puppet manages your modules/ directory based on your “Puppetfile”. The tool will install, update or remove modules automatically when you run it, always matching what’s specified in the Puppetfile. Your Puppetfile becomes the authoritative source for what modules you require and at what version, tag or branch.
Once using Librarian-puppet you should not modify the contents of your modules directory. The individual modules’ repos should be updated, tagged with a new release and the version bumped in your Puppetfile. It will help resolving and installing modules’ dependencies and become aware of compatibility issues.
Librarian-puppet simplify deployment of your Puppet infrastructure by automatically pulling in modules from the forge and git repositories with a single command thus saving you from installation and managing your modules manually. Deployment normally comes up with following two simple steps:
- Sync your main sources with your code repository (ex: git pull)
- Run librarian-puppet to synchronize your installed Puppet modules
Don’t use Git dependencies with any version specifier
6). Keep sensitive data safe:
Today Data security is the key concern for any organization. Some data always needs to be kept safe. For example, for maintaining the data security it may be required to put in your Puppet code in passwords, private keys, SSL certificates and so on.
Don’t put Puppet code in version control unless you’re absolutely aware of the risks you’re taking while doing so.
If you’re already a bit familiar with Hiera which is a Puppet tool, you know why it’s a good idea to separate your data from your Puppet manifests. In case you aren’t familiar, though, using Hiera leases you write and use reusable manifests and modules. After you store your organization-specific data in Hiera, Puppet classes can request the data they need from your Hiera data store.
Hiera allows you to store data about your servers and infrastructure in YAML or JSON files. From usage, you’ll see that most data in Hiera files is not confidential in nature… so should we refrain from using version control for Hiera files just because of a few elements that are unsafe? CERTAINLY NOT!!!!!!!!!!!!
The trick is to use Hiera’s ability to combine multiple data sources or backend
What you can do is split Hieradata files into 2 types: YAML files for your “main” Hieradata files and JSON files to store your “secured” data. Those JSON files are not to be put under version control and are stored securely on a single location i.e., “the Puppetmaster”. This way, very few people can actually see the contents of the sensitive files.
7). Producing abstractions for you high-level classes:
Wrapping up all the uses of modules into a covering or let’s say into a wrapper classes provides higher-up maintainability of the Puppet code in time. For example, Assume you want to setup a reverse proxy server using an existing Nginx module. Instead of directly assigning the ‘nginx’ class on your nodes and setting all of the required stuff up, create instead a new class called, say, ‘proxy_server’ with the attributes you want to consider for your proxy server as class parameters. Assigning the ‘proxy_server’ class on your node not only better states your intent, but it also creates a nice little abstraction over what you consider as a “proxy server”. Later on, if you decide to go away from Nginx (highly improbable) or use another Nginx module (more probable!), then you’ll probably just need to change the content of your “proxy_server” class, instead of a bunch of tangled node definitions.
In the End:
Puppet isn’t the only Configuration management (CM) tool available, but it is the most mature, with a large community of active module developers. Implementing these practices will help you in getting the most out of the Puppet.