When it comes to deploying apps in the cloud, the conventional wisdom is that DevOps is not required if you’re a small or midsized company. The cloud is designed to enable virtually anyone to set up the infrastructure they need, quickly and easily, without a lot of experience in the intricacies of managing data center operations or hardware. But then, by the time the company realizes it needs DevOps, it’s almost too late—the cloud infrastructure has become complex, costs have exceeded expectations and precious time is being wasted.
To ensure you’re able to optimize cloud, or hybrid cloud, deployments—and avoid many of the problems that other companies have experienced—there are a number of best practices you must employ within your company.
Best Practice No. 1: Realize You Need DevOps from the Start
When you manage your own hardware, you know exactly what the deployment costs are and what resources are available, and have a clear understanding of how they are interrelated. The abstraction of the cloud makes the initial setup a snap. But typically there is no formal structure involved: Individual developers can initiate new instances whenever they need them, without considering the bigger picture of the resources already in use, how they’re used or what is being planned by the organization.
Spinning up unnecessary cloud resources not only costs a company more in the long run, it also leads to a cloud sprawl that requires time-consuming, manual processes to maintain. Think about the challenges your company would face if it had thousands of machines created by different processes but you didn’t know who created them, what they were being used for or even if they were still needed.
By employing DevOps from the start, you will be able to balance the needs of developers with the availability of resources in the cloud to ensure expenses do not run out of control and you understand exactly how the cloud infrastructure is being used now and into the future.
Best Practice No. 2: Have a Process and Purpose
As you create your cloud infrastructure, you need to have a clearly defined purpose, as well as a process in place for deployment and ongoing management. You will need to keep track of what resources are being used and their designated purposes. Some cloud vendors, such as AWS, offer tagging capabilities that help you keep track of what services each machine was created for and whether they are being used for production, backup, load balancing or testing.
Regardless of whether you use tagging capabilities or maintain a separate database of cloud resources, knowing what you have purchased and how it’s being used will provide needed visibility, ease of management and greater control of costs.
Best Practice No. 3: Leverage Automation
Automation is important—both in the cloud and on premises—to streamline routine practices, make sure they’re applied consistently and eliminate potential errors that are common with manual processes.
When deploying on multiple clouds or in a hybrid environment, it is important to understand that tags used for one cloud provider won’t work on another. The process is typically broken into three steps: inventory creation, system preparation and automatic deployment. Many cloud systems allow you to create a dynamic inventory directly from the servers, eliminating the need to create or maintain it manually. However, deployment over multiple clouds require an abstraction of this inventory; it is common to create a static inventory in a unified format from the various dynamic inventories.
Best Practice No. 4: Design the Network with Security in Mind
Poorly designed networks are almost impossible to secure. Problems often arise when companies allow their networks to grow organically rather than being planned out in advance. A number of companies have employed the best practice of segregating the services network and the management network. The services network includes all the components required to provide services to your customers and users, while the management network is comprised of what you need internally to manage the network. Separating these networks provides greater assurance that unauthorized users cannot gain access to your systems via the Internet.
Another part of security is key management. Depending on the cloud provider, keys can be widely available or have a root password that is open to the world. In fact, many hackers routinely scan Github for keys developers accidentally check in. With having to manage so many keys, you may struggle to ensure that they’re not exposed. DevOps plays an important role in keeping the keys locked away, then providing them to a select group of individuals who can use them only in production.
Best Practice No. 5: Be Strategic About Monitoring
When you’re deploying hardware on-premises, it’s a common practice to define in advance what to monitor. But monitoring in the cloud is quite different, with basic defaults that produce vast quantities of data—much of which may not be relevant to your organization.
Taking a cue from how you handle monitoring on-prem, you’ll want to design cloud monitoring to fit your company’s unique needs and filter out the information that isn’t pertinent to your network or operations.
Best Practices Go a Long Way
Keeping in mind these best practices and employing a DevOps model from the start will enable your company to get the most benefits from its cloud deployment regardless of its size, and avoid the costly and time-consuming challenges that arise without proper advanced planning and a strong focus on network design and processes.
About the Author/Roger Fulton
Roger Fulton is Iron.io’s director of Infrastructure and Operations. His background includes broad experience in software architecture and operational planning. He manages international teams to deliver highly available products for end users and highly performing in-house infrastructure. Prior to joining Iron.io, Roger served as head of application and infrastructure engineering at Kaybus. He has also held technical leadership roles at JRV Consulting Group, Cybrata, Invisible IT and Clarus Systems. Roger holds a Masters from the Queens University of Belfast, having graduated with highest honors.