Opsani has added support for cloud computing environments to its namesake software-as-a-service (SaaS) platform that leverages machine learning algorithms and other forms of artificial intelligence (AI) to optimize the IT environment.
Company CEO Ross Schibler said Opsani AI not only proactively tunes resources such as CPU and memory, but it also can optimize middleware configuration variables such as Java virtual machine (JVM) type and pool sizes; kernel parameters such as page sizes and jumbo packet sizes; and application parameters such as thread pools, cache timeouts and write delays.
Given the size and dynamic nature of most cloud computing environments, Schibler said it’s not possible for IT administrators to manually optimize cloud computing environments to match the specific needs of each class of application workload being deployed. The Opsani platform makes it possible to leverage AI to optimize settings for the lowest cost or best performance every second, he said.
Opsani claims existing customers are seeing more than 200% increases in performance per dollar while saving up to 80% on their cloud spend.
In addition, Opsani has made available plugins for a variety of DevOps tools including GitHub, Terraform, Jenkins, Spinnaker, Wavefront, DataDog, SignalFX, Prometheus, Splunk and New Relic, on all the major cloud computing platforms.
Schibler said Opsani can be applied to existing monolithic applications as well as emerging microservices-based applications based on containers running on Kubernetes clusters. The goal is to make it possible for IT teams to rein in cloud costs at a time when most cloud resources are being employed inefficiently, he said.
Individual developers are deploying applications on virtual machines in the cloud with little to no appreciation for total cost, noted Schibler. Before most IT organizations realize it, the cost of running applications on a public cloud has far exceeded budget allocations. IT organizations not only need tools to rein in those costs, but Schibler noted Opsani also makes it possible to model those costs before an application is deployed or upgraded.
The adoption of public clouds and DevOps often go hand in hand. The challenge organizations are encountering more often is that as the number of applications running in a public cloud computing environment increases, they often wind up overprovisioning resources. Over time, the need to optimize the consumption of cloud resources becomes a more pressing financial concern. However, most IT organizations don’t have the resources or expertise needed to manually tune each instance of cloud workload. At the same time, the typical IT organization doesn’t have the ability to build its own AI engine. Opsani is making the case for a SaaS platform that provides that capability without requiring an IT organization to build its own AI models from scratch.
Naturally, many IT professionals are dubious of AI because there may not be enough transparency into the AI models being employed. However, because Opsani is a SaaS platform, the risk of experimenting with AI to optimize cloud workloads is relatively slight. Of course, there’s no absolute requirement to employ AI at all. However, as financial pressure on IT organizations continues to mount, there doesn’t seem to be much of an alternative either.
— Mike Vizard