Cloud costs. This simple reality has given rise to all manner of toolsets now being focused on modern IT stacks to shore up resources, reduce waste, optimize identified efficiencies and to monitor system wellbeing at the highest, widest and broadest level.
Across cloud computing landscapes now being reinvented for generative AI functions, the rise of machine learning operations (MLOps) practices is aiming to bring software engineering in line with operations engineering and—perhaps above all—cost engineering.
Before MLOps, There Was FinOps
But before we tread too deep into MLOps territory, perhaps we should remind ourselves that FinOps (operations aligned to consider the financial impact of all systems architecture and software engineering decisions) is really the foundational practice that should oversee all matters in this field.
CloudBolt is one organization that’s been very vocal in its desire to help businesses optimize, automate and control hybrid cloud and multi-cloud environments by providing cloud financial management, automation, orchestration and governance solutions for global enterprises. The firm runs several industry reports around various aspects of multi-cloud complexity.
Institutionalized Platform Engineering
CloudBolt’s Kyle Campos, chief product & technology officer (CPTO), suggests that as businesses continue to sharpen their focus on efficiency–scrutinizing every expenditure and tool–by December 2024, most forward-thinking CIOs will ‘institutionalize’ platform engineering and establish corresponding teams. These teams will not only manage technology assets but also prioritize them to accelerate business value. This strategic shift aligns with the broader industry trend of optimizing technology stacks for agility and efficiency, offering a multi-dimensional approach to business.
“FinOps silos that drive friction and [the presence of] unrealized optimization promises will reach a breakthrough as the conversation and solutions shift from ‘motivation’ to ‘facilitation’ in the coming year,” said Campos. “With platform engineering fully adopted as a technology approach by the majority of forward-thinking IT organizations by the end of next year, FinOps practices will become a native part of the journey to ‘golden paths,’ on par with security and observability, as the trifecta of defaults in the delivery process.”
The Ultimate Performance Metric
After decades of profligate spending on hardware, software and services, Campos said he thinks that 2024 will become the year that the cost to run an application becomes the ultimate performance metric. Why does he make this assertion? Because the traditional metrics that are focused on commodities (CPU, memory, disk, network) will actually become less important, he said.
As digitally transformed organizations continue their advance to the cloud, the cost-of-goods-sold will get linked intrinsically with operational and services measured at a finer-grained level; looking at specific tools, services and algorithms for cost savings.
“As organizations grapple with intricate financial challenges in 2024, consumers will increasingly demand flexibility, compelling cloud providers to offer more consumer-centric pricing and services,” said Campos. “The FinOps FOCUS initiative instituted this year is just the beginning. The collective voice of the user community will continue to be a game-changer in leveling the playing field between cloud providers and consumers.”
AI/ML Disrupts Linear FinOps
Campos concluded by saying that currently, there is a direct correlation between user complexity/friction and capabilities/cost granularity in the FinOps solution ecosystem.
“Increases in the latter make the former that much more complex,” he argued. “But advances in AI/ML will continue to disrupt the status quo, lowering the barrier and time to value for users. What used to take hours of custom configuration, trial and error, will be a low-friction conversation. Additionally, AI/ML will facilitate unit cost solution inversion such that unit cost is provided to the user, not from the user.”