Heroku, a subsidiary of Salesforce, is piloting support for inference capabilities within its platform-as-a-service (PaaS) environment in addition to tightening the level of integration it provides with artificial intelligence (AI) agents developed by its parent company.
Betty Junod, chief marketing officer for Heroku, said the Managed Inference and Agents (MIA) and AppLink capabilities being added to the platform will make it simpler for application development teams to build and deploy AI applications without having to build entire environments themselves.
Additionally, Heroku is now piloting an ability to publish events via a centralized hub that can be subscribed to by multiple application services.
Finally, Heroku has now added support for a VS Code extension, the .NET framework and Jupyter notebooks along with an ability to invoke multiple cloud services provided by Amazon Web Services (AWS), including Amazon Elastic Kubernetes Service (EKS), Amazon Elastic Container Registry (ECR), AWS Global Accelerator and AWS Graviton.
It’s not clear how many organizations are adopting PaaS environments to build and deploy applications, but as more platform engineering teams are formed, many of them are now concluding it is simpler to employ Heroku as a cloud service rather than building the equivalent of a PaaS environment themselves, said Junod.
Now that Heroku has made its PaaS environment generally available on an open source stack of software, dubbed Fir, Heroku is making a push for developing modern applications based on container technologies on its platform. Most AI applications, for example, are being deployed on Kubernetes clusters, noted Junod.
The challenge DevOps teams face is building and maintaining their own stack of open source software to build and deploy those applications is both difficult and costly. A cloud service provides that same capability in a way that enables application developers to self-service their own needs within the context of a set of defined guardrails, noted Junod.
Historically, the Heroku PaaS environment has been widely used to build 12-factor monolithic applications, with more than 65 million applications having been built on the platform. By adding support for Kubernetes, it is now more feasible to build cloud-native applications as well. That approach provides a level of abstraction that reduces the amount of knowledge individual developers need to have about the underlying Kubernetes infrastructure.
That’s especially in an era where many AI applications are being built and deployed on Kubernetes clusters, added Junod. A Futurum Research survey finds 61% of respondents report they are using Kubernetes clusters to run some (41%) or most (19%) of their production workloads. The top workloads deployed on Kubernetes are AI/ML/Generative AI (56%) and data-intensive workloads such as analytics, tied at 56% each, closely followed by databases (54%), modernized legacy applications (48%), and microservices-based applications (45%).
Regardless of the type of application being built, organizations will be building, deploying and updating a mix of monolithic and microservices-based applications for many more years to come. The challenge and the opportunity now is finding a way to unify the underlying infrastructure in a way that ultimately serves to reduce the total cost of application development.