Akamai today launched an API Acceleration service specifically designed to accelerate application programming interface (API) calls using special purpose hardware, reserved capacity and prioritized routing.
At the same time, the company is also making available a globally distributed key-value store dubbed EdgeKV that enables stateful applications to be deployed on its EdgeWorker nodes.
Finally, Akamai is also making available free tiers of services that provide access to a combined 60 million serverless EdgeWorker events per month with a limit of up to 30 million EdgeWorker events per tier. The two tiers are made up of a basic compute service for applications requiring lower CPU and memory consumption and a dynamic compute service for those applications demanding additional resources.
Josh Johnson, an enterprise architect for Akamai, said Akamai is expanding its offering at a time when many organizations are starting to deploy microservices-based applications across an extended enterprise using serverless EdgeWorker nodes deployed at the network edge. EdgeKV provides access to a low latency data store that also enables those edge computing applications to process data closer to the point where it is being created and consumed.
In general, Akamai reported that, in 2020, its platforms processed more than 300 trillion API requests, a 53% year-over-year increase. Much of the increased activity is being driven by distributed applications that are deployed directly on the Akamai CDN, rather than being deployed in a local data center to take advantage of a global network of data centers that allow IT organizations to take advantage of caching. In effect, the CDN becomes the target platform when an application developed using DevOps best practices is being deployed. The rise of edge computing has increased the number of use cases where DevOps teams are relying on CDNs to deploy applications.
Naturally, competition among providers of CDNs has increased considerably as the number of applications deployed at the edge continues to increase. There may even come a day when there are more applications deployed at the network edge than there are in the cloud as more organization deploy applications that need to process and analyze data in near-real time as part of a larger digital business transformation initiative.
Akamai, of course, as the provider of the largest global CDN, is betting it will be able to leverage the size of its CDN to convince developers to deploy applications on its platforms rather than smaller rivals.
Regardless of the approach to edge computing, the applications being deployed will be more latency sensitive than ever. Each IT organization will need to decide to what degree they want to count on infrastructure to optimize API calls those applications make before they time out or need to be redirected to another microservice. One way or another, the laws of physics still apply. No edge computing application is an island. It depends on a complex web of infrastructure ranging from routers and switches to servers running back end services in the cloud. The challenge is finding a way to optimize all those interactions in a way that developers never have to be concerned about.