Datadog published a report that shows nearly half of organizations using the company’s IT monitoring platform have embraced the AWS Lambda serverless computing framework.
Stephen Pinkerton, a product manager for Datadog, said that number shows serverless computing frameworks are being employed by mainstream IT organizations far more widely than initially might be thought, given the relative age of serverless computing frameworks.
The report finds the median Lambda function invoked by Datadog customers runs for about 800 milliseconds. Nearly one-fifth of functions execute within 100 milliseconds, while about one-third execute within 400 milliseconds. One-quarter of Lambda functions have an average execution time of more than three seconds, while 12% require 10 seconds or more. The duration of Lambda functions is notable because serverless latency impacts not just application performance but also costs. Lambda pricing is based on “GB-seconds” of compute time, which is the memory allocated to your function multiplied by the duration of its invocations.
Not surprisingly, the report notes 47% of functions are configured to run with the minimum memory setting of 128MB. By contrast, only 14% of functions have a memory allocation greater than 512MB, even though AWS will allow up to 3,008MB per function.
As part of an effort to further limit costs, most organizations are not employing a function to call another function and then waiting for a response, which would incur billable invocation time. Rather, serverless functions are making asynchronous calls via a message queue. Functions that are stateless most often read from or write to a separate persistent data store.
Amazon DynamoDB, a document database based on a key-value store architecture, is the most widely used persistent form of storage accessed, followed by an instance of the SQL databases provided by AWS as a service, and then the Amazon S3 cloud storage service.
The Amazon Simple Queue Service (SQS) is the top choice for a message queue for Lambda requests, followed by Amazon Kinesis and Amazon Simple Notification Service (SNS).
The report also notes each Lambda function has a configurable timeout setting, ranging from 1 second to 15 minutes. Two-thirds of configured timeouts are 60 seconds or less. By default, Lambda customers are also limited to 1,000 concurrent executions of all functions in any given region. Only 4.2% of all functions have a configured concurrency limit. A total of 88.6% of companies running Lambda make use of concurrency limits for at least one function in their environment.
Finally, the report notes there is a high correlation between organizations that have adopted containers and those employing serverless computing frameworks. Nearly 80% of organizations in AWS that are running containers have adopted Lambda. However, Pinkerton says that correlation at this point has more to do with the willingness of organizations to employ leading-edge technologies than it does any effort to weave together containers and serverless computing frameworks.
Pinkerton also surmises the primary reason organizations are employing Lambda is to accelerate application performance. However, it’s also worth noting serverless computing frameworks also tend to reduce to the size of an application by relying on external functions to run code that is not frequently invoked outside the core application.
Datadog plans to evaluate usage of serverless computing frameworks on other cloud platforms once they achieve enough critical mass. In the meantime, it’s clear serverless computing frameworks are rapidly becoming an extension of any DevOps pipeline.