Scalyr today announced that its log analytics cloud service is now capable of ingesting over 200TBs of data per customer per day in real-time.
Company CEO Christine Heckart said the goal is to make as much log data available at a price point that will continue to be less than $5 per GB at scale.
Heckart said Scalyr expects to be able to continue to expand the amount of data it makes available to customers because its namesake platform is built on top of a columnar data store rather than an index engine. The goal of a project dubbed Sonic Boom is to eventually make a petabyte of data available to customers of the cloud service, she said.
One of the paradoxes of log analytics is that while IT organizations are encouraged to use this data to proactively identify issues before they become a major problem, the cost of storing that data often results in IT organizations limiting the amount of log data they analyze. As a result, they often miss outlier events that could have major implications for the IT environment.
As a cloud service that is compatible with the S3 storage interface defined by Amazon Web Service (AWS), the columnar store engine employed by Scalyr makes it easier to pass the economics of cloud storage back to the end customer, she said. IT costs in the wake of the COVID-19 pandemic are, of course, going to be a much bigger issue going forward. Almost every aspect of IT, including log analytics, is likely to be subject to a cost review in the week and months ahead as organizations adjust to an economy that has changed for the worse overnight.
Heckart said the issue many organizations will face when it comes to logs is that data is the lifeblood of any digital process. Organizations may not need to store that data, forever, but analyzing log data to identify anomalies indicative of, for example, a cybersecurity breach is now almost a daily routine. As such, how much log data to make available where at what cost becomes a subject of debate.
Of course, it may be less costly to store log data locally. However, someone on the IT staff will need to manage those logs. Scalyr and other providers of SaaS-based platforms for log analytics contend it’s more cost-effective to rely on a cloud service to manage log data that also is more accessible via shared analytics tools.
Regardless of the amount of IT resources made available, the one thing that is certain is neither the amount of data nor the rate at which it needs to be analyzed will be slowing down anytime soon. In fact, many organizations are likely to accelerate their transition to digital processes that are ultimately both more flexible and cost-efficient than existing processes. It will be up to DevOps leaders to figure out how best to keep pace with that rate of change with whatever resources are at hand.