Software systems continue to produce more and more data. And making use of it has proven benefits — so much so that many analysts have, over the years, referred to data as the new oil. As a result, the majority of organizations are expending effort into refining their data — in fact, a recent study found that 84% of organizations have either already deployed or have data-driven projects on their roadmaps.
Corporations like Facebook and Google are the poster children for business models that harvest and monetize end-user data. However, this is only one aspect; valuable data is being produced by internal systems as well, which, if leveraged correctly, can provide insight into software observability and increase process automation for DevOps teams and developers. For example, time-stamped application logs are necessary for informing SLOs to maintain reliability standards. There is an ongoing parallel investment into AI/ML deployed with cloud-native tools to act upon production data to drive further business growth. As such, 63% of IT decision-makers say new revenue opportunities have emerged due to data and analytics.
The 2022 Data & Analytics Study, conducted by Foundry (formerly IDG Communications), explored how data-driven initiatives continue to be an important investment area for executive leadership. Below, I’ll review the key results from the survey to consider how organizations can continue to make intelligent decisions now and into the near future.
The State of Data-Driven Investments
We undoubtedly live in a data-driven world, and investment in related projects continues to rise. More than half (55%) of IT decision-makers plan to increase their data-focused investments — the report found the average spend to be $12.3 million in the coming year. This figure rises to an average of $23 million for financial services, which makes sense given the nature of modern banking.
Naturally, the use of analytics platforms is increasing in tandem with the amount of data generated. In terms of specific analytics tools, 50% of respondents said they used business intelligence platforms while 47% used relational databases and 19% planned to invest in them in the next one to two years.
We’re also noticing an increasing use of cloud-based solutions. For example, 27% of an organization’s data analytics workloads now run in the cloud. The report also found an uptick in cloud-based enterprise-scale data warehouse technologies, an area that shows signs of increasing in the coming year.
So, what are the most common driving factors behind these types of projects? Well, automating internal business processes is the top goal for data-driven projects—half of IT leaders described this as a primary objective. This is closely followed by other ambitions such as improving customer insights (46%), aiding customer support (43%) and automating IT operations (43%).
In terms of type, transactional data tends to be the most useful—54% of companies are using transactional data in their data-driven projects. This includes consumer purchasing behaviors such as sales, returns and credits. The next-most-common type is machine-generated data, which includes information from logs, sensors, telemetry, networks, security systems and/or IoT devices. The third-most-common type is customer profile information.
Data has a powerful impact in a business context as a means to refine existing digital products. Other respondents added that a data-driven approach to DevOps helps increase visibility and drives continuous improvement.
Data Quality: Highest Ranked Challenge
Undoubtedly, particular challenges remain. The greatest hurdle is retaining data quality—41% of organizations reported dealing with this issue. Quality may be poor because data is unstructured or incompatible with other sources. Other widespread challenges include governance issues and integrating data from multiple sources.
For those undertaking data-driven projects, 44% said they lacked appropriate skillsets, such as analytics training, data management, security, business intelligence and integration expertise. Companies also often faced funding and talent-related quandaries when beginning data-driven projects. For example, 26% of small-to-medium-sized businesses lacked the necessary funding to take on data-driven projects, according to the research.
Another challenging area (that I’ve covered previously) is optimizing how data is collected and stored. As cloud storage fees rise, organizations will likely want to refine retention and resolution to avoid creating unnecessarily large, expensive data lakes.
Rising Deployment of AI/ML
AI/ML is already well-established as a way to advance the use of data. More than half (54%) of companies used predictive analytics or planned to incorporate it into their systems in the next 12 months. Just under one-third (31%) also either used or planned to use anomaly detection in the future. These areas along with other functionalities (such as natural language processing and predictive analytics) can be used to aid applications such as process automation, decision support, customer analysis, virtual agents and others.
The industry is at an exciting moment for leveraging data in DevOps. Data is becoming more and more accessible to enable DevOps with a full-stack picture of the application life cycle. To direct future engineering investment, data-driven decisions will rely on things like performances and usage habits. Thus, it’s an interesting time to consider how you might leverage data for greater process automation and software development fluidity.
The 2022 Data & Analytics Report surveyed 872 IT decision-makers (ITDMs) from around the globe working in various industries. The respondents’ average company size was about 12,000 employees. To view the report for more insights, you can pick it up here.