DevOps Practice

Predictions 2020: Five Real-Time Data Predictions

According to Gartner, 85% of big data initiatives end in failure. In 2020, organizations are out of budget and operational runway, and need to start executing and getting the big data recipe right.

It is not just about big data; it is about using data differently. Rather than rearward-facing business intelligence, leading companies are driving differentiation and profit by using data to make fact-based decisions in real time. This requires the broader application of real-time information to make a difference in user experience, in prioritization of production and manufacturing schedules, and in spotting problems before they happen.

Rather than reports for a few people, the new data-driven business processes are engaged continuously, and across all customer engagements, transactions and decision points within business processes. Scale means throughput across millions of transactions and with potentially millions of customers. Database systems have to be engineered differently, with different technology built to deal with real-time demands and high throughput across huge numbers of endpoints.

As we enter the next decade, here are five predictions, centered on real-time data, that we expect to emerge.

Machine Learning and AI Models Finally Get the Data They Deserve

Most AI and machine learning initiatives are niche, nascent and not operating anywhere near scale. According to industry research (O’Reilly), hardware and data are the primary bottlenecks in getting enough data processed fast enough to harness the full promise of AI. That’s changing, with the arrival of new wire-speed hardware (from companies such as Intel) and clever data platforms designed to leverage hardware innovations for real-time efforts on a massive scale. It’s all about the amount of data you can process in real time to make AI and machine learning actually work. This is what is needed, and it’s becoming practical.

Privacy Prompts a New Era of Data Innovation

Businesses hoard data. And often, so much of it isn’t used due to infrastructure costs and technology bottlenecks. Now, as new privacy laws (GDPR, CCPA, etc.) show their teeth, companies really need to make all that data worth the risk. It will drive new real-time business processes that are not only safer but squeeze value out of data in new and exciting ways.

Companies face a delicate balancing act. They must comply with the realities of new regulations, while also being able to use personal data in real time to serve customers what they want and expect. Companies that can satisfy the regulators and appeal to customers will win.

If an Economic Downturn Occurs, Data Will Drive the Next Round of Economic Gains

An economic downturn is inevitable. When it happens, companies will be asked to do more with less. As a result, they will need to completely rethink their solution and delivery strategies.

In that environment, with fewer markets opening or providing a sufficient return on investment, the next wave of gains is to optimize your existing segments and operations to the fullest extent. These data-driven initiatives require that companies understand customers better, and be able to instantly target them in critical moments—harnessing massive amounts of data to suggest and expand purchases. 

5G Will Accelerate Customer Expectations

As bandwidth increases and high bandwidth communication becomes available across devices (on cars, planes, homes, ships, buildings, etc.) and in more situations, we will see data being used differently. What may have been processed intermittently in batches with high degrees of latency will now be processed in real time, driving higher efficiency and removing risk from supply chains, customer interactions and equipment maintenance and performance. It will no longer be OK to know how you’ve done and correct it next quarter; rather, things must be corrected in near real time. New standards will be set by the highest-performing companies, and entirely new businesses will emerge that are leveraging a new level of availability to real-time data.

Decline of Globalization Will Force Companies to Know Their Customers Better

As global markets shrink, it becomes imperative for companies to up their game and drive for more market share within the available market. Companies that have been treading water and growing at the rate of the global market expansion are going to start to shrink. Differentiation through better customer experience and better management of supply chains will make the difference between continuing to grow versus stagnation or shrinkage.

Companies that have invested in instrumenting their supply chains, who have a deep understanding of what comes from where and what the alternatives are, have already begun to optimize around the new web of tariffs. Those that are less agile are seeing reduced profits or reduced sales based upon the elasticity of their pricing options.

The more real-time data one has, the clearer the picture of what options are available as the global situation changes. The faster a company can process that data means either missing financial targets or outperforming peers.

Want to learn more about what to expect in 2020? Join us Jan. 23 for our Predict 2020 Virtual Summit  featuring discussions from some of the industry’s best and brightest offering up their visions for the future. Sign up today for this free daylong virtual event.

Lenley Hensarling

Lenley Hensarling

Lenley Hensarling is the chief strategy officer of Aerospike. Lenley has more than 30 years of experience in engineering management, product management and operational management at both startups and large successful software companies. Lenley previously held executive positions at Novell, Enterworks, JD Edwards, EnterpriseDB and Oracle.

Recent Posts

Valkey is Rapidly Overtaking Redis

Redis is taking it in the chops, as both maintainers and customers move to the Valkey Redis fork.

9 hours ago

GitLab Adds AI Chat Interface to Increase DevOps Productivity

GitLab Duo Chat is a natural language interface which helps generate code, create tests and access code summarizations.

14 hours ago

The Role of AI in Securing Software and Data Supply Chains

Expect attacks on the open source software supply chain to accelerate, with attackers automating attacks in common open source software…

20 hours ago

Exploring Low/No-Code Platforms, GenAI, Copilots and Code Generators

The emergence of low/no-code platforms is challenging traditional notions of coding expertise. Gone are the days when coding was an…

2 days ago

Datadog DevSecOps Report Shines Spotlight on Java Security Issues

Datadog today published a State of DevSecOps report that finds 90% of Java services running in a production environment are…

2 days ago

OpenSSF warns of Open Source Social Engineering Threats

Linux dodged a bullet. If the XZ exploit had gone undiscovered for only a few more weeks, millions of Linux…

3 days ago