Now that Tableau Software has agreed to be acquired by Salesforce and Looker is set to become a part of Google, the rate at which analytics will be embedded within almost every application is about to accelerate.
Tableau today is relied on by more than 86,000 organizations to surface business intelligence via data visualizations created by professional analysts and line of business executives. That user base accounts for why Salesforce is willing to spend $15.7 billion to acquire Tableau.
Competition within the analytics software space, however, has always been fierce. The rise of the cloud has resulted in several analytics startups challenging incumbents for a piece of one of the fastest growing sectors of the enterprise software market. One of those startups is Looker, which created a cloud-based analytics platform that Google is now set to acquire for $2.6 billion.
Google has already signaled its intent to make Looker a part of its expanding Google Cloud Platform (GCP). As such, Google Cloud CEO Thomas Kurian last week noted one of the primary reasons for the Looker acquisition is to add a software-as-a-service (SaaS) offering that will enable developers to embed analytics within their applications by invoking a set of application programming interfaces (APIs) exposed via a cloud service. Today that analytics service runs on GCP alongside the Google BiqQuery data warehouse. Google, however, has also signaled its intent to make Looker available on multiple clouds alongside the rest of its software portfolio.
Tableau, which will continue to operate as an independent arm of Salesforce, is playing catch up in the cloud. Tableau is deployed widely in both on-premises and public cloud computing platforms, including as a managed service provided by Tableau through which the company manages a dedicated instance of its software on behalf of customers. The merger with Salesforce will create an opportunity for Tableau to add a full software-as-a-service (SaaS) application instance to its portfolio that leverages the same infrastructure that Salesforce employs to deliver customer relationship management (CRM), marketing and customer service applications. Salesforce CEO Mark Benioff this week said going forward, Salesforce fully expects analytics generated by Tableau to drive a raft of digital business transformation initiatives.
Most of the analytics that will be embedded within applications will manifest themselves in three forms. The first is embedded software that will be updated continuously as new data is entered in an application. The second is embedded predictive analytics modules that will invoke big data repositories and machine learning algorithms to surface recommendations on how to optimize a business process or customer experience. The third—and more ambitious—goal is an embedded prescriptive analytics capability that automatically optimizes business processes and customer experiences in real time based on events occurring, for example, within an internet of things (IoT) application.
Every provider of analytics capabilities, from traditional Tableau rivals such as Qlik Software and Microsoft to TIBCO Software, IBM, MicroStrategy, SAP, Oracle and host of others, is pursuing all these emerging use cases for programmatic analytics. Most recently, Information Builders revealed it has embraced Docker containers and Kafka messaging platform as a means for achieving that analytics goal.
The arrival of programmable analytics platforms will, of course, make implementing best DevOps practices all the more challenging. It’s hard to think of a business process that will not benefit to some degree from embedded analytics. The challenge DevOps teams will face is embedding and then continuously updating all the analytics modules embedded within the applications they deploy. The good news is that as organizations move to embed analytics everywhere increases, so too does the need to embrace DevOps to manage it all.