Oracle today unfurled Oracle Cloud Infrastructure (OCI) AI services, a collection of services that make it easier for developers to use application programming interfaces (APIs) to invoke a wide range of services based on algorithms trained by Oracle on their behalf.
Elad Ziklik, vice president of product management for AI services and data science at Oracle, said the primary goal is to offload the data engineering tasks associated with machine learning operations (MLOps) from organizations that simply want their developers to include AI capabilities within their applications without having to necessarily hire and then retain a dedicated data science team.
However, organizations also have the option of uploading their own data into OCI AI services to retrain the models provided by Oracle, he noted.
Ultimate, the goal is to make OCI AI services a simple extension of DevOps workflows by making it simpler to employ AI models; these models become just another software artifact that can be stored and updated in a Git repository alongside all the other artifacts managed by a DevOps team, said Ziklik. In effect, Oracle wants to make AI ‘boring’ in the sense that these capabilities can now be readily invoked by any development team, he added.
The OCA AI services include OCI Language, a text analysis capability that enables developers to add sentiment analysis, key-phrase extraction, text classification, named entity recognition and other similar capabilities to their applications, and OCI Speech, which converts file-based audio data containing human speech into text transcriptions.
Oracle has also included OCI Vision for image recognition and document analysis tasks, OCI Anomaly Detection to create models that flag irregularities in processes, OCI Forecasting for tracking business metrics over time and OCI Data Labeling for creating datasets to train AI models.
These services are designed to address the most critical AI requirements of the average enterprise rather than attempting to outperform on a set of AI benchmarks that don’t reflect the requirements of business development teams, said Ziklik.
In addition, Ziklik noted Oracle has also made it easier to address AI compliance requirements that will eventually require organizations to document precisely how their AI models work.
At some point soon, just about every application will be invoking AI capabilities to one degree or another. That shift creates a major workflow challenge for DevOps teams. AI models are typically created by data science teams that typically don’t successfully deploy an AI model in a production environment more than a handful of times a year. Those AI models, however, need to be integrated with applications that, in many cases, are being continuously updated via DevOps workflows.
Oracle said that goal becomes easier to achieve using models it creates versus allowing data science teams to build out a set of custom MLOps workflows on platforms that then need to be maintained by IT professionals. Instead of having to set up, for example, a feature store to track and manage AI models, the models created using OCI AI can be stored in an existing Git repository.
Naturally, each organization will need to determine the degree to which they will converge MLOps and DevOps best practices. Oracle, however, is betting that the simplest way to achieve that goal is via AI models surfaced via a cloud service that developers can readily invoke whenever required.