If you’ve attended any technology conference or event in the last 12 months, you’ve probably lost count of references to generative AI. Every tech vendor appears to have something to say on the topic. In June 2023, we announced generative AI features in Oracle Fusion Cloud HCM, and last month we announced 50 additional use cases across ERP, SCM, and CX applications, all powered by the Oracle Cloud Infrastructure (OCI) Generative AI Service.
On the surface, many of these updates appear similar to what other vendors are talking about, but there are fundamental differences between how vendors are building, delivering, and pricing these capabilities.
To help organizations better understand the AI strategy for Oracle Fusion Cloud Applications, I wanted to provide a quick overview of how we’re embedding AI and integrating enterprise LLMs (for generative AI) into our applications.
Built on breakthrough AI infrastructure
Oracle Fusion Cloud Applications Suite’s embedded AI strategy starts at the infrastructure level with Oracle Cloud Infrastructure (OCI), which has been built with a unique architecture to run AI workloads. Its GPU cluster technology, with scale of over 16K H100 GPUs per cluster, and very low latency with the highest bandwidth RDMA network in the cloud, is designed to deliver high performance at lower cost.
Oracle is the only technology vendor that offers both a full suite of cloud applications and a next-generation cloud infrastructure specifically designed to run them. This unique combination powers classic AI and generative AI features that are embedded within Oracle Fusion Cloud Applications and creates a continuous feedback loop that drives what we believe is the fastest innovation cycle in the industry. As a result, our customers can quickly and easily adopt the latest AI capabilities to increase productivity, automate end-to-end business processes, improve decision making, and reduce the cost of doing business.
The combination of OCI, Oracle Fusion Applications, and the thousands of customers that use our applications daily enable us to continuously improve our AI capabilities and deliver best-in-class AI solutions.
Generative Al powered by enterprise-grade LLMs
Using OCI’s Generative AI Service, which leverages Cohere’s enterprise-grade large language models (LLMs), we offer generative Al capabilities that allow customers to take full advantage of both foundational and specialized LLMs to provide greater control over results. The aim is to drastically reduce the time it takes users to complete tasks, improve the customer and employee experience, enhance the accuracy of data insights, and ultimately increase business value.
Embedded into business workflows… at no extra cost
Like our approach with classic AI, we believe generative AI should be seamlessly embedded into existing workflows supported by Fusion Applications and focused on solving business problems to enhance productivity. For this reason, we are working with customers to prioritize the most impactful generative AI use cases that we can deliver and are including these in the hundreds of regular updates we provide each quarter.
We see other vendors presenting AI as a standalone capability that needs its own brand name, or introducing AI simply for its own sake. In contrast, we have sought to embed AI directly into the user experience to increase end-user productivity and improve customer and employee experience. This is aimed at practical use cases, from helping create job and inventory item descriptions, to summarizing management and financial reporting footnotes, to explaining data variances and providing recommended actions based on those variances. This is how embedded AI delivers unprecedented efficiencies and opportunities in the workplace.
Like our classic AI features, generative AI will be available for customers to use out of the box so they don’t have to invest data science or IT. Customers will also be able to take advantage of the new generative AI features at no extra cost as they are included as part of their existing Oracle Fusion Applications subscription.
A human always in the loop
We believe that with both generative and tradition AI, a human should always be in the loop to review, edit, and approve content and recommendations.
Humans make mistakes, and AI is not immune from errors. Yet, a combination of AI and human insight can provide better results than either can on their own. For example, generative AI can help summarize employee performance for submission during the regular review cycle based on feedback gathered across the year from the employee, peers, or managers, removing recency bias, while including goal progress and achievements. That is a huge benefit, but by itself, AI might miss certain nuances of the employee’s work that only a manager or colleague can identify. The combination of AI in conjunction with human insight is currently the best approach.
Respecting data privacy and security
We respect customers' enterprise data, privacy, and security. With Oracle's generative Al service, no customer data is shared with LLM providers, other customers, or other third parties. In addition, an individual customer is the only entity allowed to use custom models trained on its data. Oracle understands that security and privacy are critical for enterprise scenarios.
Just the beginning
As generative AI continues to evolve, so will our applications. The ideas and feedback we get from our customers drive our innovation, enabling the consistent enhancement of our cloud solutions.
We’re bullish on the potential of AI for business and are excited to see new business models and processes emerge over the coming years. Thank you to all our customers who are collaborating with us as we build the future of business.
To learn more about Oracle’s AI strategy, visit: oracle.com/artificial-intelligence.