Dreamforce 2023 - why getting data out of silos and democratizing integration are essential enablers for AI
- Summary:
- As Dreamforce opens, expect to hear more about Data Cloud and MuleSoft integration tools, since both are crucial enablers for effective use of AI.
While AI is the overarching theme at Dreamforce this week, there are two interlinked topics that are crucial to its effectiveness — data and integration. Enterprises increasingly recognize the importance of bringing their various data sources together for analysis and rapid action. The race to leverage AI makes that effort all the more important. Integration, of course, is fundamental to this effort, and also helps enable automation of end-to-end processes as organizations look to move faster and deliver better outcomes to customers.
No wonder, then, that MuleSoft, the integration business acquired by Salesforce five years ago, has become a huge contributor to its parent's continuing growth. In the company's recent Q2 earnings call, Brian Millham, President and Chief Operating Officer at Salesforce, called out its impact:
In Q2, nearly half of our greater than $1 million deals included MuleSoft. And as customers bring together data from all sources to fuel efficiency, growth and insights, MuleSoft has become mission-critical for them and was included in half of our top 10 deals.
I recently spoke to Param Kahlon, EVP and General Manager for Automation and Integration at Salesforce, to find out more about the investments customers are making in data and integration. It's all about breaking out of former silos of operation to join up data and processes, as he explains:
We are in the integration business of making sure that we can break the silos across systems, to make sure that from the purpose of executing a business process — take order-to-cash, take procure-to-pay, take service request-to-resolution — a lot of times the data needs to travel through multiple systems across multiple people. We've created our technology to make sure that we can connect the dots across these things.
The interest in AI has increased the urgency to act, he goes on:
As we look at now, creating these data repositories that can be used by AI algorithms to reason over that, bringing that data in real time across those different silos is a huge tailwind for our business, because our customers are now finding new uses for what we've created in solving the integration challenges for our customers.
Getting data out of silos
Consolidating data has been a particular challenge for enterprises, given the history of monolithic application stacks that each manage their own data stores. This has been an issue even within Salesforce's own product set, particularly around marketing, where like many other vendors, Salesforce has been developing a Customer Data Platform (CDP) called Data Cloud. At last year's Dreamforce, this acquired a real-time data layer called Genie, and this week we'll likely hear more about Data Cloud's evolution. Kahlon explains its importance:
The real strength in my opinion that Data Cloud has is, it understands and stores the metadata of Salesforce ... What Data Cloud provides is, because it is natively embedded into the metadata which is what constructs the Salesforce application, that can be natively available within Salesforce. That is the real power that Data Cloud provides to Salesforce customers. It's the ability to aggregate that data, reason over that data, but also have the capabilities to have that natively available within Salesforce, so we can do things, for example, take actions, when the data changes in Data Cloud.
So, for example, when a customer gets in contact with a bank, the system can bring together information about their ATM use, their website interactions and a recent support case they've raised, together with other aspects of their profile stored in the system. The agent can then have all the relevant information in front of them to inform the conversation they have with the customer and help them reach the best outcome.
The advantage that Salesforce has, according to Kahlon, is this metadata layer is already tuned to the applications customers use, and therefore embedding it in the Data Cloud enables faster, more relevant responses. He explains:
Salesforce has been successful, I think, in the CRM business primarily around this metadata-based architecture that Salesforce created 20 years ago, which has been very differentiated, and creating very configurable applications that business users are able to change.
Now we're applying the same construct, the same approach on the data and the Data Cloud piece so that you can not only create meaningful AI algorithms on that, but you can natively consume that directly within Salesforce to either visualize or take actions based on it.
As Salesforce brings generative AI tooling to bear on that data, having the metadata layer in place makes it possible to train the underlying Large Language Model (LLM) more accurately. He explains:
Businesses really need the generated content that's coming back, contextualised, grounded on, their data. Grounded on, who they are trying to communicate with ... We're working on technology that is essentially going to create contextual prompts that are going to be passed to the LLM models to be able to bring back results.
Having reliable, trustworthy responses generated by these models is going to be crucial in a business context. He goes on:
The most important thing for business customers is not the fact that you're getting relevant information back. It's that, is there hallucination in that large language model? Is there toxicity in that large language model that returned that thing? Are we actually sending an email to [the customer] that is not in accordance with the values that we stand for as a company? And how do we detect that and how do we prevent and stop ourselves from doing something that we wouldn't stand for as a company?
That piece of trust layer is the most important thing that, when we speak to our customers, they really care about. So the capabilities that we're building in our large language model, our Einstein gateway, is the ability to detect the hallucination, detect the toxicity, insert a human in the loop when you see something like that so you're able to give a very trusted response back to the person you're emailing them back for.
Data Cloud can also connect to other data sources, such as Snowflake and elsewhere, and interpret that data in real-time without having to ingest it and created a duplicate store. He comments:
I think duplicating data is definitely not the approach that any customer wants to take. Customers have been burned by having replicas of data sitting around and then trying to figure out which one's the latest, which is the actual source of truth, because data ages rapidly and you really need to find out where the data actually sits.
Making APIs discoverable
On the integration side, the rise of APIs has increased the ease of connecting across applications and data sources. But discipline and organization is crucial to ensure that APIs don't proliferate in ways that cause confusion and redundancy. He explains:
What developers do is they create APIs. And a lot of times those APIs are only things that *they* can understand. We see this problem around API sprawl. You've got too many APIs that are trying to go after the same sets of data, the same sets of problems, and then there is no consistent way in which those APIs can be reused by other folks, other developers, that are trying to create experiences for those customers ...
One pattern that we're seeing is basically standardizing the use of APIs, publishing those APIs in a way that developers within the company, but also beyond the company that want to access your data, can come to one centralized portal and be able to use that in a central authenticated way. We're seeing a lot of standardization around that.
The other trend that we are seeing is that business wants to go after using the data without having to rely on IT. So we're seeing this approach around a low-code platform for getting business to access that data without having to think about APIs and think about those things. So we're seeing this approach around connectors. So what is a connector? A connector is an abstraction on top of, let's say, an API or a set of APIs, that creates business information ... This low-code, iPaaS solution is getting extremely relevant for our customers as well.
So instead of having to call eight to 10 APIs to, for example, get an employee record from Workday, react to an order in SAP or an updated opportunity in Salesforce, a connector takes care of all that, probably with authentication too and perhaps triggers for follow-up actions.
The demand for more automation can only be met with the use of low-code tools, he argues. Instead of integration experts getting bogged down with growing demands on their time, organizations need to create Centers for Excellence where they can focus on putting the connectors and automations in place that business users can then harness. He adds:
Companies have ... a lot of legacy applications and legacy products that they can't replace fast enough, but they don't want to get slowed down in their transformation journey. I think that is where we're seeing a big shift in the market, going away from, 'We are going to code everything' to 'We need low-code tools so that we can get more and more business users involved' and a Center for Excellence to be able to create simple automations and capabilities that can drive that transformation.
In the future, AI may also play a part in easing the workload on integration specialists. MuleSoft recently introduced an internal API marketplace called API Experience Hub, where APIs are published and made discoverable for users. This is a first step towards making it easier to find and reuse APIs, but in the future, AI might supplement this. He goes on:
Some of the fascinating work that's happening around AI is, can AI monitor the execution logs and be able to find which APIs are being used? Today we see that, not as much in developers trying to find that, but companies trying to manage the security layer of APIs. 'I want to find out what APIs are being executed by looking at the execution logs, the monitoring logs on the network and be able to start to manage and govern those APIs' and providing universal API management to be able to access the policies to do that. That's fascinating.
I think at some point in time, AI would be available to developers in a way that, 'Hey, I want to access order data in SAP, what can I use?' and it can go look at all the APIs that are available, published in different assets within the company, but also look at execution logs and find out, well, this is the way people are accessing order data in SAP.
My take
I've been hearing about the use of AI to help discover APIs for many years now — are we now close to seeing it become a reality? We may not find out at this Dreamforce, but in the meantime I'm curious to see how much progress has been made on Data Cloud since last year. Consolidating data access and analysis is becoming priority number one in the new age of AI. As Salesforce faces challenges too from an up-and-coming generation of ultra-composable vendors, I'm also curious to see what's next in its integration strategy.
For all the highlights from Dreamforce 2023, check out our dedicated events hub here.