SAP TechEd 2023 news analysis - what did we learn about SAP's AI strategy, pricing, and developer engagement?

Jon Reed Profile picture for user jreed November 3, 2023
Summary:
If you wanted reams of AI-related news for SAP TechEd 2023, you got your wish. But did we advance our understanding of SAP's approach to AI innovation - from pricing to improving LLM output with SAP data? The answer is yes; here's what I've learned so far.

Juergen Mueller at SAP Teched 2023 Bangalore
(Juergen Mueller at SAP Teched Bangalore 2023)

Say what you want about SAP - the company is good at keeping the drama high, sometime of its own making.

SAP is also expert at issuing massive volumes of news around events - enough to justify a "news guide," complete with table of contents. Timed with SAP TechEd Bangalore and SAP Virtual TechEd (ongoing as of this writing), SAP has done it again - with the SAP TechEd News Guide.

Want to argue about a show's top news story? Count me in. I often pick an underrated story that is not prominently featured - I'll share a couple of those at the end of this article. But if I had to pick the trendiest news story from this batch, it would be the SAP Build Code announcement. SAP seems to be on a similar wavelength - beyond the news guide, a standalone press release is out: SAP Turns Every Developer Into a Generative AI Developer at SAP TechEd in 2023.

SAP TechEd's trendiest news story: SAP Build Code

SAP Build Code is basically the "pro-code" companion to SAP Build's low-code environment, but with a big generative AI twist. As SAP explains:

Developers creating new applications or extensions for SAP solutions can now leverage SAP’s generative AI assistant Joule to generate code, create data models and test data for applications. Developing unit test scripts and testing applications for various scenarios is also easier with SAP Build Code’s new generative AI capabilities.

In terms of SAP Build Code and other TechEd news, I'll have a chance to dig deeper with Executive Board Member and CTO Juergen Mueller when we sit down at ASUG Tech Connect in New Orleans next week. For now, I'll shift into something I wasn't expecting: during a three hour virtual media/analyst virtual SAP TechEd event on Monday, SAP clarified a number of open questions around SAP's AI pricing and strategy. The first hour of the event also marked my first session with SAP's new Global Head of AI (and SVP) Walter Sun.

SAP's AI strategy sharpens - on BTP,  Joule, and the foundational model

A quick quote from my August piece on SAP's overall AI strategy. As Thomas Saueressig, Member of the Executive Board of SAP SE and leader of SAP Product Engineering noted, SAP believes it can improve upon the enterprise relevance of Large Language Models by utilizing LLMs in conjunction with SAP's own "foundational model," trained on anonymized opt-in data of "thousands" of SAP customers, and infused by real-time customer data via a vector database:

We have an AI foundational layer in BTP; we automatically use the prompts that get sent to our AI [systems]. Alongside, we leverage our foundational layer to make it more reliable and relevant for the customer. But we will also pass the prompt via an extension of HANA, which we extend to a vector database. From the same prompt, you get the real-time data of the customer itself, from this specific customer, to bring it into that context as well. [Author's note: some have assumed that 'foundational model' means SAP is building its own LLM, but in fact a foundational model is not limited to LLMs and often means something different. SAP has no plans to build an LLM; as Thomas Saueressig explained, SAP believes its 'Business AI' advantage is found by applying AI in a business applications context, and supporting its customers to bring SAP data to bear on AI - not by building and maintaining LLMs].

But how does SAP Joule, SAP's AI "co-pilot," fit into this picture? Does a co-pilot impact SAP's AI pricing plans? Can you really charge a "premium" for any digital assistant embedded in existing products, and still delight customers dealing with tight operating margins? During the first hour of our media/analyst session,  Philipp Herzig, SVP, Head of Cross Product Engineering & Experience at SAP, explained how these AI "layers" fit together:

The bottom [layer] is really the AI foundational Business Technology Platform, bringing the best of Large Language Models out there, together with SAP applications and the real-time data. What we also do on this foundation layer is we are building our own foundational model, because we have consent by almost 25,000 Customers who allow us to use their business data for product development purposes.

Then there is the second layer, where business applications are "infused" with generative AI:

We want to build a dedicated foundational model that can make predictions and simulations about the business world, in finance, in HR. We can combine this then with the power of language models. In the second layer, we use this AI foundation on the Business Technology Platform to infuse generative AI. Whether this is SuccessFactors writing job descriptions, writing performance reviews, writing comprehensive OKRs and goals. Whether this is for document processing in the supply chain, or whether this is writing for an accounts receivable clerk - dunning letters and interactions with the customer. These are the scenarios in the middle layer.

Joule, then, is the top layer:

And then on top we have introduced Joule. Joule brings all the capabilities with AI together in a totally new experience, because now you can actually work in any system with SAP and ask questions - where's the latest invoice;summarize the last meeting, or whether this is about summarizing a document - you can all do this through Joule.

Amidst the TechEd news barrage, SAP also announced the Generative AI hub, part of SAP's overall AI data platform. That news is beyond the scope of this piece, though I'll get to the vector database news shortly.

SAP shares more AI pricing details

On the pricing side, I was skeptical about how SAP could limit Joule with a premium pricing plan. But as Herzig explained, SAP has a different plan: a pool of free credits for Joule, then transitioning over to the premium AI credits (e.g. "units"), which companies can purchase.

All the applications with Joule embedded have a certain, very generous free tier layer, so people can get started right away. But of course, it comes with a certain limit, so to speak. This limit will be based also on messages that you're exchanging with Joule. Basically, the tiering depends on company size, and so on. So it will be a certain formula how that limit is computed.

And then once you hit that limit, you would go into overage. Similar to all the other pricing that we apply, it will also count against this AI unit concept we talked about earlier... This is why the AI unit comes in handy, because now it actually helps us to work with one metric. Whether it's Joule or whether it's the embedded applications, it's all the same, and depending on the usage, the customer will then see it respectively in their dashboards.

SAP's vector database + HANA Cloud news - and why this impacts enterprise AI

This wave of SAP TechEd announcements also surfaced insight on SAP's vector database, which incorporates customer-specific data into the customer's Large Language Model output (e.g. via SAP's support of Retrieval-Augmented Generation. RAG is an important technical step in improving the overall accuracy/relevance of LLM prompts). As SAP said in its main SAP TechEd press release:

Vector data stores manage unstructured data – such as text, images or audio – to provide long-term memory and better context to AI models. This makes it easy to find and retrieve similar objects quickly. For example, users can search for suppliers based on the language in their contracts to examine payment history and trace individual orders. These powerful new vector database features enhance interactions between large language models and an organization’s mission-critical data.

Herzig added:

We announced that HANA Cloud will come with an all-new vector database capability - so that unstructured text documents like invoices, credit memos, and other documents can stay within HANA Cloud, because we can directly take a PDF document, for example, and compute it in a vectorized format that we can use actually to compute our generative AI use cases.

During the second hour of the event, Irfan Khan, President & Chief Product Officer, SAP HANA Database & Analytics, provided more context. He spoke to the problem of a Large Language Model's training data as a static point in time, or perhaps even obsolete - in need of the real-time view from a customer's own data sets. That customer data is anything but static, which is where the vector database comes in:

The idea is that the vector capability will allow us to provide that higher level of efficiency, where just as you run through the query processing, and the dynamics with natural language query or natural language processing, we'll be able to take advantage of the vector store to be able to index directly against the same data...

This is really giving us a solid capability now, natively inside of HANA, under the operation of - let's call it the RAG framework - where you could use HANA now, as a first class citizen within your gen AI use cases, and be able to further supplement all of the existing data in the LLM that you built, or the public models that you want to access directly - by utilizing more SAP specifics - SAP context, SAP business data.

My take - a step forward, but more questions remain

I wasn't expecting SAP's TechEd news barrage to bring more clarity to SAP's overall AI strategy and pricing. It may have indicated a shift in tone from the "premium AI pricing" overreach I have been critical of. Of course, vendors will need to charge for generative AI in some way; Wall Street expects no less, and running these systems will not be cheap. But I felt SAP was getting ahead of itself, talking up pricing and profit before nailing down the AI proof points and ROI - and getting the tech into customers hands.

That isn't solved yet; more details are needed. But at these recent sessions, the tone was different. There was also less emphasis that a customer would need RISE to access SAP AI innovations. There are many reasons why a rigid "must have RISE for AI" policy strikes me as a bad idea; I've detailed them before and won't revisit that here. 

I still have a hard to imagining SAP telling a customer ready to purchase its AI solutions "you must be a RISE customer first." If SAP is becoming less rigid on this point, it will be welcomed. Herzig did provide some reasons why customers need to be in either the public or private cloud to access some of this AI functionality. But technical pre-requisites for accessing new functionality are not uncommon in our industry. If SAP clearly explains those limitations and why they exist, that is a very different matter than requiring RISE for AI when it's not technically necessary (I will dig further into these public/private cloud reasons next week).

During the fall event season, I've been surprised by how many ERP/HR/CRM vendors seem to think they have a captive audience for AI services. Hardly. There are a host of enterprise-savvy AI providers eager to pounce on enterprise AI business. It's now up to SAP to prove its developer tools are better for building AI apps than other options. While the announcements SAP made this week are the right ones, the adoption question can only get answered over time.

I liked that SAP emphasized both developers and partners building AI solutions, using the platform BTP provides. Ultimately, for SAP's "Business AI" strategy to succeed, it needs a thriving apps community beyond SAP's internal apps development. I wanted to press further into how SAP could impose quality control, certification and "responsible AI" upon those external apps, just as it does on itself - that's a question I hope to ask Mueller about next week.

Speaking of which, I've heard plenty from SAP about the impact of its AI ethics policy on development. But how does this work in practice? Is there actually an example of SAP blocking an app, or pausing development because of ethics concerns? I put the question to Sun and Herzig, who told us that the answer was yes. He used the example of an interview helper for HR managers. 

The original idea? Combine info from the job description's candidate expectations and also from the CV. SAP put the brakes on that, because even when you remove personally identifiable information from a CV, there are ways of connecting the dots. Herzig:

We turned down that idea of the CV because we said, even if we remove all the PII related data from that CV, there is still, in the resume, 'Where did you study? What was your prior work experience'. There's still information might be contained in the CV that might be indirectly relatable to your person. So we didn't figure out yet a way we can actually do it - this is why we then dropped this specific requirement. [Author's note: Herzig said SAP is still looking into this scenario but with the job description/candidate expectations, not the specific CV profile. You can see more on SAP's Ethics Board and approach via its AI Ethics home page.]

Watching SAP's AI messaging evolve this fall, I can't help but wonder if SAP's interactive virtual events for media and/or analysts had an impact (I haven't had a chance to check out the main virtual TechEd experience yet). While most vendors scaled back their virtual influencer events this year, SAP scaled it up - typically using very interactive formats. This brought SAP leaders a heaping helping of pointed feedback. Candid dialogue with user groups like ASUG and DSAG brought these same issues into focus.

While I wasn't a fan of doing just one on-the-ground TechEd this year, the virtual sessions SAP conducted are getting much closer to creative event design than I usually see. In particular, vendors have been slow to allow a thriving chat while speakers are talking, perhaps believing this detracts from the main presenter. But letting a chat stream run in parallel is advantageous. SAP usually has subject matter experts answering chat questions while the presenter is talking; I'd estimate SAP is answering 3 - 5x as many questions as vendors typically address on these calls.

Of course, there are more burning questions: when will these AI products ship on a per-release basis; how will AI pricing work for different applications, company sizes etc. This is not unique to SAP; most enterprise vendors issued a slew of AI announcements this fall, leaving customers and media to scramble for the specifics on beta and general availability. In most cases, we are still chasing down the details (for SAP Joule, we know a few things: SuccessFactors will be the first to GA on November 17. S/4HANA public cloud is currently slated for Joule in the February 2024 S/4HANA release. A roadmap for Joule and the rest of SAP's LOB cloud solutions is forthcoming).

On the plus side, SAP's virtual event format allowed me to get several of my top questions answered, including my follow-up to Mueller and Saueressig on SAP's foundation model GA progress. Saueressig told us a status report was presented to SAP's Supervisory Board last week, and the foundational model is already being used for internal benchmarking: "I hope that early next year we can have an early/alpha release for the foundational model itself, and then let's see - I don't want to provide our GA date for it now. But I think we're on a great progression." Saueressig went on to say that SAP has come a long way towards getting "really structured tabular data" into the foundation model - a big step when you consider how important this type of data is to most enterprise gen AI scenarios.

Next week, I hope to ask Mueller about any progress SAP has made on S/4HANA migration tools. These tools exist, but last time I heard from ASUG members on this, they wanted even more robust options. Customers obviously want to take as much friction out of S/4HANA moves as they can. Amidst the arguably over-emphasized AI storylines, the skills, business case and technical needs of S/4HANA migrations remain a core customer concern - and a crucial issue for SAP's long term outlook.

I'll have a chance to ask customers where they stand on this next week. On the ground at ASUG Tech Connect, I have some surprise meetings in store. I am also doing a customer session on decoding SAP speak, alongside analyst Joshua Greenbaum, who crunched some of ASUG's data on SAP modernization for a recent piece on ASUG's channel here. Let's pick this story up in New Orleans.

Updated 1:0pm ET, November 4 with a few small tweaks for reading clarity, but no opinions were changed.

Loading
A grey colored placeholder image