Bringing enterprise clout to generative AI - CEO Tom Siebel on what makes a difference

Stuart Lauchlan Profile picture for user slauchlan June 6, 2023
Every business and government leader is centered on AI, argues Siebel, and they need enterprise capabilities.

Tom Siebel

The world is, in many ways, now coming to us. 

A confident assertion from Tom Siebel, a tech leader who’s never been known for being knowingly undersold. The man who brought us CRM has shifted his attention to the AI sector over the past decade and while the hype cycle generated by ChatGPT has captured mainstream media attention this year, Siebel has every right to lay claim to credentials in this space:

We have been communicating for over a decade that we believe that the market for enterprise AI solutions would be quite large. And now as we enter the summer of 2023, AI has become a dominant theme in technology discussions, government discussions, media reports, defense and intelligence imperatives, and government and business imperatives.

I do not believe that it's an overstatement to say that there is no technology leader, no business leader, and no government leader, who is not thinking about AI daily. AI chipmakers, like NVIDIA, are accelerating production to try to keep up with the very real demand that's out there. And all of this is being accelerated by the advent of generative AI. The interest in AI and in applying AI to business and government processes has never been greater. Business inquiries are increasing, the opportunity pipeline is growing, demand is increasing. 

As the founder and CEO of, Siebel argues that the bulk of current demand he’s seeing is in the form of  turnkey enterprise AI applications rather than development tools. He points to the fact that some 83% of his firm’s bookings over the past fiscal year were driven by application sales versus 17% by the platform.

The other interesting stat he cites is in regard to the increasing spread of industries that are becoming users. Oil & Gas dominates with 34% of current business, followed by Federal/Defense on 29%. Hi-tech customers make up 13% of the use base, following by Energy/Utilities.


Inevitably has its own generative AI story to tell, having released the C3 Generative AI offering.  It differs to other GPT/Large Language Model solutions in a number of ways, asserts Siebel. These include:

  • It allows enterprises to access all their enterprise data and open source data.
  • It provides traceable deterministic consistent answers.
  • It enforces the corporate information access controls and security protocols that are currently in place.
  • It has no risk of IP or data exfiltration caused by the Large Language Model.
  • It is hallucination free and doesn’t make up answers.

Expanding on these claimed differentiators, Siebel explains:

One of the problems with generative AI is you're limited to the number of data sources that you can use with these Large Language Models. Typically it is text, HTML, and sometimes code, and the Large Language Model will interact directly with the data. But one of the problems is you get [is] random answers. Every time you ask the question, you get a different answer. If two people ask the same question, they get a different answer. And there's no traceability. It doesn't tell you where the answer came from. And finally, if it doesn't know the answer, it makes one up. This is what they call hallucination. It doesn't know, so it just kind of wings it, makes up an answer.'s platform is the foundation for what the firm offers: 

We’re very good at aggregating enterprise data, extraprise data, code, images, text, sensor data, what have you, into a unified federated image. When we do that, those data are read by a Deep Learning model and they happen to be stored in a vector database.

We have a kind of a firewall between that and the Large Language Model. Now our customer uses any language model they want - be it ChatGPT, be it Palm, be it Bard, be it FLAN-T5, whatever it may  be, whatever comes along next - but we built a firewall between the Large Language Model and the data. So every time you ask the question, it will give you the same answer. If two people ask the same question and they have the authority, they will both get the same answer every time.

On the subject of traceability, Siebel adds:

If [you] click on it, you can see exactly where the data came from. And very importantly, there's no risk of LLM-caused data exfiltration…And finally, there's no risk of LLM-caused hallucination. If it doesn't know the answer, it tells you, ‘I don't know the answer’, rather than making one up. You’d think that would be kind of table stakes, and they are table stakes for any large commercial or government installation. This is something that really distinguishes the C3 generative AI offering. And one of the reasons that we're seeing very high levels of interest.

After releasing the product in March, closed three generative AI applications deals with large enterprises - manufacturing and distribution firm Georgia Pacific, Flint Hills Resources, and the US Department of Defense’s Missile Defense Agency.

With the generative application available on both the AWS marketplace and the Google Cloud marketplace, Siebel says it is difficult to estimate size of the addressable market for such solutions,  but dubs it “extraordinarily large”, noting:

It kind of seems like everybody is interested in this. At the level of the CEO or the person who operates manufacturing and the person to operate sales, they want basically a Google-like interface where they go to a web browser-like interface where they can ask any question about their business…I don't know any industry that will not be taking use of this technology. It's really quite amazing.

My take

I’ve had my run-ins with Siebel over the past 30 years or so - as have others - but I wouldn’t question his enterprise expertise. As the consumer-buzz over ChatGPT shifts over to the enterprise market, the tick list of competitive differentiators that he’s claiming for his firm’s generative AI offering provides a compelling menu for adoption, if they can be delivered ‘on the ground’.

That said, the company has faced criticism for what some have seen as a slow pace of picking-up customers, although in its most recent quarter the firm said it signed 43 deals. Wall Street remains up and down on the firm - the release of earnings last week saw the share price slip on outlook, although the stock has tripled in value this year. How much of that can be attributed to the generative AI hype cycle isn't clear. As Siebel himself observes:

We’re a little bit shocked by the response that we had to C3 generative AI.

Whatever the reason, 2023 looks to be a pivotal one for an enterprise AI veteran.

A grey colored placeholder image