Main content

Google Cloud Next ‘23 - CEO Thomas Kurian on AI pricing, sustainable models and the competitive landscape

Derek du Preez Profile picture for user ddpreez September 1, 2023
Google Cloud CEO Thomas Kurian provides us with his insights into how the vendor is thinking about making an impact with buyers mulling over their artificial intelligence (AI) options.


Image of Google Cloud CEO, Thomas Kurian

Google Cloud kicked off its annual user event in San Francisco this week with a swathe of AI announcements, which were aimed at, in particular, bolstering Vertex (its AI platform) and Duet (its in application ‘AI collaborator’). diginomica also got the chance to sit down with Google Cloud CEO Thomas Kurian, alongside a select number of other media, to get a deeper understanding of how the vendor sees AI playing out in the market, and for customers. 

The conversation with Kurian was brief but wide-ranging and included his thoughts on building sustainable AI models (something that’s often ignored, given the huge amount of compute that’s required to crunch the data at scale), Google’s pricing of AI and why he thinks Google Cloud is well positioned to capture market share in a highly competitive field. 

Kurian shared that Google Cloud’s AI offering has been a long time in the making, but that it is an extension of the vendor’s initial Google Cloud principle of ‘simplifying technology so that everybody has access’ and that it can be used ‘much more efficiently’. Of course this began with Google Cloud’s infrastructure offering, so that buyers didn’t have to own and operate their own data centers and could gain access to infrastructure through API or browser. 

The second phase, according to Kurian, was the introduction of managed services in order to help address the skills problem in the marketplace, whereby customers were struggling to find the necessary people to operate their cloud environments. 

The third phase is AI. At the infrastructure level Google Cloud is focused on providing cost efficient scaled resources for training and serving models. Kurian said: 

The reason is, if you integrate AI into applications and into your business processes, cost and efficiency is a big concern. So we want to provide that. 

And second to this, Google Cloud wants to provide the building blocks for companies to leverage AI according to their needs, via its Vertex platform. Vertex provides not just a collection of models, but a number of foundational services to make AI effective and trustworthy - such as information retrieval, search, conversations, statementment, grounding, watermarking, synthetic data generation. The idea being, similarly to how cloud platforms have developed, that if you have these foundational platform elements in place, it’s easier for buyers to run with AI in a way that makes sense for them (rather than having to build these elements themselves). 

The counterpart to Vertex is Google Cloud’s Duet tool. A collaborative, conversational AI tool that is integrated into a variety of Google applications to support work - whether that be in Workspace (Slides, Docs, etc), or Google Meet, or Looker. The idea, similarly to Google Cloud’s competitors, is that you can ask Duet a question and it will use models to deliver more productive content/results. Kurian said: 

With Duet we basically took every role that people have and imagined: what would an AI collaborator be? For example, you’re writing and you have an amazing writer to assist you to write better. 

You're doing Slides and people said ‘Oh, I'm really terrible at doing slides, I would love to have a visual designer’, so we introduced that. 

We thought maybe we should introduce something that takes notes in meetings, assigns action items, summarizes it if you’re late, and if you don't attend can send you all the things that were discussed. And sends it to you with what was discussed as action items. 

So all those are things we built in. It’s the same thing with Google Cloud Platform: we can generate code for you, we can operate the environment for you. 

For an analyst, you can ask a question and say ‘show me my revenue’, and the system not only shows you the revenue, but it tells you why it's going the right way or the wrong way. And it also creates the slide deck that you can use in a meeting to say, either we should be celebrating or we have a problem. 

And so it was taking analytics out of the handful of analysts, but to allow everybody to do analysis. And so every step of the way, it's been broadening it. 

Pricing and use cases

Kurian said that AI, in particular generative AI, is resulting in every industry and every business function thinking through how the technology will change, and improve, how they work. Commenting on companies Google is working with, He said: 

We are seeing the technology being adopted in many, many different parts of companies. If you look at the work we've done with Orange in France, it's about customer service, answering customer questions, using our AI system to answer customer questions. 

If you look at Vodafone, it's in their procurement and back office around customer contracts and how contracts have been written and monitored. If you look at what we're doing with a variety of car companies, it's about the experience people have in using the vehicles. If you look at the work we've done with Carrefour it's in marketing and commerce. 

One of the things we see is that it's in many different departments of companies that solutions are being used and each customer is free to choose what they find will be the most effective thing for them.

Kurian also commented on Google Cloud’s pricing of Duet, which has publicly been announced at $30 per user per month in preview. Some people have commented during this week’s conference that this appears steep on a surface level, when taking into account the potential number of users in an enterprise - but Kurian said that the pricing needs to be taken into consideration alongside the productivity gains that are possible. Google Cloud, according to Kurian, is basing this on extensive testing with 1 million+ people using Duet in Workspace. 

Using Gmail as an example, Kurian said: 

When a human being uses Gmail, on average, they generate 50% of the email content using the AI model. And this is on average. Most people generate a lot more of that. The average person generates 50% of the content. On average, they're sending 40% more email. If you just did the math, no matter which way you slice it, you’re generating at least 40% to 50% improvement in productivity. 

If you count the percentage of time that a human being does email, it's a significant part of the day. If you count the salaries people are typically paid…let's say you did 30% of your day doing email, and you earn $200,000, a 10% improvement in productivity is a huge pop. And we're doing a lot more than that.

And the price per year is $3,600, which is a fraction of the productive improvement. 

Kurian added that Google Cloud has discussed this pricing with many of its large customers who are using Duet in preview and said that the improvement in productivity is “so significant” that the price is a fraction compared to the results being seen. This obviously needs to be independently verified with customers, but Kurian added Google Cloud will continue to listen and will base its future pricing on their feedback. 

However, he also said that customers are basing their AI purchasing decisions on more than just price. In terms of how the competitive landscape will play out, Kurian said that buyers are choosing AI providers based on their platform capabilities. He said: 

The most important thing when we talk to customers, and we've talked to hundreds and hundreds of customers these last six months since we first launched our generative AI products in preview in March…people are not looking at generative AI and AI as a point decision on a model. 

They’re choosing a platform. And when I say they’re choosing a platform: do I have all the services I need to use AI properly in my company? So that is a significant shift, because we see that as opening doors. And we have seen a lot of clients who were with other cloud providers talking to us for AI, because it's a decision they're making independent of where their IT infrastructure is. 

The decision on the platform was: who can run compute for me cheaply? That was what the decision was 7/10 years ago. That is not the basis of what the decision today is. 

This is a shift compared to how buyers largely decided upon choosing infrastructure products, it’s true. Often that came down to price, availability and regional factors. However, I put it to Kurian that when looking at B2B software providers that platform factors have often been a deciding factor in purchasing decisions. Buyers are interested in cross-organizational benefits, where the platform has multiple capabilities to support work needs. Google Cloud certainly may have an advantage here. 

But for Kurian, the key is ultimately productivity and flexibility. He added: 

They want something that will work with all the different business functions in their organization. Every CEO I talk to says ‘I'm starting this domain but I want this to go across’. Particularly in Europe and North America, where the economy, for it to keep growing, they need a productivity lift. 

These are not theoretical numbers. People have actually tested and seen these results, so they see it as a genuine productivity lift. 

We have also done a lot of things to make it much easier for people to adopt it. At the infrastructure layer, we announced Cross-Cloud Interconnect, so you can connect to us, and through us, to everybody. And the value is your cybersecurity, your network protection is in one place, you don't have to distribute it. 

If you use our Anthos product, you can run it anywhere: on our cloud, on Amazon, on Azure. Similarly, if you use our AI models, you can connect it to applications that run anywhere. So that flexibility allows us also to enter a lot more customers. We're very confident that as this evolves, it is going to be a new decision making criteria for companies and we are quite well positioned for that.


diginomica has highlighted previously that whilst the market is currently in an AI frenzy, it can’t be denied that artificial intelligence has a sustainability problem. Generative AI’s requirement for compute power is high, it consumes a lot more energy and, therefore, emits a lot more carbon. As diginomica has noted, for example, training GPT-3, the so-called ‘parent’ model of the more famous ChatGPT, produces 10 times the emissions generated by an average car over its lifetime. Integrating generative AI into a search engine also requires four times more compute power per search than a standalone product.

As the climate emergency worsens, companies can’t (or shouldn’t) shirk their sustainability commitments in a bid for higher productivity. Kurian had two responses to this challenge. Firstly, he pointed to Google’s own ESG commitments. He said: 

We have committed, as Google, publicly that we will be carbon free in 2030. We've also been carbon neutral since 2007. Those goals continue to be very important for our company. 

Secondly, and perhaps more interestingly, Kurian said that the key is performance improvements and optimization of the models themselves. He explained: 

One of the reasons that we talk about performance improvement and optimization is that you need to spend a lot less compute time, and therefore a lot less energy watts, in order to train a model. When we say we can run a model training two times faster and better than other companies in the industry, it means you need half the amount of energy to train our model. 

And finally, Kurian sees AI playing a role in actually helping companies achieve their sustainability goals, by using models to solve energy problems more efficiently. He said:

We are applying our models in a variety of different places. And there are customers actually at this conference talking about how they're using the models in a sustainability context. We see the models as providing, for example, better solutions for sustainable sourcing. 

Just as one practical example, we run our own data centers using our models. The whole thermodynamics, power, air cooling, water cooling, airflow - we use our models to manage that. That is a public study that we have published our research on. And we've made those models available to partners, including sustainable energy companies. 

My take

It’s early days in the field of AI, no matter what vendors will tell you. I think what Google Cloud has going for it - as Kurian highlighted - is that it can integrate AI into an organization’s business processes, whilst also providing a range of services throughout the whole technology stack. Customers are looking to vendors who can provide flexibility and help solve their AI queries, across their business, with ease. The customers we’ve spoken to this week are particularly supportive of Google Cloud’s willingness to be adaptive to their specific use cases. The challenge Google Cloud will face is penetrating businesses that are already tied into another vendor for a range of other services (e.g. Microsoft), who are now offering their own AI tooling. Can Google Cloud tempt them away? Quite possibly, but time will tell. What I will say is that there have been a wide range of big name customers talking publicly this week about how Google Cloud is proactively supporting them with their AI needs - and that is always a good sign. 

A grey colored placeholder image