Just over a year ago I said that IaaS buyers, observers and commentators should not be fooled into thinking that Google Cloud would compete with market-leaders AWS and Azure on infrastructure alone (despite having invested heavily in this area too). Upon the announcement of Google Cloud's Anthos in 2019, essentially a tool to help you managed workloads across multiple cloud and on-premise environments, I wrote:
Because Google Cloud - or, I should talk Google more generally here, - is one of the largest data companies in the world. Data is about value, not about cheap tin. It recognises that if it has the infrastructure in place, upon which it can build neural networks, then it can create a gravitational pull for companies in industries that it can offer expertise in the form of AI/ML tools.
Google knows and understands how to use data at enterprise scale, which is why it makes sense to go to market with that expertise.
This is important to know within the context of Google Cloud's BigQuery Omni announcement today, as part of this year's (virtual) Next Conference. BigQuery Omni is a multi-cloud analytics solution, which allows customers to carry out analytics (using BigQuery) across their Google Cloud, AWS and Azure environments. Without having to make copies and move data.
The announcement has been supported by research from Gartner, which has found that 80% of survey respondents using public cloud environments were using more than one Cloud Service Provider.
BigQuery Omni builds on the Anthos announcement by attempting to solve a number of key problems for enterprise businesses - reducing complexity, allowing for customer choice and delivering value add through data. I would argue that Google Cloud hopes that by going to market with solutions that offer solutions to these problems, it can potentially add stickiness with customers where AWS and Azure can't.
Couple this with the knowledge that Google's business is essentially deriving value from data and Google Cloud becomes an interesting proposition for buyers higher up the technology stack, beyond infrastructure. It's also worth remembering that Google Cloud has made some interesting hires in recent years - including industry veterans Thomas Kurian and Rob Enslin - who have decades of experience in understanding enterprise buying habits and solving for those problems.
Breaking down silos
During a pre-brief on the announcement, Debanjan Saha, general manager of data analytics at Google Cloud, explained how Google Cloud believes that BigQuery Omni plays directly into Google's bread and butter - data. He said:
In Google, data is in our DNA. There are nine Google applications that have more than 1 billion users. In order to manage these applications we have built a data analytics platform, which processes and analyses massive amounts of information.
Citing the Gartner multi-cloud research, Saha said that Google Cloud believes that customers "deserve choice" and that multi-cloud is the "future", but that it also creates certain challenges for customers. It is hoped that buyers will see BigQuery Omni as one tool to help reduce this complexity. Saha said:
The problem is that when you are in multiple clouds, the data is siloed. And if you want to run analytics on that siloed data, you have to move data from one cloud to another, which is both cumbersome and expensive. Not only that but different clouds have different types of tools and analytic systems, so it makes it very difficult for people to run their analytics when their data sets are siloed across multiple clouds.
To solve that problem we are announcing BigQuery Omni, which is a multi-cloud analytics platform. It lets our customers analyse their data, no matter where their data is. It could be Google Cloud, it could be in AWS (which is now available as a private alpha), and very soon it is going to be available on Azure. Our customers always wanted to run data analytics wherever their data sits and with BigQuery Omni they have it today.
In terms of the benefits, Saha said that this approach breaks down data silos for customers, as they (according to Google Cloud) no longer have to copy data from one cloud to another and can effectively run multi-cloud analytics from one place. He added that it offers a "consistent data experience", which uses the same standard SQL that customers are used to for writing queries, developing dashboards and running analytics. Saha said:
This consistency and familiarity across clouds accelerates time to value and time to insight. Customers can query data without having to worry about various different hardware and infrastructure complexities, thanks to the portability afforded by our Anthos platform. So data analysts and data scientists can now focus on driving value and critical decisions and they don't have to focus on managing infrastructure.
In addition to the BigQuery Omni announcement, Google Cloud also announced today new security features - primarily Confidential VMs. Again, this announcement shouldn't be viewed in isolation, but rather within the broader Google Cloud strategy of improving data use in an enterprise, thus adding value and creating ‘stickiness'.
The Confidential VMs announcement falls under the bracket of ‘Confidential Computing', which is intended to help organisations process sensitive data in the cloud. Confidential Computing environments keep data encrypted in-memory and elsewhere outside the central processing unit.
Confidential VMs offer memory encryption to customers so that they can further isolate workloads in the cloud. Sunil Potti, general manager and VP of cloud security at Google Cloud, said that the company's progression with Confidential Computing will help organisation's wary of the cloud take the leap. He said:
Customers shouldn't have to choose between usability, performance and confidentiality. You normally have to choose two out of the three - you get usability and confidentiality, but not performance.
[With Confidential VMs] we are able to blend usability, performance and confidentiality in a much more consumable, mainstream adoptive use case for all of our customers. The way that we have implemented this technology offers real-time encryption in use, so that customers can ensure the confidentiality of their most sensitive data in the cloud, even while it's being processed.
It enables the last bastion of workloads. A simple example of that is that you see many hedge funds, or financial services firms, work their most sensitive IP around their algorithmic trading on premise because of the sensitivities around the data processing. Now they are able to fully embrace the power of cloud for all their workloads.
There is no denying that in recent years Google Cloud has been playing catch up with AWS and Azure. And with multi-cloud just coming to the fore, it's too soon to tell which of the three firms is going to win big in this market over the long term (not just in terms of number of customers, but also in terms of the scale of the enterprise deals being done). However, I think what I can say is that I see an understanding from Google Cloud that enterprises aren't just interested in cheap, scalable kit that is easy to use. Enterprise buyers look to vendors to help them solve complex problems - and in terms of complex problems, data is front and centre. If Google Cloud can really execute on this and highlight customer examples of the work it is doing, then it stands a fighting chance of playing a very interesting role in this market.