I asked him at the time whether he saw this as Google simply using Nutanix as a Trojan Horse to muscle in on the enterprise market, where it has not had the success or penetration that rivals AWS and Azure have had. He saw it as a partnership with much more, balanced, objectives.
The best partnerships are a win:win, they are always complementary. They want to learn about the enterprise, and we want to learn about webscale, so we can see how our customers could punch a hole through the network and get to their webscale services.
But both Pandey and Senior Vice President of Engineering and Product Management, Sunil Potti, see something more profound starting to happen in the enterprise marketplace as the full potential of cloud-delivered services starts to infiltrate large enterprise users.
Pandey, for example, pointed at the recent acquisition by Microsoft Azure of the Chevron data center operation as a perfect case in point. This is not a contract to upgrade the systems, or to off-load workloads.
This is to take over the whole data center operation, including the systems and facilities. This is just like the old days of IBM and the mainframe, when the company would buy back the systems and take on the developer and operations staff. Chevron gets to use and exploit its data, for fees that are additional to the $80 million acquisition cost, but no longer has to play at being ship's engineers and stokers. It also gets access to the latest hardware, apps and tools. As it wants to get to accelerate its development of data analytics and IoT capabilities the move arguably makes a good deal of sense.
I’m playing too, says Google
This is exactly the market Google is also pitching at. Indeed, the company has established a new group specifically to get into the energy businesses, using a sales pitch that focuses on the estimation that most of them can only exploit about 5% of the data they generate, and that Google can squeeze the remaining 95% a great deal harder than they can themselves.
We use of a lot of open source tools, we have built a lot of them ourselves. If we were to someday run on a nested architecture, where we are not running on their bare metal, but are running on their hypervisor, then some of those tools become more relevant. Then we become an app on top of Google. The question is then, can we get to the point where we run on Google and users do not have to change anything? That is when all those tools may become more relevant.
Potti feels that AWS and Azure have already broken into the enterprise market and picked up the low-hanging fruit such as new enterprise applications and new dev/test opportunities. This is roughly where we stand today and in his view it is a market worth approximately $50 billion. But he now sees a public cloud market beyond this that should be worth around $250 billion sometime out into the future. It could be two years, it could be 10, but he sees Google being a major player in this area.
Imagine the public cloud is a new neighbourhood, but we live in the old city - the enterprise. There are two options: I can move my family and go to the new neighbourhood or stay where I am. My roots and friends are still there, and I can go and visit the new neighbourhood. But what if the developers of the new neighbourhood then set out to gentrify the old one?
That is what he sees starting to happen with cloud. He see the likes of AWS, Azure and Google starting to pitch at re-platforming the enterprise. So what if Google partners with a business already in the business of re-platforming the enterprise. He sees that as helping both partners.
But even AWS is beginning to realise that, while it has a long tail of new business in its current operations it is really only going after some 10% of the enterprise market - but they are failing to get near the remaining 80% of the available business.
One reason is that the 80% will require some heavy re-factoring and re-platforming. And the way this is likely to happen is already underway, this is the re-appearance of an age-old business model brought bang up to date - outsourcing.
If an enterprise has a five year contract (seven years in the case of the Azure/Chevron deal) it is much more likely to use the public cloud service to off-load more and more work.
Once you’re in why step back out?
But that is just the start. It also means that, as the time for upgrades of their original data centers loom, they will have built a track record of (hopefully) reliable service provision on systems that are better than the legacy kit, making the need for physical upgrades a less favourable financial and operational alternative.
There is scope to add new services as well. One of the possible next steps Nutanix could consider offering are re-engineering tools for existing legacy applications. These could take the known inputs and the known good outputs generated by the legacy applications and have a ready-made test pattern for re-engineering old applications. Potti said:
In the enterprise there are apps that won't allow you to do auto-translation, they are really closed apps and they will be around for a good while yet. So these need to be wrapped in an application bubble that can be plugged in to the new environment.
That way all the licences and legalities are protected. Imagine yourself as the CIO, then when he gets involved in this sort of work he will need someone he can rely on as a lifelong partner, and the `IBM equivalent’ these days will be the likes of AWS, though for now they don't have the servicing muscle to do it.
He reckons Amazon is getting closer now, while Microsoft has had that capability for a long while. This will become like a consultative process, and for Google in particular, that is where the partnership with Nutanix comes into play because it is sitting on the inside of the enterprise market already.
And that is the key to the Google relationship because it does not have the enterprise piece at all for now, but they see what they need and what they are missing. Potti added:
It is like Google has the best subway system in the world, but they have no idea how to sell tickets.
This is the role that OpenStack would claim as its primary bailiwick, but Potti would now put that in the past tense.
Five years ago it was where OpenStack was positioned but it just didn't deliver.
It is also true to suggest that many of the enterprise users were simply not ready for it back then. The other aspect is that this is only partially a technology problem anyway, for what the enterprise users are looking for are businesses with the economic muscle and management expertise, as well as some technology. This is now more than the enterprise simply rolling in some new data center architecture into its own facilities; it is now about off-loading the facilities so they can concentrate on exploiting the IT, not running it.
In Potti’s view, the true enterprise renaissance starts here. You'll see more of these synergies coming together over the next 12 months, and it will probably be over the next five years that we get close to that $250 billion market size.
All companies, from the biggest to the smallest, are the sum of two things: their data, and their ability to collect, collate and manipulate that data in meaningful ways. From that comes the information they require, the knowledge to work that information in different ways, and the wisdom to select the tasks that achieve the most desired results. The downside of all that, especially for the biggest, is that it has also always brought with the $-multi-million costs of providing the tools and expertise needed to make any of it happen.
Outsourcing has been tried before, though more for budgetary reasons than real business development. Now, with the cloud connecting commodity hardware of prodigious power to run applications offering significant productivity potential, all built around a growing raft of common standards, the time does seem to be right for the next step – where enterprises really start exploit the resources available within and through the main public cloud service providers and forget about being the providers of the service.
I can remember when big enterprises commonly had their own mini power stations for the provision of energy and heat. None would think of doing it now. So how long before they don’t think about having their own data centers?