Whilst the rest of the technology industry gets itself wrapped up in the OpenAI leadership debacle over the last few days, with many an executive playing out a Succession-esque psychodrama in public, NVIDIA continues to win big in the AI arms race, with its chips fuelling massive growth at the company.
Announcing its third quarter 2024 earnings, NVIDIA recorded record revenues of $18.12 billion, up 34% from Q2, and up 206% year-over-year. Whilst the rest of the industry fights over a pool of talent to inch ahead of competitors, it seems NVIDIA and its investors are laughing all the way to the bank.
Speaking with analysts this week, EVP and CFO Colette Kress, said:
Revenue of $18.1 billion was up 34% sequentially and up more than 200% year-on-year, well above our outlook for $16 billion.
Some of the most exciting generative AI applications are built and run on NVIDIA, including Adobe Firefly, ChatGPT, Microsoft 365 Copilot, CoAssist, now assist with ServiceNow and Zoom AI Companion. Our Data Center compute revenue quadrupled from last year and networking revenue nearly tripled.
The earnings call was a who’s who of technology companies being referenced as buyers, all in demand of NVIDIA’s hardware. That’s not to say that there aren’t any challenges ahead. Kress also took time to highlight the impact the US Government’s new chip export regulations will have on the company, which are intended to limit China getting its hands on the latest hardware, as AI development takes center stage in geopolitical rivalries.
Kress said that regions affected by the new export regulations currently contribute approximately 20% to 25% of NVIDIA’s Data Center revenue (which was $14.51 billion this quarter) and that the new licensing requirements will significantly impact sales in China and other impacted regions. Kress added:
Following the government's clear guidelines, we are working to expand our Data Center product portfolio to offer compliance solutions for each regulatory category, including products for which the U.S. government does not wish to have advance notice before each shipment.
We are working with some customers in China and the Middle East to pursue licenses from the U.S. government. It is too early to know whether these will be granted for any significant amount of revenue.
Sovereign AI clouds
Looking ahead, NVIDIA CEO Jensen Huang had some interesting comments to make regarding the opportunity of nation states investing in their own domestic compute capacity, using their own data to train LLMs and support local generative AI ecosystems. This will of course be a further boon to NVIDIA, which will likely play a central role in providing the hardware for this infrastructure.
The company pointed to India, France, Sweden and Japan as some of the countries already pursuing a sovereign AI cloud. CFO Kress went as far to say that investing in national compute capacity is a “new economic imperative” and that the sovereign AI infrastructure market represents a “multi-billion dollar opportunity over the next few years”.
CEO Huang said that advancements in AI are transforming hardware, particularly data centers, whereby investments will be made in infrastructure that moves away from multi-tenant, towards wholly owned, in order to train AI models. He said:
Unlike the data centers of the past, where you have a lot of applications running, used by a great many people that are different tenants using the same infrastructure, and that data center stores a lot of files…these new data centers are very few applications, if not one application, used by basically one tenant. And it processes data, it trains models and then generates tokens and generates AI.
It feels jarring that to refer to cloud-based infrastructure as “data centers of the past”, given that many organizations and nations are still in a cloud-transition phase, but it appears, according to NVIDIA, that we are heading towards single-tenant infrastructure once again - something he describes as “AI factories”. Huang said:
We're seeing AI factories being built out everywhere, and just about by every country.
And so if you look at the way where we are in the expansion, the transition into this new computing approach - the first wave you saw with large language model start-ups, generative AI start-ups and consumer Internet companies.
Meanwhile, while that's being ramped, you see that we're starting to partner with enterprise software companies who would like to build chatbots and copilots and assistants to augment the tools that they have on their platforms.
And the next phase will be focused on nation states developing their own AI infrastructure. Huang added:
You're seeing sovereign AI infrastructures, countries that now recognize that they have to utilize their own data, keep their own data, keep their own culture, process that data and develop their own AI. You see that in India, about a year ago in Sweden, you are seeing it in Japan. Last week, a big announcement in France. But the number of sovereign AI clouds that are being built is really quite significant.
And my guess is that almost every major region will have and surely every major country will have their own AI clouds. And so I think you're seeing just new developments as the generative AI wave propagates through every industry, every company, every region. And so we're at the beginning of this inflection, this computing transition.
You know that common joke that if you could go back in time and invest in Apple stock before the iPhone was invented you’d be laughing? That feels very similar to NVIDIA, which has seen its share price effectively quadruple over the past couple of years - thanks to AI. Whilst the rest of the industry battles it out for AI relevance, NVIDIA appears to have made some early bets that will see it confidently having a leading voice for some time time to come.