There's no going back: The new data center is dominated by Nvidia

Jay Goldberg

Posts: 75   +1
Staff
Why it matters: The revenue share figures for data center processors present a clear picture. Nvidia has experienced robust growth for years and currently leads the market. The primary question now is: what is the new normal for their market share?

A week ago, we came across a chart from HPC Guru on Twitter that was so striking we had to make our own version to really believe it. No slight to HPC Guru, it's just that the numbers were eye-opening, and in the end our numbers matched theirs.

Editor's Note:
Guest author Jonathan Goldberg is the founder of D2D Advisory, a multi-functional consulting firm. Jonathan has developed growth strategies and alliances for companies in the mobile, networking, gaming, and software industries.

The chart below shows data center processor revenue market share by quarter going back to 2Q19. It shows the collapse of Intel and the incredible rise of Nvidia. And of course, what really stands out in all of this is just the scale of Nvidia's surge. In the latest quarter, they claimed 73% market share. We knew they were doing well, but as they say, a picture is worth 1,000 words, or $14 billion a quarter.

Data Center Revenue Share – AMD, Intel and Nvidia

In the replies to that thread, Ian Cutress asked the reasonable question as to how much the whole data center market has grown in that time. The answer is that the market grew at a 30% CAGR (compound annual growth rate) over this period – Intel shrank at a 6% rate, AMD grew by 27%, and Nvidia grew at 103% a year.

Also clear from this data is that Nvidia's truly spectacular growth came in the last year since the unveiling of ChatGPT wowed the world. But if we strip out those last 12 months, they were already growing at a 67% CAGR.

We tried plotting this as a common size graph with 2Q19 set to 100, this is a handy way to compare growth rates.

Common Size Growth in Data Center Revenue

But Nvidia's growth is so strong that it drowns out the signal from all the others. So we plotted it again, without Nvidia. This time, we added TSMC to the mix as another proxy for the growth of the market.

Common Size Growth in Data Center Revenue ex-Nvidia

For those interested in recreating their own charts, this is the raw data. Our source for the data are company presentations, quarterly reports and SEC filings. We looked at Data Center segment revenue, but during this period, both Intel and AMD reclassified how they segment their data, so the early quarters are a bit different than the more recent quarters.

For TSMC we used their "HPC" segment reports, which they only began breaking out in 2Q19 which is where we start the series. Finally, it is worth keeping in mind that all of these segments include more than just CPUs and GPUs, as each company defines the segment a bit differently. This probably means that Intel's early revenue is inflated as it included networking, memory and a bunch of other products. So their decline is not quite as steep as it may appear. That all being said, the overall trend is still very clear.

We can draw two important conclusions from this. First, after years of effort and some serious share gains, AMD is still a distant third in this market with less than 10% share. They are still half of Intel's size, which is not a surprise, but stands out starkly in the light of Nvidia's rise.

The second, more important conclusion, is that this market is now permanently changed. Again, this is one of those things that everyone knows, but not everyone realizes. It is unlikely that Nvidia can retain 70% of wallet forever, but by the same token Intel had 90% share for over a decade. As much as we talk about the rise of heterogeneous compute, we are now entering a period where Nvidia is the common factor in the data center.

As much as the tech world is excited about the bold future of AI, another way to understand the rise of Large Language Models (LLMs) and AI in general is that the market is undergoing one of its periodic shifts. Just as the rise of Linux and Intel in the data center in the 1990s heralded a seismic shift in the market to a new paradigm of computing which today we call the cloud, the rise of LLMs seems to mark the rise of a new compute paradigm based on Nvidia silicon.

The question for the foreseeable future is not "Can Intel reclaim its data center crown?" (it cannot), instead the question is just how dominant will Nvidia be? Can they maintain a 70% share? (probably not), or is 50% the long term status quo? (very possibly).

Permalink to story.

 
What an insane comparison. Basically showing that selling GPUs costing tens of thousands of dollars generates more profit than Server CPUs, Meh. it doesn't take a genius to predict what comes next.
 
What an insane comparison. Basically showing that selling GPUs costing tens of thousands of dollars generates more profit than Server CPUs, Meh. it doesn't take a genius to predict what comes next.
While this is only a rumor and my source isn't the best, I've been told one reason Intel is developing their E-cores is that modern workloads benefit from scaling many small cores(GPUs) rather than fast single threaded performance. We certainly need fast, large cores for the backbone to supply the GPUs with information to process but the actual information processing is done elsewhere. It's also been told to me that the ARC GPUs are mostly a proving ground for Intel to start building GPU like processing units for data center workloads.
 
I'm surprised it took this long. Datacenter applications have LONG benefitted from parallel computing, and GPUs dominate that arena. I suppose it's the result of nvidia's software suite being more developed, or developers really love CUDA for some reason.
 
Pretty bold predictions based on short term AI hype when Nvidia could charge whatever they want. Soon AMD will have competitive product, other companies too. Like we have seen, this kind of things happened on modern market but none of those have been long living.
 
While this is only a rumor and my source isn't the best, I've been told one reason Intel is developing their E-cores is that modern workloads benefit from scaling many small cores(GPUs) rather than fast single threaded performance. We certainly need fast, large cores for the backbone to supply the GPUs with information to process but the actual information processing is done elsewhere. It's also been told to me that the ARC GPUs are mostly a proving ground for Intel to start building GPU like processing units for data center workloads.

Well as for rumors. Imagine if AMD perfects the chiplet design for GPUs. Then imagine them putting that on a single chip. 2 Full 6 or 8 core cpu chiplets, with 4 full RDNA chiplets with an IO die between them for consumers. Yes the CPU slot would become larger and the power draw would be insane, but so would the performance. The scalability alone would be impressive. An IHS the size of thread ripper, with huge core counts for CPU and GPU on a single chip. It could be game changing for AMD.

This also lines up with what your saying, could be why Nvidia is making their own ARM cpu.
 
I'm surprised it took this long. Datacenter applications have LONG benefitted from parallel computing, and GPUs dominate that arena. I suppose it's the result of nvidia's software suite being more developed, or developers really love CUDA for some reason.
I think of CUDA the same way I think of x86/x64. It was the first to get big and has so much development built around it that it's difficult and expensive to get away from. There has to be a major cost benefit in switching to justify development for a new architecture
 
I think of CUDA the same way I think of x86/x64. It was the first to get big and has so much development built around it that it's difficult and expensive to get away from. There has to be a major cost benefit in switching to justify development for a new architecture

Even if that architecture is better. The sells mans wins. IE. Winblows.
 
Well as for rumors. Imagine if AMD perfects the chiplet design for GPUs. Then imagine them putting that on a single chip. 2 Full 6 or 8 core cpu chiplets, with 4 full RDNA chiplets with an IO die between them for consumers. Yes the CPU slot would become larger and the power draw would be insane, but so would the performance. The scalability alone would be impressive. An IHS the size of thread ripper, with huge core counts for CPU and GPU on a single chip. It could be game changing for AMD.

This also lines up with what your saying, could be why Nvidia is making their own ARM cpu.
So here is an interesting rumor and I give this one some more credit than the Intel rumor. AMD is working on a product where CPU chiplets share a memory pool with GPU chiplets and it completely skips the PCIe bus.
Even if that architecture is better. The sells mans wins. IE. Winblows.
Windows isn't don't so well. Linux dominates the server space and now chromeOS is infecting schools like the plague. MS has noone to blame but themselves, either. Their forced updates make the OS unstable and unpredictable and now they have their privacy issues. Windows 10 as it was first released was actually a good OS but they thought their marketshare would stop them from falling from the top.

Windows has become so bad that you actually have companies dumping millions of dollars into developing Linux into a true windows alternative. onto of that, with androids massive marketshare many developers are already familiar with Linux programming
 
Windows has become so bad that you actually have companies dumping millions of dollars into developing Linux into a true windows alternative. onto of that, with androids massive marketshare many developers are already familiar with Linux programming

I do not disagree here. Windows has been shooting themselves in the foot for years. They still have the same mentality, were have the biggest herd, and now were shepherding them over to our new pile of crap to sell them.
 
So here is an interesting rumor and I give this one some more credit than the Intel rumor. AMD is working on a product where CPU chiplets share a memory pool with GPU chiplets and it completely skips the PCIe bus.

Windows isn't don't so well. Linux dominates the server space and now chromeOS is infecting schools like the plague. MS has noone to blame but themselves, either. Their forced updates make the OS unstable and unpredictable and now they have their privacy issues. Windows 10 as it was first released was actually a good OS but they thought their marketshare would stop them from falling from the top.

Windows has become so bad that you actually have companies dumping millions of dollars into developing Linux into a true windows alternative. onto of that, with androids massive marketshare many developers are already familiar with Linux programming


The Adult world is dominated by Office 365. Most Fortune 500 and frankly most business in general depend of Office 365 for the day to day operation. Sharepoint, Outlook, Azure, all pretty much done best on windows...

ChromeOS has proven that it isn't making its way out of schools. Kids are forced to use it, then they grown up and use real tools. Sheets, Docs, etc the entire Google version of online office sucks for anything more than basic usage. And as a business it is hard to take anyone who doesn't use industry standard tools seriously.
 
The Adult world is dominated by Office 365. Most Fortune 500 and frankly most business in general depend of Office 365 for the day to day operation. Sharepoint, Outlook, Azure, all pretty much done best on windows...

ChromeOS has proven that it isn't making its way out of schools. Kids are forced to use it, then they grown up and use real tools. Sheets, Docs, etc the entire Google version of online office sucks for anything more than basic usage. And as a business it is hard to take anyone who doesn't use industry standard tools seriously.
Office 365 is cloud based and can be used in any HTML5 browser. It has become OS agnostic. And while some businesses may use windows I know that Target has moved over completely to ChromeOS and all of the computers the company that I work for has purchased over the last 2 years have been chrome based.

This massive push towards subscriptions and the cloud has made operating system choice almost irrelevant outside of specific applications. Even then, noone at my company has ever noticed that I'm running FreeCAD on Linux Mint
 
I work from home and the company I work for is still buying windows pc's. But they do not have to, all of the programs I use for work are hosted web based access, I can use any browser I want.

If office personel will most likely use windows or mac OS, the change or additional to chrome OS will happen because google played the long game. By placing the chrome OS into schools every kid will know how to use the basics and be familiar with the OS.
 
Well as for rumors. Imagine if AMD perfects the chiplet design for GPUs. Then imagine them putting that on a single chip. 2 Full 6 or 8 core cpu chiplets, with 4 full RDNA chiplets with an IO die between them for consumers. Yes the CPU slot would become larger and the power draw would be insane, but so would the performance. The scalability alone would be impressive. An IHS the size of thread ripper, with huge core counts for CPU and GPU on a single chip. It could be game changing for AMD.

This also lines up with what your saying, could be why Nvidia is making their own ARM cpu.
That had been the wet dream of AMD's during the last decade. Their works and efforts in HSA and dreams of "vision" failed miserably due to bad business decisions. I'm sorry to say (while I'd be happy if AMD could design such an architecture like you mentioned) that that cake will be eaten by either ARM, Apple or Nvidia (or a combination of these) in the near future. To this day, I've been wondering why AMD did not create a heterogeneous system architecture, be it in a smaller scale (for desktops, mobile etc) or in the shape of a monstrosity like you described. Something like Apple's A1 could easily be designed by AMD years ago, but no, they did not.
 
Thank you so much for this insightful article! It's clear Nvidia is no longer a gaming company; they're a data centre & AI company.

Their gaming revenue last quarter was "only" $2.8 billion, whereas their data centre & AI revenue was an astounding $18 billion in 1 quarter!!!!

When people comment on their graphics cards, it's not even remotely important for Nvidia anymore. I can imagine a future where they exit the Gaming scene completely cause it'll be peanuts compared to their Data Centre & AI revenue.
 
I can imagine a future where they exit the Gaming scene completely cause it'll be peanuts compared to their Data Centre & AI revenue.
Maybe but I'm not so sure. I mean 2.8 billion is still nothing to sneeze at. You know, a couple of billion here, a couple of billion there, pretty soon you're talking about real money. :)
 
Back