Nvidia Blackwell server cabinets could cost somewhere around $2 to $3 million each

Skye Jacobs

Posts: 589   +13
Staff
Something to look forward to: Pricing for Nvidia's Blackwell platform has emerged in dribs and drabs, from analyst estimates to CEO Jensen Huang's comments. Simply put, it's going to cost buyers dearly to deploy these performance-packed products. Morgan Stanley estimates that Nvidia will ship 60,000 to 70,000 B200 server cabinets in 2025, translating to at least $210 billion in annual revenue. Despite the high costs, the demand for these powerful AI servers remains intense.

Nvidia has reportedly invested some $10 billion developing the Blackwell platform – an effort involving around 25,000 people. With all the performance packed into a single Blackwell GPU, it's no surprise these products command significant premiums.

According to HSBC analysts, Nvidia's GB200 NVL36 server rack system will cost $1.8 million, and the NVL72 will be $3 million. The more powerful GB200 Superchip, which combines CPUs and GPUs, is expected to cost $60,000 to $70,000 each. These Superchips include two GB100 GPUs and a single Grace Hopper chip, accompanied by a large pool of system memory (HBM3E).

Earlier this year, CEO Jensen Huang told CNBC that a Blackwell GPU would cost $30,000 to $40,000, and based on this information Morgan Stanley has calculated the total cost to buyers. With each AI server cabinet priced at roughly $2 million to $3 million, and Nvidia planning to ship between 60,000 and 70,000 B200 server cabinets, the estimated annual revenue is at least $210 billion.

But will customer spend justify this at some point? Sequoia Capital analyst David Cahn estimated that the annual AI revenue required to pay for their investments has climbed to $600 billion... annually.

But for now there is little doubt that companies will pay the price, no matter how painful. The B200, with 208 billion transistors, can deliver up to 20 petaflops of FP4 compute power. It would take 8,000 Hopper GPUs consuming 15 megawatts of power to train a 1.8 trillion-parameter model.

Such a task would require 2,000 Blackwell GPUs, consuming only four megawatts. The GB200 Superchip offers 30 times the performance of an H100 GPU for large language model inference workloads and significantly reduces power consumption.

Due to high demand, Nvidia is increasing its orders with TSMC by approximately 25%, according to Morgan Stanley. It is no stretch to say that Blackwell – designed to power a range of next-gen applications, including robotics, self-driving cars, engineering simulations, and healthcare products – will become the de facto standard for AI training and many inference workloads.

Permalink to story:

 
AI is the biggest bubble I've ever witnessed in 30 years of working in the industry. This will fall hard, and extremely fast when it does. There just will not be the return on investment to justify the costs. There will be benefits for those that can offer compelling products, but that will be counterbalanced by the massive costs from others that will piss it against the wall.

There are plenty of CEO's who just want to be in the moment, in case they are left out, without even being able to quantify just what the returns will be to their business. Once they scale back their AI investments, things will get messy.
 
AI is the biggest bubble I've ever witnessed in 30 years of working in the industry. This will fall hard, and extremely fast when it does. There just will not be the return on investment to justify the costs. There will be benefits for those that can offer compelling products, but that will be counterbalanced by the massive costs from others that will piss it against the wall.

There are plenty of CEO's who just want to be in the moment, in case they are left out, without even being able to quantify just what the returns will be to their business. Once they scale back their AI investments, things will get messy.

What makes you say that? I can think of 100 situations in where AI will simply be better.

And likely shaft pretty much so much humans in regards of workforce to be replaced with AI.

Tesla's AI integration and the training through millions of tesla's in the first place is a perfect example.

 
What makes you say that? I can think of 100 situations in where AI will simply be better.

And likely shaft pretty much so much humans in regards of workforce to be replaced with AI.

Tesla's AI integration and the training through millions of tesla's in the first place is a perfect example.
Weren't saying that AI won't be useful, I've already found plenty of used for it. What people are saying is that it is incredibly over valued, this is referred to as a bubble.

Something I want to see is an AI in games that is similar to Oblivion's radiant AI. Their radiant AI, even though it isn't anything like today's AI, created unique experiences. I want to have sandbox RPGs where we can tune the AI to give the player a desired gameplay experience. You want a Dark souls experience in Skyrim? Go for it. You want an Oblivion experience in Cyberpunk? Awesome.

I see the biggest uses for AI in 3 major areas. 1 is a better version of procedurally generate content in games without the no man's sky BS, another is being able to "talk" to your computer and say things, "can you organize this data set in Excel for me and make a graph representing it" and, finally, AI generated porn.

What will probably happen is that the debt will be "sold" to a whole bunch of shell companies that will file for bankruptcy so the big tech companies books look clean.
 
I think the main and only issue here is, how are these companies going to make money from all these money that they have thrown at Nvidia. It’s not so much whether AI is useful or not. It’s got its use, but how much and how many people are willing to pay for it. It’s all good now since most AI tools are free. There is a limit how much money tech companies can throw to fill up this massive hot air AI balloon.
 
Back