Nvidia unveils next-gen Blackwell GPUs with more horsepower and better power efficiency

DragonSlayer101

Posts: 372   +2
Staff
Forward-looking: At Nvidia's GTC 2024 conference on Monday, the company unveiled its Blackwell GPU platform that it says is designed for generative AI processing. The next-gen lineup includes the B200 GPU and GB200 Grace "superchip" that offer all the grunt needed by LLM inference workloads while substantially reducing energy consumption.

The new Nvidia B200 GPU packs 208 billion transistors and offers up to 20 petaflops of FP4 performance. It also includes a fast second-gen transformer engine featuring FP8 precision. The GB200 Grace combines two of these B200 chips with an Nvidia Grace CPU and connects them over an NVLink chip-to-chip (C2C) interface that delivers 900 GB/s of bidirectional bandwidth.

The company claims that the new accelerators would aid breakthroughs in data processing, engineering simulation, electronics design, automation, computer-aided drug design, and quantum computing.

According to Nvidia, just 2,000 Blackwell GPUs can train a 1.8 trllion parameter LLM while consuming just four megawatts of power, whereas it would have earlier taken 8,000 Hopper GPUs and 15 megawatts to complete the same task.

The company also claims that on a GPT-3 LLM benchmark with 175 billion parameters, the GB200 offers a 7x performance uplift over an H100 while offering 4x faster training performance. The new chips can potentially reduce operating costs and energy consumption by up to 25x.

Also read: Not just the hardware: How deep is Nvidia's software moat?

Alongside the individual chips, Nvidia also unveiled a multi-node, liquid-cooled, rack-scale system called the GB200 NVL72 for compute intensive workloads. It combines 36 Grace Blackwell Superchips, which include 72 Blackwell GPUs and 36 Grace CPUs interconnected by fifth-generation NVLink.

The system comes with 30TB of fast memory and offers 1.4 exaflops of AI performance for the newest DGX SuperPOD. Nvidia claims that it offers 30 times the performance of an H100 system in resource-intensive applications like the 1.8T parameter GPT-MoE.

Many organizations and enterprises are expected to adopt Blackwell, and the list includes the virtual who's who of Silicon Valley. Among the biggest US tech companies set to deploy the new GPUs include Amazon Web Services (AWS), Dell, Google, Meta, Microsoft, OpenAI, Oracle, Tesla, and xAI, among others.

Unfortunately for gaming enthusiasts, Nvidia CEO Jensen Huang did not reveal or hinted anything about the Blackwell gaming GPUs that are expected to be launched later this year or in early 2025.

The lineup is expected to be led by the GeForce RTX 5090 powered by the GB202 GPU, while the RTX 5080 will be underpinned by the GB203.

Permalink to story.

 
:rolleyes: Has Nvidia told gamers "So long and thanks for all the fish" or will gamers, on the backs of whom Nvidia has built their business, continue to get recognition? That is the question.

And only 4-Megawatts of power for what used to be 15-megawatts? Quite a power savings, yet how many homes or substantial numbers of anything else could be powered with 4-Megawatts. All for the sake of a BS marketing scam that still can be fooled to create chaos.

Its no wonder this world is F'd up, IMO.

The world's priorities are definitely in the right place? 🤔
 
<p>According to <a href="https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing">Nvidia</a>, just 2,000 Blackwell GPUs can train a 1.8 billion parameter LLM while consuming just four megawatts of power, whereas it would have earlier taken 8,000 Hopper GPUs and 15 megawatts to complete the same task.</p>
The Verge reported this as 1.8 trillion parameter LLM, so is it billion or trillion?

Either way pretty neat!
 
I said 3 years ago when looking at Ampere/Ada architecture.... that NVidia no longer makes Gaming GPUs.

They are just using Enterprise architecture and trying to market them as "Gaming" to ignorant people who bandwagon RTX-ON marketing. That is the exact reason EVGA left NVidia... was because Ada Lovelace is not game specific architecture like RDNA is.

 
All for the sake of a BS marketing scam that still can be fooled to create chaos.
Not sure I understand this sentiment, or anyone that claims that AI isn't creating immense value. Are there questionable deployments of gen AI and ethical/economic implications thereof? Yes, absolutely. But gen AI is most certainly not some sham tech that's going away or over-hyped like the metaverse was. You can't be a trillion dollar company on hot air alone.
 
I said 3 years ago when looking at Ampere/Ada architecture.... that NVidia no longer makes Gaming GPUs.

They are just using Enterprise architecture and trying to market them as "Gaming" to ignorant people who bandwagon RTX-ON marketing. That is the exact reason EVGA left NVidia... was because Ada Lovelace is not game specific architecture like RDNA is.
Non-gaming they might be but… they still wipe the floor with AMD… let’s hope AMD steps it up next generation.
 
Not sure I understand this sentiment, or anyone that claims that AI isn't creating immense value. Are there questionable deployments of gen AI and ethical/economic implications thereof? Yes, absolutely. But gen AI is most certainly not some sham tech that's going away or over-hyped like the metaverse was. You can't be a trillion dollar company on hot air alone.


It is NOT Ai... though. Just a talking encyclopedia...
 
Not sure I understand this sentiment, or anyone that claims that AI isn't creating immense value. Are there questionable deployments of gen AI and ethical/economic implications thereof? Yes, absolutely. But gen AI is most certainly not some sham tech that's going away or over-hyped like the metaverse was. You can't be a trillion dollar company on hot air alone.
Nvidia created their own market through Marketing. Need I say more?

While its semantics, I don't agree with the term "immense." That's relative to individual judgement. AI has, so far, proven useful for medical applications, and materials science. Beyond that, I consider its value questionable.

It can't think for you. It cannot substitute for human companionship. It cannot truly create - it can only mash together what it has already found. It can provide answers that are worthless and/or not based in fact, and researchers are finding ways to trick it to ignore its own programming and provide answers that are potentially harmful. For instance https://www.techspot.com/news/102304-if-you-teach-chatbot-how-read-ascii-art.html Doubtlessly, those issues will be patched. However, there is no indication as to how many more exploits exist, whether the fixes will also be exploitable, and so on.

As I see it, the future of AI remains to be seen. Personally, it has no value for me. I see it as a glorified search engine that really does not offer any advantage over ordinary search engines. And then there is its obvious enormous use of energy.

Its far from the solution to problems that its marketing is designed to make everyone to believe.
 
Last edited:
:rolleyes: Has Nvidia told gamers "So long and thanks for all the fish" or will gamers, on the backs of whom Nvidia has built their business, continue to get recognition? That is the question.

And only 4-Megawatts of power for what used to be 15-megawatts? Quite a power savings, yet how many homes or substantial numbers of anything else could be powered with 4-Megawatts. All for the sake of a BS marketing scam that still can be fooled to create chaos.

Its no wonder this world is F'd up, IMO.

The world's priorities are definitely in the right place? 🤔
Believe it or not, nvidia is NOT going to leave a $10 BILLION industry just because they invented a new $40 BILLION dollar one. $10b is $10b.

EVGA did not leave because nvidia didnt make "gaming GPUs". That's straight up delusional. EVGA left the market because they couldnt make money, and the longer time goes on, the more comes out that EVGA was horribly mismanaged and wasted money on pet projects that cost them in the long run. Surprise, that doesnt work well. Somehow Asus, gigabyte, and MSI have no issue making money on GPUs. Hmmmm....

People dont understand how businesses work.....Intel has made server CPUs for decades, are they gonna stop making consumer CPUs as a result? No, that's just dumb. I swear the internet is rotting peoples brains.
I said 3 years ago when looking at Ampere/Ada architecture.... that NVidia no longer makes Gaming GPUs.

They are just using Enterprise architecture and trying to market them as "Gaming" to ignorant people who bandwagon RTX-ON marketing. That is the exact reason EVGA left NVidia... was because Ada Lovelace is not game specific architecture like RDNA is.
That marketing for their "enterprise only" architecture works well considering these "non gaming" GPUs beat the "game specific" rDNA in efficiency and outright performance. Hmmmm......

Also, again is the internet rotting people's brains? Enterprise versions of cards with different features have existed for decades. Tesla, fermi, kepler, pascal, all of them had enterprise versions with greater FP performance, ECC memory, ece. Nobody claimed nvidia was dropping out of the market then. Why would they now? Just because the market has exploded? That's the rise and fall of business, no sensible individual cuts themselves out of a market just because its merely majorly profitable instead of insanely profitable.
 
It is NOT Ai... though. Just a talking encyclopedia...
With today's standards a biased encyclopedia at best. Until they upload a full human conscious this is all infancy level stuff. Can an ai understand humor? Sarcasm, self derived empathy while being selfless and not selfish?
Wasn't it ironic when in the beginning of the show they said I am in a search for endless clean energy while it competes for your energy demands 🤔. Imo*
 
Believe it or not, nvidia is NOT going to leave a $10 BILLION industry just because they invented a new $40 BILLION dollar one. $10b is $10b.

EVGA did not leave because nvidia didnt make "gaming GPUs". That's straight up delusional. EVGA left the market because they couldnt make money, and the longer time goes on, the more comes out that EVGA was horribly mismanaged and wasted money on pet projects that cost them in the long run. Surprise, that doesnt work well. Somehow Asus, gigabyte, and MSI have no issue making money on GPUs. Hmmmm....

People dont understand how businesses work.....Intel has made server CPUs for decades, are they gonna stop making consumer CPUs as a result? No, that's just dumb. I swear the internet is rotting peoples brains.

That marketing for their "enterprise only" architecture works well considering these "non gaming" GPUs beat the "game specific" rDNA in efficiency and outright performance. Hmmmm......

Also, again is the internet rotting people's brains? Enterprise versions of cards with different features have existed for decades. Tesla, fermi, kepler, pascal, all of them had enterprise versions with greater FP performance, ECC memory, ece. Nobody claimed nvidia was dropping out of the market then. Why would they now? Just because the market has exploded? That's the rise and fall of business, no sensible individual cuts themselves out of a market just because its merely majorly profitable instead of insanely profitable.
Believe it or not, I'm taking a wait and see attitude - especially when it involves the latest money making fad/scheme - AI.
 
Believe it or not, I'm taking a wait and see attitude - especially when it involves the latest money making fad/scheme - AI.
They'd be fools to leave it... all they have to do is churn out crippled "AI" cards and they'll make a killing. Intel and AMD have nothing to compete on the high end. It's virtually pure profit for Nvidia and they can charge whatever they please for the 80 and 90 level cards.
 
They'd be fools to leave it... all they have to do is churn out crippled "AI" cards and they'll make a killing. Intel and AMD have nothing to compete on the high end. It's virtually pure profit for Nvidia and they can charge whatever they please for the 80 and 90 level cards.
🤣 Now, did I say that they were going to leave it? As I see it, there is no certainty as to how long the AI fad will last. Its nowhere near the marketing claims for it - not without a substantial amount of work, and end-users have no control over the AI code itself. Its not like any end-user can modify that code to make AI better. In many cases, AI produces crap that is worse than a normal internet search, and at the cost of significant sums of power consumed - for what? Crap?
 
Last edited:
What's unfortunate and sad is that Nvidia at this point can charge their GPU's at any price and gamers will still buy them regardless how much the prices increases....

 
What's unfortunate and sad is that Nvidia at this point can charge their GPU's at any price and gamers will still buy them regardless how much the prices increases....
Technically no. There is a finite amount of gamers willing to pay for gpus more than $999 shown from market rejection of the original 4080 and now acceptance of the 4080super at that mark. Also the market rejected the 4090s at more than $2000 and is currently selling for $1800. Also contrary what is being told the economy was also in a better position when the 4090s launched vs Blackwell launch window. Imo if they price the 5090 closer to the $2000 it would be very successful if the performance rumors pan out.
 
Technically no. There is a finite amount of gamers willing to pay for gpus more than $999 shown from market rejection of the original 4080 and now acceptance of the 4080super at that mark. Also the market rejected the 4090s at more than $2000 and is currently selling for $1800. Also contrary what is being told the economy was also in a better position when the 4090s launched vs Blackwell launch window. Imo if they price the 5090 closer to the $2000 it would be very successful if the performance rumors pan out.

I wish those prices were here, but unluckily the 4090 retails at 3.4k, while the 4080 goes for 1.7k here in Australia.....
 
They'd be fools to leave it... all they have to do is churn out crippled "AI" cards and they'll make a killing. Intel and AMD have nothing to compete on the high end. It's virtually pure profit for Nvidia and they can charge whatever they please for the 80 and 90 level cards.

The only card Nvidia currently has that is faster than AMD cards is the 4090 and sometimes even then the 7900XTX is able to meet or beat it in a few titles if you go to non-bias reviews. If you want to see bias reviews when it comes to GPU's head over to Hardware unboxed for some reason, they always seem to have much lower numbers for AMD cards than what most people get in real life. I know this because I have a 7900xtx and their numbers are always lower than what my own cards gets whether I have mine overclocked or not. It is a shame though because they used to be one of the better places to go to get reviews that were not somehow off what they should be at. I still go there for their CPU stuff though. It was right after they were dissing RTX being a c0omplete waste of time and Nvidia threatened to blacklist them that their whole outlook changed and somehow all of a sudden Nvidia could do no wrong and AMD all of a sudden sucked and was a waste of time in their eyes.

Anyway, back on topic my xtx is faster than the 80 series cards and even more so because I allow it to run at over 3Ghz core and 2800Mhz vram speed but no it is not as fast as a 4090 card except in a couple games where Nvidia does badly my card will pull ahead then but that is not often at all. The 4090 is a monster card and fast and the 5090 is going to rock even more and as for AMD RDNA 4 it is too bad, they are saying oh we decided to not do the extreme high end this time, but we will for RDNA 5 instead. translation we have nothing to match the 5090 so we are not going to even try to. It is an excuse, but it is what it is, I guess. It is too bad Nvidia has pulled some pretty dirty tactics going back years and of course the recent thing they did by trying to black mail the card makers into using their GPU's only and the rest of that I forget what everyone was calling that but ut was extreme asshat move to the extreme and yet everyone still flocks to them to buy their stuff. I quit buying their GPUs when they pulled that tactic and won't buy their products again.

 
It is NOT Ai... though. Just a talking encyclopedia...
It's not general AI (as in strong intelligence), but generative AI (as in it can extrapolate and create information not found in its training set) is the term everyone uses. AI isn't a term new to machine learning algorithms, either, so it isn't inappropriate.

Nvidia created their own market through Marketing. Need I say more?

While its semantics, I don't agree with the term "immense." That's relative to individual judgement. AI has, so far, proven useful for medical applications, and materials science. Beyond that, I consider its value questionable.

It can't think for you. It cannot substitute for human companionship. It cannot truly create - it can only mash together what it has already found. It can provide answers that are worthless and/or not based in fact, and researchers are finding ways to trick it to ignore its own programming and provide answers that are potentially harmful. For instance https://www.techspot.com/news/102304-if-you-teach-chatbot-how-read-ascii-art.html Doubtlessly, those issues will be patched. However, there is no indication as to how many more exploits exist, whether the fixes will also be exploitable, and so on.

As I see it, the future of AI remains to be seen. Personally, it has no value for me. I see it as a glorified search engine that really does not offer any advantage over ordinary search engines. And then there is its obvious enormous use of energy.

Its far from the solution to problems that its marketing is designed to make everyone to believe.
I understand your take on the disagreement about the term "immense". I stand by the term, but I would qualify it by saying it is currently selective. As you point out it has many limitations, and its value is not universal. I do think it will get there (or be near universal), as almost every person uses language, and I can imagine it providing benefits in all kinds of ways in every corner of one's life, even without the audio or visual components to generative AI. I also see that pervasiveness as a gateway to all kinds of privacy violations, but that's a product of commercialization that can be overcome with the right legislation. What is less certain is how the economics play out for those whose jobs are threatened by it.

It has a ways to go, it's not easy to get it to solve many kinds of problems (certainly not in a repeatable, production grade kind of way), but unlike the over-hyped metaverse, the path for progress and expanding usefulness is much easier to see here. That's why I don't understand the sentiment that the marketing around this is all fluff. Sure, Nvidia plays their cards well, but they are capitalizing on an existing and growing market demand - they aren't manufacturing a market for this (unlike what Meta is trying to do with the metaverse).
 
🤣 Now, did I say that they were going to leave it? As I see it, there is no certainty as to how long the AI fad will last. Its nowhere near the marketing claims for it - not without a substantial amount of work, and end-users have no control over the AI code itself. Its not like any end-user can modify that code to make AI better. In many cases, AI produces crap that is worse than a normal internet search, and at the cost of significant sums of power consumed - for what? Crap?
On the contrary, there are plenty of open source models and open source libraries that make use of these models. We have many projects in the pipeline taking advantage of these at my work, and we aren't using the proprietary services for any of our pipelines (we use the proprietary assistants to help us code, but not as a component to what we are building with language models). You are right that it does take a substantial amount of work, even the relatively benign and boring applications we are focused on (automating things, semantic search, etc.) take considerable effort to get right. It's easy to generate a cool example, it's much harder to get something repeatable into production.

But, we can modify these models, we can fine tune them, and if we pay a pretty penny we can even build our own foundational models. And this isn't restricted to work, you can do these things (well, maybe not building a foundational model of any useful size) on your personal computer, too.

On the marketing side, if there is a company guilty of marketing this too much, I would say it is Intel and their ambition to create AI PCs. Guess what? We already have them. We don't need an NPU, a GPU is more than capable, and a CPU is fine if you don't mind it being a bit slow. The real limitation, for inference anyways, is memory - not compute. I don't see an NPU changing that.

 
Megawatt. One megawatt equals one million watts or 1,000 kilowatts, roughly enough electricity for the instantaneous demand of 750 homes at once.
 
Can't believe that my hopes are that Intel can pull something out of their... thin air (for GPU) that's price effective and has performance (enough if nothing else).
if they can. they will give a 10% discount and sell for AI purposes too
 
Believe it or not, nvidia is NOT going to leave a $10 BILLION industry just because they invented a new $40 BILLION dollar one. $10b is $10b.

EVGA did not leave because nvidia didnt make "gaming GPUs". That's straight up delusional. EVGA left the market because they couldnt make money, and the longer time goes on, the more comes out that EVGA was horribly mismanaged and wasted money on pet projects that cost them in the long run. Surprise, that doesnt work well. Somehow Asus, gigabyte, and MSI have no issue making money on GPUs. Hmmmm....

People dont understand how businesses work.....Intel has made server CPUs for decades, are they gonna stop making consumer CPUs as a result? No, that's just dumb. I swear the internet is rotting peoples brains.

That marketing for their "enterprise only" architecture works well considering these "non gaming" GPUs beat the "game specific" rDNA in efficiency and outright performance. Hmmmm......

Also, again is the internet rotting people's brains? Enterprise versions of cards with different features have existed for decades. Tesla, fermi, kepler, pascal, all of them had enterprise versions with greater FP performance, ECC memory, ece. Nobody claimed nvidia was dropping out of the market then. Why would they now? Just because the market has exploded? That's the rise and fall of business, no sensible individual cuts themselves out of a market just because its merely majorly profitable instead of insanely profitable.
EVGA didn't leave the market because they couldn't make money on GPU's it was because NVIDIA made it so difficult and left very little margins. How is the internet rotting people's brains? I have nothing but good experiencs with EVGA. Just because someone has a different opinion than yours doesn't mean anything is wrong with them. You've provided no objective evidence to support your opinion just like the other posters did and myself. From many points of view it seems NVIDIA doesn't really want to sell many GPUs. That's the point people are making. If NVIDIA is selling every AI chip that is being fabricated for a lot more profit than their gaming GPU's, that are made at the same fab, why would they waste fab time on low profit GPUs? Isn't that how business works?
 
EVGA didn't leave the market because they couldn't make money on GPU's it was because NVIDIA made it so difficult and left very little margins. How is the internet rotting people's brains? I have nothing but good experiencs with EVGA. Just because someone has a different opinion than yours doesn't mean anything is wrong with them. You've provided no objective evidence to support your opinion just like the other posters did and myself. From many points of view it seems NVIDIA doesn't really want to sell many GPUs. That's the point people are making. If NVIDIA is selling every AI chip that is being fabricated for a lot more profit than their gaming GPU's, that are made at the same fab, why would they waste fab time on low profit GPUs? Isn't that how business works?
If Nvidia made it difficult and left little margins then… yes, they left because they couldn’t make money on GPUs… that’s kind of the same thing..

As a previous poster stated, MSI and ASUS aren’t having any problems selling Nvidia GPUs - and make a fortune at it.

EVGA cited “disrespect “ as one of their reasons for leaving Nvidia… when your CEO has a spat with your partner’s CEO, it shouldn’t affect business - but in reality, I guess it does…

It’s too bad, as EVGA made great cards..
 
Back