Nvidia Blackwell rumors: RTX 5090 to offer 1.7x performance, multi-chiplet design for...

midian182

Posts: 9,763   +121
Staff member
Rumor mill: With the RTX 4000 series now pretty much behind us, leaks and rumors are emerging about Nvidia's next generation of consumer graphics cards, the RTX 5000 line. The latest of these offers some performance indicators, including the RTX 5090 showing a 1.7x overall uplift compared to its predecessor. Another leak points to Nvidia finally using a multi-chiplet design in the company's high-performance compute GPUs.

Starting with the consumer series, a leaker on the Chiphell forum, Panzerlied, has posted what are claimed to be stats for the RTX 5090: a 50% increase in scale (which presumably refers to cores), a 52% increase in memory bandwidth, 78% increase in L2 cache, 15% increase in frequency, and 1.7x performance uplift.

Applying those figures to the RTX 4090 would suggest that the successor will pack around 24,000 CUDA cores, a 2.9 GHz boost clock, and 128MB of L2 cache.

It's also suggested that the RTX 5090's memory will use GDDR7 and be boosted to 32 Gbps. The AD102 GPU successor is rumored to include a 512-bit memory bus, though it might not be used in the RTX 5090. As VideoCardz notes, the card could come with configurations such as 512-bit/24 Gbps or 448-bit/28 Gbps.

While everyone has their own theory about the next generation of Nvidia cards, it's worth mentioning that Panzerlied has made correct claims in the past. Moreover, these most recent rumors were "confirmed" by prolific hardware leaker Kopite7kimi.

If the figures are true or close to the truth, one has to wonder what sort of price tag Nvidia will slap on the RTX 5090. Team Green was heavily criticized over its Lovelace pricing, but it's hard to imagine this RTX 5090 being cheaper than, or even the same price as, the $1,600 RTX 4090.

Previous RTX 5000-series rumors also pointed to significant performance increases compared to Lovelace. No word yet on a release date, though many say they will land next year.

In a related story, Kopite7kimi also made some claims about Nvidia's next-gen products. He says that the Blackwell architecture will be used across both consumer and datacenter GPUs, as opposed to the current Ada Lovelace/Hopper split. Moreover, Nvidia will apparently be following Intel and AMD in using a multi-chiplet design for the first time in its datacenter class of GPU.

"After the dramas of GA100 and GH100, it seems that GB100 is finally going to use MCM," kopite7kimi wrote. "Maybe GB100=2*GB102."

Even if Nvidia does go down the multi-chiplet design route for its compute GPUs, the company is still expected to stick with a monolithic design for consumer products.

Permalink to story.

 
4090 is significantly faster than 3090. Problem is, it didn't translate to more mainstream cards such as 4070 to 4060 where they are just slightly faster and in some cases same or even slower than previous gen.
 
I'm planning my next upgrade for 2025, so will probably time it for when this hits. I hope to god its a good generation, the 40xx generation was such a damp squib, like the article says mainly for the pricing, and performance in the mid-high cards.

Good thing is I've already started saving!
 
The last card they released iirc is the 4060 at 300 bucks. It was also delayed a full what, 10 months I think? We know the architecture is capable of being more affordable because laptops with a 4050 already exists but it seems like Nvidia has just decided to cut out their entire entry level segment altogether and are barely interested in selling any product below the 500 USD mark.

And here we are with Nvidia already lining up the 5090 (Because lets get real: I don't believe any leaks are anything but 100% intentionally planted out by Nvidia themselves, it's just marketing with a layer of plausible deniability on top) with another price increase no doubt so they might stop at like 600-700 for the 5070 which will be severely lacking just like the current 4070 and then they might just skip the rest.

This is going to effectively kill AAA PC gaming: There's no way this is sustainable but as I said on another post, Nvidia doesn't needs gamers anymore specially not now, they can get away with just demanding more and more money while makikng the X090 line the only product that is actually a significant generational upgrade in performance.

Everybody should just stop upgrading and refuse to play any new game that doesn't runs on something like a 2060 to 3060 tops: Limit yourself to indie games and older titles because you'll pretty much have to at this rate anyway.
 
Our brand new nVidia GeForce RTX 5090 Ti Super Mega Hyper Founders Edition is built with the enthusiast customer in mind.
Enthusiasts, make it yours for only "$ 3000 ".


Or $2000 for that matter. Or even $1500. It's just as ducking ridiculous, no less.


Then I read about nVidia becoming the most valuable company...with huge profits. Well, if they manage to overprice their cards, like they did with the 4090, and still have them flying off the shelves...guess who's the loser getting them there?

First, the dumb consumer paying ridiculous prices, then the companies in the server space. The latter is made possible by the same consumer who accepts these prices and allows nVidia to tell companies that their products are perceived as being worth the money they ask for.
It's us again and again and again.
 
Last edited:
I'm sure it will be at 17X the cost and will be a dud for sales just like most of the 40XX generation, especially since the world is drunk on AI and I am sure that Nvidia is happily gouging the AI accelerator market.
 
No doubt Nvidia will stick with their 12-pin connectors. There's a saying, if it ain't broke, don't fix it, and I'd rather not gamble on the off chance my house burns down. I'm still rocking a 3090 because of it, much to my annoyance with Phantom Liberty right around the corner. I guess I'll have to deal with Pleb Tier ray-tracing for the foreseeable future.
 
Our brand new nVidia GeForce RTX 5090 Ti Super Mega Hyper Founders Edition is built with the enthusiast customer in mind.
Enthusiasts, make it yours for only "$ 3000 ".


Or $2000 for that matter. Or even $1500. It's just as ducking ridiculous, no less.


Then I read about nVidia becoming the most valuable company...with huge profits. Well, if they manage to overprice their cards, like they did with the 4090, and still have them flying off the shelves...guess who's the loser getting them there?

First, the dumb consumer paying ridiculous prices, then the companies in the server space. The latter is made possible by the same consumer who accepts these prices and allows nVidia to tell companies that their products are perceived as being worth the money they ask for.
It's us again and again and again.

As long as people keep buying them, NVidia will find their price gouging justified. With each passing successor having exclusive tech, you bet NVIDIA is riding the planned obsolescence train. Remember when $600 could get you a flagship GPU? Pepperidge Farm remembers!
 
Asus is prepping the market with the 4090 Matrix platinum at $3200 😑.
I think that Asus effort will fail for obvious reasons especially if that 4090 requires their "solution" for the hvpwr connector which requires Asus' "special" motherboard which will likely sport "special pricing" not to mention incompatibility with designs from other manufacturers.

As I understand it, other sectors of the "Tech Industry" are seeing price fatigue. IMO, it is inevitable that the tech industry, in general, will catch up and realize that significant numbers of people are worn out by, and fed up with, price gouging.

All the specialized crap from these PC component manufacturer is reminding me of content providers that thought they could make a killing by starting their own streaming services - services which are having very difficult times seeing profits.
 
Last edited:
4090 is significantly faster than 3090. Problem is, it didn't translate to more mainstream cards such as 4070 to 4060 where they are just slightly faster and in some cases same or even slower than previous gen.

The smaller memory bus hurts their performance for anything that is memory intensive. Combined with not bumping the amount of RAM versus the 3000 series, there are titles that can see no (or even small negative benefit) as a result.
 
As long as people keep buying them, NVidia will find their price gouging justified. With each passing successor having exclusive tech, you bet NVIDIA is riding the planned obsolescence train. Remember when $600 could get you a flagship GPU? Pepperidge Farm remembers!
I would argue that it's not only gamers holding off buying the flagship cards it's the competition that is needed and lacking. For eg. you reference the $600 mark, I recall when Nvidia's gpus were even cheaper because AMD was competitive back then. Another example of this in current form, the 7800XT causes Nvidia to lower price of the 4070 to $549. Will AMD step up the the plate. Also I don't believe Blackwell flagship will be north of $3000 unless it's Titan class tier. Nvidia had cards that were expensive throughout the decades as well, like the 8800 ultra for $829 at 5/2007, it's the competition that keeps them in check not gamers buying the flagship cards imo. I wonder why people are hating on gamers vs AMD that is rumored to have canceled it's high end offerings on its next gen succesion 🤔? Yay blame the consumer and not the root cause analysis!
 
Last edited:
The smaller memory bus hurts their performance for anything that is memory intensive. Combined with not bumping the amount of RAM versus the 3000 series, there are titles that can see no (or even small negative benefit) as a result.
Had Nvidia stuck to the Ampere method of allocating GPU variants to particular SKUs, none of this would be the case -- if the 4090 and 4080 both used the AD102, with the AD103 for the 4070 models, AD104 for 4060, and so on, we'd have the same bus widths as before but more VRAM.

Unfortunately, due to the combined cost of the N4 manufacturing (compared to Samsung's 8N) and the lack of market competition, Nvidia released it could hoof almost everything down one tier, then sit back and watch the money roll in.
 
The last card they released iirc is the 4060 at 300 bucks. It was also delayed a full what, 10 months I think? We know the architecture is capable of being more affordable because laptops with a 4050 already exists but it seems like Nvidia has just decided to cut out their entire entry level segment altogether and are barely interested in selling any product below the 500 USD mark.

And here we are with Nvidia already lining up the 5090 (Because lets get real: I don't believe any leaks are anything but 100% intentionally planted out by Nvidia themselves, it's just marketing with a layer of plausible deniability on top) with another price increase no doubt so they might stop at like 600-700 for the 5070 which will be severely lacking just like the current 4070 and then they might just skip the rest.

This is going to effectively kill AAA PC gaming: There's no way this is sustainable but as I said on another post, Nvidia doesn't needs gamers anymore specially not now, they can get away with just demanding more and more money while makikng the X090 line the only product that is actually a significant generational upgrade in performance.

Everybody should just stop upgrading and refuse to play any new game that doesn't runs on something like a 2060 to 3060 tops: Limit yourself to indie games and older titles because you'll pretty much have to at this rate anyway.
They seem to be conceding the entry - mid level gamer to AMD. AMD has now released 3 GPUs under $500 and as much as everyone seems to hate on AMD (whether it's GPUs or CPUs) they are offering your best prices and performance in the $500 and under GPU space. AMD is now looking to concede the "high end" gaming ($750 and up) to Nvidia as they don't seem to have any plans for an 8900XT or even 8800 series GPU. It sounds like they will start with 8700 and work down from there. What are game developers going to target? Personally, I've given up on PC Gaming and sticking to the lower price point (and optimizations) of XBox Series S (no longer care for FPS) so stick to puzzle games, RTS (which I still play Civ on my PCs as GPUs are not as important as CPU for single person gaming). Good luck to the PC Gamers (although I still like to keep up with the latest developments).
 
I would argue that it's not only gamers holding off buying the flagship cards it's the competition that is needed and lacking. For eg. you reference the $600 mark, I recall when Nvidia's gpus were even cheaper because AMD was competitive back then. Another example of this in current form, the 7800XT causes Nvidia to lower price of the 4070 to $549. Will AMD step up the the plate. Also I don't believe Blackwell flagship will be north of $3000 unless it's Titan class tier. Nvidia had cards that were expensive throughout the decades as well, like the 8800 ultra for $829 at 5/2007, it's the competition that keeps them in check not gamers buying the flagship cards imo. I wonder why people are hating on gamers vs AMD that is rumored to have canceled it's high end offerings on its next gen succesion 🤔? Yay blame the consumer and not the root cause analysis!
I'd wager that's more of a symptom than a root problem. Like every AAA franchise or studio these days, NVIDIA is more focused on its shareholders than it is on the gamer with cheeto stains on his fingertips. These companies aren't pandering to gamers; they've been an afterthought since the crypto boom, truth be told. NV having proprietary tech that's vastly superior to their competition isn't the problem. It's their arrogance in thinking they've got a monopoly. A too-big-to fail mentality is dangerous to have.

I miss ATI... and I miss the days of having a market that wasn't oversaturated with pointless GPU's. GTX 200 series era was the golden age of GPU's imo.
 
I'd wager that's more of a symptom than a root problem. Like every AAA franchise or studio these days, NVIDIA is more focused on its shareholders than it is on the gamer with cheeto stains on his fingertips. These companies aren't pandering to gamers; they've been an afterthought since the crypto boom, truth be told. NV having proprietary tech that's vastly superior to their competition isn't the problem. It's their arrogance in thinking they've got a monopoly. A too-big-to fail mentality is dangerous to have.

I miss ATI... and I miss the days of having a market that wasn't oversaturated with pointless GPU's. GTX 200 series era was the golden age of GPU's imo.
Yep, also gamers have minimal affect when it comes to disruptive forces in the market, like crypto currency mining, and now now ai compute. When we have competing products then the ball goes back to our court. Another power gamers hold is second hand market, for eg. I sold my 3090 xc3 hybrid for $900 and and bought the 4090 suprim liquid for $1749 at launch, that's $850 out of pocket and an annual upkeep cost for the latest flagship at a rate of $425 annually. Has this annual cost gone up since I sold my 2080ti hsc hybrid for the same price in the Summer of 2020 ( $850) and bought the 3090 hsc hybrid at $1619? The answer is yes slightly from a rate of $384.5 vs $425. Will this rate go up with Blackwell? Probably very likely but is actually more in line with inflation if you take into the consideration the second hand market offsetting the out of pocket cost. If my predictions are right that we will be paying for the same performance per dollar at the top end in q1 2025 compared to for 4090 (10/22 launch date ) with natural efficiency deltas and features sets that the culprit of an ai will bring with it for Blackwell, this means that the 4090 should hold its value better compared previous generations 2 fold. 1) lack of current competition for this tier 2) the cancelation of the 4090ti should prevent the 4090 to drop like the 3090 did when the 3090ti fell by half.
The unknown.
If Nvidia decides to bring out the 4080ti or 4080 Super duper it might disrupt the the lowered tier market but at the cost of competing with itself and lowering profit margins just to improve revenue. The pricing structure is so fragile that Nvidia would try to prevent this at all costs especially when AMD's 7000 series Radeons have locked in tiers. In conclusion lol enjoy your life because I could think of many other hobbies that cost 10 fold or greater than an annual upkeep cost of $400 to $500.

😎
 
4090 is 35% faster vs 7900xtx at Raster 4K and 75%+ faster vs 7900xtx at Max Settings (RT+RAster) at 4k.

If 5090 is 70% faster vs 4090 in Raster then that would put it 105%+ faster vs 7900xtx and 145%+ faster in Max settings (RT+Raster) at 4k.

I think I understand why amd canceled n41 and n42 and decided to do below $400 gpus.

It also shows amd will gradually leave pc gpu market. They silently left laptop gpu market. Only console seems to be their main focus.

Very good decision. Go where profit is.
 
As long as people keep buying them, NVidia will find their price gouging justified. With each passing successor having exclusive tech, you bet NVIDIA is riding the planned obsolescence train. Remember when $600 could get you a flagship GPU? Pepperidge Farm remembers!
So what's stopping AMD from selling you 600$ flagship GPU's?
 
On the flip side, people's sense of entitlement is funny to me. As if AAA gaming 'should' be accessible to the masses. It doesn't have to be and nobody is owed access. As with most things in life, purchase power is everything and you gotta pay to play. I can't afford to, but such is life and there is much more to do for enjoyment than look at a screen in whatever capacity. #gooutside
 
Back