Nvidia RTX 5000 graphics cards rumored to launch next year with massive performance increase

Lovelace could have been a great generation. Performance is there, provided they do not gimp on silicon. And power efficiency can be great also. Even for the top cards, as limiting the TDP of the 4090 shows.
 
Imagine the performance of a card built on 3nm and where silicon space was mostly allocated to what gamers can actually notice and therefore actually turn on during real game play, meaning direct raster performance.

AMD, are you going to take this opportunity?

Of course, even then they'd have to handle the influencer/PR problem first to avoid reviews like "Although the AMD card has twice the memory and raster performance, we think you should buy the Nvidia card because it has more resources dedicated to ray tracing and artificial frame generation."
 
This time next year the 5080 will be $1500 and the 4080 and 5070 will be around $1100. In five year there won't be such a thing as a GPU under a grand.
 
I think, people are just blind on what Nvidia is planning on the 4000 series, though the performance leap isn't high but the power consumption is half, if you match the watt vs watt the performance of a 4000 series can be more or doubled. however the lack on vram is something concerning really even if the l3 cache is bigger
I think you might be looking at it from the wrong angle yourself. What NVIDIA is mostly doing this generation is being cheap, the low power usage is just a side effect (well that and TSMCs 4N process is simply a LOT more power efficient than Samsungs 8nm).

Die sizes:
RTX 4090 608mm², RTX 3090 628mm² - (3.18% smaller)
RTX 4080 379mm², RTX 3080 628mm² - (39.56% smaller)
RTX 4060 Ti 190mm², RTX 3060 Ti 392.5mm² (51.63% smaller)
RTX 4060 190mm², RTX 3060 276mm² (31.16% smaller)

Also note L2 cache is on-die. Although it reduces the impact of smaller VRAM it basically 'wastes space' that could have been used for more straight up performance. So it's no wonder the RTX 4060 Ti compares so incredibly unfavourably to the RTX 3060 Ti in everything except for power usage.
The smaller dies use less power (yeah I know, not necessarily true. For laptops things are often bigger chips at lower clocks to save power). Apparently from what I understood from I think it was one of the Moore's law is dead videos is that the memory controller is on the very outer edges of the die and because the die is so small that leads to the low bandwidth bus.

Why are all these dies so small (4090 excepted) compared to their predecessors? Because it saves costs! A smaller die = more chips per wafer = cheaper chips. And it's not linear either, because they're cutting rectangles out of a circle it means that around the edge a lot less is wasted with smaller dies. So NVIDIA is selling tiny dies at higher than ever prices, because they're tiny and not much of an upgrade the power usage is much lower. Doubly so because the memory bandwidth is also drastically lower which costs a lot less power and triply so because TSMCs chips use a lot less power than Samsungs (RTX 3000 series were on Samsung).

So rather than as you put it
"I think, people are just blind on what Nvidia is planning on the 4000 series, though the performance leap isn't high but the power consumption is half"

I'd say:
I think NVIDIA is just planning on milking the market with the 4000 series. The performance isn't there because the die sizes are tiny leading to lethargic memory bandwidth - but at least the power usage is low as a result.

tldr: NVIDIA is selling chips with dies so tiny they're basically moving their entire product line down a class except for the RTX 4090. Which makes the only selling points DLSS 3.0 and low power usage.
(They could compensate this by lowering prices but instead they're going up)
 
The simple reality is that Nvidia no longer cares about or requires volume sales as a business model. They know they can charge literally whatever they want and someone out there will pay.
I do not think that is what gamers are telling them at all. Lower volume sales of the too expensive 20xx and 40xx series reflect that there is a limit to what the wider audience will pay.

Meanwhile Nvidia has benefitted enormously from demand outside of gaming, such as cryptocurrency and AI. If the market determines that for the time being the world's limited silicon manufacturing capacity should be dedicated to these other needs then so be it.

Capacity, supply and demand will eventually even out. When they do for gaming, most of the world's gamers will still be on affordable devices playing whatever games run on them best.

As to Nvidia, they adjusted prices down for the launch of the 30xx series, until it became clear that the pandemic and the crypto boom had brought new buyers. They will eventually adjust again, or someone else will do it for them.
 
I do not think that is what gamers are telling them at all. Lower volume sales of the too expensive 20xx and 40xx series reflect that there is a limit to what the wider audience will pay.

Meanwhile Nvidia has benefitted enormously from demand outside of gaming, such as cryptocurrency and AI. If the market determines that for the time being the world's limited silicon manufacturing capacity should be dedicated to these other needs then so be it.

Capacity, supply and demand will eventually even out. When they do for gaming, most of the world's gamers will still be on affordable devices playing whatever games run on them best.

As to Nvidia, they adjusted prices down for the launch of the 30xx series, until it became clear that the pandemic and the crypto boom had brought new buyers. They will eventually adjust again, or someone else will do it for them.
No. Just no. Lol

I stand by my analysis.
 
My use case is primarily Blender and DaVinci Resolve, so I understand that my needs aren't the same as for games. For Blender, the 40 series was a significant improvement over the 30 series, typically giving more performance and less heat for the same price (with the low end winning mostly on heat, the high end on performance), though the jump was admittedly not as big from 30 to 40 as it had been from 20 to 30. What kept me from getting a 40-series is the lack of modern video decoding, I still can't decode footage from modern cameras directly on nvidia gpus, and, should nvidia fix that in the 50 series and delivered the promised performance jump for what I use, I see myself aiming between 5070 and 5080 when those become available.
 
Given the fact that nearly every Ada card is really named a family ABOVE what it should actually be, it would be easier to achieve 2x perf boost if nVidia went back to the norm for RTX 5000.
 
Last edited:
I will take a 600w card, but also at 1k max.
Your decision. But I don't want the sun in my case, I don't want fire hazard, etc. Performance gains must come from good engineering and NOT from increasing power consumption. No way.
 
Can hear the collective groan in the audience. Strap yourselves in for another generation of gaslighting and price gouging. The more you buy the more you save guys, don't forget!
 
My use case is primarily Blender and DaVinci Resolve, so I understand that my needs aren't the same as for games. For Blender, the 40 series was a significant improvement over the 30 series, typically giving more performance and less heat for the same price (with the low end winning mostly on heat, the high end on performance), though the jump was admittedly not as big from 30 to 40 as it had been from 20 to 30. What kept me from getting a 40-series is the lack of modern video decoding, I still can't decode footage from modern cameras directly on nvidia gpus, and, should nvidia fix that in the 50 series and delivered the promised performance jump for what I use, I see myself aiming between 5070 and 5080 when those become available.
So why not buy a PRO card...?

Or, do you just like the way nVidia markets gaming cards to creators...? (Bcz it's a cheaper alternative, then actually buying a Pro card.)


And do you think NVidia markets it gaming card (as creator cards) so they can claim more people buy their cards for gaming... when it is not true...? Because (TODAY), almost all the people I know who buy nVidia cards, do so because of non-gaming cuda requirements..
 
Jensen has that Billionaire lifestyle to afford. He can't be caught living with just 6 houses, 12 cars, and 2 planes, that is for Peasants.
Nvidia stock has basically retired my *** sorry you missed out, but l love what Jensen is doing
 
I do not think that is what gamers are telling them at all. Lower volume sales of the too expensive 20xx and 40xx series reflect that there is a limit to what the wider audience will pay.

Meanwhile Nvidia has benefitted enormously from demand outside of gaming, such as cryptocurrency and AI. If the market determines that for the time being the world's limited silicon manufacturing capacity should be dedicated to these other needs then so be it.

Capacity, supply and demand will eventually even out. When they do for gaming, most of the world's gamers will still be on affordable devices playing whatever games run on them best.

As to Nvidia, they adjusted prices down for the launch of the 30xx series, until it became clear that the pandemic and the crypto boom had brought new buyers. They will eventually adjust again, or someone else will do it for them.
I think that "someone else is doing it for them" is happening with the drop in sales for gaming cards they are experiencing right now. The problem is the boom in sales they are experiencing because of AI, and in the past cryptocurrency. Personally, I think the AI "boom" is more of a fad like crypto was. When people figure out that the present state of AI, for the general consumer, is to deliver crap and fake results when it cannot deliver anything good, there are two things that I see happening; AI either drastically improves, for the general consumer, or the fad dies just like the crypto fad has. I'm betting that due to geed and competition between "AI providers", that it will be the latter. As I see it, there is no reason that current general consumer AI should be delivering fake/misleading/bad/crap results except that everyone providing AI is treating it like they have to beat everyone else to the market no matter the cost.
 
Back