Nvidia GeForce RTX 4090 is 78% faster than RTX 3090 Ti, hits 3.0 GHz in leaked benchmarks

Gimp65

Posts: 75   +157
key word initially. Unfortunately it's the new norm for all hardware or goods now. The good part is there will be a significant sell off for the high end like 3090tis and 3090s so it's not all gloom and doom. Questions?
I already asked the question, and it still stands unanswered. Why should scalping be a problem this time ?

"The good part" is not only that there is a sell off, that's just one part of it, the other parts are, that there is no real supply problems anymore and the demand is much lower this time around.
 

godrilla

Posts: 494   +248
Yeah, why again...? Miners are not interested in gaming cards anymore, when there are mining cards out by nVidia...
You are assuming that the demand will be atypically low to not cause instant sell out initially even before crypto mining was a thing. I mean scalpers were scalping ps5s and other consoles that don't mine crypto so there is that. Your counter argument is atypical one irrespective to cryptomining based on all other hardware launches before cryptomining. That is why.
 
Last edited:

McMurdeR

Posts: 574   +739
I heard people say it barely handles some games in 4k. If 3080ti is just barely enough for all 4k games, then we arent there yet.

The reason I'm interested is because I'm wondering if next year / next gen will be the right time to pull the trigger on a good 4k gaming display?
 

Avro Arrow

Posts: 2,612   +3,200
TechSpot Elite
This isn't really as impressive an improvement as it would initially appear. I'll explain why by using snippets from the article. First let's use the headline:

"Nvidia GeForce RTX 4090 is 78% faster than RTX 3090 Ti, hits 3.0 GHz in leaked benchmarks"
That 78% is pretty impressive-sounding, but there's something that I picked up on here that makes it actually disappointing. Here's why:

"The leak also shows the card reaching a 3,015 MHz frequency. For comparison, the RTX 3090 Founders Edition boosts to 1,695 MHz."
- Ok, so...
3015 ÷ 1695 = 1.77876.... (aka 78%)

Well, this RTX 4090 just looks like a wickedly overclocked 3090. Of course, to maintain a stable overclock, TDP must have an equally dramatic increase. Did that also happen here?
"Being a triple-slot card, the NVIDIA GeForce RTX 3090 Ti draws power from 1x 16-pin power connector, with power draw rated at 450 W maximum."
- RTX 3090 Ti - TechPowerUp GPU Database

Ok, so what's 78% more than 450W?
450W x 1.78 = 801W
Now, what was said about the TDP of the RTX 4090? Oh yes, that's right:
"According to the leaker, it has a default TDP of 450W, though it is designed for 600W to 800W."
I see a maximum of 800W. Almost exactly 78% more than the previous generation and most likely the TDP that this leaked card was running with to get these performance numbers. The leaker said that it has a default TDP of 450W but there's nothing said about what this card was running to get the numbers that were posted. A performance increase of 78% and a maximum TDP increase of also 78% is just too perfect to ignore.

Then of course, to do this, you'd have to increase your cooling equally dramatically and nVidia is claiming that the RTX 4090 will not fit in a mid-tower case. This sounds like a card that will just barely fit into my gigantic Ultra U12-40670 SuperTower (with regard to its length). Also who knows how many slots it will take up? Maybe four, five or even six? With multi-GPU setups all but dead there is a lot of residual room for a card that big. Wouldn't it be funny if it had 78% more cooling surface area than the 3090?

It just sounds like nVidia took Ampere, overclocked the hell out of it and increased the TDP and cooling to compensate. There doesn't appear to be much of a technological improvement with regard to efficiency. The performance-per-watt appears to be exactly the same. That's not incredibly impressive to me.

DISCLAIMER:

Firstly, this analysis is based on the assumption that the leak is accurate which, as we know, is by no means guaranteed. Secondly, I realise that I've bounced back and forth between the RTX 3090 Founder's Edition and the RTX 3090 Ti which are not exactly the same card. However, their performance delta is insignificant so they are more or less the same card.

According to TechPowerUp, there is only a paltry 7% difference between the RTX 3090 and RTX 3090 Ti. I believe that Steve Walton considers a 3% difference to be statistically a margin-of-error tie so there's just a 4% real difference between them.
 

McMurdeR

Posts: 574   +739
Even a 4090 will be having issues handling a 4k resolution at high FPS sooner than most expect. We have heard "this high end" will finally be able to run 4k well, over and over, gen after gen, fact is there is a long way to go even with a 4090. It just dosnt scale linear like that, 4 was is a gigantic step up from 2560x1440, there is no gigantic step up in raw GPU performance gen over gen.

This is a really good point. However it seems like the performance demands of games isn't scaling like it used to. There was a time when you had to upgrade your GPU almost annually just to keep up on the same resolution.
 

godrilla

Posts: 494   +248
What's wrong with HDMI 2.1?
hdmi is great but it caps at 4k 120hz 4:4:4 10 bit color. If you want 4:4:4 at 10 bit color over 4k 120hz you need dp 2.0. All the current high end 4k monitors at 4k 144hz to 240hz with dp 1.4 come at picture quality compression 4:2:0. compromise.
Lastly hdmi 2.1 standard is going on almost 3 years now.
 
1500 W PSU
New Full size case
Major water-cooling for the CPU and the GPU
New motherboards (to handle new Intel &AMD CPUs)
New DDR5 RAM,
etc

All this is for what exactly??

Aside from professionals - and beside bragging rights from the spoiled rich kids - what can these new expensive toys do that we can't do with current generation of CPUs and GPUs??

They can render crappy ps4 engine games with physics from 2000
 
Last edited:

loki1944

Posts: 601   +441
I already asked the question, and it still stands unanswered. Why should scalping be a problem this time ?

"The good part" is not only that there is a sell off, that's just one part of it, the other parts are, that there is no real supply problems anymore and the demand is much lower this time around.
Maybe because the market/scalpers have had a taste of how many pounds of flesh they can get away with thanks to the recent precedents.
 

m3tavision

Posts: 850   +618
You are assuming that the demand will be atypically low to not cause instant sell out initially even before crypto mining was a thing. I mean scalpers were scalping ps5s and other consoles that don't mine crypto so there is that. Your counter argument is atypical one irrespective to cryptomining based on all other hardware launches before cryptomining. That is why.

They NEED is in the mid to low tiers, not upper end. Plenty of 3090's under $1k that nobody wants for 4k gaming... <-- that atypical demand for 1440p and 1080p will fill the used markets.
 

MarcusNumb

Posts: 64   +64
So in order to enjoy this new GPU, I'm gonna need:
- A new PSU over 1000w
- A new case
- A new custom liquid cooling system
- Air conditioner to my room
- Adding around 50 - 70 euros or more to my electric bill every month

Hmm no thanks, this type of monster clearly is for professional users, not gamers anymore imo.
 

Athlonite

Posts: 365   +136
If it can't use the same or less power as the previous series and get better perf then someone's not doing their job right
 

McMurdeR

Posts: 574   +739
hdmi is great but it caps at 4k 120hz 4:4:4 10 bit color. If you want 4:4:4 at 10 bit color over 4k 120hz you need dp 2.0. All the current high end 4k monitors at 4k 144hz to 240hz with dp 1.4 come at picture quality compression 4:2:0. compromise.
Lastly hdmi 2.1 standard is going on almost 3 years now.

Good point. However I doubt there will be much content that will break 120hz at 4k quality settings, even on next gen hardware.
 

Rocky4040

Posts: 105   +136
I thought this site dealt with facts not rumor mill stuff?

So when all of these rumors pan out to be not right and all of these cards from Nvidia and AMD only do half of what the rumors are proclaiming will everyone then go on to say oh such and such product sucks. Oh they promised us so much more we did not get what we were promised. Well news flash the companies themselves never promised these magic numbers some grunts sitting in their parents basements or attics that have little more to do each day than to scour the internet for anything they can make wild guesses at and proclaim they know it all so they look important to those that eat all of these wild rumors up like cornflakes have something to gossip about at the water dispenser at coffee.
 

Rocky4040

Posts: 105   +136
hdmi is great but it caps at 4k 120hz 4:4:4 10 bit color. If you want 4:4:4 at 10 bit color over 4k 120hz you need dp 2.0. All the current high end 4k monitors at 4k 144hz to 240hz with dp 1.4 come at picture quality compression 4:2:0. compromise.
Lastly hdmi 2.1 standard is going on almost 3 years now.
99% of the world that buys computer stuff would not notice either way anyway and that's what these companies cater to. Not the ones that know or can tell the difference only a few companies cater both ways and when they do they jack the prices up.

Edit:
Yes we do need an updated HDMI standard yet again but by the time we would see that new standard in new products it would be outdated again anyway or surpassed by a new DP standard and the whole thing starts all over again it's an never ending cycle it seems.
 

m3tavision

Posts: 850   +618
99% of the world that buys computer stuff would not notice either way anyway and that's what these companies cater to. Not the ones that know or can tell the difference only a few companies cater both ways and when they do they jack the prices up.

Edit:
Yes we do need an updated HDMI standard yet again but by the time we would see that new standard in new products it would be outdated again anyway or surpassed by a new DP standard and the whole thing starts all over again it's an never ending cycle it seems.

Not sure what you are on about.... nothing really pushes 4k at 120frames yet. And if you do have a $1k+ GPU, then you can afford the HDMI2.1 120Hz 4k monitors that are out.

But, nobody is buying a 4k Monitor that does 144Hz plus... UNTIL GPU that can push 4k @ 144 frames exists..!
 

Vanderlinde

Posts: 156   +104
The 4090 scores so great since they significantly increased the L1/L2 cache sizes, looking from AMD with their Infinity Cache.
 

loki1944

Posts: 601   +441
You are assuming that the demand will be atypically low to not cause instant sell out initially even before crypto mining was a thing. I mean scalpers were scalping ps5s and other consoles that don't mine crypto so there is that. Your counter argument is atypical one irrespective to cryptomining based on all other hardware launches before cryptomining. That is why.
especially since I imagine plenty of pc gamers have held out since 2019 on upgrading.