Nvidia GeForce RTX 4090 is 78% faster than RTX 3090 Ti, hits 3.0 GHz in leaked benchmarks

1500 W PSU
New Full size case
Major water-cooling for the CPU and the GPU
New motherboards (to handle new Intel &AMD CPUs)
New DDR5 RAM,
etc

All this is for what exactly??

Aside from professionals - and beside bragging rights from the spoiled rich kids - what can these new expensive toys do that we can't do with current generation of CPUs and GPUs??
4k/120hz max settings with rt?

That's what I'm aiming for.... I'll keep upgrading til that's the standard I see across all the games I play.

3080ti / 10900k and I'm still not close to there...

Only game I play that I'm even close on is Destiny and even there I have to use render scaling and still don't stay above 120 at all times.
 
And why exactly do you believe scalping is going to be any problem this generation, literally nothing indicates that it will be a problem
Scalping will always be a problem when there is demand for a product they think they can sell they'll buy it and people will pay it.... If you think ONLY crypto was the reason people were paying more for cards in the past you haven't been paying attention.

I KNOW for a FACT that going back atleast 3 generations scalpers have been making bank off gpu's for quite some time after release.... Will it be YEARS like it was this last time? Prob not... but months? Absolutely!
 
Even a 4090 will be having issues handling a 4k resolution at high FPS sooner than most expect. We have heard "this high end" will finally be able to run 4k well, over and over, gen after gen, fact is there is a long way to go even with a 4090. It just dosnt scale linear like that, 4 was is a gigantic step up from 2560x1440, there is no gigantic step up in raw GPU performance gen over gen.
Been chasing "4k" since 2013 and you're absolutely right in the ultra high end circles we've been talking about it coming "next gen" ever since the 780 (though it was in 2x or 3x sli back then) the 2080ti was the first time we started talking about it in single cards and while the 3080/90 did bring 4k/60 to MOST games us high end guys have moved on from 4k/60 being "the goal" to 4k/120.... So the chase continues.

I'm hoping the 40 series brings a card that can do it and I think I might actually finally be able to take a gen or two off without upgrading.
 
So in order to enjoy this new GPU, I'm gonna need:
- A new PSU over 1000w
- A new case
- A new custom liquid cooling system
- Air conditioner to my room
- Adding around 50 - 70 euros or more to my electric bill every month

Hmm no thanks, this type of monster clearly is for professional users, not gamers anymore imo.
Let's see..

Have a tower 900 (massive case)

Sold my 3080ti/10900k/mobo memory/1200w Plat psu for $3600 bucks a little while back to prepare

Still have my full watercooling kit including pumps rads cpu block and all fittings tubing tools etc

House is central air but pc room has dedicated split level a/c system


Electricity is cheap where I'm at plus have solar...


So yea in ready to go and as a "gamer" I have no problem with it.
 
Good point. However I doubt there will be much content that will break 120hz at 4k quality settings, even on next gen hardware.
I'm not sure what you mean but almost every game can benefit from 4k/120 over 4k/60 when you add in things like rt we absolutey need more powerful cards than what we have if you want to drive today's top displays (like lg oleds) at their best.
 
Good point. However I doubt there will be much content that will break 120hz at 4k quality settings, even on next gen hardware.
Depends what you play. I have been playing Vermitide 2 at maximum settings 4k at 115 fps average on my CX oled and 3090 xc3 ultra hybrid for almost 2 years now. I am holding off on Cyberpunk and RD2 until I can play at maximum settings with 60 fps minimum because I don't like second playthroughs.
Although I agree 4k 120hz via hdmi 2.1 on an oled with 1ms response time currently is the sweet spot until dp 2.0 hits the market. You can cap performance at 4k 120hz as well for next gen and even use the upcoming trend of dldrs where you upscale to a higher resolution than 4k like 8k and improve imagine quality even further if there is performance on the table.
 
Won't even fit in a midsized case? Between the size and the power needed for it...I don't know.
 
RTX4090 will be $1,499+ MSRP... and use a $300 PSU. Nobody really cares about the mid-sized case...
 
For arguments sake let’s assume the rumor is true and speaking solely from the gaming perspective:

If you slap a i9/R9 with a 4090 into your PC, water cool it, etc, what are you getting in return? We’ve got ray tracing. Is the next step just pushing photorealism? Seems like diminishing returns at this point or waiting on game developers/consoles to catch up. Not trying to tear on advancements because I think it’s pretty cool, but still.

Would like to hear some thoughts.

The big problem is to get ever diminishing gains we need to throw significantly more processing power at the problem. That's one reason why we've seen new features like 4k+, new AA modes, VRR, and the like rather then any significant graphical leaps for almost a decade now. We're chasing small gains. At best, you see more games start to use Ray Tracing, but that's about it; it's not like we're near the point where we can handle Ray Tracing in environments with multiple independent light sources (which gets comically expensive FAST).
 
Been chasing "4k" since 2013 and you're absolutely right in the ultra high end circles we've been talking about it coming "next gen" ever since the 780 (though it was in 2x or 3x sli back then) the 2080ti was the first time we started talking about it in single cards and while the 3080/90 did bring 4k/60 to MOST games us high end guys have moved on from 4k/60 being "the goal" to 4k/120.... So the chase continues.

I'm hoping the 40 series brings a card that can do it and I think I might actually finally be able to take a gen or two off without upgrading.

I would say a 3080Ti can handle 4k/60+ fine, though it obviously can't hit 4k/120 with consistency (at maxed settings at least). With VRR that's a bit less of an issue, but it does look like the 4000 series could be sufficient for 4k120 across the board.
 
It amazes me that they would even bother to try air cooling when you're going to spend like 2 grand for this card. Water cooling should be a STANDARD if we're talking about this much heat.
At this price, I would personally expert Liquid Nitrogen cooling as standard ;)
 
hdmi is great but it caps at 4k 120hz 4:4:4 10 bit color. If you want 4:4:4 at 10 bit color over 4k 120hz you need dp 2.0. All the current high end 4k monitors at 4k 144hz to 240hz with dp 1.4 come at picture quality compression 4:2:0. compromise.
Lastly hdmi 2.1 standard is going on almost 3 years now.
I completely agree but that's a very narrow case of use scenario. Last I checked, most PC displays are still 60Hz. Now, I realise that this won't be forever but I think that HDMI 2.1 will work just fine for over 95% of gamers for probably the next three years at least. For people who are running the big, higher-hertz displays, then yeah, you're going to need DP 2.0.
 
I completely agree but that's a very narrow case of use scenario. Last I checked, most PC displays are still 60Hz. Now, I realise that this won't be forever but I think that HDMI 2.1 will work just fine for over 95% of gamers for probably the next three years at least. For people who are running the big, higher-hertz displays, then yeah, you're going to need DP 2.0.
95%of gamers will not buy the rtx 4090 either and yet we are here😉. Enthusiast will be enthusiast.
FYI 4k 144hz displays via dp 1.4 were around more than 3 years ago. Even Nvidia had a whole line-up of big format displays with gsync at 4k 144hz so strictly for the enthusiast niche currently the next best thing is dp 2.0. Unfortunately dp 2.0 monitors are still likely a year away. Maybe we may get lucky and see concepts during CES 23. Until then I will enjoy my 4k 120hz oled via hdmi 2.1.
 
Last edited:
I completely agree but that's a very narrow case of use scenario. Last I checked, most PC displays are still 60Hz. Now, I realise that this won't be forever but I think that HDMI 2.1 will work just fine for over 95% of gamers for probably the next three years at least. For people who are running the big, higher-hertz displays, then yeah, you're going to need DP 2.0.


People do not buy GPUs to game at 60hz...
 
People do not buy GPUs to game at 60hz...
What makes you think most human eyes can see any diference starting with 24.9-30hz up?
Isn't hand faster than the eye, this is exploited since who knows when by illusionists.
I know some people are exceptions, like the ones that can hear 20khz and up, but 99% of humans don't.
 
What makes you think most human eyes can see any diference starting with 24.9-30hz up?
Isn't hand faster than the eye, this is exploited since who knows when by illusionists.
I know some people are exceptions, like the ones that can hear 20khz and up, but 99% of humans don't.
makes sense if you want your headshot maximum times that is in gaming. Most gamers prefer the opposite where you get to do maximum headshots and improve your K/d ratio. We aren't talking about an average person here on a 4090 article are we?
Update I am certain that 99 % of enthusiast who buy the 4090 class cards will have a 120hz minimum capable display.
Besides gaming scrolling on a sub 120 hz display is also much worse for production purposes. There is substantial evidence subjectively and objectively that faster refresh rate displays are better with a plateau effect pf diminishing returns the higher you go.
 
Last edited:
makes sense if you want your headshot maximum times that is in gaming. Most gamers prefer the opposite where you get to do maximum headshots and improve your K/d ratio. We aren't talking about an average person here on a 4090 article are we?
Update I am certain that 99 % of enthusiast who buy the 4090 class cards will have a 120hz minimum capable display.
Besides gaming scrolling on a sub 120 hz display is also much worse for production purposes. There is substantial evidence subjectively and objectively that faster refresh rate displays are better with a plateau effect pf diminishing returns the higher you go.
Let me break it down to you again, from 30hz up the human eye cant make any difference. If the signal and the screen are in 100% sync you cant see any diference. Just look at consoles and TV screens.
More important in a competitive FPS game is the network latency aka Ping.
If one player has 20ms and another 40ms, the one with 20 ms will get the server information refreshed first. And from here the human reflexes make the real difference.
An old CRT TV screen in the 80' and 90' was on PAL, Secam or NTSC, 25 or 30hz depending on soket used, Europe 50hz or US 60hz. I didnt feel any lag or ghosting on CRT screens for football, basket or ruggby games. That until the shlt LCD came up.
 
Let me break it down to you again, from 30hz up the human eye cant make any difference. If the signal and the screen are in 100% sync you cant see any diference. Just look at consoles and TV screens.
More important in a competitive FPS game is the network latency aka Ping.
If one player has 20ms and another 40ms, the one with 20 ms will get the server information refreshed first. And from here the human reflexes make the real difference.
An old CRT TV screen in the 80' and 90' was on PAL, Secam or NTSC, 25 or 30hz depending on soket used, Europe 50hz or US 60hz. I didnt feel any lag or ghosting on CRT screens for football, basket or ruggby games. That until the shlt LCD came up.
I understand your subjective reasoning and that is why I left the 1% and didn't say 100%.
 
95%of gamers will not buy the rtx 4090 either and yet we are here😉. Enthusiast will be enthusiast.
FYI 4k 144hz displays via dp 1.4 were around more than 3 years ago. Even Nvidia had a whole line-up of big format displays with gsync at 4k 144hz so strictly for the enthusiast niche currently the next best thing is dp 2.0. Unfortunately dp 2.0 monitors are still likely a year away. Maybe we may get lucky and see concepts during CES 23. Until then I will enjoy my 4k 120hz oled via hdmi 2.1.
Enthusiast will be enthusiast, I agree. However, enthusiasts are a tiny portion of the PC-owning public. Just look at how many people buy "brand-in-a-box" PCs like Dell, Lenovo and HP. We enthusiast builders are nowhere near the majority.
 
Enthusiast will be enthusiast, I agree. However, enthusiasts are a tiny portion of the PC-owning public. Just look at how many people buy "brand-in-a-box" PCs like Dell, Lenovo and HP. We enthusiast builders are nowhere near the majority.
All those vendors have high refresh rate monitors like HP was one of the first on market to have 1440p 240hz display a few years back. Dell had the 1st qd oled 165hz ultrawide 3440 by 1440p gaming monitor. Lenovo has a 360 hz full hd monitor as well. Even consoles have 120hz capability and console gamers aren't enthusiast nore high end pc gamers they come up the commodity market or the way the market is trending.
 
What makes you think most human eyes can see any diference starting with 24.9-30hz up?
Isn't hand faster than the eye, this is exploited since who knows when by illusionists.
I know some people are exceptions, like the ones that can hear 20khz and up, but 99% of humans don't.


You have no idea wtf you are talking about.

Back in the day, we had CRTs... that could pretty much scan at any freq up to 266Hz... (ie medical grade CRTs) we used for gaming.

Myself, would get tremendous headaches, unless the freq was higher than 80hz. Matter of fact, in most Business environments those monitors are required to have 78Hz or higher.

As for what the eye can see... <-- means YOU do not understand (one bit) what the difference between a 60hz monitor and a 144Hz monitor.

It's about FEEL.

It is why competitive gamers seek the highest frequencies, because their aim and accuracy relies on the fluid motion you feel as you move your character.

You are not sitting back watching TV at 144Hz... you are playing.
 
Last edited:
Let me break it down to you again, from 30hz up the human eye cant make any difference. If the signal and the screen are in 100% sync you cant see any diference. Just look at consoles and TV screens.
More important in a competitive FPS game is the network latency aka Ping.
If one player has 20ms and another 40ms, the one with 20 ms will get the server information refreshed first. And from here the human reflexes make the real difference.
An old CRT TV screen in the 80' and 90' was on PAL, Secam or NTSC, 25 or 30hz depending on soket used, Europe 50hz or US 60hz. I didnt feel any lag or ghosting on CRT screens for football, basket or ruggby games. That until the shlt LCD came up.


LOL... once again, in gaming you are moving things on the screen, you are not watching like you do Television. Are you really this inept at understanding movement, within a game at 60Hz vs 144Hz..?

dERP!
 
LOL... once again, in gaming you are moving things on the screen, you are not watching like you do Television. Are you really this inept at understanding movement, within a game at 60Hz vs 144Hz..?

dERP!
In real life, wave fast your hand in front of your eyes, let me know if you see all hand positions or just pieces of the motion. Human eye is not a slowmotion CCD.
Or on a PC fan stick a paint mark on a blade, spin the fan starting with 1 rpm then faster, let me know the RPM when you will not see the mark but a circle.
If you can see the movement cursive then you will see your 144,240,360hz.

To me it's all marketing bullshlt and human ego on who has the fastest LCD or GPU.
 
In real life, wave fast your hand in front of your eyes, let me know if you see all hand positions or just pieces of the motion. Human eye is not a slowmotion CCD.
Or on a PC fan stick a paint mark on a blade, spin the fan starting with 1 rpm then faster, let me know the RPM when you will not see the mark but a circle.
If you can see the movement cursive then you will see your 144,240,360hz.

To me it's all marketing bullshlt and human ego on who has the fastest LCD or GPU.
I make the same argument from the 120hz minimum refresh rate specifically on a 1 ms response time oled with around 5 ms of input lag anything superior than these specs have minimal return on investment for the hardware and premium you have to invest for that experience. Objectively the 60 hz displays have worse input lag and worse response times contributing to ghosting and motion sickness so there is that. Objectively the data correlates to subjective experiences that most gamers have anything sub 120hz contributes to less competitive advantage and more motion sickness. While I agree that there is marketing fluff that surrounds premium products with higher than 120hz displays and enthusiast level graphics I hope you didn't come to a 4090 article just to troll!
 
Back