Nvidia GeForce RTX 4090 is 78% faster than RTX 3090 Ti, hits 3.0 GHz in leaked benchmarks

While that's wicked fast, my current setup with an RTX 3080Ti handles every maxed-out game I throw at it - is there coming software that will be able to actually take advantage of this speed in say, the next four years?
I heard people say it barely handles some games in 4k. If 3080ti is just barely enough for all 4k games, then we arent there yet.
 
Just out of curiosity, how many watts is too much for a typical household 15/20 amp circuit?
 
While that's wicked fast, my current setup with an RTX 3080Ti handles every maxed-out game I throw at it - is there coming software that will be able to actually take advantage of this speed in say, the next four years?
A fair question. Most of the biggest budget games are designed around the capabilities of the current console generation. As you scale PC GPU power well past that, what you're most likely to enable is higher framerates, more post-processing, more insurance against inefficient ports, and, for a handful of people with 8K or greater TVs/monitors, a chance to try those higher resolutions. Personally I find a certain amount of all the above helpful, although diminishing returns apply (especially later in a console generation where PC GPUs can be even farther ahead.)



 
While that's wicked fast, my current setup with an RTX 3080Ti handles every maxed-out game I throw at it - is there coming software that will be able to actually take advantage of this speed in say, the next four years?
Stop thinking games then. Think simulation software, like ANSYS Live, or other Langrangian solvers. Those have been using multiple high speed GPUs for a few years now. They will definitely take advantage of faster GPUs, especially with more RAM. RAM on GPUs determines the number of elements in a model, and therefore resolution. You need that resolution for meshless flow models in sheet metal assemblies.
 
I heard people say it barely handles some games in 4k. If 3080ti is just barely enough for all 4k games, then we arent there yet.
DLSS will keep the 3080 going strong for a long time.

But since 120-144Hz 4K monitors are now getting more common, 4K60p won't cut the mustard. I skipped this gen entirely and hopefully we will see 4K120p in AAA titles with RT (with DLSS/FSR).
 
It amazes me that they would even bother to try air cooling when you're going to spend like 2 grand for this card. Water cooling should be a STANDARD if we're talking about this much heat.
I prefer hybrids myself but the leak was interpreted to have 65* Celsius on air and 30 degrees Celsius ambient which seems fake but
techpowerup has a great chart for all the 3090tis and only one card comes up sub 70 degrees Celsius and its the air cooled 3090ti by Zotac Amp extreme at 68 * Celsius on the core and low to mid 80s for memory and hotspot FYI so I cay its possible to have a decent air cooled card but it will come with a huge heatsink.
Everyone should watch out for independent reviews for long gaming sessions to make sure the cards don't throttle and not fall for a 5 minute benchmarks!
 
I prefer hybrids myself but the leak was interpreted to have 65* Celsius on air and 30 degrees Celsius ambient which seems fake but
techpowerup has a great chart for all the 3090tis and only one card comes up sub 70 degrees Celsius and its the air cooled 3090ti by Zotac Amp extreme at 68 * Celsius on the core and low to mid 80s for memory and hotspot FYI so I cay its possible to have a decent air cooled card but it will come with a huge heatsink.
Everyone should watch out for independent reviews for long gaming sessions to make sure the cards don't throttle and not fall for a 5 minute benchmarks!

Zotac's AMP! cards seem to have good cooling - massive heatsinks. My 980Ti AMP! Omega rarely broke 70C.
 
Zotac's AMP! cards seem to have good cooling - massive heatsinks. My 980Ti AMP! Omega rarely broke 70C.
its true but the closest gpus we have to next gen TDP are the 3090tis the 980ti has 250 watts tdp. The 4090 has rumored 450 watt tdp. When the 3090tis came out I said that Nvidia is preparing the market for higher tdp cards and higher price premium so the 3090tis are a direct forshadowing for next gen flagship.
 
ProjectBeyond: Beyond the bottom 90% to buy and run thanks to 2000 USD price, limited availability, scalpers running wild, 500W/h energy consumption in the middle of a world energy crisis. Running a gaming PC in Germany with a 500 W/h GPU for 2 hours costs 1 Euro which is 1 USD thanks to the EU.

edit: so that is 30 EUR/USD a month averaged over the 12 months if average gaming per day is 2 hours lol. Suddenly, NV cloud tiers look attractive as they get you rid of the energy costs (CPU runs at a usual load and GPU zero load)
 
Strictly saying it's "too big" for a mid-sized tower can be a bit misleading. There are a lot of different sized/designed cases out there that fall into the mid-sized tower range. I'm sure some of them could house one of these beasts.

Take my CM HAF XB Evo case - it's not a mid-tower, but it can hold a decently sized GPU without too many constraints. Max GPU length is 13.1". Depending on what else you may have inside, it could restrict how long of a GPU you can use, but there haven't been many GPUs that wouldn't work in the case except for the high-end AIBs for the 3090/3090Ti cards that are 13.5-14.5"

If the 4090 is as long as those high-end AIB 3090/Ti cards, they'll be pushing 14.5" (36CM) or longer and that's just nuts. If this is the case and I had to have a 4090, I'd have to move back to my full tower that's tucked away in the storage space in my basement.

My ~$60 mid tower can take 42cm length GPU. Most actually can. My current card is 31cm long and it's 1070Ti.
 
1500 W PSU
New Full size case
Major water-cooling for the CPU and the GPU
New motherboards (to handle new Intel &AMD CPUs)
New DDR5 RAM,
etc

All this is for what exactly??

Aside from professionals - and beside bragging rights from the spoiled rich kids - what can these new expensive toys do that we can't do with current generation of CPUs and GPUs??
You can say the exact same about the generation before the current generation and go on and on and on. Your point being ?
 
DLSS will keep the 3080 going strong for a long time.

But since 120-144Hz 4K monitors are now getting more common, 4K60p won't cut the mustard. I skipped this gen entirely and hopefully we will see 4K120p in AAA titles with RT (with DLSS/FSR).
Even a 4090 will be having issues handling a 4k resolution at high FPS sooner than most expect. We have heard "this high end" will finally be able to run 4k well, over and over, gen after gen, fact is there is a long way to go even with a 4090. It just dosnt scale linear like that, 4 was is a gigantic step up from 2560x1440, there is no gigantic step up in raw GPU performance gen over gen.
 
Power usage should go down, not up lol
Sometimes it does, sometimes is dosn't, it depends on a number of factors. Chip design, process nodes, how hard the competition is, how far the GPU designers are willing to go with top designs. There is nothing wrong in making a 2000watt GPU as long as we have the choice of low and mid powered GPU's to.
 
As allways I'm looking for at least 50% or more perf for the same money, when upgrading.
This round I will go with any vendor that can give me 3070ti or 6800 perf for 250-350$ @200W max.
My 2060 still good for now, but some games are using almost 6GB vram at 2560x1080 and a 8GB will have to work for the next couple of years.
Honestly I don't belive we will get this price/perf ratio this gen.
 
what are you getting in return? We’ve got ray tracing.

A Ha ha ha. No. Just no. No we do not "got ray tracing".

What we have now is a cut down minimalist attempt at doing some real-time shadows from a few point sources in most games. Or some reflection added in some others in minimal areas like puddles. 95% of what you still see in "ray-traced" games is still old style pre-rendered shadow maps and raster lighting.

We are nowhere near the quality of what traditional non-realtime rendering provides in terms of quality. When this takes minutes to hours per frame to render - call me when we can do this in 1/60 second on some future GPU - only then can you say "We’ve got ray tracing"
 
And why exactly do you believe scalping is going to be any problem this generation, literally nothing indicates that it will be a problem
key word initially. Unfortunately it's the new norm for all hardware or goods now. The good part is there will be a significant sell off for the high end like 3090tis and 3090s so it's not all gloom and doom. Questions?
 
As allways I'm looking for at least 50% or more perf for the same money, when upgrading.
This round I will go with any vendor that can give me 3070ti or 6800 perf for 250-350$ @200W max.
My 2060 still good for now, but some games are using almost 6GB vram at 2560x1080 and a 8GB will have to work for the next couple of years.
Honestly I don't belive we will get this price/perf ratio this gen.

Yeah, that means you won't be buying anything for a few years... probably pick up a used 6900xt some time in the next 3 years..!
 
Last edited:
key word initially. Unfortunately it's the new norm for all hardware or goods now. The good part is there will be a significant sell off for the high end like 3090tis and 3090s so it's not all gloom and doom. Questions?
Yeah, why again...? Miners are not interested in gaming cards anymore, when there are mining cards out by nVidia...
 
Back