Nvidia GeForce RTX 4090 is 78% faster than RTX 3090 Ti, hits 3.0 GHz in leaked benchmarks

midian182

Posts: 9,633   +120
Staff member
Rumor mill: Nvidia looks set to finally give the RTX 4000 series its official unveiling in 11 days. Adding to the hype train is yet another alleged leaked benchmark for what will be the flagship card of the expected three initial releases: the RTX 4090.

Screenshots of an unnamed card appeared on China's Chiphell forums, so take them with the usual pinch of salt. They were shared on Twitter by user HXL (via VideoCardz). The 3DMark Time Spy Extreme score is a massive 20,192, suggesting this is the RTX 4090, which scored 19,000 in another leaked 3DMark Time Spy Extreme graphics test in July.

Assuming the screenshots are the real deal, the score makes the RTX 4090 90% faster than the Ampere equivalent, the RTX 3090. That lines up with previous reports of the Lovelace card being around twice as fast as its predecessor. It also makes the RTX 4090 78% faster than the current-gen RTX 3090 Ti flagship. Of course, synthetic benchmarks don't always totally reflect real-world performance, but the numbers are still promising.

The leak also shows the card reaching a 3,015 MHz frequency. For comparison, the RTX 3090 Founders Edition boosts to 1,695 MHz. According to the leaker, it has a default TDP of 450W, though it is designed for 600W to 800W. That lines up with a rumor in June claiming the RTX 4090 could have a power limit of 800W but a lower default TDP.

The card's overall Time Spy Extreme score is lower than expected, but that's down to the PC the RTX 4090 is being tested in, which includes a Core i5-12400F processor.

The leaker also mentions that the card is air-cooled and very large, too big to fit into a mid-sized tower. That's something we expected, given the RTX 4090's reported performance and power consumption.

Nvidia's GeForce Twitter account yesterday announced ProjectBeyond. We don't know any details yet, though it seems almost certainly related to an RTX 4000 announcement coming at GTC on September 20.

Permalink to story.

 
It amazes me that they would even bother to try air cooling when you're going to spend like 2 grand for this card. Water cooling should be a STANDARD if we're talking about this much heat.
 
The 4080 has only 76 SM units compared to the 128 units of the 4090, so if this is representative of gaming performance, the 4080 will likely not be anywhere close to 2X the 3080 (as some rumors have suggested), instead is might be 20-30% or just above 3090 Ti level performance. That is of course, if this synthetic benchmark represents real world gaming performance. I read that there is a pretty major reconfiguring of the FP32 + INT cores with Lovelace which may not be represented well in a synthetic benchmark, but could have a bigger impact on gaming performance. I guess we will have to wait a few more weeks to find out.
 
Last edited:
Perhaps Nvidia decided to do the Intel method of showing a lower TDP/ PL1, and have a secret PL2 that hits 600W if there is sufficient thermal headroom. In any case, I am skeptical that the RTX 4090 will perform 2x better than the fastest Ampere card now. Doubling of cores don’t scale proportionately, and while they are clocked faster, getting a 100% improvement in performance may not be possible in all cases.
 
The 4080 has only 76 SM units compared to the 128 units of the 4090, so if this is representative of gaming performance, the 4080 will likely not be anywhere close to 2X the 3080 (as some rumors have suggested), instead is might be 20-30% or just above 3090 Ti level performance. That is of course, if this synthetic benchmark represents real world gaming performance. I read that there is a pretty major reconfiguring of the FP32 + INT cores with Lovelace which may not be represented well in a synthetic benchmark, but could have a bigger impact on gaming performance. I guess we will have to wait a few more weeks to find out.

Well, this makes sense when you expect the rest of the lineup to target sane card total board power

I mean, the 80-series being within 10 percent of the ti series (now renamed 3090) has never happened before. so now that we're off that **** Samsung process, we can finally bump up the clocks on the New titan into the stratosphere ( and be much more laid-back with every other card)
 
For arguments sake let’s assume the rumor is true and speaking solely from the gaming perspective:

If you slap a i9/R9 with a 4090 into your PC, water cool it, etc, what are you getting in return? We’ve got ray tracing. Is the next step just pushing photorealism? Seems like diminishing returns at this point or waiting on game developers/consoles to catch up. Not trying to tear on advancements because I think it’s pretty cool, but still.

Would like to hear some thoughts.
 
For arguments sake let’s assume the rumor is true and speaking solely from the gaming perspective:

If you slap a i9/R9 with a 4090 into your PC, water cool it, etc, what are you getting in return? We’ve got ray tracing. Is the next step just pushing photorealism? Seems like diminishing returns at this point or waiting on game developers/consoles to catch up. Not trying to tear on advancements because I think it’s pretty cool, but still.

Would like to hear some thoughts.

I definitely am not interested in walking into the uncanny valley of gaming. I'd much rather use a GPU to run more advanced AI or to generate worlds rather than just render them. Why isn't Nvidia using its expertise in these areas to let developers do this with consumer games? If they are I've not heard about it yet.
 
Mark my words, Ada will be insane on all chapters, prices included.
Listen to leather boy speech, they will scale prices with current gen still selling for at least one quarter after Ada release. Ampere will not have anymore discounts for at least 6 months from now.
 
Way too big to fit into a mid-sized tower? I’m little confused about this as the video card is bigger than mid size tower? Doing search mid size tower shows that these as mid towers https://graphicscardhub.com/wp-content/uploads/2017/04/mid-tower-case.jpg or http://www.suntekpc.com/image/case-tower-atx-xxx-9689d-detail.jpg

It bigger than that? There are many prebuilt desktop computers from HP, Acer and Dell so on that are mid tower and some are just little bid shorter than mid tower like this.

https://www.bhphotovideo.com/images...0k_16gb_256gb_ssd_rtx2080_windows_1527321.jpg

What do you need a full size tower like http://www.suntekpc.com/image/case-tower-atx-xxx-777b-detail.jpg

This is crazy if the new video cards are going to be massive bigger than any video card before.
 
For arguments sake let’s assume the rumor is true and speaking solely from the gaming perspective:

If you slap a i9/R9 with a 4090 into your PC, water cool it, etc, what are you getting in return? We’ve got ray tracing. Is the next step just pushing photorealism? Seems like diminishing returns at this point or waiting on game developers/consoles to catch up. Not trying to tear on advancements because I think it’s pretty cool, but still.

Would like to hear some thoughts.
Number one thing I'm thinking is a space heater and a power bill to reflect it, my i9 and 3080 already pull 500+ watts while gaming and it makes my room very warm, very fast, if there is no natural wind or AC running.

Better graphics too of course, but I would tend to agree with you on the front of once again waiting for the games to catch up and use theses resources in a meaningful way.

I feel my 500+ watts is already hard to justify just to play video games, pushing that to near 1000 watts is of no interest in the pursuit of higher refresh rates and resolution?
 
Way too big to fit into a mid-sized tower? I’m little confused about this as the video card is bigger than mid size tower? Doing search mid size tower shows that these as mid towers https://graphicscardhub.com/wp-content/uploads/2017/04/mid-tower-case.jpg or http://www.suntekpc.com/image/case-tower-atx-xxx-9689d-detail.jpg

It bigger than that? There are many prebuilt desktop computers from HP, Acer and Dell so on that are mid tower and some are just little bid shorter than mid tower like this.

https://www.bhphotovideo.com/images...0k_16gb_256gb_ssd_rtx2080_windows_1527321.jpg

What do you need a full size tower like http://www.suntekpc.com/image/case-tower-atx-xxx-777b-detail.jpg

This is crazy if the new video cards are going to be massive bigger than any video card before.

Strictly saying it's "too big" for a mid-sized tower can be a bit misleading. There are a lot of different sized/designed cases out there that fall into the mid-sized tower range. I'm sure some of them could house one of these beasts.

Take my CM HAF XB Evo case - it's not a mid-tower, but it can hold a decently sized GPU without too many constraints. Max GPU length is 13.1". Depending on what else you may have inside, it could restrict how long of a GPU you can use, but there haven't been many GPUs that wouldn't work in the case except for the high-end AIBs for the 3090/3090Ti cards that are 13.5-14.5"

If the 4090 is as long as those high-end AIB 3090/Ti cards, they'll be pushing 14.5" (36CM) or longer and that's just nuts. If this is the case and I had to have a 4090, I'd have to move back to my full tower that's tucked away in the storage space in my basement.
 
I'd much rather use a GPU to run more advanced AI or to generate worlds rather than just render them. Why isn't Nvidia using its expertise in these areas to let developers do this with consumer games? If they are I've not heard about it yet.
Here's two links for you to dig into:



Neural radiance fields are probably going to be the most useful for content creators in game development:


Still early days for all of this but the research is there.
 
Strictly saying it's "too big" for a mid-sized tower can be a bit misleading. There are a lot of different sized/designed cases out there that fall into the mid-sized tower range. I'm sure some of them could house one of these beasts.

Take my CM HAF XB Evo case - it's not a mid-tower, but it can hold a decently sized GPU without too many constraints. Max GPU length is 13.1". Depending on what else you may have inside, it could restrict how long of a GPU you can use, but there haven't been many GPUs that wouldn't work in the case except for the high-end AIBs for the 3090/3090Ti cards that are 13.5-14.5"

If the 4090 is as long as those high-end AIB 3090/Ti cards, they'll be pushing 14.5" (36CM) or longer and that's just nuts. If this is the case and I had to have a 4090, I'd have to move back to my full tower that's tucked away in the storage space in my basement.

It could also be very wide, maybe using 120mm (or larger) fans and increased heatsink width. Increasing fan size is a relatively cheap way to substantially increase airflow over the heatsink without increasing noise. It could also be some 4 slot monster with a crazy tall heatsink and 25mm thick fans.
 
I definitely am not interested in walking into the uncanny valley of gaming. I'd much rather use a GPU to run more advanced AI or to generate worlds rather than just render them. Why isn't Nvidia using its expertise in these areas to let developers do this with consumer games? If they are I've not heard about it yet.

Well, they're good for both workloads, but if you think about it, a world generated by AI still has to be rendered!
 
Well this is how you do a paper launch! Build an actual working product that's totally unpractical to run for 99% of people, that you probably won't be able to buy, but sends the tech media into a frenzy.

It looks like they may have rowed back on higher power limit targets. If that applies across the stack one wonders how that would impact performance targets?

 
While that's wicked fast, my current setup with an RTX 3080Ti handles every maxed-out game I throw at it - is there coming software that will be able to actually take advantage of this speed in say, the next four years?
 
1500 W PSU
New Full size case
Major water-cooling for the CPU and the GPU
New motherboards (to handle new Intel &AMD CPUs)
New DDR5 RAM,
etc

All this is for what exactly??

Aside from professionals - and beside bragging rights from the spoiled rich kids - what can these new expensive toys do that we can't do with current generation of CPUs and GPUs??
 
While that's wicked fast, my current setup with an RTX 3080Ti handles every maxed-out game I throw at it - is there coming software that will be able to actually take advantage of this speed in say, the next four years?

Do you play competitive games? Or is max lush at 60 frames your goal?
 
What's up with these endless comments about lovelace consumption and prices? Is that some kind of psychological fetish training in order "to embrace the pain"? Or Ngreedia farmbots trying to make a mood of "unique product"?
 
I am gonna compare them based on price/power. Hopefully techspot makes a good comparison when 4000 is released.
 
Back