The RTX 3090 Ti is still a monster when limited to 300W

midian182

Posts: 9,741   +121
Staff member
In brief: Power efficiency is not the first thing that comes to mind when one thinks of the RTX 3090 Ti, yet it seems undervolting the card really can make it less power-hungry than some of Nvidia's and AMD's best while maintaining a performance advantage.

Igor's Lab conducted a test on the RTX 3090 Ti that involved undervolting the card, which has a 450W power consumption, down to 300W. Igor achieves this by adjusting the power limit to 300W using OC tool MSI Afterburner and adjusting the VF curve. He duplicated the curve of the RTX A6000, which also has a 300W max power draw and features the same GA102 die—though it uses GDDR6 instead of GDDR6X memory.

Igor tested the MSI Suprim X RTX 3090 Ti against the same brand of RTX 3080 10GB/12GB, RTX 3080 Ti, and RTX 3090 cards. He used MSI'S Gaming X versions of the AMD RX 6800 XT and RX 6900 XT.

Looking at the 4K benchmarks, we see that the standard RTX 3090 Ti manages 107.4 fps, dropping just over ten fps to 96.3 fps at 300W.

Image credit: Igor's Lab

The RTX 3090 Ti pulls a maximum of 314W in the tests, while the 3080 Ti reached 409W, yet Igor concludes that the two cards are roughly equal in performance. The undervolted card manages to beat the Radeon RX 6900 XT (359W), RTX 3080 12GB (393W), RTX 3080 10GB (351W), and the Radeon 6800XT (319W).

Of course, this is just a demonstration of what's possible with the RTX 3090 Ti. The card costs around $2,000, while the RTX 3080 Ti is roughly $1,200+, and it's going to take a very long time and a lot of use to recoup the $800 difference via power saving. Still, it's certainly an interesting experiment.

Make sure to check out our review of the RTX 3090 Ti, which we called fast and dumb.

Permalink to story.

 
Nice, I wonder what the temp differenece is as well.
I already do this with my 1080ti.(250w)
Undervolting uses a max 125-150w which gives me a highend 1080 performance at max 60c on air.
Fans at 30%
 
Yeah this is nothing new, and the reason why people "undervolt".
You will often see higher performance, because no throttling happends.

I do it on my 3080 Ti and my 3070.

My 3070 sits around 200 watts and performs like a stock 3070 Ti at +100/+1000 on the GPU/MEM.

My 3080 Ti uses ~325 watts and performs like 3090 pretty much with +125 on the GPU (memory OC makes no difference and pointless since watt usage goes up but performance says where it is - because bandwidth is plenty at stock) :laughing:
 
Last edited:
Undervolting is the way. My 3060Ti performs higher than stock while only drawing 155W instead of 200W.
No throttling, much higher sustained clock speeds, cooler and most importantly (to me) inaudible noise while gaming on max performance.
 
Undervolting is the new overclocking. Even a slight undervolt can help reduce temperatures and power consumption and it increases performance in some cases.
 
Speaking of temperature difference, I remember reading a claim that if a chip is run at over 50 degrees Celsius, that shortens its lifespan. What would be the power draw of a 3090 Ti if it was set to limit its speed to make that its temperature limit?
 
Speaking of temperature difference, I remember reading a claim that if a chip is run at over 50 degrees Celsius, that shortens its lifespan. What would be the power draw of a 3090 Ti if it was set to limit its speed to make that its temperature limit?
I call bs on that, I've seen 16 year P4's running at 80-90c no issues. The issue is all the other parts on a graphics card that can go, 70< is perfectly fine.
 
Speaking of temperature difference, I remember reading a claim that if a chip is run at over 50 degrees Celsius, that shortens its lifespan. What would be the power draw of a 3090 Ti if it was set to limit its speed to make that its temperature limit?

Chips can easily withstand 80-90-100c peaks, even higher... Voltage and especially high spikes is the chip killer not temps, we have built in protection for that and have had for years, they will simply throttle or shut down if temp is an issue ie faulty cooler

I remember my AMD Thunderbird 1400 caught fire back in the days, when I ran it without cooler for fun 🤣 Smoke, then fire.
 
..but when all is said and done, a $2000 card is still only about 20% faster than my $650 3080 FE.

What else is new, you always pay a huge premium for top-end stuff. I bought 3080 on release for 649, sold it for 1500 and bought 3080 Ti for 1500 last year.

20% more performance is alot when we speak about 4K+ gaming tho, because none of the current cards feel overpowered for this task (in AAA games on high settings). I would say 3080 Ti is bare minimum for this, or better yet, 3090 series.

In 1440p 3060 Ti is "only" 20-25% slower than 3080 :laughing: Is 20% alot? Yes and no.

GA102 is really meant for 4K gaming and 3080 with it's 10GB is on the low-end here. Both in terms of raw power but also when it comes to VRAM, especially if you plan to keep it for a few years. I'd not recommend a 3080/10GB for 4K gamers that is for sure. 4000 series is coming in 4-6 months and I bet even 4060 Ti will beat 3080. 5nm meaning almost double core count + higher clockspeed.

I have 3080 Ti and 3070 and the difference at 1440p is really not that huge really. Going to 4K makes the performance difference much bigger. GA102 first can spread its wings at 4K really. Tons of cores and high bandwidth memory shines at high pixel count.
 
Last edited:
$1400 for a 3090 is a huge premium. $2000 for a Ti, is nothing but an outright piss take of a premium.
Not if you need the VRAM, 3090 sold well because of this, not because of the gaming performance. Tons of people used it for professional work.

3090 series replaced Titan class. Titan were also terrible perf/value gaming cards, but could be used for work on the side.
 
Last edited:
Nice, I wonder what the temp differenece is as well.
I already do this with my 1080ti.(250w)
Undervolting uses a max 125-150w which gives me a highend 1080 performance at max 60c on air.
Fans at 30%
I underclock and undervolt my gpu's too. Saves power and heat without really degrading performance much.
 
Sure for pro level work, the 24GB of VRAM justified the purchase, but Nvidia primarily marketed it at the gaming crowd, who fell for it hook, line & sinker.
You imply deception on the part of NVidia? Why? Do you think including 24GB of VRAM is somehow a scam? And if so, how so?

$1400 for a 3090 is a huge premium.
Where are you seeing that price? You need to add another $1000 to it and you'll be closer to reality. Prices are dropping, but they haven't dropped THAT much...
 
What else is new, you always pay a huge premium for top-end stuff. I bought 3080 on release for 649, sold it for 1500 and bought 3080 Ti for 1500 last year.

20% more performance is alot when we speak about 4K+ gaming tho, because none of the current cards feel overpowered for this task (in AAA games on high settings). I would say 3080 Ti is bare minimum for this, or better yet, 3090 series.

In 1440p 3060 Ti is "only" 20-25% slower than 3080 :laughing: Is 20% alot? Yes and no.

GA102 is really meant for 4K gaming and 3080 with it's 10GB is on the low-end here. Both in terms of raw power but also when it comes to VRAM, especially if you plan to keep it for a few years. I'd not recommend a 3080/10GB for 4K gamers that is for sure. 4000 series is coming in 4-6 months and I bet even 4060 Ti will beat 3080. 5nm meaning almost double core count + higher clockspeed.

I have 3080 Ti and 3070 and the difference at 1440p is really not that huge really. Going to 4K makes the performance difference much bigger. GA102 first can spread its wings at 4K really. Tons of cores and high bandwidth memory shines at high pixel count.

The reference 3080 is nearly 40% faster than the 3060Ti at 1440p. link

So, I'm not sure where you're finding your info, but even TS shows the same results for 1440p between a 3060Ti and 3080. link

As for a 3080Ti and 3070, the difference is almost as large as the 3060Ti vs the 3080. You're going to see a 30-35% performance increase going from the 3070 to the 3080Ti at 1440p or 4k.
 
Last edited:
And your point would be?
That for gaming purposes, 24GB of VRAM was, and still is, completely unnecessary in 2022, never mind in 2020 when the cards launched.

Sure, if you're part of the 0.001% of gamers who want to run Skyrim/Flight Simulators with 50 mods installed and 8K texture packs, you would feel like $1400 was well spent...I suppose.
 
The reference 3080 is nearly 40% faster than the 3060Ti at 1440p. link

So, I'm not sure where you're finding your info, but even TS shows the same results for 1440p between a 3060Ti and 3080. link

As for a 3080Ti and 3070, the difference is almost as large as the 3060Ti vs the 3080. You're going to see a 30-35% performance increase going from the 3070 to the 3080Ti at 1440p or 4k.

Check 1440p difference with newest drivers and updated games selection
3080 is nowhere near 40% faster than 3060 Ti in 1440p.

Most will be CPU limited at 1080p and 1440p using high-end GPUs. I know I am in most games and I have both 3070 and 3080 Ti ;) 4K/UHD removes the CPU bottleneck, almost completely, meaning the difference increases.

GA102 first really shine at 4K+ sadly 3080/10GB lacks the VRAM to age well in this res. 3090 will age far better than 3080, because 12-16GB is the VRAM you should have for 4K gaming going forward, 10GB will be enough in _some games_ but not all. 24GB will never be needed for gaming until 3090 is obsolete anyway, but as I said earlier; Some use it for work. Nvidia did not force gamers to buy 3090...

6GB is fine for 1080p.
8GB is fine for 1440p.
12GB is fine for 2160p.

Today and for a few years into the future. 10GB was and is on the low side for a 4K capable GPU without using DLSS or FSR. WITH those features enabled, even most 8GB cards will do 4K gaming just fine too.

I use 3070 in my HTPC and I have played Elden Ring on my OLED TV at locked 60 at native 4K/UHD. VRAM usage 4.5GB or so.
 
Last edited:
12GB is fine for 2160p.

I have played just about every top game released since getting my 3080, at max settings, on my 4K monitor, and not once has 10GB VRAM been the limiting factor, not once.

Even the likes of Cybepunk & Watchdogs Legion with every setting maxed, including ray tracing @4K, top out at between 8.5GB to 9.25GB used.

Raster performance & memory bandwidth drops the framerate long before running out of VRAM does.
 
I have played just about every top game released since getting my 3080, at max settings, on my 4K monitor, and not once has 10GB VRAM been the limiting factor, not once.

Even the likes of Cybepunk & Watchdogs Legion with every setting maxed, including ray tracing @4K, top out at between 8.5GB to 9.25GB used.

Raster performance & memory bandwidth drops the framerate long before running out of VRAM does.
You must have used DLSS then, because a 3080 won't max those games with RT at native 4K unless you like 30-45 fps with drops below 30 fps

Not even 3090 Ti does solid 60 in Cyberpunk at 4K Maxed out with RT

In TPU and TechSpot testing a 3080 gets like 35 fps average, meaning dips way below 30 fps at times, not exactly what I personally would consider playable

I will replace my 3080 Ti with a 4080 in about 6 months for proper 4K gaming, which for me is, 60 fps minimum, pref ~100 fps
 
And your point would be?

I think the point it 24GB of VRAM has to not really been beneficial when gaming. And even though it helps in some professional work its a gaming GPU is doesn't come with Pro drivers like NV pro line up does.

So that card is made for people that just have money to burn and to increase their margins. The 3080 12GB and 3080 Ti are both much smarter buys for the intended market. But most people that own 3090's will apply whatever mental gymnastics they need to defend that purchase.

And none of the current generation cards are that great for 4k since you need to use upscaling to have RT on to get playable framerate. The next gen coming out will finally give you 4k 60hz without having to comprise.
 

Check 1440p difference with newest drivers and updated games selection
3080 is nowhere near 40% faster than 3060 Ti in 1440p.

So you're saying that updated drivers has made the 3060Ti much faster and closer to 3080 performance, but the new drivers haven't made any difference for the 3080? Huh....

I don't see the 3060Ti on some of the newer game benchmarks from TPU, but the 3060Ti is roughly 10% faster than the 2080 on average.

If that's true, even if it's 15% faster than the 2080, the 3060Ti still around 35-40% behind the 3080.

We'll take Total War: Warhammer III (pic)
1440p
RTX 2080 is 45.5fps
RTX 3080 is 73.7fps
RTX 3060Ti 10% over the 2080 = 50.1fps
That's a 47% difference between the 3060Ti and 3080
Even if the 3060Ti is 15% faster than the 2080 = 52.3fps
That's 41% difference between the 3080 and 3060Ti.


Next we'll look at Dying Light 2 (pic)
1440p
RTX 2080 is 62.8fps
RTX 3080 is 111.0 fps
RTX 3060Ti 10% over the 2080, so = 69.1fps
That's a 60% difference.
Even if it's pulling the same fps as the 6700XT at 80.2fps
That's still a 38% difference between the 3060Ti and the 3080.


How about God of War (pic) where they did use the 3060Ti as part of the testing?
3060Ti = 69.8fps
3080 = 104.7fps
That's a 50% difference between the two.

Benchmark tests used will also give different results between sites you look at. As it stands, the 3080 is still 30-40% faster, on average, over the 3060Ti - regardless of what game or resolution you're using.
 
Back