Asus TUF Gaming GeForce RTX 3080 OC Review

It's interesting to see that, despite Nvidia's clever design and engineering with the bespoke PCB structure, new connector, and twin fan layout, nothing really beats big chunks of metal and lots of fans.

I know the heat released is exactly the same, but I'll take a cooler chip any day of the week.
 
It's interesting to see that, despite Nvidia's clever design and engineering with the bespoke PCB structure, new connector, and twin fan layout, nothing really beats big chunks of metal and lots of fans.

It was just a gimmicky design the whole time.
 
I have a pre-order in on the Asus TUF 3090 non-OC, and it's physically pretty much identical to the 3080 version. Temperatures should be a hair higher, and gaming performance only 10% higher until drivers optimize for the ridiculous shader count better, but otherwise basically the same card.

I'm hoping the 3090 version exhibits the same positive traits of cooler running over Founders Edition that the 3080 has. I do know that I'm am excited to have snagged the TUF version, even if the non-OC.
 
It's interesting to see that, despite Nvidia's clever design and engineering with the bespoke PCB structure, new connector, and twin fan layout, nothing really beats big chunks of metal and lots of fans.

I know the heat released is exactly the same, but I'll take a cooler chip any day of the week.

Did you compare temps inside the case ? If I remember correctly, the FE exhausts 50% of the hot air from the case, not sure if Asus‘ card does this ?

If not, wouldn‘t that mean increased case temps over the FE as the heat needs to go somewhere ?
 
Did you compare temps inside the case ? If I remember correctly, the FE exhausts 50% of the hot air from the case, not sure if Asus‘ card does this ?

If not, wouldn‘t that mean increased case temps over the FE as the heat needs to go somewhere ?
You'd have to ask Steve if he did. The Asus is, of course, a traditional 'cook-your-case' graphics card, whereas Nvidia's design is roughly half and half, as you pointed out.

It would be a good exercise to use a thermal camera to monitor the difference in air temperatures coming from the two fan exhausts to see which of the two is expelling the most heat.

Personally, I prefer the FE design over the Asus one - it's more compact, it looks nicer (obviously a very subjective thing) and I appreciate the fan design from an engineering perspective. However, I'd take the hit on case temperatures for lower chip temperatures every time.
 
So far among the several AIB cards I've seen, it seems there isn't much OC headroom left with the power limits in place. That's disappointing, although with standard power draw running consistently at well over 300 watts, allowing something significantly higher might be...fun. Lol
 
Is there a plan to review Red Dead Redemption 2? I would really like to see how 3080 is able to handle overall R* client + RDR2 protection and if it is able to reach 80-100FPS on Ultra (which would be huge).
 
Chip made by Samsung, so don't expect to overclock like TSMC's 12FFN.
The Vega 10 chip was technically made by GlobalFoundries, using licenced libraries from Samsung. But you're right in that it wasn't the best for overclocking.
 
Did you compare temps inside the case ? If I remember correctly, the FE exhausts 50% of the hot air from the case, not sure if Asus‘ card does this ?

If not, wouldn‘t that mean increased case temps over the FE as the heat needs to go somewhere ?

It makes no difference to case thermals that we could measure.

I tested the 3950X with an AIO and with the NH-D15, with both cards saw the exact same CPU temperatures.
 
Last edited:
Can you undervolt these to allow higher boost whilst keeping under the power cap? You don't seem to have a boost frequency chart to see whether the power limit is preventing you achieving your OC speeds during benches, and I wonder with the power draw of this card whether it is one of those cards that improves performance with an undervolt.
 
The non-OC is listed for $700, the OC version you reviewed is listed for $750. B&H is selling both for even more, $780 for the non-OC and $800 for the OC version, so I won't list them here.

OC-version:
ASUS store - https://store.asus.com/us/item/202009AM160000001/
Newegg - https://www.newegg.com/asus-geforce-rtx-3080-tuf-rtx3080-o10g-gaming/p/N82E16814126452

Non-OC version:
ASUS store - https://store.asus.com/us/item/202009AM150000004/
Newegg - https://www.newegg.com/asus-geforce-rtx-3080-tuf-rtx3080-10g-gaming/p/N82E16814126453
 
How can you as a professional self respected reviewer not see the bottleneck that this card gets with not using a better cpu like the 10900k? I mean, why should I read a review that shows 90% of a card potential??? Totally an amd fan, but this doesn't mean we shouldn't give credit where it is due and simple fact is that intel CPUs are needed to show the full potential of the 3080, so using an amd cpu in your reviews is a bad decision and detracts from giving your audience quality results. If that is fine by you, then well, nothing more to say.
And please don't come saying that the gaming results are the same between Intel and AMD cause they just are not. There are multiple reviews that show 10%+ difference at 1080p and 5-10% at 1440p in favor to intel
 
How can you as a professional self respected reviewer not see the bottleneck that this card gets with not using a better cpu like the 10900k? I mean, why should I read a review that shows 90% of a card potential??? Totally an amd fan, but this doesn't mean we shouldn't give credit where it is due and simple fact is that intel CPUs are needed to show the full potential of the 3080, so using an amd cpu in your reviews is a bad decision and detracts from giving your audience quality results. If that is fine by you, then well, nothing more to say.
And please don't come saying that the gaming results are the same between Intel and AMD cause they just are not. There are multiple reviews that show 10%+ difference at 1080p and 5-10% at 1440p in favor to intel
Because the Patreons of "Hardware Unboxed", which the above is a transciption of, voted for the AMD platform (and he that pays the piper calls the tune). Anyway it doesn't matter when you are bench marking unless you are severely bottlednecked by the CPU (e.g. at 1080p and below) - because the point of GPU bench-marking is to compare the GPUs whilst keeping all other things equal. These are not system benchmarks or CPU benchmarks.
 
Last edited:
It chews the 2080Ti and spits out, especially considering the the ridiculous price the 2080Ti was asked for.

And it pays to wait for seeing more 3rd party boards to come out and compare rather than jumping to buy the first card that's released.
 
How can you as a professional self respected reviewer not see the bottleneck that this card gets with not using a better cpu like the 10900k? I mean, why should I read a review that shows 90% of a card potential??? Totally an amd fan, but this doesn't mean we shouldn't give credit where it is due and simple fact is that intel CPUs are needed to show the full potential of the 3080, so using an amd cpu in your reviews is a bad decision and detracts from giving your audience quality results. If that is fine by you, then well, nothing more to say.
And please don't come saying that the gaming results are the same between Intel and AMD cause they just are not. There are multiple reviews that show 10%+ difference at 1080p and 5-10% at 1440p in favor to intel

You're very poorly researched. The 3950X is on average 5% slower than the 10900K at 1440p in our 14 game sample when using the RTX 3080. It was also 4% slower with the RTX 2080 Ti.

TechPowerUp found the RTX 3080 to be 52% faster than the 2080 at 1440p, based on a 23 game sample, and they used a 9900K clocked at 5 GHz with DDR4-4000 memory. That is very much inline with our own data, we found the 3080 to be on average 49% faster than the 2080. Looking around at all the usual suspects it seems pretty unanimous that overall the RTX 3080 is about 50% faster than the RTX 2080 at 1440p and about 70% faster on average at 4K.

Because the Patreons of "Hardware Unboxed", which the above is a transciption of, voted for the AMD platform (and he that pays the piper calls the tune). Anyway it doesn't matter when you are bench marking unless you are severely bottlednecked by the CPU (e.g. at 1080p and below) - because the point of GPU bench-marking is to compare the GPUs whilst keeping all other things equal. These are not system benchmarks or CPU benchmarks.

That's not correct, we didn't ask Patreon members. We created a poll on our YouTube channel and 83% of the 62,000 people who voted wanted to see results with the 3950X.

However, that was just part of the reason why we went with the 3950X.

At the end of the day you're talking about a 1-2% difference in the results overall, and that same margin apples to all GPUs tested on the platform.
 
Please have some content on hdmi 2.1 upgrade. Forbes is reporting that people are having issues on LG oleds. There is limited content on this.
 
You're very poorly researched. The 3950X is on average 5% slower than the 10900K at 1440p in our 14 game sample when using the RTX 3080. It was also 4% slower with the RTX 2080 Ti.
And why exactly do you think it is fair for your readers to hide even that 4%? What about the 10% at 1080? I am all for AMD, as I said, they have great CPUs, great prices, great platform, etc. But again, the purpose of a GPU review is to test that GPU without any other bottleneck, which is what you and other reviewers don't do.
If you think that is fine, I have nothing else to say.
 
And why exactly do you think it is fair for your readers to hide even that 4%? What about the 10% at 1080? I am all for AMD, as I said, they have great CPUs, great prices, great platform, etc. But again, the purpose of a GPU review is to test that GPU without any other bottleneck, which is what you and other reviewers don't do.
If you think that is fine, I have nothing else to say.

As long as the GPU is utilized close to 100% there is no bottleneck nor any kind of neck. And % increase to a known quantity remains the same no matter the CPU. Sure the number of FPS may vary but the % remains the same or close to. And 1080p, srsly ?? ppl want to play @ 4k and you ask for 1080p ? That's for budget gamers or those CS-go players that need high frames. No self respecting gamer gets a 3080 to play below 1440p nowadays.
 
And why exactly do you think it is fair for your readers to hide even that 4%? What about the 10% at 1080? I am all for AMD, as I said, they have great CPUs, great prices, great platform, etc. But again, the purpose of a GPU review is to test that GPU without any other bottleneck, which is what you and other reviewers don't do.
If you think that is fine, I have nothing else to say.

First off, this card is a waste at 1080p. Just a baseline there. It is not worth the money and you should be going to a 3070 (assuming that's also not a waste) and I'm sure you've read other reviews using Intel CPUs which have concluded exactly the same. The 1080p argument is irrelevant with this card.

Secondly, if the relative numbers at 1440p are 4% lower with an AMD CPU with all GPUs, then the comparison of *GPUs* is equally as relevant with either CPU as long as you keep everything else constant. This is not a CPU review, it a GPU review. Go read a CPU review and enjoy the 4% uplift with an Intel CPU.

Thirdly, TS/HUB is giving you GPU reviews with AMD CPUs which nobody (?) else is doing. That's *more* data, not less. So you can make a more educated decision when building a system. Go read the other reviews for Intel CPU-based GPU comparisons and read TS/HUB for AMD CPU-based GPU comparisons.

More data is good. Take advantage of it.
 
Back