Asus TUF Gaming GeForce RTX 3080 OC Review

Given the fact that on an Intel system the 3080 gets 5-10% better FPS means one thing. It isn't utilized to 100% or something in the drivers/software is more optimized. Also, there are quite a few competitive players that want high FPS and those should get their information from where? The argument doesn't hold...all the self respecting tech outlets are still using an Intel rig cause it is just better to stress the highest end GPUs. It doesn't have anything to do with fanboyism, with prices or something else.

@Lew Zealand : In a GPU review I want to see how many FPS I get in modern titles in all the common resolutions today, that are 1080p, 1440p and 4k. To see how many FPS 3080 can push you need the best CPU there is which is 10900K. It is pretty logical and this is why most other reviewers use 10900k. As for AMD, we know they are usually 5-10% below Intel, something closer sometimes farther away, but in a GPU review you remove any bottleneck (CPU/RAM/whatever) to give the GPU the best possible platform. Get it?
Peace!
 
Last edited:
@Lew Zealand : In a GPU review I want to see how many FPS I get in modern titles in all the common resolutions today, that are 1080p, 1440p and 4k. To see how many FPS 3080 can push you need the best CPU there is which is 10900K. It is pretty logical and this is why most other reviewers use 10900k. As for AMD, we know they are usually 5-10% below Intel, something closer sometimes farther away, but in a GPU review you remove any bottleneck (CPU/RAM/whatever) to give the GPU the best possible platform. Get it?
Peace!

Yes, makes total sense, but only if you play using the exact same system that's tested. Maybe yours matches the tested system that some site uses, but for the vast majority of people, the system doesn't. So actually the specific FPS numbers you see in all reviews are not relevant to the vast majority of users out there. That's why it's less relevant to look at those specific numbers and instead look at the relative numbers. If this GPU 60% faster in the same test conditions and my processor is similar enough, then I can expect 60% more frames.

Except of course that most people's processors are different enough from the tested system that you can't even draw those conclusions. I have an i5-8400 and I expect my system to be thread-bottlenecked in some games so even if I get a 3080 (which I'd like to, or a 3070), not only am I not going to see the same specific framerate, I can't even assume I'm going to get the same %FPS uplift that a 10900K sees.
 
While your relative numbers argument is fine, it still doesn't cover the entire spectrum of expectations. I want to see if card X can hit 60/144/240fps in a game, ok? Cpu is not a problem, I just want to see if this card is able to hit my target is some games. Especially for high refresh rate gamers this is something important and by using a slower cpu you actually show less potential than what a 3080 can do. Comparatives are good, but absolute performance is also very important and that is what you miss by using an amd platform.
 
In a GPU review I want to see how many FPS I get in modern titles in all the common resolutions today, that are 1080p, 1440p and 4k.
I don't think any review can show how many FPS you will get. Steve's test system is used for benchmarking only and specific in-game sections are used in the analysis. One could put together a simple system and get different results.

The point is that the whatever CPU or configuration is used, the platform is common across all tested products. More importantly, if the results show that there is a difference between two CPUs at a given resolution, it shows that the tested graphics card is not fully GPU-bound in those conditions.

One can see this in the 3080 FE review

CPU_1440p.png


The 2080 Ti is almost GPU bound, when averaged across the tests, at 1440p but it is entirely so at 4K:

CPU_4K.png


These results show that for this system, in those particular tests, both cards are only fully GPU-bound at 4K. Thus at something like 1080p, the tests will be so affected by the rest of the test platform, that judging how well either of those cards perform in any other PC is open a broad influence from a significant number of variables.
 
While your relative numbers argument is fine, it still doesn't cover the entire spectrum of expectations. I want to see if card X can hit 60/144/240fps in a game, ok? Cpu is not a problem, I just want to see if this card is able to hit my target is some games. Especially for high refresh rate gamers this is something important and by using a slower cpu you actually show less potential than what a 3080 can do. Comparatives are good, but absolute performance is also very important and that is what you miss by using an amd platform.

Part of your problem is you don't understand what the performance data in these reviews is showing you. It's NOT a performance guide for that particular game, the frame rate is likely going to vary quite a bit beyond what the graphs show. The point is to show the performance difference between GPUs, that margin will remain fairly consistent.

I'm also not sure why you keep talking about low resolution testing for a $700 US GPU review, those numbers weren't included for obvious reasons. I'll try this again, the difference at 1440p is 1-2%, so less than the difference you'll see between some AIB cards. The difference at 4K is 0%, as in zero percent, as in you're being a complete boob over a zero percent margin at the focus resolution.
 
I simply don't understand why you keep saying what people want. You should do what people want not what you believe people want. I want to see gpus, no matter the budget being tested at 1080p also and I want a platform that doesn't bottleneck it. Simple.
You keep using an amd system and I'll come back here to read reviews only when amd will be better than intel for gaming. Until then, there are many other sites to read reviews which actually put that gpu in the best light possible. Peace!
 
I simply don't understand why you keep saying what people want. You should do what people want not what you believe people want. I want to see gpus, no matter the budget being tested at 1080p also and I want a platform that doesn't bottleneck it. Simple.
You keep using an amd system and I'll come back here to read reviews only when amd will be better than intel for gaming. Until then, there are many other sites to read reviews which actually put that gpu in the best light possible. Peace!

I gave people what they voted for because it has no impact on our findings. You're unreasonable and entitled, there is no way you should object this hard to a change that has no affect on you or the results. Any reasonable person wouldn't have an issue with this and tens of thousands don't. You're in a very small minority, largely occupied by fanboys (or maybe you have Intel investments, either way I don't care).

You know what really makes this all the more ridiculous, if we didn't list the CPU tested and instead just showed the results (the overall margins and cost per frame) you'd have no issue with the review. As far as you'd be concerned, our conclusion was inline with that from other trusted media outlets and our margins/cost per frame was virtually identical.
 
Last edited:
To be clear, which model was tested?

Asus has two model numbers listed on their site, TUF-RTX3080-10G-GAMING and TUF-RTX3080-O10G-GAMING. The review title state "OC" leads me to believe that this is the 010G, which is the higher clocked version, but that card is $749, not $699 as mentioned in the review.
 
I gave people what they voted for because it has no impact on our findings. You're unreasonable and entitled, there is no way you should object this hard to a change that has no affect on you or the results. Any reasonable person wouldn't have an issue with this and tens of thousands don't. You're in a very small minority, largely occupied by fanboys (or maybe you have Intel investments, either way I don't care).
I must apologize since I didn't know people voted for AMD system. If that is the case, then I retract everything I said. As you saw in your latest article, using a 10900K does hold some advantages and given it can be pushed even further in frequency, it is still the king of the hill in gaming, even though it is not the CPU to buy or to recommend from a bang 4 buck point of view. Peace!
 
"Asus also points out that this model features a very robust stainless steel I/O bracket which they say protects against rust while providing a more durable and secure mount. Can’t say we’ve ever had an issue with the standard steel brackets, but if you have, well this will be a welcomed feature."

This metal is used because of its low coefficient of thermal conductivity, so as not to melt the output.
 
I simply don't understand why you keep saying what people want. You should do what people want not what you believe people want. I want to see gpus, no matter the budget being tested at 1080p also and I want a platform that doesn't bottleneck it. Simple.
You keep using an amd system and I'll come back here to read reviews only when amd will be better than intel for gaming. Until then, there are many other sites to read reviews which actually put that gpu in the best light possible. Peace!
I don't know what your problem is. Steve Walton is one of the best video card benchmarkers in the world and it's not like the results here are at odds with the results at other sites like Tom's Hardware, Gamers Nexus, Guru3D and the like. I don't know who you are, and in a way, that's the point, who the hell are you to come in here and troll someone like Steve Walton? The reason that Steve knows what people want is because he conducts polls. His viewers voted on whether to use an Intel Core i9-10900K or an AMD R9-3950X and the AMD CPU won in an astonishing 83%-17% AVALANCHE (landslide doesn't do this justice). I'm sorry if little Intel fanboys like you are butthurt about it (actually, that's a lie).

The respect that Steve Walton has from us enthusiasts doesn't come from being a whiny baby like you're being now. Steve EARNED our respect with countless hours and countless dollars spent bringing us honest reviews of products. Whiny NOOBS like you have no idea just how tedious that must be to run the same thing over and over and over again and then have to spend the better part of a day scripting his Harbour-on-Box video and doing hours of editing for a twenty-minute video with the polish that are a hallmark of his work. Hell, he was probably doing this since before you even knew what a PC was.

If you have a "better" way, then I suggest you put up or shut up because this "discussion" that you're having with him makes you look like a kid who got good grades in high school physics trying to debate with Stephen Hawking himself.
 
Steve, you're really spoiling us with the quality of your reviews. People are going to start expecting perfection every time.

I wanted to ask if you've heard the talk about custom AIB RTX 3080s crashing at around the 2GHz mark. I wonder if it's just certain cards or if it's just certain situations. If you want to know more, Chris at "The Good Ol' Gamer" was talking about it on YT.

Here's a joke for ya:

Tourist in Australia: Excuse me, can you tell me where to find the bush?
Steve Walton: Yeah, I reckon you can find it on Tim's upper lip!

CHEERS! :p
 
Last edited by a moderator:
However, that was just part of the reason why we went with the 3950X...
I certainly don't agree with the posters who fail to see the reason why you chose to test with both Intel and AMD. However, I do still wonder why the choice of the 3950x, rather than the better-performing (in these benches) and much cheaper 3900xt or 3800xt. I assume it was because that the choice of anything else would have led to complaints about your not using the "top of the line" model.
 
Last edited:
I don't know what your problem is. Steve Walton is one of the best video card benchmarkers in the world and it's not like the results here are at odds with the results at other sites like Tom's Hardware, Gamers Nexus, Guru3D and the like. I don't know who you are, and in a way, that's the point, who the hell are you to come in here and troll someone like Steve Walton? The reason that Steve knows what people want is because he conducts polls. His viewers voted on whether to use an Intel Core i9-10900K or an AMD R9-3950X and the AMD CPU won in an astonishing 83%-17% AVALANCHE (landslide doesn't do this justice). I'm sorry if little Intel fanboys like you are butthurt about it (actually, that's a lie).

The respect that Steve Walton has from us enthusiasts doesn't come from being a whiny baby like you're being now. Steve EARNED our respect with countless hours and countless dollars spent bringing us honest reviews of products. Whiny NOOBS like you have no idea just how tedious that must be to run the same thing over and over and over again and then have to spend the better part of a day scripting his Harbour-on-Box video and doing hours of editing for a twenty-minute video with the polish that are a hallmark of his work. Hell, he was probably doing this since before you even knew what a PC was.
I already apologized...it is the popular choice that made him move to AMD system. Also, let's not evangelize tech reviewers. They are clearly more knowledgeable than the users but they aren't some gods in the tech world and they can also get it wrong, so feedback is very important. Not being able/willing to listen to feedback is not something a smart person would do, so I advise you to think about that.
 
While your relative numbers argument is fine, it still doesn't cover the entire spectrum of expectations. I want to see if card X can hit 60/144/240fps in a game, ok? Cpu is not a problem, I just want to see if this card is able to hit my target is some games. Especially for high refresh rate gamers this is something important and by using a slower cpu you actually show less potential than what a 3080 can do. Comparatives are good, but absolute performance is also very important and that is what you miss by using an amd platform.
Okay. I get your point. I think that what you mean is that this review is important but not comprehensive enough. And that another review aimed at testing the crap out of the gpu using the highest performing gaming cpu such as Intel Core i9-10900K should be made. What I can only tell you is that you needn't complain, I think that the best approach was to inquire if such a review would be made soon. These guys enjoy benchmarking, and I'm sure that once they get the right equipment, they'll use the Intel Core i9-10900K to test the gpu at 100%
 
Because there are those who can afford the highest of specs. So they need your attention too. Right?
 
I have a pre-order in on the Asus TUF 3090 non-OC, and it's physically pretty much identical to the 3080 version. Temperatures should be a hair higher, and gaming performance only 10% higher until drivers optimize for the ridiculous shader count better, but otherwise basically the same card.

I'm hoping the 3090 version exhibits the same positive traits of cooler running over Founders Edition that the 3080 has. I do know that I'm am excited to have snagged the TUF version, even if the non-OC.
Do you know if there are any differences between the two tuf versions pcb wise? Wanting to add the non oc card to my watercooling setup but the non oc blocks are hard to come by.
 
Do you know if there are any differences between the two tuf versions pcb wise? Wanting to add the non oc card to my watercooling setup but the non oc blocks are hard to come by.

Unfortunately I will not be able to answer as I ended up getting the PNY RTX 3090 XLR8 Epic-X instead. PNY made me an offer I could not refuse, they literally called me and took my order over the phone and I paid wholesale. I still had to pay full MSRP of 1499.99, but no tax or shipping. As I said, an offer I couldn't refuse.

The story of WHY that all happened is a long one, but suffice to say, I cancelled the Asus TUF pre-order on 3090 launch day as my order was already in PNY's system 10 minutes after launch time.

PNY is a top choice for me as well though. They are the only company Nvidia allows to make Quadros, Titans, and Tesla cards, as well as ALL of Nvidia's reference designs are designed by PNY. They are never the top overclocker choice, but they are ALWAYS top rock solid stability choice. The recent Ampere capacitor incident is a prime example, PNY cards were unaffected by the crashes hitting Zotac, MSI, Asus, and EVGA owners. Founders Edition and PNY/Palit/Gainward cards had zero crashing reports. Reference stability has it's advantages :p
 
Thanks fyi there is no difference in the two in case anyone out there needs that info. Only difference is a small boost on the OC variant.
 
Back