GeForce RTX 3080 vs. Radeon RX 6800 XT: 50 Game Benchmark

That is a valid real world observation and holds more merit than fanboys arguing over 5-10fps difference.
Or, go to youtube and see how the thumbnail always have a nvidia gpu present or how somehow, there is a RTX gpu visible somewhere in the video even if the video is not about GPUs.


How do you like OLED gaming?
Its awesome. No lag, awesome colors, the blacks are truly black!

Now, this particular TV has left me with a bad impression of LG.

Example, the hardware is more than capable of using FreeSync, yet somehow, LG only provided support for GSync via a software update and said they will never add FreeSync.

Or how the OS version will always be stuck at 5.xx.xx.

So they can leave you/force you to buy a new one if something decides to start requiring WebOS 6.xx for example.

So my next TV will most likely be from someone else but using the same LG panels, since it looks like only LG makes them.
 
I'd say the only reason the Radeon is even in the running at all is because of its higher clock-speeds, with some games favouring those, much more than anything else.

Run both these cards at the same clock-speeds and I bet the 3080 would walk all over the 6800, whether that's running the 6800 at the lower Nvidia clocks or the 3080 (if it were possible) at 6800 level speeds.

Those higher clocks on the 6800 disguise just how inefficient their RDNA2 architecture still is, compared to Nvidia's


That's quite a dumb comparison. I mean they're two different arches all together. The 6800XT has pretty much the same performance and draws less power too, all with a higher clockrate..

The 3080 isn't capable of those frequencies at all. Obviously downclocking the 6800XT to 3080 speeds would slow it down, that's still irrelevant. If I slowed down the clockspeed on my Ryzen 5600X enough to be as slow as a chip that maxes out at 3GHz what would that prove? It would be meaningless as it runs at 4.4GHz no problem and the other chip cant.
 
It might be more interesting when game engines are built around the new consoles as then we might see games in 2 years run significantly better on RDNA2 than Ampere due to the PS5 and XSX. Currently a lot of game engines are built around UE4 which favours Nvidia or game engines built around GCN on the PS4.
 
I'd say the only reason the Radeon is even in the running at all is because of its higher clock-speeds, with some games favouring those, much more than anything else.

Run both these cards at the same clock-speeds and I bet the 3080 would walk all over the 6800, whether that's running the 6800 at the lower Nvidia clocks or the 3080 (if it were possible) at 6800 level speeds.

Those higher clocks on the 6800 disguise just how inefficient their RDNA2 architecture still is, compared to Nvidia's
I never heard anyone using these arguments when AMD had lower clocks... The idea that they should run at equal clock speeds is baseless. Can the nVidia cards hold higher clocks in the first place? No. Holding higher clocks while consuming less power is an architectural advantage, not a disadvantage.

 
I hope one day techspot do their own research instead of copying everything done by other reviewers without even referencing them or giving them any credit.
Does a yt channel named Hardware unboxed sound familiar to you guys?
 
I hope one day techspot do their own research instead of copying everything done by other reviewers without even referencing them or giving them any credit.
Does a yt channel named Hardware unboxed sound familiar to you guys?
FYI, the author of this review, Steve, is also the creator of the Hardware Unboxed YouTube channel, so no problem here.
 
I'd say the only reason the Radeon is even in the running at all is because of its higher clock-speeds, with some games favouring those, much more than anything else.

Run both these cards at the same clock-speeds and I bet the 3080 would walk all over the 6800, whether that's running the 6800 at the lower Nvidia clocks or the 3080 (if it were possible) at 6800 level speeds.

Those higher clocks on the 6800 disguise just how inefficient their RDNA2 architecture still is, compared to Nvidia's
I'm not sure why that would matter. The end result is the end result and the 6800xt is less expensive and easier to find. Only disadvantage is the ray tracing.
 
Last edited:
I couldnt imagine buying a flagship GPU for over $1000 and not get good ray tracing support and DLSS. In particular DLSS is awesome, I use it all the time. In some games it even makes the games look better, in most games it makes the motion look much clearer, theres no trails like you get from TAA. The Radeon is odd, it seems best at 1080p but I couldnt imagine spending this much money on a 1080p card, it would cost like 5 times as much as the monitor itself!

Also after years of being infuriated with AMDs driver support I dont want to go Radeon again for at least a little while. Geforce all the way for me, until they inevitably let me down. But its been 3 years back on Nvidia and so far its been vastly superior in terms of driver support to the previous 8 years I spent using Radeon cards.
You know I have heard this a good bit so I know there is probably some truth to it. That said, my last cards have been 580, Vega 56, 5700xt and now the 6800xt and I have almost no problems. I think the Vega had a weird green screen driver issue every once in a while but it was pretty rare. But overall I have had no real issues...but I don't generally buy cards right at launch either.
I think "vastly superior" is a bit of an exaggeration.
I was wanting to go with Nvidia this last upgrade for the rtx but I have been pretty impressed with my 6800xt this far. I'm really curious to know what AMD is doing in terms of ray tracing support and if the 6000 series cards will be included in that support.
If not I may go with Nvidia next round...I have no brand loyalty.
 
That could have all been said in one line " I prefer NVIDIA"



This is done on purpose on the majority of the tech sites. You will see colourful wording which just strokes the fan boys, and brings in more eye balls and clicks.

Majority of people will not be able to tell the difference when both GPU are giving playable performance. And I don't know many people that play games staring at a fps counter instead of playing games.
You really think the majority of people cant tell the difference between a 3090 and a 6900xt in, lets say, cyberpunk? I really dont or can't think how this can be the case
 
You really think the majority of people cant tell the difference between a 3090 and a 6900xt in, lets say, cyberpunk? I really dont or can't think how this can be the case
I'd say... yes... I suspect most wouldn't be able to tell the difference between TONS of video cards... kind of like how, unless you are a wine connoisseur, you can't tell the difference between a $50 bottle of wine and a $300 dollar one...
 
I'd say... yes... I suspect most wouldn't be able to tell the difference between TONS of video cards... kind of like how, unless you are a wine connoisseur, you can't tell the difference between a $50 bottle of wine and a $300 dollar one...
Yes but we are talking about the specific crowd that actually buys 1000+ gpus. We are not talking about the average Joe

One card gets 100 fps @1440p rt ultra and the other one... Doesnt.
 
Yes but we are talking about the specific crowd that actually buys 1000+ gpus. We are not talking about the average Joe

One card gets 100 fps @1440p rt ultra and the other one... Doesnt.
Just because you have money, doesn't mean you can tell the difference... would love to see an actual study - but if other products can be used as a guide, I'm pretty sure I'm right.

I recall when they did a study with "audiophiles" - and they couldn't tell the difference between an expensive hi-fi system and an ipod...
 
You really think the majority of people cant tell the difference between a 3090 and a 6900xt in, lets say, cyberpunk? I really dont or can't think how this can be the case
I don't have much else to add here Squid Surprise is on the ball.

Yes most people will not be able to tell the difference between a game at 80fps vs 100fps unless they have a counter up on screen. And you trying to cherry pick cyberpunk with ray tracing doesn't represent the whole market.

I've been stressing this forever online but most people just seem to want to argue over 10fps differences. At the end of the day playable performance vs non playable is all that matters.
 
I don't have much else to add here Squid Surprise is on the ball.

Yes most people will not be able to tell the difference between a game at 80fps vs 100fps unless they have a counter up on screen. And you trying to cherry pick cyberpunk with ray tracing doesn't represent the whole market.

I've been stressing this forever online but most people just seem to want to argue over 10fps differences. At the end of the day playable performance vs non playable is all that matters.
It's not cherrypicking, the reason you would buy a 3090 over a 6900xt IS RT and DLSS. Of course if you don't use them you can't tell the difference, but the whole reason for buying the 3090 is to use those specific features. So yes, I stand by what I said. And no, the difference when DLSS is used is definitely not 80 to 100. More like 35 to 80. That's the actual difference at 4k

So I guess we are kind of in agreement, if I wasn't using RT and DLSS I couldn't tell my 3090 from a 6900xt, but I wouldn't have bought a 3090 in the first place, I would get the 6800xt.
 
Last edited:
I recall when they did a study with "audiophiles" - and they couldn't tell the difference between an expensive hi-fi system and an ipod...
That is to be expected? Why would there be a difference between an ipod and an expensive hifi? Dac's are pretty much down to taste, a 200$ dac is as good at reproducing analog sound as a 5k $ dac. The difference is in amps, assuming your amplifier doesn't have enough power to drive whatever you are trying to, but that's about it. A 10k amp + dac combo will be 99% identical to a 500$ amp + dac combo, assuming you are not trying to drive something extraordinary hard (low sensitivity / high resistance headphones).

So yeah, one comparison doesn't tell us anything about another. You are basically saying you can't see RT at all, and you can't tell the difference between 50 and 100 fps (that's the difference DLSS does). Uhm...okay..
 
It's not cherrypicking, the reason you would buy a 3090 over a 6900xt IS RT and DLSS. Of course if you don't use them you can't tell the difference, but the whole reason for buying the 3090 is to use those specific features. So yes, I stand by what I said. And no, the difference when DLSS is used is definitely not 80 to 100. More like 35 to 80. That's the actual difference at 4k

So I guess we are kind of in agreement, if I wasn't using RT and DLSS I couldn't tell my 3090 from a 6900xt, but I wouldn't have bought a 3090 in the first place, I would get the 6800xt.
I agree it SHOULD be the reason to buy a 3090.... but it ISN'T for most people... most people buy it either :
a) because it's the "best" and they must have the "best".
b) Status symbol - kind of like having a Mercedes or a Porsche...

The VERY FEW who buy a 3090 for your specific purpose are in the minority...

I actually think the best reason for getting a 3090 is if you're too cheap to pay for a Quadro and want equivalent performance cheap :)
That is to be expected? Why would there be a difference between an ipod and an expensive hifi? Dac's are pretty much down to taste, a 200$ dac is as good at reproducing analog sound as a 5k $ dac. The difference is in amps, assuming your amplifier doesn't have enough power to drive whatever you are trying to, but that's about it. A 10k amp + dac combo will be 99% identical to a 500$ amp + dac combo, assuming you are not trying to drive something extraordinary hard (low sensitivity / high resistance headphones).

So yeah, one comparison doesn't tell us anything about another. You are basically saying you can't see RT at all, and you can't tell the difference between 50 and 100 fps (that's the difference DLSS does). Uhm...okay..
There IS a difference between an iPod and an expensive HiFi system... but only a VERY small % of people will notice the difference... The same way most people in a movie theatre won't notice if one of the speakers is busted or has been calibrated incorrectly (I have a friend who always notices and gets us free movie tickets after complaining - I never notice a thing).
 
An additional factor to consider - do you have a high-cost GSYNC only screen that you're happy with and not planning to replace? Then an NVIDIA card is the only option for you...
 
I do. After over 8 years on Radeon I definitely prefer Nvidia. Radeon sucks and im never paying for another Radeon card again. The things are snake oil.
Well I've been using Radeons for over 20 years and no major issues. So we can agree to disagree.

It's not cherrypicking, the reason you would buy a 3090 over a 6900xt IS RT and DLSS. Of course if you don't use them you can't tell the difference, but the whole reason for buying the 3090 is to use those specific features. So yes, I stand by what I said. And no, the difference when DLSS is used is definitely not 80 to 100. More like 35 to 80. That's the actual difference at 4k

So I guess we are kind of in agreement, if I wasn't using RT and DLSS I couldn't tell my 3090 from a 6900xt, but I wouldn't have bought a 3090 in the first place, I would get the 6800xt.
If one has a library that has games that support RT sure. Out of my library of games I think I have 3 games that support it and everything else does not. So for me there was no reason to look at a 3090 and its high price. The amount of people with a 3090 gaming at 4k is tiny compared to everyone else, and DLSS is required at 4k because a 3090 isn't powerful enough to run that resolution with RT on without it.

I myself don't care for upscaling at all I prefer native res so FSR / DLSS I would never use and have no interest in it.

So yes if you are gaming at 4k want DLSS/RT and can afford to spend $2000+ on that GPU then great to each his own.
 
Last edited:
Just because you have money, doesn't mean you can tell the difference... would love to see an actual study - but if other products can be used as a guide, I'm pretty sure I'm right.

I recall when they did a study with "audiophiles" - and they couldn't tell the difference between an expensive hi-fi system and an ipod...

Excuse me but what? The exact opposite was found when applying the Harmon scale to the average listener. It was found that every single one could tell the difference between Lo-Fi and HiFi products.

If any survey or thing that you red said that audio files can't tell the difference between good audio equipment and bad audio equipment is completely bunk. The disparity between good headphones and bad headphones is so great that even the average person will be able to hear it if you make them listen to it side by side. Each and every single blind test has repeated this when it's done properly with good equipment.

You can try this at home, buy some overpriced crap raycons, and then buy something reasonable and decent like Taotonics at a quarter of the price and a better product.

 
If one has a library that has games that support RT sure. Out of my library of games I think I have 3 games that support it and everything else does not. So for me there was no reason to look at a 3090 and its high price. The amount of people with a 3090 gaming at 4k is tiny compared to everyone else, and DLSS is required at 4k because a 3090 isn't powerful enough to run that resolution with RT on without it.

I myself don't care for upscaling at all I prefer native res so FSR / DLSS I would never use and have no interest in it.

So yes if you are gaming at 4k want DLSS/RT and can afford to spend $2000+ on that GPU then great to each his own.
Well the disagreement wasn't on whether or not you should buy it, but whether or not you can tell the difference between a 3090 and 6900xt. Which you can easily do. You don't even need 4k.

Also, plenty of games look better with DLSS than they do on native cause of TAA. Saying you would never use DLSS is akin to saying I don't care about image quality. Well, okay I guess.
 
Back