GeForce RTX 3080 vs. Radeon RX 6800 XT: 50 Game Benchmark

If you are still in the market for a card, aim for the 3080Ti if you can’t get a 3090.
 
Last edited:
I'd say the only reason the Radeon is even in the running at all is because of its higher clock-speeds, with some games favouring those, much more than anything else.

Run both these cards at the same clock-speeds and I bet the 3080 would walk all over the 6800, whether that's running the 6800 at the lower Nvidia clocks or the 3080 (if it were possible) at 6800 level speeds.

Those higher clocks on the 6800 disguise just how inefficient their RDNA2 architecture still is, compared to Nvidia's
 
I'd say the only reason the Radeon is even in the running at all is because of its higher clock-speeds, with some games favouring those, much more than anything else.

Run both these cards at the same clock-speeds and I bet the 3080 would walk all over the 6800, whether that's running the 6800 at the lower Nvidia clocks or the 3080 (if it were possible) at 6800 level speeds.

Those higher clocks on the 6800 disguise just how inefficient their RDNA2 architecture still is, compared to Nvidia's
I think you may be right to say that RDNA2 is still not as good as Ampere from an architecture standpoint. Considering that AMD had sizeable advantage over Nvidia from using TSMC 7nm, while Nvidia cheap out and went with Samsung’s 8nm (a refined 10nm). However at the end of the day, gamers will just look at the performance of the card. So if I chose a RX 6800 XT for example, because I know it is just as fast and uses less power. I won’t consider whether AMD is using a better node, faster clock speed to get that kind of performance. I’ve tried both cards, and I don’t think from a pure rasterisation standpoint, any of them stand out. It’s only when you start considering about using the extra features that will tilt the favour considerably towards the RTX 3000 series.
 
It's too bad your system is AMD instead of Intel... I'd love to see the numbers with an Intel 12900... I suspect that Nvidia's margin of victory would rise a bit...
 
I'd say the only reason the Radeon is even in the running at all is because of its higher clock-speeds, with some games favouring those, much more than anything else.

Run both these cards at the same clock-speeds and I bet the 3080 would walk all over the 6800, whether that's running the 6800 at the lower Nvidia clocks or the 3080 (if it were possible) at 6800 level speeds.

Those higher clocks on the 6800 disguise just how inefficient their RDNA2 architecture still is, compared to Nvidia's

RX6800XT :
Transistors- 26,800 million
Die Size- 520 mm²
TDP- 300 W

RTX3080
Transistors- 28,300 million
Die Size- 628 mm²
TDP- 320 W
lets talk about efficiency again :)
 
RX6800XT :
Transistors- 26,800 million
Die Size- 520 mm²
TDP- 300 W

RTX3080
Transistors- 28,300 million
Die Size- 628 mm²
TDP- 320 W
lets talk about efficiency again :)
AMD clockspeed 1.7 - 2.105Ghz
Nvidia clockspeed 1.44 - 1.71 Ghz

Think that was his point...
 
Comparing those GPUs at the same clockspeed does not make sense. Both of them use different strategies in order to deliver the better experience. At the end of the day, in order to evaluate those GPUs what matters is perf and perf/W
Agreed - but that was the point of the guy who made it... Rooster...I just gave the numbers :)
 
RX6800XT :
Transistors- 26,800 million
Die Size- 520 mm²
TDP- 300 W

RTX3080
Transistors- 28,300 million
Die Size- 628 mm²
TDP- 320 W
lets talk about efficiency again :)

And that TDP on the 3080 is for the FE model at that.
The EVGA FTW3 Ultra variant easily hits 380W under load and will push up to 400 with a simple OC with the option to install a 450W LN2 OC BIOS to it.

/quite happy with it

AMD clockspeed 1.7 - 2.105Ghz
Nvidia clockspeed 1.44 - 1.71 Ghz

Think that was his point...

Again, for FE models.
Typical clocks in game for me are anywhere between 1835-2040Mhz.

I've managed to get a 2100Mhz OC on mine (on air, at 70c die - 82c hot spot), but only for a minute or two before it'll downclock to ~1980 and fluctuate somewhere between if not back up to 2100.

Not sure I really have a point with this post, guess it works in that there's quite a variance between the manufacturers default spec numbers and those of what most users are going to have.
Really is an impressive little piece of hardware they've got going on, considering just how much overhead is in it compared to default specs. Shame prices are so jacked still.

Edit: also works in showing that yes, AMD is the more efficient if you're comparing TDP to clockspeed, but as already mentioned given the differences between the two architectures it's not entirely accurate.
 
Last edited:
I couldnt imagine buying a flagship GPU for over $1000 and not get good ray tracing support and DLSS. In particular DLSS is awesome, I use it all the time. In some games it even makes the games look better, in most games it makes the motion look much clearer, theres no trails like you get from TAA. The Radeon is odd, it seems best at 1080p but I couldnt imagine spending this much money on a 1080p card, it would cost like 5 times as much as the monitor itself!

Also after years of being infuriated with AMDs driver support I dont want to go Radeon again for at least a little while. Geforce all the way for me, until they inevitably let me down. But its been 3 years back on Nvidia and so far its been vastly superior in terms of driver support to the previous 8 years I spent using Radeon cards.
 
I couldnt imagine buying a flagship GPU for over $1000 and not get good ray tracing support and DLSS. In particular DLSS is awesome, I use it all the time. In some games it even makes the games look better, in most games it makes the motion look much clearer, theres no trails like you get from TAA. The Radeon is odd, it seems best at 1080p but I couldnt imagine spending this much money on a 1080p card, it would cost like 5 times as much as the monitor itself!

Also after years of being infuriated with AMDs driver support I dont want to go Radeon again for at least a little while. Geforce all the way for me, until they inevitably let me down. But its been 3 years back on Nvidia and so far its been vastly superior in terms of driver support to the previous 8 years I spent using Radeon cards.

That could have all been said in one line " I prefer NVIDIA"

Drone: OMG, obliterated! Destroyed!

reading all the facts: FPS difference is really only 9 FPS between the 2, but in this case, the wording makes it more sensational.

Drone: you are a loser!

This is done on purpose on the majority of the tech sites. You will see colourful wording which just strokes the fan boys, and brings in more eye balls and clicks.

Majority of people will not be able to tell the difference when both GPU are giving playable performance. And I don't know many people that play games staring at a fps counter instead of playing games.
 
This is done on purpose on the majority of the tech sites. You will see colourful wording which just strokes the fan boys, and brings in more eye balls and clicks.
Oh, trust me, I am more than aware of that.

Better yet, plenty of these places go farther than that, like for example, look at the article main photo, the nvidia gpu "resting" on top the AMD one, subtle little thing, right? ;)

Majority of people will not be able to tell the difference when both GPU are giving playable performance. And I don't know many people that play games staring at a fps counter instead of playing games.
Not to brag, but I have a 6900XT and a Series X on a LG C9 and except for some games in certain parts (cyberpunk, Arkham games for example), I honestly cannot tell the difference between ultra settings@120 fps on the 6900xt and the performance mode in the Series X.

So due to the true plug and play nature of the console, I have been playing on it more than the PC.

But thats just me.
 
Last edited by a moderator:
Oh, trust me, I am more than aware of that.

Better yet, plenty of these places go farther than that, like for example, look at the article main photo, the nvidia gpu "resting" on top the AMD one, subtle little thing, right? ;)


Not to brag, but I have a 6900XT and a Series X on a LG C9 and except for some games in certain parts (cyberpunk, Arkham games for example), I honestly cannot tell the difference between ultra settings@120 fps on the 6900xt and the performance mode in the Series X.

So due to the true plug and play nature of the console, I have been playing on it more than the PC.

But thats just me.
That is a valid real world observation and holds more merit than fanboys arguing over 5-10fps difference. How do you like OLED gaming?
 
Back