GeForce RTX 3080 vs. Radeon RX 6800 XT: 30 Game Benchmark

The 3080 and the 6800XT are NOT evenly matched at all. The 3080 can give you real time ray tracing and DLSS. It also has far better streaming capabilities. This definitely gives it an edge. It’s odd that ray tracing is not tested as AMD advertise ray tracing on their cards and consoles and is a mainstream feature that even consoles now have. It makes Techspot appear biased against Nvidia for not testing it.

Also the drivers from Nvidia come out more frequently, crash your system less when installing and resolve issues faster oh and don’t randomly turn your screen black.
 
"This latest GPU shootout sees the faster and more expensive GeForce RTX 3080 pitted against the Radeon RX 6800 XT."

I'd have worded that starting sentence differently personally.

"We’d rather spend $150 on something like a second hand RX 570 and play some older games, or esports titles with lower quality visuals and wait it out."

100% agree, the stock and prices are stupid in the current market. I'm tempted to sell my RX 470 and double my money I paid for it used in 2019.
 
It’s odd that ray tracing is not tested as AMD advertise ray tracing on their cards and consoles and is a mainstream feature that even consoles now have. It makes Techspot appear biased against Nvidia for not testing it.
From the article itself (my emphasis):
Features like ray tracing support and DLSS can make the RTX 3080 more appealing to you, especially if the games you play use those features. In fact, if you care about ray tracing, the RTX 3080 is a much better option.

Although we didn't spend much time showing how the two GPUs compare in this regard, we have dedicated articles about this, and we're particularly enthusiastic about DLSS 2.0 support coming to more games. As we’ve noted in the past, the list of games that support ray tracing is slowly growing, but there are few games where the feature truly shines.

You will almost always want to enable DLSS 2.0 in any game that supports it though.......we consider DLSS is a strong selling point of RTX Ampere GPUs and it’s something AMD will need to counter sooner rather than later.

Don't forget the video encoding engine: NVENC vs VCN. NVENC quality in Turing and Ampere is far superior and also supports 4:4:4 chroma subsampling, B-frames etc. And you also get cool things like NVIDIA Broadcast.
Again, from the article itself:
If you’re into streaming and wish to use your GPU for all the heavy lifting, then the RTX 3080 is the obvious choice, too. AMD offers only poor encoding support which is something they should have addressed by this point, especially if they want to charge top dollar for their GPUs.
 
What I find interesting is that the 6800 XT's additional VRAM doesn't seem to help. The 6800 XT is clearly a better performer at 1080P. But, as you increase resolution, the 3080 erases that advantage. One would think that additional frame buffer would benefit higher resolutions, but apparently not. Maybe 10GB is enough for 4K, at least for now.
 
Two comments:

At least in Germany, the 6800XT is available (in stock) at several etailers (a good bit above MSRP) whereas the RTX 3080 is completely MIA and has been since lauch. As for being „listed“ at MSRP: Stores can list all they want but if they are not in stock that‘s worthless.

Good to see you are getting increasingly positive towards RT and DLSS (unlike the apparently worthless SAM which is supported on Intel mainboards now, as well as on 3xx and 4xx chipsets and does work with Zen 2 and Zen+), so I guess there won‘t be an issue with nVidia review samples going forward. It gives the impression (although I may be completely wrong) that their little email served its intended purpose.

Not contesting your overall summary as that‘s supported by benchmark results and DLSS plus RT are bonus features that you get, but rather the tone.
 
Last edited:
Would love to buy an RTX3080. Refuse to pay scalper prices, and I want a selection of models, not being forced into just buying whatever is available. Nvidia will have probably shuffled the model pack by the time you can freely buy them.

It reminds me of the Fermi GTX480 where after the initial availability of cards there was zero stock right up until the GTX580 replaced it 6 months later. Mainly because in that case the yield of those chips was so horrendous they barely made any of them. I have to question the yield and production speed of this at Samsung, four months after launch.

Lots of factors like a gaming market explosion and crypto again but I would also probably say this is another result of Turing's failure to have a big impact on the market. Lots of people went to Pascal, then they didn't move onto RTX2000 Turing because it wasn't a good enough step.

Now Ampere is a significant step every man and his Pascal dog wants an upgrade.
 
From the article itself (my emphasis):



Again, from the article itself:
I too click on the title, click on the comment section and leave a comment while skipping the entire article.


I would only get 6800 XT if I could find it cheaper than the 3080, which it is supposed to be at MSRP as I would probably play at 1440p and the fact that it has more VRAM might help it in the future.
 
6% faster might not sound much, but with faster RT and DLSS support, the RTX 3080 is clearly better deal

Some game like The Medium for example is so demanding (even without RT) and DLSS is needed when you want good fps. Same thing with Cyberpunk 2077

So, RTX 3080 is an easy winner for me as long as the price gap between the two is not big

Also, even if you have 1080p screen you can still run game above 1080p as supersample and it will look better than native 1080p specially in open world game (shimmering and aliasing will be reduced when you use supersampe). I don't think many people will play only 1080p on something as powerful as RTX 3080
 
From the article itself (my emphasis):



Again, from the article itself:
Yes I did read the article and it mentions mentions the advantages that Nvidia has then goes on to claim the parts are evenly matched.

That’s like saying “team A scored more goals than team b so we have concluded that the game is a tie”.

By the way, digital foundry have just released a video about how control performs with ray tracing on PS5. Its mainstream, it’s on AMD hardware and it’s absolutely absurd that Techspot seem to be ignoring it.

The difference of RT on vs Off is often visually bigger than the difference between low and high settings. So why are we testing with high settings? May aswell test at low?
 
And that's how you school a Shadowboxer, too easy mate (y) (Y)
I read the article and yes, you have written Nvidias advantages within it. But then claimed the match up is a tie. This wrong, Nvidia offers more features (as you have mentioned in your article).

I can definitely understand why Nvidia want to cut you off. You are deliberately hiding their advantage in ray traced gaming - a now mainstream feature.
 
You are deliberately hiding their advantage in ray traced gaming - a now mainstream feature.
When just 5 out of the 30 games tested offer a ray tracing feature, I'm not sure that one can class that as being mainstream. In time, it certainly will be, just as anti-aliasing, tessellation, ambient occlusion, etc have all become, and by that time, testing will almost certainly include it - as it will be a standard setting when putting the graphics options to 'Ultra'.

As to why one can can say that the RTX 3080 and RX 6800 XT match - this is down to the primary function for which the bulk of purchases will arguably used for: gaming. Exactly what portion owners of either card routinely encode/stream or predominantly play ray-tracing enabled titles, I couldn't say but I would suggest that neither scenario from a significant majority.

For the latter, this is down to (a) the paucity of titles offering ray tracing and (b) the sizeable performance impact while using it, with or without DLSS. In the case of the former, it might seem to the casual observer that everyone is streaming every moment of their gameplay to the world at large, the portion of those with top-end GPUs is likely to match the overall distribution of sector sales (I.e. they're most likely to have mainstream cards).

The article clearly points out that the RTX 3080 has advantages over the RX 6800 XT (better 4K performance, better feature set, better encoding), but the primary task for which they're bought, they average out to be the same.
 
When just 5 out of the 30 games tested offer a ray tracing feature, I'm not sure that one can class that as being mainstream. In time, it certainly will be, just as anti-aliasing, tessellation, ambient occlusion, etc have all become, and by that time, testing will almost certainly include it - as it will be a standard setting when putting the graphics options to 'Ultra'.

As to why one can can say that the RTX 3080 and RX 6800 XT match - this is down to the primary function for which the bulk of purchases will arguably used for: gaming. Exactly what portion owners of either card routinely encode/stream or predominantly play ray-tracing enabled titles, I couldn't say but I would suggest that neither scenario from a significant majority.

For the latter, this is down to (a) the paucity of titles offering ray tracing and (b) the sizeable performance impact while using it, with or without DLSS. In the case of the former, it might seem to the casual observer that everyone is streaming every moment of their gameplay to the world at large, the portion of those with top-end GPUs is likely to match the overall distribution of sector sales (I.e. they're most likely to have mainstream cards).

The article clearly points out that the RTX 3080 has advantages over the RX 6800 XT (better 4K performance, better feature set, better encoding), but the primary task for which they're bought, they average out to be the same.
The primary task of these GPUs is to play games. These are expensive GPUs that people pay a lot for, you have decided to max out every setting except one - real time lighting. And the only reason I can think of for this is to hide Nvidias dominance.

RT is just a visual setting. It’s not exclusive, it’s not a proprietary tech, it’s not a gimmick, it’s coming on lots of upcoming AAA games on the consoles let alone PC.

Turning it off at standard testing is absurdly biased at this point. And that’s before I even get started with DLSS. You tested control and death stranding - these games look objectively visually better with DLSS on. Yet you turned DLSS off for some reason.

Can you advise to me (and your readers) who would buy an Nvidia card and turn DLSS off in control and death stranding?

This is exactly why Nvidia have a beef with HUB/Techspot. It’s a good thing I don’t just get all my tech news from here. If I did I’d have no idea how much Nvidia are dominating AMD at the latest visual technologies...
 
It's at the end, they link their dedicated articles for the Ray-Tracing comparisons.
Considering ray tracing is available on the new consoles, has been available for over 2 years on the PC, It should be considered standard with “ultra” settings on mainstream reviews.

At this point RT off comparisons should be relegated to bespoke dedicated articles for users who wish to actively ignore and turn off the latest visual features.

Right now what Techspot is doing is giving AMD a free pass for lagging behind in RT performance on their PC solutions. Despite the fact that that AMD advertise it to consumers and share holders and even provide it on the consoles. Right now users are seeing potentially better visual fidelity settings on a console than what’s been tested here.
 
Last edited:
Trolling impressively @Shadowboxer. There's a whole context to the article and the conclusion. The GPUs are evenly matched in performance/cost per frame, except in the games where DLSS/RT is a factor (not hundreds of games, not even dozens) and for streaming. It's all there.
I’m not trolling.

Can you tell me why the real time lighting settings are not maxed when you test at “ultra settings”?

The GPUs are apparently only matched if you turn certain specific visual improvements off. That’s all I can get from this article.
 
The primary task of these GPUs is to play games. These are expensive GPUs that people pay a lot for, you have decided to max out every setting except one - real time lighting. And the only reason I can think of for this is to hide Nvidias dominance.

RT is just a visual setting. It’s not exclusive, it’s not a proprietary tech, it’s not a gimmick, it’s coming on lots of upcoming AAA games on the consoles let alone PC.

Turning it off at standard testing is absurdly biased at this point. And that’s before I even get started with DLSS. You tested control and death stranding - these games look objectively visually better with DLSS on. Yet you turned DLSS off for some reason.

Can you advise to me (and your readers) who would buy an Nvidia card and turn DLSS off in control and death stranding?

Straight from the article:

TechSpot said:
As you lower the resolution, even to 1440p, the image becomes slightly blurry as DLSS has less data to work with and at 1080p, even in Cyberpunk 2077, the quality DLSS option isn’t great and in our option noticeably worse than native 1080p.

I play at 1440p and if I buy the 3080, which I want to do some day, I will not be playing it with DLSS on as I don't want to *LOWER* my visual quality. To do that would just be stupid. The FPS are high enough already.

This is exactly why Nvidia have a beef with HUB/Techspot. It’s a good thing I don’t just get all my tech news from here. If I did I’d have no idea how much Nvidia are dominating AMD at the latest visual technologies...

Nvidia has significant advantages in a minority of games and in a minority of circumstances. As Steve and others have said, when those games become more numerous and if AMD does not improve their support for those techs, only then will Nvidia dominate as you claim.
 
Back