GeForce RTX 3080 vs. Radeon RX 6800 XT: 30 Game Benchmark

I do. And I am right. At some point Techspot is going to have to test ray tracing, other tech outlets already do.

Imagine thinking we shouldn’t test a feature that gives the highest image quality on flagship GPUs lmao!
You're raving about many different things, don't tell me it's only ray tracing you're upset about. And TS has ray tracing benchmarks and articles which makes me question why am I even replying to a clearly mentally ill individual. They have also said multiple times their opinion on ray tracing so there's no way to not know their stance on this when they have clearly said it.
 
The 3080 and the 6800XT are NOT evenly matched at all. The 3080 can give you real time ray tracing and DLSS. It also has far better streaming capabilities. This definitely gives it an edge. It’s odd that ray tracing is not tested as AMD advertise ray tracing on their cards and consoles and is a mainstream feature that even consoles now have. It makes Techspot appear biased against Nvidia for not testing it.

Also the drivers from Nvidia come out more frequently, crash your system less when installing and resolve issues faster oh and don’t randomly turn your screen black.
The 3080 and the 6800XT are NOT evenly matched at all. The 3080 can give you real time ray tracing and DLSS. It also has far better streaming capabilities. This definitely gives it an edge. It’s odd that ray tracing is not tested as AMD advertise ray tracing on their cards and consoles and is a mainstream feature that even consoles now have. It makes Techspot appear biased against Nvidia for not testing it.

Also the drivers from Nvidia come out more frequently, crash your system less when installing and resolve issues faster oh and don’t randomly turn your screen black.


Ahh the old drivers chestnut. Nonsense, I've run amd cards for 20 years and never had a driver issue.

Personally I wont buy a $700 card with 10GB RAM. It's just ridiculous and the Nvidia fanboys lap it up. But those fanboys also buy the new $1500 Nvidia card every year so Jen can have his 6th Ferrari so maybe it doesn't matter to them, because by the time 10Gb is limiting they already have whatever the new Nvidia money grab is. It does to me. 16GB is more important than DLSS and RT times ten. The new consoles have 16GB, so if you want to outperform them over the next few years 10GB just isn't enough.

I'd be saying the same thing on the flip side, even though I prefer AMD. If AMD came out with a $700 card with only a paltry 10Gb memory there is no way I would touch it. What hilarious is my two year old phone (One Plus 7 Pro, bought in may 2018) has 12GB memory, more than NVidia's top of the line billions of transistors 20+ teraflops video card. Pathetic. Nvidia has been doing this for many years and it's always been pathetic. Put 20GB default on the thing, like a luxury 700+ card should, and I dont have this problem. But they're too cheap.
 
What hilarious is my two year old phone (One Plus 7 Pro, bought in may 2018) has 12GB memory, more than NVidia's top of the line billions of transistors 20+ teraflops video card. Pathetic. Nvidia has been doing this for many years and it's always been pathetic. Put 20GB default on the thing, like a luxury 700+ card should, and I dont have this problem. But they're too cheap.
Your phone has 12 GB of LPDDR4X, in the form of a single 32-bit wide, 4266 Mbps, low power RAM module that’s been around since 2016 - it offers up to 16 GB/s of bandwidth. The RTX 3080 has 10 GB of GDDR6X, in the form of ten 32-wide, 19000 Gbps, high power RAM modules that have only been around since last Fall and are only made by one company.

It’s not in the least bit cheap - each GDDR6X module is more than twice the price of a single large capacity LPDDR4X module, and the 3080 needs ten of them. Or if you’re expecting 20 GB, you’ll need twenty modules.
 
The 3080 and the 6800XT are NOT evenly matched at all. The 3080 can give you real time ray tracing and DLSS. It also has far better streaming capabilities. This definitely gives it an edge. It’s odd that ray tracing is not tested as AMD advertise ray tracing on their cards and consoles and is a mainstream feature that even consoles now have. It makes Techspot appear biased against Nvidia for not testing it.

Also the drivers from Nvidia come out more frequently, crash your system less when installing and resolve issues faster oh and don’t randomly turn your screen black.

But, I dont care about none of the extra features Nvidia is bringing. I dont stream, I dont play games with RT, and RT is pretty much unplayable with this generation if your talking about things being maxed out. Take note that Nvidia has telemetry in it's drivers. No thx.

AMD is still the better graphics card here. Like it or not.
 
When just 5 out of the 30 games tested offer a ray tracing feature, I'm not sure that one can class that as being mainstream. In time, it certainly will be, just as anti-aliasing, tessellation, ambient occlusion, etc have all become, and by that time, testing will almost certainly include it - as it will be a standard setting when putting the graphics options to 'Ultra'.

As to why one can can say that the RTX 3080 and RX 6800 XT match - this is down to the primary function for which the bulk of purchases will arguably used for: gaming. Exactly what portion owners of either card routinely encode/stream or predominantly play ray-tracing enabled titles, I couldn't say but I would suggest that neither scenario from a significant majority.

For the latter, this is down to (a) the paucity of titles offering ray tracing and (b) the sizeable performance impact while using it, with or without DLSS. In the case of the former, it might seem to the casual observer that everyone is streaming every moment of their gameplay to the world at large, the portion of those with top-end GPUs is likely to match the overall distribution of sector sales (I.e. they're most likely to have mainstream cards).

The article clearly points out that the RTX 3080 has advantages over the RX 6800 XT (better 4K performance, better feature set, better encoding), but the primary task for which they're bought, they average out to be the same.
I second this - if ray tracing was available on most games and if the performance hit when you turn it on wasn’t huge, then it could easily be said that the 3080 beats the 6800 XT. However, as we all know only a small selection of games feature RT so it makes sense to compare these cards without these features (until they become mainstream).

To me, it’s pretty simple - if you value RT and DLSS and think they are important, then go with the 3080. If not, the 6800XT. In other words, just buy the card that has the features you want. To argue over which card is better is pretty pointless in my opinion. Both are extremely capable cards in their own right. It is purely up to the individual to buy the card that has the features that they think are worthwhile.
 
Last edited:
I second this - if ray tracing was available on most games and if the performance hit when you turn it on wasn’t huge, then it could easily be said that the 3080 beats the 6800 XT. However, as we all know only a small selection of games feature RT so it makes sense to compare these cards without these features (until they become mainstream).

To me, it’s pretty simple - if you value RT and DLSS and think they are important, then go with the 3080. If not, the 6800XT. In other words, just buy the card that has the features you want. To argue over which card is better is pretty pointless in my opinion. Both are extremely capable cards in their own right. It is purely up to the individual to buy the card that has the features that they think are worthwhile.

Thing is 6800XT/non-XT are priced too high for their performance and features. That is like KIA want to sell their cars for the same prices as Merc or BMW in the same segment because KIA's cars are just as fast.
If you don't care about RT and DLSS right now, just skip 6800XT and wait for RDNA3 where those techs are integrated. With the market right now that's the best choice ;).
 
Thing is 6800XT/non-XT are priced too high for their performance and features. That is like KIA want to sell their cars for the same prices as Merc or BMW in the same segment because KIA's cars are just as fast.
If you don't care about RT and DLSS right now, just skip 6800XT and wait for RDNA3 where those techs are integrated. With the market right now that's the best choice ;).
But at the end of the day, if you think the cards are too expensive, then you have a choice not to buy them. For me personally, RT and DLSS are not features that I think are worthwhile. That’s why I went for a 6800XT.
 
But at the end of the day, if you think the cards are too expensive, then you have a choice not to buy them. For me personally, RT and DLSS are not features that I think are worthwhile. That’s why I went for a 6800XT.

Well If you want to spend 1000usd on a GPU just to play games at "Medium" settings (RT is the new Ultra), that's all good. I'm sure you will upgrade when RDNA3 come out anyways.
 
Thanks for review. However, I think I will skip current gen video cards altogether. Or maybe, bitcoin crashes hard and I will have an offer I cant resist...
personally, I d love a Gigabyte 3080 waterforce, since I have 1080ti waterforce right now.
But most likely not.
 
Well If you want to spend 1000usd on a GPU just to play games at "Medium" settings (RT is the new Ultra), that's all good. I'm sure you will upgrade when RDNA3 come out anyways.
I would agree that RT is the new ultra IF all games featured it. So based on that argument the vast majority of games only run at medium settings? Not quite sure I see the logic in that. And I actually paid 1150 USD for my 6800XT ( gotta love Japan and it’s rip-off GPU prices).
 
I would agree that RT is the new ultra IF all games featured it. So based on that argument the vast majority of games only run at medium settings? Not quite sure I see the logic in that. And I actually paid 1150 USD for my 6800XT ( gotta love Japan and it’s rip-off GPU prices).

For any game that features RT, RT is the true Ultra settings because it offers better visual.

Why do you care if RT is mainstream anyways? it is new technology afterall (well hybrid RT/Raster is new). If everyone only cares about mainstream technology then there is no progress at all. Tesla would have bankrupted by now if everyone thinks fossil fuel is the only way to go, but tada, Tesla has become the biggest car maker, yet electric car make up of only .1% of all cars out there.
 
For any game that features RT, RT is the true Ultra settings because it offers better visual.

Why do you care if RT is mainstream anyways? it is new technology afterall (well hybrid RT/Raster is new). If everyone only cares about mainstream technology then there is no progress at all. Tesla would have bankrupted by now if everyone thinks fossil fuel is the only way to go, but tada, Tesla has become the biggest car maker, yet electric car make up of only .1% of all cars out there.
The reason I mentioned the mainstream thing is that some people believe that the 3080 is a better card because it performs better in games that support RT and DLSS.
RT and DLSS are not yet mainstream because they are only featured in a limited number of games. In games that do not have those features, the cards are of similar performance.
 
The reason I mentioned the mainstream thing is that some people believe that the 3080 is a better card because it performs better in games that support RT and DLSS.
RT and DLSS are not yet mainstream because they are only featured in a limited number of games. In games that do not have those features, the cards are of similar performance.

RT and DLSS is a bet for the future, except that you dont lose anything when you lose since Ampere and RX6000 have the same P/P.
Now if RT and DLSS take off in the future, 50% of new games coming out from 2021 will feature either RT or DLSS or both, then it's obvious that 3080 is the superior option.
It's easy to tell that 3080 is a win-win bet.
 
I'm still unsure on DLSS, on one hand, it allows performance boosts thanks to internally rendering at a lower resolution. Since version 2.0 the image quality seems to be pretty good as well, some have argued better than traditional methods which is pretty impressive.

On the other hand, is it not a step in the wrong direction? Tailoring GPU's to run quick at 1080p and then just upscaling?

The next gen of GPU's use 100 watts less and have half the cores because DLSS will just upscale everything? I don't know, sounds backwards to me.

Also, surely there has to be a limit? It is just re-creating an image, it cannot add details that aren't there and 1080p isn't exactly a high resolution by today's standards.

I'd like to hear from someone like @neeyik and/or @Steve . Do you reckon DLSS is actually the future? Everything is internally rendered at 1080p then just upscaled? Or do you reckon it's just a bandage for the current (and immediate future) crop of GPU's to help hit that 4K marketing buzzword?
 
RT and DLSS is a bet for the future, except that you dont lose anything when you lose since Ampere and RX6000 have the same P/P.
Now if RT and DLSS take off in the future, 50% of new games coming out from 2021 will feature either RT or DLSS or both, then it's obvious that 3080 is the superior option.
It's easy to tell that 3080 is a win-win bet.
If 50% of games released in 2021 feature RT and DLSS - is this a fact or just a prediction? I agree that RT and DLSS is a bet for future - as in future generations of cards will be more powerful and able to run at high FPS with RT turned on. Right now, for me at least, RT kills performance to the extent that it is not worthwhile - I didn’t pay 1000+ USD to play at 60 FPS regardless of how good the game looks.
 
Last edited:
I'd like to hear from someone like @neeyik and/or @Steve . Do you reckon DLSS is actually the future? Everything is internally rendered at 1080p then just upscaled? Or do you reckon it's just a bandage for the current (and immediate future) crop of GPU's to help hit that 4K marketing buzzword?
Upscaling, be it in the form of DLSS or something less proprietary, is definitely going to be more prevalent. For the simple reason that it allows developers to push ever better levels of graphical fidelity to a wider userbase, and not worry about it being limited to a select few with top-end graphics cards. For those working on cross platform titles, it's especially good news.

However, current forms aren't perfect - either by the nature of the system used or because of how its been implemented into the game. But along with variable rate shading, it'll be as commonplace as tessellation and anti-aliasing with a number of years (although how many is anyone's guess).

From a GPU vendor's perspective, it's a godsend -- high resolution rendering, and especially when using ray tracing, is very data intensive. One can through as many shaders and RT cores as you like at the problem, but increasing this just places more and more demand on the register file, cache, and internal interconnect system. These scale less well compared to logic, so the likes of DLSS helps to mitigate this problem.
 
Upscaling, be it in the form of DLSS or something less proprietary, is definitely going to be more prevalent. For the simple reason that it allows developers to push ever better levels of graphical fidelity to a wider userbase, and not worry about it being limited to a select few with top-end graphics cards. For those working on cross platform titles, it's especially good news.

However, current forms aren't perfect - either by the nature of the system used or because of how its been implemented into the game. But along with variable rate shading, it'll be as commonplace as tessellation and anti-aliasing with a number of years (although how many is anyone's guess).

From a GPU vendor's perspective, it's a godsend -- high resolution rendering, and especially when using ray tracing, is very data intensive. One can through as many shaders and RT cores as you like at the problem, but increasing this just places more and more demand on the register file, cache, and internal interconnect system. These scale less well compared to logic, so the likes of DLSS helps to mitigate this problem.
This makes far more sense to me now. Yeah fine, Upscaling is probably the way forward. Very good point about access to a wider userbase, never thought of that.
 
Wait-- how is dual-rank memory faster than single-rank memory?? Everything I've been able to find indicates it's the other way around...
 
Wait-- how is dual-rank memory faster than single-rank memory?? Everything I've been able to find indicates it's the other way around...
That very much used to be the case, due to increased electrical loading, and while it still can be, a modern CPU handles multiple ranks far better than they ever used to. Only one rank can be read/written to at a time, but with dual or quad ranks, the others can be 'prepped' for use whilst the selected rank is being accessed. Essentially this shaves off a few nanoseconds of latency and for memory intensive situations, even tiny gains can result in notable benefits.
 
If 50% of games released in 2021 feature RT and DLSS - is this a fact or just a prediction? I agree that RT and DLSS is a bet for future - as in future generations of cards will be more powerful and able to run at high FPS with RT turned on. Right now, for me at least, RT kills performance to the extent that it is not worthwhile - I didn’t pay 1000+ USD to play at 60 FPS regardless of high good the game looks.

Hybrid RT will always kill performance, same as any other setting that improve visual at the cost of performance.
The current RT solution is a hybrid one, which use RT on top of rasterization. The only way to increase RT performance is to increase rasterization performance. That and Nvidia solution (with RT cores) currently impose less performance tax with RT than AMD's solution.
 
But, I dont care about none of the extra features Nvidia is bringing. I dont stream, I dont play games with RT, and RT is pretty much unplayable with this generation if your talking about things being maxed out. Take note that Nvidia has telemetry in it's drivers. No thx.

AMD is still the better graphics card here. Like it or not.
But it’s not though, even if you don’t use all the advantages Nvidia provides, the 6800XT is better at 1080p, matched at 1440p and slower at 4K.

Somehow I think that most people buying a 6800XT aren't planning on 1080p gaming...
 
Your phone has 12 GB of LPDDR4X, in the form of a single 32-bit wide, 4266 Mbps, low power RAM module that’s been around since 2016 - it offers up to 16 GB/s of bandwidth. The RTX 3080 has 10 GB of GDDR6X, in the form of ten 32-wide, 19000 Gbps, high power RAM modules that have only been around since last Fall and are only made by one company.

It’s not in the least bit cheap - each GDDR6X module is more than twice the price of a single large capacity LPDDR4X module, and the 3080 needs ten of them. Or if you’re expecting 20 GB, you’ll need twenty modules.
That's very true but we've seen from the comparisons between the GeForce RX 30 and Radeon RX 6000 series cards that there is little to no advantage to that very expensive letter "X" at the end of GDDR6X. This doesn't surprise me because having a larger VRAM buffer has always had a more profound effect upon gaming performance than the speed of the VRAM itself or the bandwidth that it has to use (to a point of course).

I have two Sapphire R9 Furies with 4GB of HBM1 but I'd much rather they had 6 or 8GB of GDDR5. Let's remember that HBM1 had an astonishing 4096-bit bus. That gave it 16x the bandwidth of today's RX 6800 XT and 8x the bandwidth of the RTX 3080. That didn't stop the GTX 980 from out-performing it with regular GDDR5. It is my opinion that these fancy VRAM types are, for gaming, nothing more than an expensive gimmick. This is because they only really see proper utilisation when using professional-level graphics applications. Games are designed to be playable on as many GPUs as possible to maximise the size of the potential customer base (to a point of course) and so aren't coded with these exotic VRAM types in mind.
 
RT and DLSS is a bet for the future, except that you dont lose anything when you lose since Ampere and RX6000 have the same P/P.
Now if RT and DLSS take off in the future, 50% of new games coming out from 2021 will feature either RT or DLSS or both, then it's obvious that 3080 is the superior option.
It's easy to tell that 3080 is a win-win bet.
They said exactly the same thing about the RTX 20 series. How did that pan out? I don't buy tech for the future, I buy it for the now. Buying tech for the future only guarantees two things:
  1. You'll be paying way too much
  2. What you get will be sub-par by the time the tech matures

People bought the RTX 2080 Ti as a "bet for the future". How did that turn out?
 
They said exactly the same thing about the RTX 20 series. How did that pan out? I don't buy tech for the future, I buy it for the now. Buying tech for the future only guarantees two things:
  1. You'll be paying way too much
  2. What you get will be sub-par by the time the tech matures

People bought the RTX 2080 Ti as a "bet for the future". How did that turn out?

1. You are not paying any premium for Ampere, not when AMD decide to charge similar price for their RX6000 series.
2. The more DLSS mature, the better RTX Turing age, 4K 60FPS Ultra is possible for even the 2060 in DLSS supported games.
There is really nothing going against buying RTX Turing right now if you can buy them at better P/P than Ampere, they are functionally the same.
You will find lots of people putting their old Turing into SFF case and play on their 4K TV, totally playable on Turing will the help of DLSS. I'm playing CP2077 on an 4K TV connected to my 2070 Super laptop just fine.

Even this guy find DLSS awesome
And he's getting burned in the comments
 
Last edited:
1. You are not paying any premium for Ampere, not when AMD decide to charge similar price for their RX6000 series.
2. The more DLSS mature, the better RTX Turing age, 4K 60FPS Ultra is possible for even the 2060 in DLSS supported games.
There is really nothing going against buying RTX Turing right now if you can buy them at better P/P than Ampere, they are functionally the same.
You will find lots of people putting their old Turing into SFF case and play on their 4K TV, totally playable on Turing will the help of DLSS. I'm playing CP2077 on an 4K TV connected to my 2070 Super laptop just fine.

Even this guy find DLSS awesome
And he's getting burned in the comments
This is true because of AMD's pricing and the lack of availability but only in this specific case. Under normal conditions, it doesn't hold up. I would definitely agree that right now, with all things being (uncharacteristically) equal, from a pure monetary standpoint, the RTX 3080 would appear to be a better choice than the RX 6800 XT.

Now, I wouldn't be getting an RTX 3080 but that's a matter of personal ethics, not a matter of which card offers more for the money. To me, it's about getting parity in the market to keep both companies honest. As it is, the pricing we're seeing is what I like to call "Non-cooperative market collusion" because nVidia's prices influenced AMD's prices even though the two companies didn't exactly get together and discuss it. This is the problem with duopolies and it's also why the Competition Bureaus of several nations (especially the US, where most of this is based) have completely failed. I suspect that this failure wasn't by accident but a result of political corruption and bad-faith lobbyists.

It has become painfully clear that without other players that used to be involved like Orchid, Diamond, Matrox and S3, the remaining two's duopoly might destroy itself as nVidia prices itself ever higher and AMD follows suit. Intel joining the fray won't help matters much because Intel has long demonstrated that it has no qualms about screwing the entire industry for its own financial gain.

I really hope that S3 comes up with something new because even though Matrox exists, it's currently thriving in its own niche market and probably won't re-enter the gaming GPU marketplace. I just don't know if S3 will consider it worthwhile because its latest GPU design, the S3 Chrome, is at least a decade behind the big two.

As an aside, I know Timmy Joe personally and he's 100% on the mark here. Being burned in the comments section is something that every YouTuber experiences. This is because of:
  1. Know-nothing know-it-alls (We've all seen them and know who they are).
  2. Arseholes who deliberately troll because they're arseholes.

Hell, even Jim from AdoredTV, one of (if not the) greatest tech gurus that I've ever seen or heard about, became so vexxed from the crap that he had to deal with in the comments section (along with YouTube's less than stellar behaviour), that he hung up his keyboard and hasn't made a tech video since October 21, 2020. To say that this was a major loss to the community would be the understatement of the year.
 
Last edited:
Back