GeForce RTX 3080 vs. Radeon RX 6800 XT: 30 Game Benchmark

Lmao

This is the guy who literally got cut off from Nvidia for not covering their cards well enough.... He's always had a heavy amd bias (but even with that he's still a reasonable person and isn't a delusional fanboy).

The article just goes to show that it's almost impossible not to admit that Nvidia has do e a better job once again overall between features future tech pricing and availability.

All of it is showing without question that Nvidia is the best option.

We evaluate every product individually and if you take a proper look at our review history you'll find significantly more positive Nvidia product reviews than AMD.

It's the same old crap and we often heard the opposite during the AMD FX era, "ohh they've got a strong Intel bias". Didn't matter that Intel made significantly better products, nope that couldn't be it.

Anyway your AMD favoritism argument falls apart when you start to look at our content. Go find the pro Radeon VII coverage, what about the Vega 56 and 64 coverage? How about the RX 590 day one review? Don't be lazy with your opinions, do some actual research first.
 
Last edited:
From the article:

"Therefore it’s time to call it: AMD has failed to deliver. "

"On the Nvidia side, at least there are RTX 3080 AIB models listed at the MSRP, even if they are out of stock"

Isn't that essentially the same but worded differently? Splitting hairs has become a fine art.
 
From the article:

"Therefore it’s time to call it: AMD has failed to deliver. "

"On the Nvidia side, at least there are RTX 3080 AIB models listed at the MSRP, even if they are out of stock"

Isn't that essentially the same but worded differently? Splitting hairs has become a fine art.

Well no because people have bought the GeForce GPUs at the MSRP, that's not even close to possible on the AMD side. Moreover once supply does meet demand we can hold Nvidia at those MSRP prices, we can't do that for AMD.
 
This is also a very lazy comment. We evaluate every product individually and if you take a proper look at our review history you'll find significantly more positive Nvidia product reviews than AMD, that's just a fact and it's why your comment is very lazy.

It's the same old crap and we often heard the opposite during the AMD FX era, "ohh they've got a strong Intel bias". Didn't matter that Intel made significantly better products, nope that couldn't be it.

Anyway your AMD favoritism argument falls apart when you start to look at our content. Go find the pro Radeon VII coverage, what about the Vega 56 and 64 coverage? How about the RX 590 day one review? Don't be lazy with your opinions, do some actual research first.

Edit: Also you can edit your posts, condensing them into a single post and not something that looks like spam.


I'm not speaking to years of coverage or decades your bias seems to follow the flow of "community opinion" it tends to reflect what is likely to generate the most views / likes / interactions / etc and I have no rat in the race but your pay check is yours.

Everyone of you is chasing the almighty algorithm and having watched your content for years I've watched as your rode that wave.

No Hate I know you're not ignorant or even a "fan boy" just another person chasing clout / financial gain.

Also as far as my repeated posts I wanted to do it better but the editing tools aren't the best (at least on mobile) sorry I had a lot of interaction to give you and wanted to have an actual input on all the discussions going on.

Ultimately it's just helping you even more.

Edit:

Thanks for removing the empty duplicates though I was tired of looking for the place to do it
 
Last edited:
I'm not speaking to years of coverage or decades your bias seems to follow the flow of "community opinion" it tends t reflect what is likley to generate the most views / likes / interactions / etc and I have no rat in the race but your pay check is yours.

Everyone of you is chasing the almighty algorithm and having watched your content for years I've watched as your rode that wave.

No Hate I know your not ignorant or even a "fan boy" just another person chasing clout / financial gain.

Also I wanted to do it better but the editing tools aren't the best (at least on mobile) sorry I had a lot of interaction to give you and wanted to have an actual input on all the discussions going on.

Ultimately it's just help g you even more.

This is a pretty pathetic justification for acting like a fanboy. So it can't possibly be that we call it as it is?

Also you haven't provided a single bit of evidence to support those claims and presumably all the evidence you need is present on this very website. For your claim to hold water you'd need to provide evidence where we chop and change based on what we think will bring in the most views, this should be extremely easy to prove.

Within the same generation we said the Radeon VII was crap for gamers but the 5700 XT was excellent value. So within the same generation we're favoring and then not favoring AMD... for views? Did I understand that correctly?

My evidence here: https://www.techspot.com/review/1848-radeon-vii-vs-geforce-rtx-2080/#:~:text=Traditionally AMD has done well,RTX 2080 is Battlefield V.
 
Last edited:
Let's see, should I bother getting involved in an obvious ranting from a fanboy? Hmmmm......

Na, everyone is entitled to their opinions. To answer your question, yes, I paid MSRP. Does that blow your mind? Because, no one could possibly obtain one without paying a scalper. lol.

It's called working in the industry, and knowing people. Your black and white thinking does not serve you. I've just met you and you want no nerd rage all over me for sharing my opinion? So be it, have at it.

I've used Nvidia for years and years. Honestly, having tried both products, I can honestly say I prefer my 6800 XT. If you don't agree, do us all favor and keep your opinion to yourself because you obviously can't have an opinion and be civil about it.

Saying Nvidia is still better after fundamentally doing their best to make their advantages something not seen in these numbers (like moving dlss / rt to a separate article) just shows that Nvidia is clearly the winner.

When Nvidia gets handicapped in the things that absolutely are their main focus and they still come out on top that just speaks to the value.

Steve has always leaned towards an amd bias but he's also not a delusional person / shill.

You say you got your 6800 XT Nitro+ but what you didn't say is what you paid...
More than likely ( I can actually guarantee) you didn't pay anything near the $649 msrp they "claim".

You can be bias (most people are especially when they spent the money you did to aquire something that's so hard to get) but objectively the 3080 overall has been the better card from value to costs to availablity.

Enjoy what you line and convince yourself any which way you want/need that's fine... It's actually rational.

But at the end of the day Steve's Conclusion isn't one that can really be argued with amd with it Como g from the guy known as the Nvidia fighter amd supporter that he is we know it's likley to be only a personal opinion that could argue against it.
 
Last edited:
Shadowboxer - you underestimate the readership here - We come to tech sites because we love tech - just about none of us would walk into BigBarnPC and buy what some sales assistant tells us - we made seek their advice on reliability etc - possible deals or they may know of something in the pipeline.

This review didn't hide anything - it gave caveats - As I said I brought the RTX 3080 for my son - as I know I'm getting a LG950 screen so it's faster in 4K - The DLSS and with RT - my son can minecraft with all the bling - plus I think Nvidia can do some sound thingy as well ( minor thought ) .

Both are great cards - if someone in the future can get the AMD with 2 or 3 AA titles thrown in why not .

So no one is being fooled ( is it only YOU that can't be fooled - and you need to protect all us other ignorant readers )

Even new readers are catered for - I wish I knew when I started ripping CDs - to throw away the dvd drive and get one the speeds through scratched up CDs with no problems .
Most folks spending $700 especially for the first time will probably seek quite a few reviews , reddit , their gaming forums etc .

The only main point of contention you could argue about is RT - as others say - is an an extra at the moment . If you plan on having your card for sometime it will be more and more important . Except for a complete Noob everyone - I mean everyone knows Nvidia > AMD for RT

Anyway AMD have been amazing - new cards for Consoles , new PC CPUs, new Mobile, new server cpus, collabs with samsung with it's mobile gpu - New GPUs that are quite comparable , better drivers - plus I think new ARM like stuff etc - to advance across a wide spectrum all at once is quite incredible
 
The RTX 3080 cards is aimed at 4k gaming and are faster at 4k. Also the conclusion needs a little work. In the past if two cards were equal at 1080p and 1440p but one went ahead at 4k. If this happened then the card faster at 4k would be the faster card. We would talk about why at 4k the other card bottlenecked. The 6800xt is ment to go head to head against the RTX 3080. It is marketed and was priced, 6800xt vs 3080. The fact that the 3080 pulls ahead at 4k (in almost every game), pulls ahead even more with DLSS at any resolution and in DXR games which are the future. This would put the RTX 3080 firmly ahead of the 6800xt.

This is also architectural and not fixable with drivers. DLSS uses tensor cores which are faster than compute for that task. So super-resolution will never be as fast. Tensor cores excute in one clock cycle and not many like compute (4 cycles on nvidia cards). Also DXR, NVidia performance lead is architectural and cant be fixed with driver. Both Ray tracing and Rasterization pipeline operate simultaneously and cooperatively in Hybrid Rendering model using Turing GPUs.

RT Cores in Ampere accelerates BVH traversal algorithms, whereas in RDNA 2 this is done via compute shaders using the SIMD 32 units. RDNA 2 texture and RA both use the texture processor. https://www.freepatentsonline.com/y2019/0197761.html https://images.anandtech.com/doci/15994/202008180220211.jpg This means that RA which is in use at the start, has to finish for the card to use its texture units. RA uses the texture cache, for RT storage and BVH caching. https://static.techspot.com/images2/news/bigimage/2019/06/2019-06-28-image-18-j.webp On NVidia's card you can do both RT and raster at the same time. https://static.techspot.com/articles-info/2151/images/2020-12-03-image-2-p.webp https://static.techspot.com/images2/news/bigimage/2019/06/2019-06-28-image-18-j.webp
 
The choice isn't hard at all..I don't know why you are trying to sugarcoat it. nvidia card has better drivers, more performance, more features (like DLSS) and better (like RT), plus the whole software package that is better with nvidia. I would start to consider an AMD card when they will come with a product like the 9700Pro which beat everything. Otherwise, I wouldn't pay 700 bucks for a gpu with crappy drivers.
 
How long realistically will 10GB VRAM last when playing at 4K?

Is there any kind of trend we can look at that plots minimum or recommended VRAM against time in AAA PC games? Could there be an extrapolation?
I thought about this and concluded it doesn’t matter to me, the 3080 is the best mainstream card at 4k and it still struggles at top settings (not to mention ray tracing). Given how demanding 4k is, there is no long term 4k gpu, if we want 60fps at 4k in the latest games, we will have to upgrade to the -80 or -80ti variant every generation for a while, so if 10gb is enough now, it’ll be enough until a 4080 is necessary to deliver the 4k high quality 60fps goodness.
 
I thought about this and concluded it doesn’t matter to me, the 3080 is the best mainstream card at 4k and it still struggles at top settings (not to mention ray tracing). Given how demanding 4k is, there is no long term 4k gpu, if we want 60fps at 4k in the latest games, we will have to upgrade to the -80 or -80ti variant every generation for a while, so if 10gb is enough now, it’ll be enough until a 4080 is necessary to deliver the 4k high quality 60fps goodness.
Agreed - to quote the original 1080Ti review "The GeForce GTX 1080 Ti is now the ultimate 4K gaming solution"
Guess the world has moved on a bit since then...
 
Agreed - to quote the original 1080Ti review "The GeForce GTX 1080 Ti is now the ultimate 4K gaming solution"
Guess the world has moved on a bit since then...
Haha yeah, for all the fans arguing about their favourite team, I’m just pretty impressed with how fast graphics cards are improving each generation (compared to other components).

At least at 60Hz 4k we just have to keep buying new GPUs, the 144Hz 1440p guys have to keep upgrading both GPU AND CPU (+MB)!
 
The article goes out of its way to mention RT & DLSS, which technically aren't even really relevant for this comparison, and the fanboys are STILL complaining that this is biased against nVidia...
 
The article goes out of its way to mention RT & DLSS, which technically aren't even really relevant for this comparison, and the fanboys are STILL complaining that this is biased against nVidia...
If you're unbiased and have your own opinion on things you will get attacked from both sides because they don't like when you're not on their side. That's why it's impossible to not be controversial when you think for yourself.
 
This is also a very lazy comment. We evaluate every product individually and if you take a proper look at our review history you'll find significantly more positive Nvidia product reviews than AMD, that's just a fact and it's why your comment is very lazy.

It's the same old crap and we often heard the opposite during the AMD FX era, "ohh they've got a strong Intel bias". Didn't matter that Intel made significantly better products, nope that couldn't be it.

Anyway your AMD favoritism argument falls apart when you start to look at our content. Go find the pro Radeon VII coverage, what about the Vega 56 and 64 coverage? How about the RX 590 day one review? Don't be lazy with your opinions, do some actual research first.

Reading this comment made me decide to register an account so I could add my 2 cents so to speak.

This review is incredibly lazy, firstly you didn't even bother to show charts of the average of the 1% lowest frame times across the 30 game benchmarks at 1440p or 4k which is as important a metric to consider when purchasing video cards as the average frame rates.

I think its also safe to say that most people who buy these cards or are considering buying these cards aren't purely interested in only traditionally rasterised games and settings. If you don't care about 4k or 1440p gaming with RT enabled and everything turned up to max or ultra-high frame rates buying one of these cards would be a complete waste of money. So your testing using only standard rasterised games/settings is totally flawed as a large proportion of purchasers of these cards do actually care about features such as RT, so to leave it out in your "shootout" is either lazy or just dumb. These aren't mainstream graphic cards. Techspot can do much better.

Finally, your conclusion that this is a tie is a joke. This is clearly a win for the rtx 3080 for the vast majority of consumers who are interested in purchasing a high-end graphics card with its superior performance at 4k, far superior RT performance and AI supersampling not to mention features like RTX broadcast and good video encoding which the 6800 xt is lacking.
 
Last edited:
Cool I'm glad the low single digit percentage of people who play at 4k have DLSS that the rest of us have and looks like trash at actual popular resolutions, and the ever unusable barely noticeable ray tracing is so very sublime. Always good to have lots of fringe tech covered and make a comparison really dumb and easy to just skip to the end for these same awful tropes again. The fun part is going back up to read that no actually it's not much of a tossup, but at least everyone can go on about butt ugly NVENC, which is h.264/5 with a blurry piggish coat of mascara.

Guess I'll keep my 2060 until next gen when these amazing advantages Nvidia has might not be total dog**** in reality or AMD actually stocks something other than Ryzen for once.
 
Budget HDR monitors are terrible and look worse with HDR on, this is a known fact, plenty of reviews prove you need to spend decent money to get a screen that has a decent HDR implementation.

Er, I don't know, my HDR TV is an InFocus one, bought a years ago, made by Foxconn (they make value stuff, except iPhones), Japan 10th gen Sakai 4K 75" 10 bit IPS panel from Sharp, not expensive (for where I am from), about 1200 USD but gets the job done delivered HDR raytracing and reflections goodness fine, never seen anything this real in a game than Cyberpunk with these features on, especially RTX reflections and HDR seem to do magic with the skins, skins finally look like skins... With that said if I were in the US, I would get an OLED TV for sure.. a 75" OLED here is like 3 times more expensive than what you can get in the US... 9 times more expensive than my InFocus IPS...
 
Last edited:
"This latest GPU shootout sees the faster and more expensive GeForce RTX 3080 pitted against the Radeon RX 6800 XT."

I'd have worded that starting sentence differently personally.

"We’d rather spend $150 on something like a second hand RX 570 and play some older games, or esports titles with lower quality visuals and wait it out."

100% agree, the stock and prices are stupid in the current market. I'm tempted to sell my RX 470 and double my money I paid for it used in 2019.

But then what do you play with?
 
Er, I don't know, my HDR TV is an InFocus one, bought a years ago, made by Foxconn (they make value stuff, except iPhones), Japan 10th gen Sakai 4K 75" 10 bit IPS panel from Sharp, not expensive (for where I am from), about 1200 USD but gets the job done delivered HDR raytracing and reflections goodness fine, never seen anything this real in a game than Cyberpunk with these features on, especially RTX reflections and HDR seem to do magic with the skins, skins finally look like skins... With that said if I were in the US, I would get an OLED TV for sure.. a 75" OLED here is like 3 times more expensive than what you can get in the US... 9 times more expensive than my InFocus IPS...
I don't disagree with you, I was specifically talking about computer monitors.

There are some gems out there in the TV world as you've listed but even so, you tend to have to spend a bit of money for a proper worthy HDR implementation.

There is no such thing in the Monitor world, they all suck so far unless you spend serious money.
 
If you have a television that is FreeSync or "G-Sync Compatible" but not officially supported by Nvidia, then AMD definitely holds an advantage there. I had an RX 5700 and the FreeSync worked perfectly and with the RTX 3080, while G-Sync definitely works, it flickers a lot (really some of the frames get brighter giving a flickering effect, not flickering black), especially if the framerate ever drops below 50 fps. I really like my RTX 3080, but I am disappointed that neither my 120hz 4K television nor 144hz 1440p monitor are officially supported.

My only other complaint about the RTX 3080 is that I did buy Nvidia rather than trying to get the RX 6800 XT because of the RTX and DLSS support. But, so far, very few game support RTX in general and the ones that do often still run too poorly to enjoy. "The Medium" is a stutterfest with RTX Ultra @ 4K and DLSS Performance, even though these are Nvidia's "Optimal" settings. In one particular area the game runs at 12fps with ultra RT! I turned RT down to 'normal' or just 'on', but there was still a ton of stuttering, even if the framerates appeared to be acceptable. When I finally just turned RT off, the game's performance improved a lot, though the game still has performance issues even with no RT. Control is the only game that I have played thus far where RT really worked, albeit, DLSS is a necessity @ 4k.
 
If you have a television that is FreeSync or "G-Sync Compatible" but not officially supported by Nvidia, then AMD definitely holds an advantage there. I had an RX 5700 and the FreeSync worked perfectly and with the RTX 3080, while G-Sync definitely works, it flickers a lot (really some of the frames get brighter giving a flickering effect, not flickering black), especially if the framerate ever drops below 50 fps. I really like my RTX 3080, but I am disappointed that neither my 120hz 4K television nor 144hz 1440p monitor are officially supported.

My only other complaint about the RTX 3080 is that I did buy Nvidia rather than trying to get the RX 6800 XT because of the RTX and DLSS support. But, so far, very few game support RTX in general and the ones that do often still run too poorly to enjoy. "The Medium" is a stutterfest with RTX Ultra @ 4K and DLSS Performance, even though these are Nvidia's "Optimal" settings. In one particular area the game runs at 12fps with ultra RT! I turned RT down to 'normal' or just 'on', but there was still a ton of stuttering, even if the framerates appeared to be acceptable. When I finally just turned RT off, the game's performance improved a lot, though the game still has performance issues even with no RT. Control is the only game that I have played thus far where RT really worked, albeit, DLSS is a necessity @ 4k.

No adaptive refresh rate on mine, simply straight 60hz but I found turning on motion blur high helps, looks natural too, and tearing becomes not as distracting. If G Sync is needed on a 4K TV, LG OLED seems to be the only way to go.

To me Cyberpunk alone was worth it... more games should have RTX/Reflection/HDR implementation as great as Cyberpunk.. and yea DLSS is defo necessary for 4K RTX Cyberpunk too, balanced setting works great. Haven't tried Medium. Control is alright, but the RTX in it is just OK. Watch Dogs Legion seem legit, and looking forward to Dying Light 2 now, but no hurry, because really need some rest after Cyberpunk now heh..
 
Back