GeForce RTX 3080 vs. Radeon RX 6800 XT: 2023 Revisit

It'll be interesting to see this next year when games are full adopted to the current series of consoles where we're still in this development of crossgen games technically. I will say this AMD was right to set the MRSP $50 below which was roughly 10% less than the RTX 3080 as the data shows that it is priced right. The issue they have now is AMD needs to sell the last stock of the RX 6800 XT before they can launch the RX 7800 XT and 7700XT.
 
Great article, smart conclusions.
After reading this article, customers can make a better decision on which videocard to buy for the right price.
Also from your review, we can better understand why Nvidia placed its RTX 4070 to 600$.

1. I suggest a small correction to: "The current state of DLSS support and quality is excellent, this truly is the key benefit of GeForce GPUs and we've been sure to make this known for some time now – basically ever since DLSS got good with its second iteration."
Better to write - benefit of GeForce RTX GPUs, because DLSS2 is not supported by Nvidia on their GeForce GTX GPUs.

2. I have another suggestion for helping us to decide which videocard performance is better when the differences are very close in performance.
For this we can look to both average frame rate and 1% lows fps.
Let's say that average frame rate, or 1% lows framerate for videocard 1 is higher up until 3,4,5% percent than videocard2.
Usually we can call it a tie due to margin errors in testing.
But I consider that 1% lows are more important in deciding a better game play experience especially at lower framerate than average framerate. So, when there is a close tie equal or less to margin error, I suggest to take into consideration which videocard has the better 1% low frame rate and call it "better".
From this point of view on Atomic heart game, I would say that RTX 3080 is (slightly) better in 4K, because has 1% low framerate higher, even if it is close to margin error (45 vs 42).
On Hogwart Legacy 4k, 6800XT is slightly better than RTX 3080 because 1% low is higher (39 vs37).

In general, even if the average framerate of videocrad 1 is higher than videocard 2 , but 1% low is lower then 5% of margin error, than I would say that videocard 2 is better in offering a smooth gameplay.

3. Regarding "The only real outlier here is Hogwarts Legacy where the RTX 3080 runs out of VRAM, but the 6800 XT wasn't able to provide playable performance here either, so technically that result is void".
I would rule out the 4K results from A plague tale: Requiem too, for both videocards, because none of the cards offer a minimum playable performance.
 
Last edited:
I was able to get a 3070 FE day 1, but just a couple of weeks later I was able to purchase a 3080 and I sold the 3070, which at that time actually paid for my 3080, but nothing like the crazy prices it demanded later. I'm glad that I decided to go with the 3080. The 6800 XT was clearly a good choice as well, but it was nearly impossible to get from launch until just a few months ago.
 
My Daily driver is RTX 3080 and these two articles grabbed my attention for that reason. I'd really love to see the same tests run on a non overclocked Intel 13900k, with an ASUS HERO and XMP timings.

Reason is when I'm benching at home with my development rig, the RTX runs about 10-20% better then your graphs, so I think your testing method might be a little off.
 
10GB is on the edge but let's scream for 12GB and 16GB minimums anyway now and accuse Nvidia of maliciously planning - this! What a joke. Ever think AMD has more VRAM because they can't evenly compete in RT so that's the next best thing they can do? It's not even G6X! I guess it worked cause everyone was losing their ish over the 3070 and every other 8GB card on the planet. Meanwhile, no one is buying Radeon! Have you seen their market share? You used their failing practices and expected Nvidia to follow their lead? Radeon isn't a leading brand! What the heck?

Because no one could have just said the 3070 is over the edge, but luckily after 2 years it's not the end of the world to start dropping image quality settings. Exactly what Steve replied when asked where 4070 12GB would be 2 years from NOW in a HUB Clips video.

4070 has 12GB. 4080 has 16GB.
Can we put this bs to rest now???

/rant
 
Well whether we want/need it or not, it seems like 12Gb of VRAM is being promoted heavily to be the minimum requirement, just like how NVIDIA is trying hard to sell the RT feature. I will just sit back and relax, waiting for the day when we can buy a 12GB VRAM G6X GPU with 400usd/euros then I'll think about upgrading xD
 
1. I suggest a small correction to: "The current state of DLSS support and quality is excellent, this truly is the key benefit of GeForce GPUs and we've been sure to make this known for some time now – basically ever since DLSS got good with its second iteration."
Better to write - benefit of GeForce RTX GPUs, because DLSS2 is not supported by Nvidia on their GeForce GTX GPUs.

Dont waste your time.

Something has happened to HU and Steven, that for some strange reason, every single damned review, he has to paint DLSS as this magical must have selling point thing, yet conveniently ignores facts like what you described or worse, never, ever call its out as to what it really is, a proprietary tech to keep everyone locked into nvidia hardware and stifling the gaming industry of any options since they are not open standards.
 
Dont waste your time.

Something has happened to HU and Steven, that for some strange reason, every single damned review, he has to paint DLSS as this magical must have selling point thing, yet conveniently ignores facts like what you described or worse, never, ever call its out as to what it really is, a proprietary tech to keep everyone locked into nvidia hardware and stifling the gaming industry of any options since they are not open standards.
Thank you for your support.

This post is for chat forum and also for TechSpot editors and HUB.
Both Steven and Tim called it time to time, and were and are among the first to take a stand.
But, because Nvidia effectively tried to blackmailed them, now they are more careful how they present the advantages and benefits of Nvidia products like DLSS for example, to not be painted as being anti-Nvidia or worse, as AMD fanboys. And Nvidia has a bigger army of passionate fanboys (which are ok), but also an army of trolls (which are not ok), than others.
What Nvidia did to HUB, to dictate them how to write a review, is the most disgusting and worse thing a corporation can do to a passionate and honest reviewer.
And because of that we now know what kind of evil company Nvidia can be from this point of view. For some, this may be quite shocking, how Nvidia managed to bring some great products and technologies and in the same time to be the bad guy, but facts always speak louder than any of their PR.

My post about DLSS and Tim articles may be perceived as being quite fierce by many, but I hope that my points stand in front of an objective scrutiny. My intention was to make the forum and TechSpot editors aware that too much DLSS shoved on to readers and customers throat and eyes is enough, and too close to an unintentionally commercial PR. Especially when many articles praised the DLSS strong points, which is OK in proper quantitities, but it is too much and not OK when did not talk, or blatantly disregarded the weakness and huge limitations which the same DLSS has. And all of this while not applying the same standards and methodology to AMD FSR (which is also a great technology).

Another important remark which I want to make is that for us, the readers of Techspot, it is more easier to express our amateur or semi-professional opinions, but is way more hard to step in the reviewer shoes and do the hard work which they are doing.
I appreciate their hard work and commitment, and from this point of view I can confidently say that TechSpot editors and Steve and Tim are among the best reviewers I found on the net. They are not perfect, and they are entitled to make mistakes, as we also do. I do not ask or look for such an imaginary perfection.

Summarizing my ideas, I asked my self what can I do to bring more positive value to TechSpot, to this forum and to this amazing hardware community.
Writing some posts is a little step in this direction but it is not enough. And I found a better and more efficient solution. To support through a decent paid subscription the websites content and/or reviewers content and channels which are passionate, skillful and bring great content and reviews to us.

So I encourage everybody, as much as they feel appropriate and afford, to support through subscription TechSpot and/or reviewers like HUB.
They well deserve it from my point of view and that's why I subscribed, and doing so we will continue to receive more and better content.


I can tell you that after many years of constant documentation, reading, watching and analyzing hardware websites and YouTube reviewers that TechSpot and HUB is among the best sources of information about computers and hardware in the world.
I can say that because when readers engaged and reacted to their content, or pointed out what they thought was not good or a mistake or even something wrong, TechSpot and HUB for example, responded more often in a positive and constructive way, and took extra efforts to improve their content and/or correct their mistakes.

Another thing which why I love TechSpot, is that here I found the best forum and community to share our passion for computers, IT, science, even politics sometimes. That's why I love this chat and community and engage with you.
Even if we may disagree sometimes, by engaging in a civilized and constructive debate, we can share our opinions, find new information and different points of view and thus improve ourselves and expanse our knowledge, experience and hobbies.

P.S. I found and other great sources of information besides TechSpot and HUB, like Hilbert Hagedoorn from Guru3d and many more, apologize if I do not mention more of them, so thanks God that they are not alone.

P.S.2 Oh, and the sense of humor and Article titles from TechSpot are quite often liquid gold, including the forum.
:)
 
Last edited:
My 3080 works just fine for the time being, but it is a real shame Nvidia did not put more VRAM on this card!
I feel cheated, though I bought it second hand real cheap in December 2022.
The GPU is very capable, even today, and it would be a real shame to see it hampered by lack of VRAM...
This card deserved 12 GB from the start, if not more!
 
I'd also add that DLSS is so far ahead of FSR that it should be considered more. Indeed XeSS on an Intel GPU looks as though it's beating FSR in image quality.

I was lucky to pickup the 3080 TUF OC (£720) at launch knowing it was a one generation card, but it's on the shelf now as it just feels it's a little low on horse power even at 1440p. I went with a 4090 again at launch for the huge RT bump but this time hoping that it lasts two generations.
 
Last edited:
Team Red so used to favourable coverage, that when it's neutral and the strengths of each product are covered, it feels like being pro Nvidia.

Looks like team green are the ones experiencing fine wine lol, the gap widened and DLSS is better than ever, at least AMD has VRAM eh.
 
Dude. The 4080 is $1200.

Twelve. Hundred. Dollars.
If I remember correctly, the first time we saw the GeForce Titan xt, it was like 1000 dollars and everyone was shouting out how expensive it was... Now it looks like 1000 dollar GPU is a standard. Not a good time to be a pc gamer if you ask me.
 
Shared this today in our weekly newsletter (we usually send twice per week, include a fun gallery photo on every issue)...

View attachment 88991
Loool, and thank you for helping me start my workday with the Best joke of the day already.
Worth seeing me laughing out loud in front of my monitors. Now my work mood skyrocketed fast.
Please keep sharing with us the fun gallery too. :laughing:

P.S. Indeed, an inspired image is worth thousands words. Or many hours of debates. :laughing:
 
Last edited:
Dude. The 4080 is $1200.

Twelve. Hundred. Dollars.

Lol that's bad but not that bad.

Admittedly, May 2021 was a bad time to be left without a GPU nm the year I'd planed for my 5 year upgrade. Nor to reach the 3440x1440 res, comfortably and with reassuring longevity, that was denied me by similar (though not as extreme) price madness in late 2016, when somehow my premium 1070 was under £500 but a similarly premium model 1080 was £900... Anyway, as is well remembered, all was nuts back in 2021 but some things were more nuts than others. I decided on a 6800XT (Sapphire Nitro+ SE cos I like nice cards with pedigree) at £1200... yes, way more than MSRP but also immediately available, over sticking with Nvidia and a 3080 that ranged from £1800-2400, kept that margin for months, wasn't immediately available with stock mostly TBA etc for months (indeed I know of ppl who ordered their 3080 at that time and weren't served until near Christmas) So, out of the worst year for it I got the best I could and we could call that 6800XT £900 as smart buying and deals elsewhere on that upgrade saved me £300 on the pcpp prices (which also allowed me a lot more money to go higher on the other parts vs my options with the 3080)

And that 6800XT, along with AMD's other improvements, has turned out to be a winner vs Nvidia's missteps and... well, VRAM and raster vs price when RT is niche, inconsistent and has little coverage, and DLSS is either needed to balance perf back where RT is used or make up for a lack of VRAM (or poor optimisation) where not, keeps AMD as a first look option for me at least. Asides from those (minor points, made large via advertising, hype and mindshare) AMD is evens when it comes to simply getting the best out of gaming.

And even now the price difference to perf is around the same 6800XT vs 3080 (£500-600 vs £700-800) as it is 7900XTX vs 4080. The main difference being my 6800XT is going to keep slugging for much longer on VRAM alone, as the 7900XTX will over the 4080 in years to come. Nvidia's somewhat better RT with still too few dedicated assets to use it without major caveats (even at the high end) along with lower VRAM overall just isn't worth paying more for.

Next gen? Maybe it'll be a closer choice again... but afaik Nvidia are already taking a leaf out of AMD's book for that. All I know for sure rn is that if I had a 3080 I'd feel more pressed to upgrade sooner to stay ahead at 3440x1440 than I do with the 6800XT, and if losing ~5 fps or so in the odd game is the cost (when I average 80-100 ultra in AAA's anyway) then fair enough.
 
I have a 3080 12GB and later bought a 6950XT when it fell below $650 and came with a free game. Both GPUs are very powerful and both GPUs run great. I have never had a problem with either. The Last of Us was the free game I got with my GPU, the game was a disaster and it had crashes and performance issues on both GPUs. The 6800XT seems like a great card as well.
I don't think fanboying over either team is necessary. Get what works for you, don't get mad at others for the decision they make, just game on and mind your own business.
 
I was able to get a 3070 FE day 1, but just a couple of weeks later I was able to purchase a 3080 and I sold the 3070, which at that time actually paid for my 3080, but nothing like the crazy prices it demanded later. I'm glad that I decided to go with the 3080. The 6800 XT was clearly a good choice as well, but it was nearly impossible to get from launch until just a few months ago.

I bet you were sorry you sold the 3070 too early ;)
 
Even second hand 3080's are way dearer than 6800XTnin Australia and for photo editing and the like the 16GB is required. My AI processing runs much faster on 6800XT than my 2080 Super
 
In a near future the lack of VRAM will kill 3080's performance in RT which was its selling point for some.

3080 is 3 years old very soon and still does RT better than most AMD cards, whats your point? Do you expect a 699 dollar GPU to run games maxed out for 5 years with RT enabled? I sure don't...

My 3080 still runs very well and not once have VRAM been an issue.
I don't use RT at all tho. Could not care less about RT actually. However I like all the other RTX features, DLDSR especially.

I will upgrade later this year when 4080 Ti launches (buy a cheaper 4080 or just buy the 4080 Ti). Maybe 7900XTX will be considered, if price is below 800 dollars but I am not sure I can live without DLDSR again. Awesome feature for single player games, that delivers close to native 4K quality to my 1440p panel with very little performance impact. Games look way sharper and crisp + functions as antialiasing as well.
 
Last edited:
Back