AMD Radeon RX 6800 XT Review

Big D. Navi main selling points;

1. No silly psu cable

2. Efficient power usage (lower temps)

3. Smal gpu form factor without compromise for cooling compared to 3080 etc

4. More potential boost for OC even in 4k

5. Usb type C port !

To be honest there is enough reason to go all in for AMD this year !
 
The extra memory will benefit it once games start using more than the 3080s 10gb or the 3070s 8gb, at 4k/1440p (or whatever res). So if in 2 years (or whenever)
On the account of "whatever", nearly all modern games support memory profiling, which can be chosen based on which hardware is used. It could be that the new video card is simply not listed for automatic memory profiling. But that shouldn't take more than 5 mins to add, and deploy the patch. It won't need 2 years :)
 
Well done?

When you have no competing flagship GPU for consecutive years you better come hard. That should be expected, not praised.

Five years without a new CPU you better come hard when you can. Sooner rather than later is preferential. Again, the performance gained after that amount of time is expected to be competitive - not praised.

Good job AMD. It's about freaking time.
Keep it real.
When you are competing with a 10th the budget split between CPU and GPU development, this type of advancement is to be praised, nto merely expected.

Fantastic job AMD!
 
On the account of "whatever", nearly all modern games support memory profiling, which can be chosen based on which hardware is used. It could be that the new video card is simply not listed for automatic memory profiling. But that shouldn't take more than 5 mins to add, and deploy the patch. It won't need 2 years :)

Profiling in a game like Doom Eternal just restricts what settings can be used. Memory profiling does not help the RX 480 4GB vs the RX480 8GB, the 4GB model just loses performance and/or stutters when exceeding the VRAM, until settings and/or res are reduced. It is all about lowering settings and/or res. Especially if it exceeds the VRAM by a large amount.

And texture settings especially do this (even more so when combined with high res), and if a game exceeds the VRAM buffer by a large amount, such as the RX 480 4GB model, then you experience FPS drops or cause stutters until you lower that VRAM usage. The same with all GPUs when VRAM is exceeded by a large amount.
 
Last edited:
On the account of "whatever", nearly all modern games support memory profiling, which can be chosen based on which hardware is used. It could be that the new video card is simply not listed for automatic memory profiling. But that shouldn't take more than 5 mins to add, and deploy the patch. It won't need 2 years :)

But I am not honestly not meaning to panic anybody, and I wouldn't worry about it as an 3080/3070 owner if I were one right now, it would be a few years away still. And, if worst comes to worst, dropping textures from Ultra to Very High/High (and maybe some other settings?) to compensate, is honestly not a train smash. I have done it many times over the years, and I it was not enough to upset me, because at least I could still play the games just fine at reasonable settings.

And it is all future talk, it is honestly nothing worth worrying about now, but something to definitely consider if purchasing today. But I was only explaining how the 16gb on the 6000 series will actually benefit it, as in the future, and not now with current games. I was only explaining when and why the extra VRAM would actually benefit, which is in the future, when new demanding games actually take advantage of it.
 
Last edited:
And now let's see how AMD handles keeping up-to-date drivers in the mix. Something they've really struggled with in the past.
 
Great write up as usual with one glaring error in my opinion.
Steam October hardware survey:
66% of gamers are playing at 1080p. 9% at 2k and 2.3% are at 4k. Yet 1080p is left out.
 
"The advantages of the GeForce GPU may be more mature ray tracing support and DLSS 2.0, both of which aren’t major selling points in our opinion unless you play a specific selection of games. "

This sounds like an uninformed youtube or reddit comment. There are several dozen games already using both those tech. The paroting of "not enough games" might have been true 2 years ago. Might have even been true 1 year ago. Certainly not as of now. Right when 4 of the biggest 5 games releasing at the end of 2020 have either ray tracing or dlss - Watch Dogs, Cyberpunk, Call of Duty, World of Warcraft.

RTX and DLSS are absolutely a selling point for buying new cards. Why would you buy one without them especially now after consoles got this tech too ? It makes no sense.

You dont need DLSS in any random game, its enough to have it in the heaviest hitters of the year. The games that require top of the line hardware to run well. Since AAA games arent releasing in the hundreds per month, of course its not gonna be a huge number of games when you say it in a vacuum. But that number is a fair one as of now and it's getting bigger each passing month.
Completely agreed, it smells like bias on techspots behalf. Looking at the numbers it’s very clear that DLSS would heavily tip the balance in Nvidias favour. But Techspot don’t want consumers to know too much about that apparently.
 
Great review as always Steve.
Nice job by AMD. Incredible performance - considering the resource deficit they have with Nvidia on staffing and R&D.
I'm with Steve here on raytracing in games. We are still 3 or 4 generations away from full raytracing, and the realism that some games warrant, and by that time, games will come on a 4tb nvme stick due to their sheer size.
From what I can see there is virtually no difference, performance-wise between the red and the green, but I do like the Nvidia FE design...but that's just me.
When you are spending £600-700 on a graphics card - £50 either way is not going to make much of a difference to your buying decisions, although AMD have a chance here to take market share away from Nvidia due to their current woes with yields, and more power to their elbow if they do.
 
Completely agreed, it smells like bias on techspots behalf. Looking at the numbers it’s very clear that DLSS would heavily tip the balance in Nvidias favour. But Techspot don’t want consumers to know too much about that apparently.
I'm sure anyone forking out £700+ on graphics card already know all about DLSS and its availability or otherwise.
 
Very impressive, and yeah idgaf about RTX performance so that's fine. Based on other reviews I just read nah, sounds like the reference is really nice, I'd only go for a high end AIB to overclock it.
 
Great write up as usual with one glaring error in my opinion.
Steam October hardware survey:
66% of gamers are playing at 1080p. 9% at 2k and 2.3% are at 4k. Yet 1080p is left out.
The average 1080p results is shown, but they could be gleaned from the 1440p/4K figures anyway - the 6800 XT is 3080-ballpark at 1440p, and slightly behind at 4K, so at the very least it would be on par with the 3080 at 1080p.
 
The average 1080p results is shown, but they could be gleaned from the 1440p/4K figures anyway - the 6800 XT is 3080-ballpark at 1440p, and slightly behind at 4K, so at the very least it would be on par with the 3080 at 1080p.
Don't get me wrong I agree with you.
It's just that to see how each game running at 1080p performs could be a deciding factor.
 
RTX and DLSS are absolutely a selling point for buying new cards. Why would you buy one without them especially now after consoles got this tech too ? It makes no sense.
I completely disagree with one of your points and half-agree with the other.

I disagree with your assertion that RTX matters because DirectX ray tracing is now a thing. Developers can support RTX but they must support DirectX. Ergo, RTX is irrelevant.

As for upsampling, I agree that it is a major selling point for buying new cards and is a phenomenal technology that, unlike ray-tracing, is a REAL game-changer. However, I disagree that it's relevant for top-end cards like this because they can already game natively at 4K. While upsampling does look good, it still doesn't match native resolution so top-end cards won't really be using it. However, on mid-to-lower-end cards, good upsampling tech will be absolutely invaluable and THAT is the price point at which most video cards will be sold. This makes upsampling tech very important.

IF technologies like nVidia's DLSS and ATi's FidelityFX turn out to be as good as it appears they will be, they might be screwing themselves over because what gamer is going to pay the big money for a top-end card RTX 3080 or RX 6800 XT when upsampling technology can give an almost identical gaming experience on a high-end card like the RTX 2070 or RX 5700 XT? Hell, it may one day be that upsampling tech will be so good that you'd be able to game at 8K with a GTX 1660 Ti or 5600 XT's level of rasterisation performance.

I can't tell you how much I look forward to that day. :D
 
Completely agreed, it smells like bias on techspots behalf. Looking at the numbers it’s very clear that DLSS would heavily tip the balance in Nvidias favour. But Techspot don’t want consumers to know too much about that apparently.
They address DLSS in the beginning of this article:

The advantages of the GeForce GPU may be more mature ray tracing support and DLSS 2.0, both of which aren’t major selling points in our opinion unless you play a specific selection of games. DLSS 2.0 is amazing, it’s just not in enough games.

This even includes a link to the DLSS 2.0 article for those inclined to check it out.

What I like about Tech Spot's reviews is the good mix of games and how they point out which titles are sponsored by which GPU maker. Don't recall seeing this at sites such as TPU.
 
Big D. Navi main selling points;

1. No silly psu cable

2. Efficient power usage (lower temps)

3. Smal gpu form factor without compromise for cooling compared to 3080 etc

4. More potential boost for OC even in 4k

5. Usb type C port !

To be honest there is enough reason to go all in for AMD this year !
I've been doing it since 2009 and have had no regrets. :D
 
I was pleasantly surprised originally, at the time of the announcement, that AMD came out with competitive cards. While the review shows the 6800 XT beating the 3090 in 1440 in most games, it seems to me that no one really needs the enormously high frame rates attained there, and so the 4K results, where the 6800 XT consistently lags the 3080, would be more relvant.
However, the 6800 XT has more memory, and that should help somewhere. NVIDIA, of course, is ahead in the software they provide for their cards, and that adds to their value.

I think the numbers will matter to everyone using 1440 monitors which if I were to guess outnumber those on 4k monitors..


They address DLSS in the beginning of this article:

This even includes a link to the DLSS 2.0 article for those inclined to check it out.

What I like about Tech Spot's reviews is the good mix of games and how they point out which titles are sponsored by which GPU maker. Don't recall seeing this at sites such as TPU.

We already knew what his choice was going to be before this article dropped so I wouldn't even bother.

If I was him I would be focusing on upgrading from a Haswell based cpu and getting on a moderm platform than worrying about these high end gpus.
 
Last edited:
It doesn't, the 3080/3090 suffer at 1440p. Check the 6800 XT scaling with the 2080 Ti for example.
I have subtracted 1440p from 2160p FPS's and the difference is the following:
shadow of the Tomb Reader
6800xt 169, rtx3800 149, rtx2080ti 120, 5700xt 100, rtx2700S 92
Horizon Zero Dawn
6800xt 51, rtx3800 45, rtx2080ti 38, 5700xt 33, rtx2700S 29
Resident Evil 3
6800xt 92, rtx3800 92, rtx2080ti 70, 5700xt 50, rtx2700S 52

As you can see in the first two games, RDNA cards seems to Suffer more in 4K yet alone other games where there is bigger margins, maybe the folks explanation is correct regarding the bandwidth?.
On Tom's Hardware they noticed that ray tracing in watch dog doesn't look same, have you noticed any similar thing?
thanks for replaying
 
They address DLSS in the beginning of this article:



This even includes a link to the DLSS 2.0 article for those inclined to check it out.

What I like about Tech Spot's reviews is the good mix of games and how they point out which titles are sponsored by which GPU maker. Don't recall seeing this at sites such as TPU.

I think the mix of games is odd and doesn’t include a lot of the most played PC games right now, or even the big new title that was released the other day - COD. It has RTX features though so probably performs better on Nvidia. Can’t have people knowing that though can we? Does it even run better on Nvidia? I wouldn’t know, I guess I will have to go to another website. Of course they benchmarked Dirt 5 and F1 2020 because apparently that’s more relevant than COD..


Death stranding was tested for some reason with DLSS off with no mention that performance and visuals can be boosted with it. Can anyone tell me who in their right mind would turn DLSS off in death stranding please? It looks better and runs better with it on. Incidentally, my RTX2080 runs death stranding better than the 3080 does in this benchmark. Of course that’s with DLSS on..
 
Great write up as usual with one glaring error in my opinion.
Steam October hardware survey:
66% of gamers are playing at 1080p. 9% at 2k and 2.3% are at 4k. Yet 1080p is left out.

If you correlate those % with the amount of stock at launch, Techspot is spot on in omitting 1080p.
 
On the account of "whatever", nearly all modern games support memory profiling, which can be chosen based on which hardware is used. It could be that the new video card is simply not listed for automatic memory profiling. But that shouldn't take more than 5 mins to add, and deploy the patch. It won't need 2 years :)

Sorry, I just reread this post of yours and I think I might of misunderstood you. Do you mean that game devs could adjust current or make new profiles for current games now, to make use of the 16GB on the 6000 series now? As in now and not in the future? So instead of waiting for future games to make use of the 16gb, game devs could possibly adjust or make new profiles now to make use of the unused VRAM on the 6000 series 16gb? Is that what you meant? Because if that is what you meant, then I completely misunderstood you, my apologies for the misunderstanding then.

But if that is what you meant, then maybe? I am really not sure, your guess would be as good as mine. I am not sure if game devs can adjust current or create new profiles to take advantage of more, or all of the 16gb now. Your guess is as good as mine. But I am not sure that would benefit performance either, if games at Ultra quality already fit into 8gb and 10gb VRAM buffers. It is more about future games with higher quality textures and graphics etc. At least that is how it has always worked.

Anytime a GPU comes along with more VRAM than games require, it doesn't benefit them until games actually requite that extra VRAM organically, such as increased texture quality and details and such. It is not really about increasing FPS with more VRAM, it is about using highest settings without losing as much performance due to insufficient VRAM size.

So when games become prettier with higher res textures that are larger in size is a prime example, because texture quality is one of the least demanding settings FPS hit wise, but the most demanding VRAM usage wise. But other graphics settings also use VRAM as well, such as detail levels and draw distance etc. Just not nearly as much as textures do. And using higher res also increases the VRAM usage. But the future upcoming Direct Storage API might also have some performance benefits when used with a GPU with more VRAM, but that is just speculation on my part, so don't take that as fact. And with both next gen consoles that just released, we will likely see a sharp increase in game requirements as next gen games start releasing over the next year or two. So that extra VRAM could be made use of in some games by then. I am honestly not sure though, we will have to wait and see.

I just don't want to go to far into speculation of the 16gb potential future benefits, because I don't want to worry anyone with other GPUs unnecessarily. I am sure that 16gb will prove useful at higher res and settings sometime in the future, I just cant say when for certainty. Could be useful for some new games in 2 years, could be sooner, could be later. I honestly cant give you a date. But GPUs with less VRAM (eg: RTX 2080/3070/3080) could always just lower some settings and/or res to compensate, same as always, you lower visual fidelity to increase/restore performance.
 
Last edited:
Back