AMD Radeon RX 6800 XT Review

Thanks for the review - I more interested to see their midrange paired with a 5600x
Maybe I will build my son a nice PC with those or a 3060- all being equal probably give my money to AMD - my pc has 2060 - so good enough for me .

Good games don't need 4K - they need to be well master and immersive - your brain filters out detail all the time - That's why making water look like water was a big deal - and hair looking like hair . I would much rather play in a 1080p jungle with ambient sounds, light, insects, bird calls or howler monkeys that insipid 4k detail. Anyway 1440k gaming now looks real good - add HDR screen .
Most of your phones can shoot in 4K - doesn't mean your videos will look dreamy -like well mastered movies
 
You just count them by hand, theres no official sourse. As I've said in my post, I've added games with both dlss and ray tracing - that have either one of them or both.

Anthem
Amid Evil
Battlefield V
Bright Memory
Call Of Duty: Black Ops Cold War
Control
Cyberpunk 2077
Death Stranding
Deliver Us The Moon
Dirt 5
Edge Of Eternity
Enlisted
F1 2020
Final Fantasy XV
Fortnite
Ghostrunner
Justice
Marvel’s Avengers
Mechwarrior V: Mercenaries
Metro Exodus
Minecraft
Mortal Shell
Mount & Blade II Bannerlord
Monster Hunter: World
Pumpin Jack
Ready Or Not
Shadow of the Tomb Raider
Xuan-Yuan Sword VII
Watch Dogs Legion
Warthunder
Wolfenstein Youngblood


Its not in the low 20's and games are now coming out each month that support either dlss, or ray tracing or both. The argument that there arent enough games simply doesnt hold water as of now. Like I've said, you will get new games with these features every month. You cant have a gigantic list of games suporting this becaues only a fixed nr of games come out every month and not all of them need dlss. Only the largest games. The way steve leanes towards amd in his article and ignores and shits over rtx and dlss is frankly embarassing and people are pointing him out in forums already

How many of the games listed are worth buying and playing? I would argue that majority of the games you listed are garbage. The most recent games are running best on RDNA 2, simply because of the consoles. AMD will leverage the optimizations that Sony and Microsoft are developing/using, and over time AMD will gain a clear advantage in next-gen titles. Exclusive features, like DLSS and RTX will eventually lose traction as the development process requires Nvidia's hardware to run. The overwhelming number of future games will be made targeting the lowest common denominator, the consoles. Three heads are better than one. AMD just released this hardware, and it's very early in terms of drivers and features. I think DLSS is great, but AMD said they will have some up-sampling tech come January. Nvidia does have a clear lead in RT, but again, how will it age when compared to RDNA 2? If you take a look at AMD's last generation hardware, the RX 5700 XT, you'll notice that over the last year or so AMD has gained ~10% perf vs Turing.
 
And now let's see how AMD handles keeping up-to-date drivers in the mix. Something they've really struggled with in the past.
Yep. One of the reasons why while I'm cautiously optimistic right now I'm not really "flipping out either." I can wait and see for a few months to see how things shake out with availability and support for both contenders...
 
Your preference of choice does not conclude that AMD does not doesn't deserve a "well done", (which clearly was the aspect I was responding to) for now bringing full competition in the high end with a showing of better value in $ per frame and performance per watt and unmatched 1080p performance from even the competitors $1500 product.

Do you really think that it does?

Its not blowing Nvidia out of the water (except at rasterized 1080p), but for almost everyone, except perhaps the most die-hard green fans and those that only play RT games (or 4k MS flight simulator ;)), it deserves full consideration and respect based on this showing.
lol@deserves consideration and respect
Not Big enuff Navi does nothing NVIDIA isn't already doing.
Dropping price is all AMD has as an advantage - again.
 
lol@deserves consideration and respect
Not Big enuff Navi does nothing NVIDIA isn't already doing.
Dropping price is all AMD has as an advantage - again.

It was nvidia that dropped their prices dramatically from RTX. AMD pricing is what has been consistent and consistently better for the consumer, and now they have the high end performance to boot and better value in every category ... and that irks you, because why? (rhetorical question ... we all know why)
 
Well I feel even more vindicated by my decisions to go 6800XT + 5900X for my new build and this is is with first release drivers and no equivalent of DLSS yet.

However, yet again another pathetic launch where cards are sold out in minutes despite their BS about much much better stock levels than Nvidia. I doubt I'll be able to get my PC rebuilt before March if god forbid I want to only pay RRP.
 
It was nvidia that dropped their prices dramatically from RTX. AMD pricing is what has been consistent and consistently better for the consumer, and now they have the high end performance to boot and better value in every category ... and that irks you, because why? (rhetorical question ... we all know why)
They finally have high end performance.....
 
I came specifically to see the 4K comparisons for Microsoft Flight Simulator 2020.
The 6800 is plenty capable, but the 3080 /3090 is where to be.
3080 maybe (although it's not a good long term buy with the low VRAM allocation), 3090 is still a vanity card I'm afraid at that price. The 3080ti and 6900xt are gonna be the ones to have when they are released and you can actually buy them. I picked up a 3080 recently and am very happy with it. I'll likely move on to the next round though in a couple of years due to that low VRAM.
 
Well I feel even more vindicated by my decisions to go 6800XT + 5900X for my new build and this is is with first release drivers and no equivalent of DLSS yet.

However, yet again another pathetic launch where cards are sold out in minutes despite their BS about much much better stock levels than Nvidia. I doubt I'll be able to get my PC rebuilt before March if god forbid I want to only pay RRP.
Moore'slawisdead on YouTube thinks that a decent supply of 6800s will come by end of year, so there's a hope you could get one in December January.
 
Nice write up, thank you. I’m curious about the article’s position on ray tracing as well. I agree that at this exact moment there are very few compelling implementations of raytracing. Ironically though the author picked some of the very worst examples of RT to make his point. Was that intentional?

Anyway, we’re already seeing rapid RT adoption this new console generation so clearly there will be lots more games coming down the pipe with raytraced effects.

Should we ignore Control, Watch Dogs, Cyberpunk and all the RT games that will drop in the next few years when deciding to purchase a 6800xt today? Or should we not worry about that for now and just buy new hardware if and when those games actually make it to market?

The advantages of the GeForce GPU may be more mature ray tracing support and DLSS 2.0, both of which aren’t major selling points in our opinion unless you play a specific selection of games.
 
...
Should we ignore Control, Watch Dogs, Cyberpunk and all the RT games that will drop in the next few years when deciding to purchase a 6800xt today?
...

I'm not sure I would ignore those games just because my card can't do the uber ultra level shadows at a high fps. Would you?

I don't think anyone is going to say, "Well I don't have the best raytracing options available, so I guess I can't play the games I wanted to!"

4k looks a fair bit better than 1080p, but I never heard anyone say they can't play games because they don't have a 4k monitor or a fast enough card to do it ...

6800xt looks to be roughly at par with a 2080ti at RT (no DLSS) and that was fairly acceptable before Ampere launched. I don't think anyone with a 2080ti is going to not play any RT titles just because that's the performance they have.

We also have to wait and see how much the -not ready yet- "super resolution" feature will improve RT performance. If RT is somewhat important to one on the fence between Radeon or RTX, they should maybe wait for this feature to see if it is any good before deciding. (not that they have a choice considering stock levels from both camps)
 
I hope someone eventually reviews h264 encoding performance, as well as compute. h.264 AMF/VCE was a fail for Navi 5700XT etc.

Big Navi has potential for cleaning up the content creation and streaming market if they nail h.264 encode, especially with that 16GB VRAM.

*Waving at EposVox *
 
"The advantages of the GeForce GPU may be more mature ray tracing support and DLSS 2.0, both of which aren’t major selling points in our opinion unless you play a specific selection of games. "


This sounds like an uninformed youtube or reddit comment. There are several dozen games already using both those tech. The paroting of "not enough games" might have been true 2 years ago. Might have even been true 1 year ago. Certainly not as of now. Right when 4 of the biggest 5 games releasing at the end of 2020 have either ray tracing or dlss - Watch Dogs, Cyberpunk, Call of Duty, World of Warcraft.

RTX and DLSS are absolutely a selling point for buying new cards. Why would you buy one without them especially now after consoles got this tech too ? It makes no sense.

You dont need DLSS in any random game, its enough to have it in the heaviest hitters of the year. The games that require top of the line hardware to run well. Since AAA games arent releasing in the hundreds per month, of course its not gonna be a huge number of games when you say it in a vacuum. But that number is a fair one as of now and it's getting bigger each passing month.

What IF somebody is not planning on playing any of the games you mentioned?
 
Great write up as usual with one glaring error in my opinion.
Steam October hardware survey:
66% of gamers are playing at 1080p. 9% at 2k and 2.3% are at 4k. Yet 1080p is left out.

That's because nobody is buying this cards to play at 1080p, even if you play at high refresh you still wouldn't buy a $700 GPU for that because you can get away with a $300 one :)
 
That's because nobody is buying this cards to play at 1080p, even if you play at high refresh you still wouldn't buy a $700 GPU for that because you can get away with a $300 one :)
The refresh rate is one of the things I was thinking about. Competitive gamers run some very high-end stuff and then play at 1080p for the high framerates.

Also, I haven't built a desktop for myself in almost 17 years. I buy very high-end laptops for my business and my gaming though at home when I game I use an external monitor. It's just that with laptop panels the choice is 1080p at high refresh rates or 4k at 60 fps. And honestly 4k on a 17" panel is just silly in my opinion, plus, 60 fps? No thanks.
 
How many of the games listed are worth buying and playing? I would argue that majority of the games you listed are garbage. The most recent games are running best on RDNA 2, simply because of the consoles. AMD will leverage the optimizations that Sony and Microsoft are developing/using, and over time AMD will gain a clear advantage in next-gen titles. Exclusive features, like DLSS and RTX will eventually lose traction as the development process requires Nvidia's hardware to run. The overwhelming number of future games will be made targeting the lowest common denominator, the consoles. Three heads are better than one. AMD just released this hardware, and it's very early in terms of drivers and features. I think DLSS is great, but AMD said they will have some up-sampling tech come January. Nvidia does have a clear lead in RT, but again, how will it age when compared to RDNA 2? If you take a look at AMD's last generation hardware, the RX 5700 XT, you'll notice that over the last year or so AMD has gained ~10% perf vs Turing.


Thats fine if you think they're garbage. The list is growing on a monthly basis, as new games come out.

No, new games arent running better on rdna2. Only AC Valhalla does, that game runs particularly poor on nvidia for some reason. Watch Dogs Legion runs better on nvidia, Godfall runs the same, Horizon runs better, Desperados 3 runs far better on nvidia, Detroit Become Human far better on nvidia.


There's no if or buts, the 3080 is faster. How will ray tracing age on nvidia ? Seeing as its twice or more faster, its gonna age extemely well, if certain devs wont **** things up particularly for nvidia like how the Godfall devs are doing or Dirt 5 devs.
 
Good that the reviews are finally out. And now that we know where things stand I can proceed to look for a 3080 with confidence.
 
I'm not sure I would ignore those games just because my card can't do the uber ultra level shadows at a high fps. Would you?

I don't think anyone is going to say, "Well I don't have the best raytracing options available, so I guess I can't play the games I wanted to!"

4k looks a fair bit better than 1080p, but I never heard anyone say they can't play games because they don't have a 4k monitor or a fast enough card to do it ...

6800xt looks to be roughly at par with a 2080ti at RT (no DLSS) and that was fairly acceptable before Ampere launched. I don't think anyone with a 2080ti is going to not play any RT titles just because that's the performance they have.

We also have to wait and see how much the -not ready yet- "super resolution" feature will improve RT performance. If RT is somewhat important to one on the fence between Radeon or RTX, they should maybe wait for this feature to see if it is any good before deciding. (not that they have a choice considering stock levels from both camps)

Sorry I may have phrased my post poorly. Of course those games are very playable without RT. I was asking whether the author is suggesting that we ignore RT support in current and upcoming games when deciding to purchase a 6800xt over a 3080.
 
Hopefully this results in price drops from Nvidia, but as long as their cards keep flying off the shelves (including the ridiculous 3090) there won't be much incentive for that.

AMD claiming both the high-end and bang-for-buck (on average) for once is a big accomplishment - BUT they also have to prove their drivers, software & general reliability are now up to standard.
The latter is a common concern I find, Radeon's have been a nightmare for some in the past. Which means they will happily pay a substantial premium for an equivalent Geforce card, or settle with less performance for peace of mind.

Anyone reading this who is familiar with the 5700 / XT please reply with your experience. Has AMD turned a corner? Is it more a case of early-adopter vs buy-mature?
 
it's mostly the 8GB of the 3070 that people don't like since it also has lower bandwidth.

Most of the complaints I hear about the 3070s 8gb is because of the lower capacity, same as the 3080s 10gb (though to an lesser extent with the 3080).

Even Steve alludes to this in this very article, Steve mentions the 16gb capacity benefits in terms of the future as well:

" The 16GB VRAM buffer is almost certainly going to prove beneficial down the track, think 1-2 years "

So not in a bandwidth sense, more of a capacity sense. The lower bandwidth of the 3070s 8gb may be a complaint for some people as well, but the primary complaint I hear the most often, is about the limited size of the 3070s 8gb, and the impact that limited capacity will have in future games at higher/highest settings.

And at the end of Steve's RX 6800 review on Hardware Unboxed YouTube channel today, Steve says that even for 1440p gaming, 8gb will be less than ideal in the not to distant future. And he also says, that he would not spend $500 on an GPU the features only 8gb in late 2020. Skip to the final thoughts section.

So it is more of a capacity thing with the 3070, not a bandwidth thing. It is about having enough VRAM to play new games at the higher/highest settings (such as textures) in the future. I am not saying that is when it will become an issue, I am just saying it is more about capacity and not bandwidth.
 
Last edited:
Most of the complaints I hear about the 3070s 8gb is because of the lower capacity, same as the 3080s 10gb (though to an lesser extent with the 3080).

Even Steve alludes to this in this very article, Steve mentions the 16gb capacity benefits in terms of the future as well:

" The 16GB VRAM buffer is almost certainly going to prove beneficial down the track, think 1-2 years "

So not in a bandwidth sense, more of a capacity sense. The lower bandwidth of the 3070s 8gb may be a complaint for some people as well, but the primary complaint I hear the most often, is about the limited size of the 3070s 8gb, and the impact that limited capacity will have in future games at higher/highest settings.

And in the RX 6800 review of Hardware Unboxed on YouTube today, Steve says that even for 1440p gaming, 8gb will be less than ideal in the not to distant future. And he also says, that he would not spend $500 on a GPU the features only 8gb in late 2020. Steve will surely say the same thing in Techspot's RX 6800 review later today as well.

So it is more of a capacity thing, not a bandwidth thing. It is about having enough VRAM to play new games at the higher/highest settings in the future.
Capacity is definitely a problem for the 3070, but bandwidth does help with the memory swaps and keep the GPU fed. Ideally the 3080 would have a minimum of 16GB and the 3070 12GB and I think we'll see higher capacity with the refresh cards (Super/Ti whatever they'll call them)
 
Capacity is definitely a problem for the 3070, but bandwidth does help with the memory swaps and keep the GPU fed. Ideally the 3080 would have a minimum of 16GB and the 3070 12GB and I think we'll see higher capacity with the refresh cards (Super/Ti whatever they'll call them)

The 3070s bandwidth is not the real issue though, because it is still fast enough to keep it fed, especially at 1440p. Even though the 3070's 448 Gbps is less bandwidth than the 2080ti's 616 Gbps of bandwidth, it still keeps up with the 2080ti just fine. So its bandwidth is sufficient to keep the 3070 fed already I think. But more importantly, settings like textures, will not overly affect performance when increased, as long as the GPU is already powerful enough (sufficient TMUs etc), has sufficient bandwidth (in this case 448 Gbps), and enough VRAM capacity (8gb in the 3070's case). And you can check online benchmarks to see just how little higher quality texture settings effect performance with sufficient VRAM to handle them, or test it yourself at home. So I think the 3070 is most definitely able to handle new games higher texture quality settings in 2-3 years time I think, but is the VRAM capacity enough to allow them?

Same thing we are seeing with the RX 480/580 8gb vs RX 480/580 4Gb in Doom Eternal. The 480/580 8gb do great at 1080p Ultra settings, thanks to that 8gb. But the 4gb models are restricted by the game itself to only use high settings at 1080p, unless you lower textures. So it is not because the 4gb models are not powerful enough, or have insufficient bandwidth to run Ultra settings, but because they have insufficient VRAM to allow Ultra settings, and Ultra textures specifically (textures will always be the most demanding VRAM wise, but other settings such as detail, draw distance etc, also do their bit to increase VRAM usage, as does raising resolution).

So if the 3070 had 16gb (or 12gb), it would be able to use settings such as the highest textures, that an increased VRAM capacity benefits tremendously, while also not adversely affecting performance in most cases when using said settings. More VRAM can also benefit all settings that increase VRAM comsumption of course, but textures is a prime example of one that uses lots of VRAM, but doesn't affect performance adversely while doing so, as long as VRAM capacity is sufficient. And if VRAM limitations come into play in the next 2-3 years, then you are lowering settings such as textures, but not because the 3070 is unable to play games with higher texture settings, but because VRAM capacity is limited.

But bandwidth does also have a roll to play, you are absolutely correct, but in this case, more VRAM is the correct solution, as the 3070 is already sufficiently fed to be able to use larger textures, and settings like textures will only continue to grow in VRAM requirements as next gen moves on. But hopefully we do see some higher VRAM capacity models like you mentioned. Nothing wrong with stimulated completion from AMD, to bring NVIDIA into an more competitive state, that will just benefit more consumers in the end. AMD pushes NVIDIA to be more competitive, and in return NVIDIA pushes AMD to be more competitive. Awesome, great for all consumers.

And once again, I also don't know when lower capacity will become an issue precisely, it could be 1, 2 or more years. And I don't want anyone to panic either, I am just trying to explain what the benefits of the higher capacity VRAM are, which would be longevity, by being able to use VRAM demanding settings in upcoming games.
 
Last edited:
Back