AMD Radeon RX 6800 XT Review

Vulcanproject

Posts: 1,270   +2,113
High end competition is back on the agenda! This time the balance in some aspects falls with AMD, the power consumption for one. Reference design is another big positive.

It's fast, really fast. Typical AMD performance where it loves some games and struggles in others, slightly more inconsistent than an RTX3080. I guess AMD drivers as per usual and some games favouring Navi as an architecture. It's also that trade off between the amount of memory it has over an RTX3080, on the flipside the significant deficit of memory bandwidth.

The smart memory access is very interesting, at 4K your gains are limited but they are there. That's a little bonus for people already on or eyeing up a newer AMD platform. AC: Valhalla's gains are notable, 6FPS seems like nothing at 4K but that's 15 percent and not to be sniffed at.
 

neeyik

Posts: 1,443   +1,590
Staff member
I dont understand why RDNA architcture suffer particualry in 4K.
At 4K, it's not just about being able to process as many calculations for the millions of pixels as possible, it's also about being able to read/write a small mountain of data too. And the RTX 3080 has an advantage here, having 10 memory controllers to the 6800 XT's 8, and GDDR6X has a higher transfer rate than GDDR6. If it wasn't for the 128 MB of Infinity Cache, the 6800 XT would be struggling more at 4K (not that it's actually bad, of course).
 
"The advantages of the GeForce GPU may be more mature ray tracing support and DLSS 2.0, both of which aren’t major selling points in our opinion unless you play a specific selection of games. "


This sounds like an uninformed youtube or reddit comment. There are several dozen games already using both those tech. The paroting of "not enough games" might have been true 2 years ago. Might have even been true 1 year ago. Certainly not as of now. Right when 4 of the biggest 5 games releasing at the end of 2020 have either ray tracing or dlss - Watch Dogs, Cyberpunk, Call of Duty, World of Warcraft.

RTX and DLSS are absolutely a selling point for buying new cards. Why would you buy one without them especially now after consoles got this tech too ? It makes no sense.

You dont need DLSS in any random game, its enough to have it in the heaviest hitters of the year. The games that require top of the line hardware to run well. Since AAA games arent releasing in the hundreds per month, of course its not gonna be a huge number of games when you say it in a vacuum. But that number is a fair one as of now and it's getting bigger each passing month.
 

Irata

Posts: 966   +1,414
TechSpot Elite
Thanks for the review.

Very well done by Amd. The jump from the 5700x is impressive. They really did deliver like promised and as a plus the reference design appears to be really good.

Overall a nice mix of performance, price and power consumption, particularly the system power consumption.

Let's see how they will do wrt features going forward as there's still some catching up to do.

Am also curious about big Navi's video core.
 

Vulcanproject

Posts: 1,270   +2,113
I dont understand why RDNA architcture suffer particualry in 4K.
Overal it is really good card and finally AMD is more effecint than Nvidia
Probably memory bandwidth. Also on the ray tracing tests here, the performance collapses hard.

It's day one so drivers and working with developers can only improve performance but the early rumours already start to look like they have a fair bit of truth about them. The ray traced performance isn't quite up to snuff against Nvidia's.

Likely a combination of Nvidia's more mature drivers and developer help, second gen dedicated RT hardware being better and all that extra memory bandwidth the 3080 is packing.

The further evidence of this is the consoles. Where with no RT Xbox Series X was similar to an RTX2070S or 2080. However with significant RT effects enabled like in Watchdogs Legion it couldn't beat an RTX2060 Super in direct comparisons.
 
  • Like
Reactions: redhat

Achaios

Posts: 61   +203
Νgreedia be hurtin' a lot today, mon.

On top of everything else, Lisa Su gave Leatherman Tsao of Ngreedia the coup de grace too, in the form of the Shared Access Memory.

Boy, I can picture Leatherman now sweatin' a lot in his leather jacket although it's November.

And uh, thanks to AMD, the "Intel" Ngreedia tried to pull off with the "3080 10 GB" came back to bite them where the sun don't shine. Serves them right.

Honorable Mention: GTX 580 1.5 GB, lawl.
 

hahahanoobs

Posts: 3,074   +1,231
Thanks for the review.

Very well done by Amd. The jump from the 5700x is impressive. They really did deliver like promised and as a plus the reference design appears to be really good.

Overall a nice mix of performance, price and power consumption, particularly the system power consumption.

Let's see how they will do wrt features going forward as there's still some catching up to do.

Am also curious about big Navi's video core.
Well done?

When you have no competing flagship GPU for consecutive years you better come hard. That should be expected, not praised.

Five years without a new CPU you better come hard when you can. Sooner rather than later is preferential. Again, the performance gained after that amount of time is expected to be competitive - not praised.

Good job AMD. It's about freaking time.
Keep it real.
 
  • Like
Reactions: Charles Olson
AMD is really benefiting from TSMC making their chips. This wouldn't be this close if Nvidia was on TSMC instead of Samsung. Its good job though. I had always said there was no reason AMD couldn't catch up but I kinda lost faith. If they had made larger GPUs with the 5700xt series they probably would have been this close.
 
  • Like
Reactions: Charles Olson

Shadowboxer

Posts: 974   +574
Curious as to how much of an advantage DLSS gives Nvidia. But even without it looking at the numbers and from my poor recent experience with AMD drivers I’d pick the 3080.
 

mAdmAnDingo

Posts: 59   +56
So it is good at 1440p, but at 4K - not really. So much for the extra memory "benefit".
The extra memory will benefit it once games start using more than the 3080s 10gb and the 3070s 8gb, at 4k/1440p (or whatever res). So if in 2 years (or whenever), new games at max settings can use more than what the 3080s/3070s 10gb/8gb can provide, thanks to max texture quality and/or other settings and/or res, then we will see a difference. Just like the RX 470 4GB VS RX 470 8GB now. The RX470 4GB performs worse (FPS drops/stutters) compared to the 8gb model when settings are used that exceed its 4GB VRAM (which is many new games these days depending on settings and res).

And games such as Doom Eternal wont allow any 4gb GPUs to use Ultra settings at 1080p, you need at least 5.2GB worth of VRAM to do so, as shown by Techspot. So 4gb GPUs are forced to use High settings at 1080p.

But lowering settings such as texture quality (and other settings to a lesser degree) and/or playing at a lower res also brings down VRAM usage, and restores stable performance once more. But it is very likely that future games will benefit from that 16GB VRAM (depending on settings and res), it is just a question of when new games come with larger textures and overall higher VRAM usage. But it could be 1 years, 2 years, or more, I am honestly not sure, but as next gen progresses, games will become more demanding, so just a matter of time.

So it just depends when that 16gb will be used by games, but games always get prettier and more demanding as time progresses, and so they end up using more VRAM as as well. So the 6000 series with its 16GB, will likely age well thanks to that 16GB VRAM. I am not trying to panic you or anyone over VRAM amount with the 3080/3070 or other GPUs either, I am just explaining to you why the 16gb is not showing any benefit as of yet, but that it will be a benefit in the future, especially at the higher/highest settings and/or resolutions (1440p/4k).

And in this review, Steve mentions 16gb benefits in terms of the future as well:

" The 16GB VRAM buffer is almost certainly going to prove beneficial down the track, think 1-2 years "

So it is a future thing, but something worth considering if purchasing a new GPU today. But 16gb VRAM benefits aside, I am personally quite pleased with the 6800 XT from a raw performance standpoint, and I am very interested in picking up an RX6000 GPU to go with my 5950X for some SAM action in an all AMD build. Will wait for all the models to be out and reviewed before I decide which one, but the 6800 XT looks like the one so far.
 
Last edited:
  • Like
Reactions: TempleOrion

quadibloc

Posts: 230   +133
I was pleasantly surprised originally, at the time of the announcement, that AMD came out with competitive cards. While the review shows the 6800 XT beating the 3090 in 1440 in most games, it seems to me that no one really needs the enormously high frame rates attained there, and so the 4K results, where the 6800 XT consistently lags the 3080, would be more relvant.
However, the 6800 XT has more memory, and that should help somewhere. NVIDIA, of course, is ahead in the software they provide for their cards, and that adds to their value.
 

Irata

Posts: 966   +1,414
TechSpot Elite
Well done?

When you have no competing flagship GPU for consecutive years you better come hard. That should be expected, not praised.

Five years without a new CPU you better come hard when you can. Sooner rather than later is preferential. Again, the performance gained after that amount of time is expected to be competitive - not praised.

Good job AMD. It's about freaking time.
Keep it real.
Yes, well done *because* they were so far behind in the top end for years.

Before that, they were competing with the 2070 (Navi) and 2600 (Vega 64 - but only on performance, never mind efficiency).

Now you have a card that is actually a bit more efficient, reference model is more silent, temps are great (remember the "hot and loud" reference cards?) and performance is pretty even averaging resolutions and titles.

Are you seriously saying that this amount of catching up (with fewer resources) is not a job well done ?
 
Last edited: