AMD Radeon RX 6900 XT Review: Can AMD Take the Performance Crown?

What I don't understand is, why doesn't AMD fully maximize and exploit the use of infinity cache since it found out that it can elevate their GPU performance? Even if their ray tracing is still poor but exceptional 4k results could add even more value to the GPUs, no stone should ever be left unturned. It performed relatively well at 4k, but I expected the infinity cache to be higher in the 6900xt, hope I was right.
 
It has become clear by now that these navi chips suffer from memory bandwidth starvation. That's why they don't push the power envelope any higher because they knew that even when the chip can clock so much faster it won't matter anyway.

Maybe AMD couldn't afford gddr6x for it will lessen their margin further, aside of the single supplier dependency issue.
 
That "reviewer" is Steve Walton, the best benchmarker in the business. One of the things that makes him the best is his clear lack of bias. He ripped into the RTX 3090 just as badly if not worse when he reviewed it. He believes, and quite rightly, that neither of these cards should exist.


But he is NOT correct in this sort of thinking. Its simplistic and wrong. CPU and GPU and tech in general always needs the 'stupid high end' that will never make much sense to 99% of people. Guess what? Those products *are not aimed at you*. Please try to understand this.
They exist for very good reasons. You and Steve can stamp feet all day about it or actually try to understand the R&D behind why these sort of products exist in many areas in all of society. If you dont want to take time to understand why, thats fine, but you both remain wrong.
 
If I had either an RTX 3090 or an RX 6900 XT, I'd be embarrassed to admit it because nothing says "I'm clearly trying to compensate for something!" more than buying one of those cards.
If I had one of those cards I'd mention it in constantly. Geez.
 
Solution is simple: For those who want to buy a £1000 (or more) card to play minecraft with and are unwilling to play without raytracing, then they can buy the 3090.

Personally, I can't even imagine myself playing minecraft let alone paying over $1000 for video card to play it ... but to each their own.
Well yes if you don’t want to play minecraft you shouldn’t be buying a card just to do so. But you don’t need a 3090 to play it, my 2080 manages over 60fps at 1440p and a friend of mine just picked up a laptop with a 2070 in it and manages fine at 1080p60. (Of course this is with DLSS on but you 100% can’t tell the difference in minecraft).

But minecraft is the worlds most popular game, RTX has now been made an option on the most common version, it’s a bit of a miss to not be able to deliver this on these expensive big Navi GPUs. Especially as it’s apparently coming to the Xbox series X and PS5 sometime down the line, both of which have much weaker AMD based GPUs than this $999 monster.

I must say though, as a long term PC gamer (my first rig had an Intel 486DX2) Minecraft RTX is the single biggest leap I have ever seen in PC gaming graphics. It’s phenomenal and when the hardware improves enough to see fully ray traced AAA titles wel be in for yet even bigger visual gains. It’s odd because the RTX demos are a bit boring but as soon as you start up a proper survival world with RTX on and you’re building your own creations its far more impressive. I spent the first few hours just playing around with the lighting in my base!
 
Am I right that if AMD didn't decide to offer these, the slim fraction of chips that came out defect-free enough would just have been sold as 6800 XTs with the extra cores or whatevers turned off?

If the underlying physical reality is that a tiny supply of near perfect chips will be produced and something ought to be done with them, I don't see any problem letting a handful of whales get their trophy and contribute extra industry revenues. I don't see how it takes anything away from the rest of us. There are plenty of rare items that sell for high costs that don't even have any performance or VRAM at all (although most of those hold their value much better and are prettier to look at .)
 
Minecraft RTX came out of Beta into full release today. I can’t imagine paying £1000 for a graphics card and finding it doesn’t run that game very well.

Ray tracing is relevant for some of us. The way I see it is if your paying top money for a card you really should be getting it.
I can't imagine paying top money for a GPU to play.... Minecraft.

But you do you.
 
But he is NOT correct in this sort of thinking. Its simplistic and wrong. CPU and GPU and tech in general always needs the 'stupid high end' that will never make much sense to 99% of people. Guess what? Those products *are not aimed at you*. Please try to understand this.
They exist for very good reasons. You and Steve can stamp feet all day about it or actually try to understand the R&D behind why these sort of products exist in many areas in all of society. If you dont want to take time to understand why, thats fine, but you both remain wrong.
Huh? I was responding to someone falsely stating that Steve Walton was being biased and explaining why he was completely unimpressed with the RX 6900 XT. You may not know it but you actually honour me by putting me in the same category as Steve Walton.

Steve has to be completely honest and for the overwhelming majority of his viewers, this would be a terrible purchase to make. This is one of the secrets of his success and what makes him great at what he does. Steve is one of Techspot's senior editors and there's a reason for that. If someone wants to buy one, they can go ahead but that doesn't mean that a TechTuber has to recommend it when there's something that makes 10,000x more sense to his audience like the RX 6800 XT or RTX 3080.

I don't think that anyone has a problem with these products existing, I sure don't. I do take issue with the insane pricing level involved because this is a relatively new thing. Cards costing $1000 and up used to be specialty cards that were limited-run or prosumer cards ONLY. For a good long time, the most powerful, absolute top-end gaming cards in the world were $700 or less over several generations.

Steve and I have both been around tech for well over 20 years so while you may see this as just normal, he and I both see it as a new and troubling trend. Jacking prices on cards for no reason other to cash-in on the insecurities of a few people is a terrible thing to do, especially when the games that come out require more and more hardware performance to run properly. At some point, people are going to say "SCREW THAT!" and stop buying which will contract and then collapse the market.

Nothing in history has killed more corporations that were once thought to be invincible than their own greed. The younger set doesn't know this but Gen-X'ers like me and Steve remember things that were once ubiquitous but will never be seen again. Driving prices ever higher when the opposite should be happening is going to damage the PC gaming sector and neither Steve nor I want that, even if most people are oblivious to it.
 
Solution is simple: For those who want to buy a £1000 (or more) card to play minecraft with and are unwilling to play without raytracing, then they can buy the 3090.
Or alternatively get an XBox Series X for $499.

Personally, I can't even imagine myself playing minecraft let alone paying over $1000 for video card to play it ... but to each their own.

My kid quite enjoys it. It's amazing what you can build in the game, especially with Redstone. It's visual plus logic design.

Funny thing is my kid exclusively plays the Java version and I suppose that won't come with RT.
 
If I had one of those cards I'd mention it in constantly. Geez.
That's because you're young and young people are impressed by things like this. As you get older, you realise that it's a complete waste of money because it will never hold its value over time. I'll explain what I mean.

I have a Blue Rolex Submariner and it makes these cards look like absolute chump change. I don't crow about it (this is actually the first time I've ever mentioned that I had it on a non-horology forum) because it doesn't define my sense of self-worth. Now, some people would say "Why would anyone pay so much for a watch?" and the answer is that, unlike these cards, a Blue Rolex Submariner will keep its value or perhaps increase because blue ones are relatively rare (most are black). That makes a Rolex watch an investment and an uncommon (but beautiful) colour configuration a great investment.

These cards will be almost completely worthless in less than five years, kind of how the GTX 980 and R9 Fury series were so expensive in 2015 but weren't even worth looking at three years later. Video cards are not an investment, they are a very quickly-depreciating liability. Think of the guys who crowed about the power of their RTX 2080 Ti cards that they paid $1,200 for a few years ago that aren't even really worth $500 now.

Such is the fate of the gamer who pays an extra $300 for a barely-perceptible performance increase now, instead of banking that $300 and spending it 3-5 years from now when it will have an exponentially greater performance impact.
Young people don't think of this stuff because they haven't been around long enough to think long-term. I know this because I was the same way so I do understand it. I made some stupid purchases in my 20s like the time I bought a BMW 3-series because I thought that they were cool. I had no idea what pieces of garbage that they really are at the time and I learnt my lesson the hard way but I DID learn it.

Spend all you want on things that retain their value or go up over time but spend only as much as you have to on things that will be worthless in five years. That's a major key to long-term success. Even if you disagree now, one day (assuming you even remember this), you'll slap yourself upside the head and think "He was right!". Everyone over the age of 30 will agree with me because we have a better grasp on how the world works.

The best way that I can explain it is like this. Think about how much you knew when you were fifteen years old. You know, back when you thought that you knew everything. The older we get and the more that we learn, the more we realise just how little we know. That's why we stop crowing about stupid things like we did in our youth. We understand that it's far more impressive to know what we have without having to tell everyone else to make ourselves feel special. When we see other people do it, it comes across as childish.
 
Or alternatively get an XBox Series X for $499.



My kid quite enjoys it. It's amazing what you can build in the game, especially with Redstone. It's visual plus logic design.

Funny thing is my kid exclusively plays the Java version and I suppose that won't come with RT.
I'm pretty certain that the RT version isn't going to make your kid any better at visual or logical design building. The skill set that Minecraft cultivates doesn't depend on RT capability. The things that I've seen that have been built in Minecraft can be just mind-blowing and that was long before the RT version existed. Your kid will be just as well off with the java version. :D
 
Solution is simple: For those who want to buy a £1000 (or more) card to play minecraft with and are unwilling to play without raytracing, then they can buy the 3090.

Personally, I can't even imagine myself playing minecraft let alone paying over $1000 for video card to play it ... but to each their own.

Yup the majority of people I know that play minecraft are little children. And the adults I know that do play do so with their kids on a private server. None of them will look at any of the RTX cards.
 
Simple. Kicks 3090 a$$ with cheaper price, but poor value when compared to the 6800XT.

6800XT won the round this gen.

Couple the 6800XT with a Ryzen 9 5900X and you've built the best value per best performance PC setup for now.
 
That's because you're young and young people are impressed by things like this. As you get older, you realise that it's a complete waste of money because it will never hold its value over time. I'll explain what I mean.

I have a Blue Rolex Submariner and it makes these cards look like absolute chump change. I don't crow about it (this is actually the first time I've ever mentioned that I had it on a non-horology forum) because it doesn't define my sense of self-worth. Now, some people would say "Why would anyone pay so much for a watch?" and the answer is that, unlike these cards, a Blue Rolex Submariner will keep its value or perhaps increase because blue ones are relatively rare (most are black). That makes a Rolex watch an investment and an uncommon (but beautiful) colour configuration a great investment.

These cards will be almost completely worthless in less than five years, kind of how the GTX 980 and R9 Fury series were so expensive in 2015 but weren't even worth looking at three years later. Video cards are not an investment, they are a very quickly-depreciating liability. Think of the guys who crowed about the power of their RTX 2080 Ti cards that they paid $1,200 for a few years ago that aren't even really worth $500 now.

Such is the fate of the gamer who pays an extra $300 for a barely-perceptible performance increase now, instead of banking that $300 and spending it 3-5 years from now when it will have an exponentially greater performance impact.
Young people don't think of this stuff because they haven't been around long enough to think long-term. I know this because I was the same way so I do understand it. I made some stupid purchases in my 20s like the time I bought a BMW 3-series because I thought that they were cool. I had no idea what pieces of garbage that they really are at the time and I learnt my lesson the hard way but I DID learn it.

Spend all you want on things that retain their value or go up over time but spend only as much as you have to on things that will be worthless in five years. That's a major key to long-term success. Even if you disagree now, one day (assuming you even remember this), you'll slap yourself upside the head and think "He was right!". Everyone over the age of 30 will agree with me because we have a better grasp on how the world works.

The best way that I can explain it is like this. Think about how much you knew when you were fifteen years old. You know, back when you thought that you knew everything. The older we get and the more that we learn, the more we realise just how little we know. That's why we stop crowing about stupid things like we did in our youth. We understand that it's far more impressive to know what we have without having to tell everyone else to make ourselves feel special. When we see other people do it, it comes across as childish.

Very level headed adult response there.

And yes children and those that still live at home just don't have the life experience yet to get it. You will see pages and pages of children arguing in forums over a gpu getting 10fps more than the other one. I find as we progress and computers get easier to use and build you have way less technical users and more children. And anything CPU vs CPU or GPU vs GPU attracts them like flies to honey.

Also doesn't help with some of the click bait titles I see on some review sites and they just seem to want the fighting for the extra clicks it brings and ad revenue I guess.

With age comes wisdom alot of this stuff is just not worth the time to argue over. I get it people are bored in lockdown due to covid but there are better things one can do with their time than obsessing over hardware and what other people choose to buy etc.
 
That's because you're young and young people are impressed by things like this. As you get older, you realise that it's a complete waste of money because it will never hold its value over time. I'll explain what I mean.

I have a Blue Rolex Submariner and it makes these cards look like absolute chump change. I don't crow about it (this is actually the first time I've ever mentioned that I had it on a non-horology forum) because it doesn't define my sense of self-worth. Now, some people would say "Why would anyone pay so much for a watch?" and the answer is that, unlike these cards, a Blue Rolex Submariner will keep its value or perhaps increase because blue ones are relatively rare (most are black). That makes a Rolex watch an investment and an uncommon (but beautiful) colour configuration a great investment.

These cards will be almost completely worthless in less than five years, kind of how the GTX 980 and R9 Fury series were so expensive in 2015 but weren't even worth looking at three years later. Video cards are not an investment, they are a very quickly-depreciating liability. Think of the guys who crowed about the power of their RTX 2080 Ti cards that they paid $1,200 for a few years ago that aren't even really worth $500 now.

Such is the fate of the gamer who pays an extra $300 for a barely-perceptible performance increase now, instead of banking that $300 and spending it 3-5 years from now when it will have an exponentially greater performance impact.
Young people don't think of this stuff because they haven't been around long enough to think long-term. I know this because I was the same way so I do understand it. I made some stupid purchases in my 20s like the time I bought a BMW 3-series because I thought that they were cool. I had no idea what pieces of garbage that they really are at the time and I learnt my lesson the hard way but I DID learn it.

Spend all you want on things that retain their value or go up over time but spend only as much as you have to on things that will be worthless in five years. That's a major key to long-term success. Even if you disagree now, one day (assuming you even remember this), you'll slap yourself upside the head and think "He was right!". Everyone over the age of 30 will agree with me because we have a better grasp on how the world works.

The best way that I can explain it is like this. Think about how much you knew when you were fifteen years old. You know, back when you thought that you knew everything. The older we get and the more that we learn, the more we realise just how little we know. That's why we stop crowing about stupid things like we did in our youth. We understand that it's far more impressive to know what we have without having to tell everyone else to make ourselves feel special. When we see other people do it, it comes across as childish.
True. Many of the young 'uns, especially the millennials started to have the mindset that the most expensive is the best. Or at least buy to brag. Which is absolutely childish and wasteful. Wisdom and greedy foolishness never go together.

Being sold on "luxury" moniker makes the buyers look like dunce actually.
 
It has become clear by now that these navi chips suffer from memory bandwidth starvation. That's why they don't push the power envelope any higher because they knew that even when the chip can clock so much faster it won't matter anyway.

Maybe AMD couldn't afford gddr6x for it will lessen their margin further, aside of the single supplier dependency issue.
GDDR6X hasn't really shown itself to be the huge advantage that nVidia touted it to be. If it were, nVidia would win at everything, but it doesn't. This just proves that GDDR6X is barely more of an advantage than HBM (if at all). It's as much of a marketing buzzword as HBM was for AMD back in the day. At the end of the day, the same question and answers apply to GDDR6X as they did to HBM and HBM2:
  • Q1: Is it faster than standard GDDR?
  • A1: Hell yeah!
  • Q2: Does it matter?
  • A2: Hell no!
To date, I've never seen a clear example of faster VRAM making a significant impact. ATi was the first to introduce GDDR5 but its speed advantage over the GDDR4 that nVidia was using didn't translate into better gaming performance.

The HBM that AMD used on the R9 Fury series was an order of magnitude faster than the GDDR5 used on the GTX 980 and GTX 980 Ti. The GTX 980 and 980 Ti used a 384-bit VRAM bus which is quite a respectable bus width for a video card. However, comparing it to the bandwidth of the HBM VRAM bus in the R9 Fury and Fury-X was like comparing a back alley to an expressway. This is because HBM uses a staggering 4096-bit bus. Regardless, the Fury series still found itself sandwiched between the GTX 980 and GTX 980 Ti in gaming performance.

Later on, along came the much-anticipated Radeon VII with its massive 16GB of HBM2. Now, HBM2 only has a 2048-bit VRAM bus but that's still gigantic. I don't need to tell you what a flop the Radeon VII was in gaming, easily bested by the GTX 1080 Ti.

The more I look at these cards with their "fancy" VRAM types, the more I think that gaming performance comes from having a very fast GPU and VRAM that is fast enough not to bottleneck the card. In that regard, GDDR6 is plenty fast.
 
Last edited:
If I had either an RTX 3090 or an RX 6900 XT, I'd be embarrassed to admit it because nothing says "I'm clearly trying to compensate for something!" more than buying one of those cards.
Hi, just to say that I bought a 3090 and I'm not trying to compensate for anything. Nor am I pushing it in anyone's face that I have a 3090.

I was going to buy a 3080 but I got fed up waiting to replace my near 5 year old 1080 so I bought the 3090 instead. Is it overpriced? You betcha? Would I have bought it under normal circumstances? Not a chance...

However, these are not normal circumstances and graphics cards (and tech in general) at all levels are very hard to come by - almost everything is sold out, at least in the UK. Life is short (global pandemics have a way of resolving focus) and I did not want to wait until say April next year to turn up settings in the games I play (I'm at 5120x1440). I wanted an upgrade now and I could afford it so I bought the 3090. I know quite a few people in the same boat as me and who bought for similar reasons.

BTW I started playing on the ZX Spectrum so that gives my age away - I'm no Millennial with an e-peen.
 
4K gaming for $500 less is a win in my book. Unless RT is a priority I imagine any high end gamer would take either NV or AMD card if they could find one lol.

Once AMD has their version of DLSS even the RT performance will be a non issue.
 
Well yes if you don’t want to play minecraft you shouldn’t be buying a card just to do so. But you don’t need a 3090 to play it, my 2080 manages over 60fps at 1440p and a friend of mine just picked up a laptop with a 2070 in it and manages fine at 1080p60. (Of course this is with DLSS on but you 100% can’t tell the difference in minecraft).
...

Yeah the RT in minecraft looks pretty neat. That said, my kids play it just fine on the Intel iGPU. You don't even need a video card to play it, let alone an expensive one.
 
Just in case someone tries to edit the articles later, here's the conclusions for the 3090 and the 6900xt. See if you can spy a huge difference in tone, despite the 3090 arguably being a WORSE value propisition then the 6900xt:





" In short, don’t buy it, doing so will simply ensure that the next GPU generation is even more expensive. "

How interesting this line was only found in the $1000 6900xt review, and not the $1500 RTX 3090 review.
How interesting that, in that very quote, he specifically statest that "the Radeon GPU is the better option," literally telling you that, given the choice, he thinks that most people should choose the 6900 XT over the 3090 . . . doesn't make me think bias towards nVidia when he says to choose AMD instead (though he would rather you not choose either :D)
 
Or alternatively get an XBox Series X for $499.



My kid quite enjoys it. It's amazing what you can build in the game, especially with Redstone. It's visual plus logic design.

Funny thing is my kid exclusively plays the Java version and I suppose that won't come with RT.
If your kid wants RT in Java, you can buy Sonic Ether's Path Traced Global Illumination shaders on Patreon. It works on all GPUs, even ones without RT cores/accelerators such as a GTX 1060, and the settings can be tweaked for visuals and performance. In my opinion, it also looks better, as it doesn't look so glossy and white.
 
GDDR6X hasn't really shown itself to be the huge advantage that nVidia touted it to be. If it were, nVidia would win at everything, but it doesn't. This just proves that GDDR6X is barely more of an advantage than HBM (if at all). It's as much of a marketing buzzword as HBM was for AMD back in the day. At the end of the day, the same question and answers apply to GDDR6X as they did to HBM and HBM2:
  • Q1: Is it faster than standard GDDR?
  • A1: Hell yeah!
  • Q2: Does it matter?
  • A2: Hell no!
To date, I've never seen a clear example of faster VRAM making a significant impact. ATi was the first to introduce GDDR5 but its speed advantage over the GDDR4 that nVidia was using didn't translate into better gaming performance.

The HBM that AMD used on the R9 Fury series was an order of magnitude faster than the GDDR5 used on the GTX 980 and GTX 980 Ti. The GTX 980 and 980 Ti used a 384-bit VRAM bus which is quite a respectable bus width for a video card. However, comparing it to the width of the HBM VRAM bus in the R9 Fury and Fury-X was like comparing a back alley to an expressway. This is because HBM uses a staggering 4096-bit bus. Regardless, the Fury series still found itself sandwiched between the GTX 980 and GTX 980 Ti in gaming performance.

Later on, along came the much-anticipated Radeon VII with its massive 16GB of HBM2. Now, HBM2 uses a 2048-bit VRAM bus but that's still gigantic. I don't need to tell you what a flop the Radeon VII was in gaming, easily bested by the GTX 1080 Ti.

The more I look at these cards with their "fancy" VRAM types, the more I think that gaming performance comes from having a very fast GPU and VRAM that is fast enough not to bottleneck the card. In that regard, GDDR6 is plenty fast.

Thanks for the thoughtful feedback. I am just wondering, could it be that with HBM before the memory was the one starving for data from the chip. Just wondering #wink
 
Back