AMD Radeon RX 6500 XT Review: A Bad, Really Bad Graphics Card

Mr Majestyk

Posts: 1,212   +1,107
Triple facepalm for Lisa Su. God how pathetic will the 6400 be? Honestly the fact a $200US 2022 era GPU could ship with 64 bit bus and 4GB defies all belief. It needed to have been a 96 bit, 6GB card with 2x the IC for that money.
 

RedBear

Posts: 48   +40
I feel sorry for the sand, but thanks for the review. AMD has confirmed that they will stop at nothing if there's a chance to make a larger profit, all the blabbing about the 4GB VRAM being a counter-Ethereum solution was pointless, it was pretty much obvious to everyone but the die-hard fanboys that when combined with the x4 PCIe link it would have resulted in performance even lower than an RX 5500 XT on systems limited by PCIe Gen 3.
 

Watzupken

Posts: 597   +500
Thank you for the very thorough review Steve - it must have been a ton of work (and one you most certainly didn't enjoy).

I know I will be monikered public enemy nr1 after this, but I think this GPU is not that bad IN THE CURRENT MARKET. There is nothing at 200$ point, or even it the vicinity. Nothing. Is it fair? Hell no. But that's where we are. And one can game with this card (though preferably on PCIE 4.0).

My biggest grief is the x4 PCI-E limitation, I can't imagine it brings so much savings that it makes it worthwile. The 64-bit memory bus is a bummer too, but think the bandwidth is enough (just, by the skin of it teeth...) for the GPU anyway.

But, if someone needs something to game on, and has a PCI-E 4.0 motherboard, this card can be a lifesaver (there, I said it. Let the stones fly at me...). I'm sorry but the cost per frame analysis is completely off this world (talking about the ebay one here: the MSRP is a theoretical value, like my ideal waist size: once it was true, for a very short while, and will never return). But seriously: 1650 for 170$? Where? It is more than twice of that here, IN POUNDS. 330$ for a 3060? Really? Sign me right up, I will take that immediately (it is more like 6-700 quids).

The only thing remotely reflecting reality is the price of a used RX570. Still off, but not by multipliers, just percentage. Sorry Steve, but in the current market, if that card is selling for 200$, that would provide excellent price/performance ratio. I don't say this is how it should be (and I cherish my little 1070 Jetstream), but the market is what it is. For a 4.0 motherboard owner, this is a viable opttion.
I disagree. The card is not bad IF you are running PCI-E 4.0. But don't forget, this is meant to be a budget card. You should ask around and see how many people are running budget systems with shiny PCI-E 4.0 support. AMD themselves are churning out "budget" CPUs in the form of Ryzen 4xxxG and 5xxxG with only PCI-E 3.0 support, which limits the bus speed even if your motherboard supports PCI-E 4.0. There are rumours also that AMD will be re-releasing Renoir to compete against Intel in the lower end segment, which potentially means PCI-E 3.0 support only.

Price wise, MSRP is reasonable, I don't dispute that. But the fact is whether you can get it anywhere close to MSRP. If a GTX 1650 4GB GDDR6 is going for 300 USD on Amazon, it is unlikely to see this card at MSRP for long, if any at all.

Overall, I support Steve's conclusion of this card which I have been very unhappy with even before launch. It sounds bad, and it is bad. Bad because it is a terrible card for the segment they are trying to sell this to based on the reasons I have provided above. Some sites try and downplay the issue by given an average performance lost moving from PCIE 4.0 to 3.0, but you can't really tell if newer games will get hit badly by the bus limit. I can't recall if I said it here in the forum, but this is indeed one of the worst GPU that I can recall for at least the last decade. It is unexciting, barely outperforming previous gen card in the same segment (when running in PCIE 4.0) and bare bone to the max where everything is stripped off. Are the features critical to gamers, may be, may be not. While it is not critical to have these extra features, it does not mean that people/ gamer will not use it. For example, I game and I watch videos as well. Imagine if I have a budget rig with a lesser CPU, the AV1 decoder may be useful because otherwise the CPU will have to do the heavy lifting. Yet, the feature is missing and you wonder why the video is suffering from dropped frames. If a cheap ARM device like an Amazon Fire Stick offers AV1 support, how much does it actually cost AMD to add that in the first place?
 

Watzupken

Posts: 597   +500
Triple facepalm for Lisa Su. God how pathetic will the 6400 be? Honestly the fact a $200US 2022 era GPU could ship with 64 bit bus and 4GB defies all belief. It needed to have been a 96 bit, 6GB card with 2x the IC for that money.
I won't bother to read up on the review for the RX 6400 series nor even consider it for sure. The features that AMD cut out from this card can be found even in iGPU, which shows how miserable this card is. For example, the Vega 8 on the APU I am using supports 3x display out because I am using 3 displays. The Intel XE iGPU supports AV1 decode and I believe also supports 3x display out. It is gaming capable, I give it that, but the target market for this card don't always have a PCI-E 4.0 slot to not bottleneck it. AMD could have spruced up the specs, but I feel the main killer for this card is the crippled memory bus speed more than anything.
Anyway, I am done bashing this card. The product is a done deal and will not change. So I won't frustrate myself looking or thinking about it anymore. Lol.
 

Stardude82

Posts: 17   +5
This isn't as bad as all those DDR3 Kepler cards. The OEM only GTX 745... wow. That's was a bad card. But I guess even that card could output to 3 displays.
 

Vanderlinde

Posts: 138   +93
I give it that, but the target market for this card don't always have a PCI-E 4.0 slot to not bottleneck it. AMD could have spruced up the specs, but I feel the main killer for this card is the crippled memory bus speed more than anything.


What are you on. If you buy a set of hardware (new) it comes with PCI-E 4.0 standard. PCI-E 4.0 is pretty much supported from the 3x00 to 5x00 range of CPU's. The generations before that tick at PCI-E 3.0.

Most system builders will assemble components with PCI-E 4.0 stuff. So that really shoud'nt be an issue. And enabling SAM, appearantly increases it with another 5%. So if you have the hardware "right" it looks like a good card for a discontinued RX580 or so.

Its just the pricetag of 300$ which is absurd.
 

Grinnie Jax

Posts: 29   +58
@Steven Walton

This card was created for TON MINING, period. Worth mentioning, as it's the only thing were it's good and efficient per cost.
 
Last edited:

Dimitrios

Posts: 1,058   +863
Another proof that this is the worst moment to buy a GPU. Now I'm gonna deep clean my 3070ti to show how much I cherish it xD

LOL gonna do the same but I only have a RX580, paid $80 from a friend last year. I guess he's a true friend.
 

amghwk

Posts: 1,187   +1,110
It's not THAT bad. It's not even at the last of the list. Why the tantrum and exaggeration? The price? Everything is bloated nowadays, not just this.

AMD is going nowhere though, with it's STILL arrogant pricing. AMD has not waken up yet from it's once Ryzen3-debuted dominance slumber.

Saying that I am still very satisfied with my 5700XT. It cruises even the new games at 1440p maxed out at 60fps minimum, effortlessly. And best of all, I got it at a time when the scalper phenomenon was not this bad, and I actually got it at MSRP price.
 

takaozo

Posts: 60   +65
Damn, this thing came in stock for "only" 365 Euro.

I'm so glad I sold my RX570 8GB for 400, considering I got it for 120. Did put another hundred and get a RTX2060. So 120 + 100 = 220 for a 2060, not bad for y2k22. Also I spent like 15 Euro for another pair of fans, thermal pads and thermal grease. I will not buy something AMD made, never again, including CPU's.

Duck them.
 

dragosmp

Posts: 68   +74
Just bagged one for £180, in comparison GTX 1650 sell for £250-280 currently in the UK. I thought worst case I'll sell this in a few weeks for £250.
That's the thing, for 200 £/$/E - if you have PCIe 4 - it looks decent compared to what is actually available. The trick is not to exceed the framebuffer.
Great detailed review btw, I agree with the conclusions too (for the US market), but in EU land Nvidia cards are even more overpriced. Curious how it's gonna pan out in a few months, I'm all for trying things to decouple consumer graphic cards from mining - if it'll flop, it'll flop, at least they tried.
 

Irata

Posts: 2,107   +3,635
Those guys at ATi who designed this are either dumb as rocks (doubtful) or were mandated by AMD to make no compromises when it came to cheapness. It would have been far better to make the card PCI-Express 3.0 x8 than what they've done and it probably wouldn't have cost much more (if it did in fact cost more).

This is an entry-level card which means that there's a much higher chance that someone who buys it doesn't have the latest and greatest hardware. In fact, I'm willing to bet that there are a good number of people still using FX processors and the video card slots on even the flagship northbridge for the FX series, the 990fX are only PCI-Express 2.0 x16. With this card, they'll be PCI-Express 2.0 x4. I wonder just how BAD that will be for them and I shudder.

This is something that I would expect from nVidia, not AMD.

Navi 24 was designed as an entry level laptop companion dGPU, essentially an MX competitor. In this light, the specs make perfect sense.

Selling to OEM, every $ you can shave of the cost does matter - just think of the sub par HSF they use for desktop systems where - according to Steve at GN - they could have gotten a noticeably better HSF for a few bucks more.

Of course releasing this in a desktop graphics card will inevitably result in drawbacks. Imho, in a normal market this would not have been released at all or only as a sub 75W RX550 replacement.
 

MarcusNumb

Posts: 42   +40
LOL gonna do the same but I only have a RX580, paid $80 from a friend last year. I guess he's a true friend.
Last night I was sleeping while holding my 3070ti tightly xD (just a joke). Anw, I can't imagine the prices of the 4000 series xD
 

paul1122

Posts: 224   +238
"How to start this review. I thought of going for a fun little gag mocking the 6500 XT before completely tearing into it, but this thing is so bad it’s really spoiled the mood for me. I’ll cut right to it. In my opinion, this is the worst GPU release since I can remember, and I’ve been doing this job for over two decades."

Wait, didn't you guys review another AMD product a week ago? So, it hasn't been 2 decades since you felt that way. That's what Nvidia told me anyways.
 

Stardude82

Posts: 17   +5
The 1650 does a lot for what it is: a slot powered video card. To this day it is the best option for systems which have no upgrade path for the PSU. Nvidia completely messed up the marketing for it, but I don't understand the reviewer hate.
 

Robertrogue

Posts: 111   +71
This is the point in history that people of all kinds see that a "budget GPU" is $300. Like the "budget" steaks at the grocer is now $11.99 a pound or Ground beef is now on sale for $5.99 a pound! Good Lord, $300 is a ridiculous amount of money for a "budget" GPU! Especially one that cannot out perform a five year old "budget" GPU on a newer setup. AMD should be ashamed of offering left over crap that wouldn't work in a Playschool digital play toy and assume buyers will jump at a $200 (oh wait $300) Piece of wasted energy to use!
 

lostinlodos

Posts: 190   +47
Is it all that surprising when everything is so expensive at the moment due to both source shortages and supply chain failure?

Even if you manage to get all the materials and build something… it’s going to sit on a barge for 6 months!

$3.99 USB floppy drives are suddenly $20!

I think this is less a screw you and more like “we need to stay alive somehow “.
 

Irata

Posts: 2,107   +3,635
So quick question @Steve : Since you included PCIe 3 scaling tests for the 6500 XT review which is indeed very useful for those still on that platform - will you also include tests for driver overhead effects on older/weaker CPU when you review the 3050Ti ?

This would also be very useful for upgraders.
 

Alfatawi Mendel

Posts: 203   +303
It's gotten to the point when GPU reviews are becoming meaningless. They're too expensive. Too rare, and don't offer a meaningful performance uplift. This is bound to affect the wider PC market too...After all, what is the point of spending £1000 on a system upgrade, when it costs at least that again for a decent GPU. Maybe Intel's entry into the GPU market will buck things up...but I don't think so given their previous behaviour. My next purchase will be a console.
 

BadThad

Posts: 1,037   +1,191
Every GPU iteration ALL manufacturers release what I call a "cheap scam card" for the ignorant. It looks pretty on a retail shelf and they sell. They usually sold on brand recognition and often proudly boast their crappy specs on the box to make it look "more powerful". These are the cards folks in the know walk by and laugh a little when the noob and his friend are holding the box and exclaiming "Geforce! (or Radeon!), yeah, I hear those are good."

AMD said they were doing a top down approach this time, they are following through. This card was inevitable and timed perfectly to make some real margins for both AMD and resellers. Normally, this area has some very thin margins because, in reality, this is a $75 card!