AMD Radeon RX 6500 XT Review: A Bad, Really Bad Graphics Card

Aaron Fox

Posts: 153   +90
This card supports raytracing so where are the raytracing tests?

I know that sounds like a joke but remember that AMD put raytracing into this card and will sell the cards to ignorant buyers based on the card's 'awesome' capability.

Fooling naive buyers isn't cool. It's bad business, no matter how hip it makes people in the know feel. That includes what AMD did when it put the FX 9590 into the market and charged an obscene price for it, while telling enthusiasts it was a proper 8 core chip with a huge 5 GHz clock. In reality, it couldn't hit 5 GHz generally. It had only 4 FPU units so it was bad for gaming. If it did ever manage to hit 5 GHz turbo it would manage it for two seconds or less. And, put it into an AsRock board certified for the 9000 series and it would catch fire.

It's a good thing for AMD that I didn't review this card because I would have started with charts showing its raytracing-enabled performance, versus the other cards that support raytracing — all the way up to the 3090. Those charts, of course, would be with this card running in PCI-e 3 mode. After that we could see its raytracing glory running in PCI-e 4.

Then, I would show non-ray charts only showing the card running in a PCI-e 3 board. I would include the Fury X, which is a more powerful 4GB card. AMD nevertheless chose to prematurely stop providing driver updates to owners of Fury X, Fury, and the Nano — any of which are better than this card. This, despite the shortage and Windows 11 coming to market.

After all that, I would show the card's non-ray performance in PCI-e 4.

Then, I would find the used Ebay prices for Fury X, 980, and 980 Ti — prior to this latest mining craziness. People were getting those cards for less than what this card is selling for. So, a price-to-performance chart would show what people were able to get used, versus the performance of this card. We can add the 1070 as well.

The next chart would show the prices the 470 and 480 routinely sold for, new. Slickdeals deals constantly had those dirt cheap, and new. They bargain price became the functional MSRPs. So, we'd see how much performance one would have gotten for that money, versus this card.

All of this is off the cuff. The main point is that when AMD makes a point of selling a product with a feature, like it does here with raytracing, its performance should be held to account.

And, literally, this is what Chacos of PCWorld said about this card and raytracing:

'Finally! AMD’s Radeon RX 6500 XT takes ray tracing under $200 Budget graphics cards are back! Cue the “It’s finally happening” GIFs. Well over a year after the launch of this GPU generation and after 12 months of excruciating supply woes, the first new-school graphics card that normal people might be able to afford is finally here. Hallelujah.

This GPU features 16 compute units and ray tracing accelerators along with 16MB of Infinity Cache, a radical RDNA 2 innovation that helps Radeon GPUs play games faster at the resolutions they’re tuned for.

Those ray accelerators mean the 6500 XT can handle real-time ray tracing, though frame rates will undoubtedly be low unless you flip on AMD’s FidelityFX Super Resolution, or the new Radeon Super Resolution feature that works in almost any game.'

Hallelujah, indeed.
 
Last edited:

Avro Arrow

Posts: 2,204   +2,591
TechSpot Elite
Navi 24 was designed as an entry level laptop companion dGPU, essentially an MX competitor. In this light, the specs make perfect sense.

Selling to OEM, every $ you can shave of the cost does matter - just think of the sub par HSF they use for desktop systems where - according to Steve at GN - they could have gotten a noticeably better HSF for a few bucks more.

Of course releasing this in a desktop graphics card will inevitably result in drawbacks. Imho, in a normal market this would not have been released at all or only as a sub 75W RX550 replacement.
Well yeah, I get all that but unlike Dell and HP, people who buy video cards are tech-savvy enough to install them without breaking their motherboards.

People who buy Dell and HP are the same people who think that Bestbuy is the place to go for computer stuff and have often never heard of Canada Computers or Microcenter. They just use computers for stupid **** like Facebook.

AMD had to know that their customers would realise that something was wrong. On the other hand, someone with a Dell could probably spend hours upon hours on the phone with an equally-clueless Dell customer service rep and they never figure out the problem.

Then of course, there's the Techtubers who review these things...
 

Avro Arrow

Posts: 2,204   +2,591
TechSpot Elite
It's not THAT bad. It's not even at the last of the list. Why the tantrum and exaggeration? The price? Everything is bloated nowadays, not just this.

AMD is going nowhere though, with it's STILL arrogant pricing. AMD has not waken up yet from it's once Ryzen3-debuted dominance slumber.

Saying that I am still very satisfied with my 5700XT. It cruises even the new games at 1440p maxed out at 60fps minimum, effortlessly. And best of all, I got it at a time when the scalper phenomenon was not this bad, and I actually got it at MSRP price.
Yeah I got pretty lucky with my RX 5700 XT as well. Here in Canada, the going rate for the RX 5700 XT was CA$580+tax but I found my XFX on Amazon for CA$490 back in August of 2020. Normally, I don't buy a card right before the next gen is released because they tend to drop the prices of the previous generation when the new gen is introduced or, in the case of nVidia, they screw people over like they did to the later-on RTX 2080 Ti owners. However, I knew that I wasn't going to get better than CA$90 below the going price and at the time, it was actually cheaper in Canada than it was in the USA. I only wish I had two because it turns out that the RX 5700 XT is a damn good Ethereum miner, better even than the RX 6800 XT.
 

Danny101

Posts: 2,026   +838
Hopefully, no one buys this. With all the issues, the lack of hardware encoding is probably the worse. Not even HPTC capable.
 

Lounds

Posts: 1,082   +955
Hi Steve,
I was wondering why you only showed one game with SAM enabled? The retailer Overclockers UK did a video showing the Sapphire Pulse RX 6500 XT and they showed 3.0 Vs 4.0 but also then did the same run again with SAM enabled and in certain titles it saw massive gains. CSGO for example saw 70% improvement.

I'm not saying the RX 6500 XT is the best product but I encourage anyone interested to watch the video.

 

Axeia

Posts: 38   +40
infamous black screen bug (which still isnt fixed btw)
That never affected the RX580 or any Polaris card though. That was an issue with Vega cards.

Only issue I have with mine is that by default it's kinda loud. My own fault for going with the cheapest possible card I could find (some powercolor model). But it does undervolt like a champ which in turn runs it nice and quiet. No regrets about getting this card, demolishes this RX 6500 XT in performance and with my undervolt it's not even that far off it power efficiency wise. Also outperforms the NVIDIA GTX 1060 my girlfriend got around the same time. The RX 580 aged much better, 8GB Vs 3GB even though the prices were nearly identical at the time. And AMD squeezed up to like 20% performance out of the drivers offer the years whilst for NVIDIA it remained the same mostly.

I definitely wouldn't consider AMD unreliable. Sometimes you just get unlucky. I've had more bad luck in that regard with NVIDIA (GTX 970 that had to be underclocked to prevent crashing, laptop that started games on the iGPU instead of NVIDIA graphics etc).

The Vega black screens however were pretty bad so if you had a Vega card it's definitely understandable to want to move away from that.

Atm I wouldn't recommend brand loyalty to anyone. Get what you can get is what it has come to. Even if it's this awful 6500 XT (and you have a PCI-E 4.0 motherboard).
 

Grinnie Jax

Posts: 29   +58
Pathetic behavior from AMD. I've been their supporter for over two decades (my first CPU had been the Thunderbird 1400 which I successfully pushed to @1.8GHz). I even bought laptops on AMD CPU/GPU to support them, when they were the weakest (Bulldozer, etc.). But now I am ashamed to see that this card has MSRP $200, when it should have been called 6300 or 6400 and priced at $100 maximum. Shame on you, AMD. My old fart RX580 provides more punch than that. PCI-E limited, bus limited, memory limited, but costing more than what I paid for RX580 four years back (!!)..
 

Makste

Posts: 152   +104
A score of 20... Yikes. It scores about 40 in my book.
It is an underwhelming product nevertheless.
 

pencea

Posts: 270   +233
What a bittersweet piece to read: even AMD has truly reached Nvidia or intel level of arrogance and consumer abuse and in record time because they haven't even financially really grown past any of their competitors: AMD is just pure, undistilled hubris and I hope they end up having to leave the consumer GPU market altogether with enough customer backlash: they can't compete on anything beyond integrated graphics and at this point, I'm beginning to think both Sony and Microsoft would end up with nothing but improvements if they switch their consoles to Nvidia in the future.

If AMD leaves the GPU market than it will be even worse for us consumers. Nvidia will charge even more, and can do whatever they want. Intel GPUs doesn't count.
 

Danny101

Posts: 2,026   +838
This card can be made to work if you are informed. But most buyers would not be, and would end up installing this on a PCI-e 3.0 system and that's where AMD drops the ball here. It's also overpriced even for what it can do. Steve is right to give AMD the blisters that they deserve.
 

Shadowboxer

Posts: 2,071   +1,650
That never affected the RX580 or any Polaris card though. That was an issue with Vega cards.

Only issue I have with mine is that by default it's kinda loud. My own fault for going with the cheapest possible card I could find (some powercolor model). But it does undervolt like a champ which in turn runs it nice and quiet. No regrets about getting this card, demolishes this RX 6500 XT in performance and with my undervolt it's not even that far off it power efficiency wise. Also outperforms the NVIDIA GTX 1060 my girlfriend got around the same time. The RX 580 aged much better, 8GB Vs 3GB even though the prices were nearly identical at the time. And AMD squeezed up to like 20% performance out of the drivers offer the years whilst for NVIDIA it remained the same mostly.

I definitely wouldn't consider AMD unreliable. Sometimes you just get unlucky. I've had more bad luck in that regard with NVIDIA (GTX 970 that had to be underclocked to prevent crashing, laptop that started games on the iGPU instead of NVIDIA graphics etc).

The Vega black screens however were pretty bad so if you had a Vega card it's definitely understandable to want to move away from that.

Atm I wouldn't recommend brand loyalty to anyone. Get what you can get is what it has come to. Even if it's this awful 6500 XT (and you have a PCI-E 4.0 motherboard).
No the black screen bug mostly affected the RDNA cards but did also affect Vega and Polaris to a lesser extent. Although it could well be two seperate bugs for all I know. AMD just seem to point users to the mercy of reddit rather than offer any meaningful support.

I was on Radeon for years and they definitely are more unreliable. The market knows it too, right now people are paying a lot more for an RTX 3060 than they do for an RX 6600XT, considerably more despite the 6600XT having better framerates. Users know that there is more to a GPU than average frame rates.
 

Avro Arrow

Posts: 2,204   +2,591
TechSpot Elite
I feel sorry for the sand, but thanks for the review. AMD has confirmed that they will stop at nothing if there's a chance to make a larger profit, all the blabbing about the 4GB VRAM being a counter-Ethereum solution was pointless, it was pretty much obvious to everyone but the die-hard fanboys that when combined with the x4 PCIe link it would have resulted in performance even lower than an RX 5500 XT on systems limited by PCIe Gen 3.
You could have replaced the word "AMD" with "Intel" or "nVidia" and it still would have sounded correct.

The RTX 3050 has 8GB of VRAM. I wonder just how long until miners discover how to circumvent nVidia's disabling alogirthm like they have before on other LHR cards. Of course nVidia knows this will happen and that's why the RTX 3050 has 8GB of VRAM while the RTX 3080, at the clear other end of the spectrum, has 10GB. It's a low-power card with only 2GB less VRAM than the RTX 3080. They know that this will be a mining card. You know what this means?

It means:
nVidia has confirmed that they will stop at nothing if there's a chance to make a larger profit, all the blabbing about being on the side of gamers against mining. However, without using 4GB to halt or even 6GB VRAM to slow down mining, once again just using a software-based inhibitor, something that miners have already demonstrated they can hack.

We can expect some miner for figure out how to hack the RTX 3050 and suddenly they'll be as (or more) expensive as the RX 6600 but without the availability. I'm sure that's Big Huang's wet dream.

AMD's release of the RX 6500 XT wasn't an act of evil, it was an act of lunacy. I say this because ultimately, AMD will be hurt the most by the RX 6500 XT. They had come so far by mathcing nVidia but they just flushed it down the toilet. AMD isn't going to profit from this, it's going to bleed. It's going to bleed because very few people are loyal to ATi GPUs while nVidia fans get the same look in their eyes that Apple fans do. :laughing:
 

Shadowboxer

Posts: 2,071   +1,650
Until someone figures out how to disable the disabler and bring the card up to FHR. It WILL happen.
Currently LHR cards can mine up to about 70% of their max and thats with the hacked bios, which isnt a huge jump up from the 50% mining performance LHR cards start with and you have to remember the card is operating at 100% power to do this making it very inefficient. Mining difficulty and energy costs have gone up, emphasising efficiency at the moment. Where I live the only cards consistently in stock are Nvidia LHR parts like the 3060, 3070ti or the 3080ti. They sell for big markups but are always there to buy. I think LHR has done a lot tbf as you cant even smell a 3070, 3080, 3060ti anywhere anymore.

The thing is, the 3050 is performing dreadfully at mining, even with a hacked bios allowing it to do 20% more mining performance, its going to be considerably less profitable than a Radeon 6600 to mine on. And at the prices Ive seen its going to cost more than a Radeon 6600 too.

You have to remember, Nvidia sell dedicated mining cards. They do have perhaps a passing interest in making sure these LHR Geforce parts get to gamers over miners.
 

RedBear

Posts: 42   +35
You could have replaced the word "AMD" with "Intel" or "nVidia" and it still would have sounded correct.

The RTX 3050 has 8GB of VRAM. I wonder just how long until miners discover how to circumvent nVidia's disabling alogirthm like they have before on other LHR cards. Of course nVidia knows this will happen and that's why the RTX 3050 has 8GB of VRAM while the RTX 3080, at the clear other end of the spectrum, has 10GB. It's a low-power card with only 2GB less VRAM than the RTX 3080. They know that this will be a mining card. You know what this means?

It means:
nVidia has confirmed that they will stop at nothing if there's a chance to make a larger profit, all the blabbing about being on the side of gamers against mining. However, without using 4GB to halt or even 6GB VRAM to slow down mining, once again just using a software-based inhibitor, something that miners have already demonstrated they can hack.

We can expect some miner for figure out how to hack the RTX 3050 and suddenly they'll be as (or more) expensive as the RX 6600 but without the availability. I'm sure that's Big Huang's wet dream.

AMD's release of the RX 6500 XT wasn't an act of evil, it was an act of lunacy. I say this because ultimately, AMD will be hurt the most by the RX 6500 XT. They had come so far by mathcing nVidia but they just flushed it down the toilet. AMD isn't going to profit from this, it's going to bleed. It's going to bleed because very few people are loyal to ATi GPUs while nVidia fans get the same look in their eyes that Apple fans do. :laughing:
Nvidia really did what you're talking about with the RTX 2060 12GB, a GPU that they released stealthily and with no stock whatsoever available to gamers. I'm not sure about the RTX 3050, allowing reviews one day before release seems to imply that they're confident that there will be some stock and they want gamers to know about their GPU. About the 8GB of VRAM, I'm an RTX 2060 owner and sometimes 6GB feel barely enough for that GPU, also AMD itself used to say that 4GB were not enough to play in 2020, I guess it's even more the case right now. Personally I'm getting sceptical about mining being the main cause behind the shortages felt by gamers, the RX 6500 XT is already selling north of $300 even though it's incapable to mine Ethereum, there's too much demand and/or scalpers going around for quick and dirty solutions like limiting VRAM in lower end cards.
 

Puiu

Posts: 5,464   +4,395
TechSpot Elite
Currently LHR cards can mine up to about 70% of their max and thats with the hacked bios, which isnt a huge jump up from the 50% mining performance LHR cards start with and you have to remember the card is operating at 100% power to do this making it very inefficient. Mining difficulty and energy costs have gone up, emphasising efficiency at the moment. Where I live the only cards consistently in stock are Nvidia LHR parts like the 3060, 3070ti or the 3080ti. They sell for big markups but are always there to buy. I think LHR has done a lot tbf as you cant even smell a 3070, 3080, 3060ti anywhere anymore.

The thing is, the 3050 is performing dreadfully at mining, even with a hacked bios allowing it to do 20% more mining performance, its going to be considerably less profitable than a Radeon 6600 to mine on. And at the prices Ive seen its going to cost more than a Radeon 6600 too.

You have to remember, Nvidia sell dedicated mining cards. They do have perhaps a passing interest in making sure these LHR Geforce parts get to gamers over miners.
Neah, they don't run at 100% power which is why even at 50% people used them.
 

Avro Arrow

Posts: 2,204   +2,591
TechSpot Elite
Nvidia really did what you're talking about with the RTX 2060 12GB, a GPU that they released stealthily and with no stock whatsoever available to gamers. I'm not sure about the RTX 3050, allowing reviews one day before release seems to imply that they're confident that there will be some stock and they want gamers to know about their GPU.
The thing is, the deals made with miners are made beforehand. That's why they're nowhere to be seen on launch day.
About the 8GB of VRAM, I'm an RTX 2060 owner and sometimes 6GB feel barely enough for that GPU, also AMD itself used to say that 4GB were not enough to play in 2020, I guess it's even more the case right now.
Well, I don't think that AMD put only 4GB on the card because of gaming. They did it because having only 4GB of VRAM on a card cripples it badly when it comes to mining Ethereum. This is also one of the reasons that I think the RX 6800 XT is a much better choice than the RTX 3080. We're talking a difference in VRAM so huge that it's the total VRAM of the RTX 2060. As you say, your RTX 2060 feels hampered by having 6GB but just imagine how bad it would be if your card had the power to perform but was stopped by a game's VRAM requirements.

I know that my RX 6800 XT will live a long and fruitful life because 16GB won't be too little for at least a decade. I'd be pretty pissed off at nVidia if I spent what some are spending on an RTX 3080 (just mind-blowing) and it already wasn't able to use the HD textures in Far Cry 6 because it has 2GB to little VRAM already. Mind you, at this point, if you buy an nVidia card, you're just begging for more of their consumer abuse. :laughing:
Personally I'm getting sceptical about mining being the main cause behind the shortages felt by gamers, the RX 6500 XT is already selling north of $300 even though it's incapable to mine Ethereum, there's too much demand and/or scalpers going around for quick and dirty solutions like limiting VRAM in lower end cards.
In the case of the RX 6500 XT, I quite agree with you. I believe that the RX 6500 XT was snapped up by "starving gamers" who wanted to replace something really old like an R9 290 or people who want to get into PC gaming for the first time but were unwilling to pay $1,000+ for a video card.

As for your skepticism however, the evidence that miners ARE the primary reason for the shortage is quite clear (and widespread):
16388051_204990046641128_6870174302440485694_n.jpeg

Now, imagine how many more cards would be in the wild if just this operation didn't exist. The pro-mining tries to claim that mining isn't the main issue but the fact remains that without them, stores would have tonnes of cards to sell. Don't believe your ears, believe your eyes:
Xsf3dzf8ZjfbiCxaLhbYZL.jpg

GeForce-RTX-3080-Mining-Rig-almost-operational.jpg

il_794xN.1483073208_o04g.jpg

And then there's this:
7chm180jsr671.png

The Green is the price of GeForce cards, the Red is the price of Radeon cards and the yellow is the value of Ethereum at the time.

If you consider the effect of not having those colossal mining farms and the fact that the card prices are clearly based on the value of Ethereum, you might not be so skeptical anymore.
 

qtbrn

Posts: 9   +7
AMD has just wiped off years of effort. Quite convinced their Product Marketing team is a clueless bunch of relics. They have zero idea why they were doing badly for years and equally big-headed and misled why they are still surviving today.
 

HardReset

Posts: 1,625   +1,275
Now for manufacturing costs claim:

"AMD claims these sacrifices were made in order to get the die size as small as they have, which allowed them to hit the $200 price point. This I believe is a lie, and even if it isn’t, it’s hard to justify releasing a product that is worse than products you had out in the market 5 years ago."

Accusing AMD is lying is pretty bold statement. Without any arguments that is very bold one. Now here's something:

Costs.png

What that chart does not include are development costs. If that is even some way accurate, it seems your claim about AMD lying is wrong. And commenting about releasing worse product than 5 years ago, here's some breaking news: costs are now much higher than 5 years ago.

When commenting this and other flaws on review, Steve ignored almost all my arguments and tell that "Our reviews are based on very in-depth testing and analysis."

Source: https://www.techspot.com/community/...ability-is-not-guaranteed.273330/post-1943462

Well, I cannot see any in-depth analysis proving right claim that AMD is lying about production costs. In fact, I cannot see Any analysis for that. Cost factor is very important thing when considering this product.

Even better when Steve commenting I'm a fool if I'm not thinking AMD is making good money with this: "If you don’t think AMD’s making good money off the 6500 XT silicon, well frankly you’re a fool."

Source: https://www.techspot.com/community/...ability-is-not-guaranteed.273330/post-1943471

Arguments for those claim are missing. Calculations I provided might of course be wrong but at least there are something to support my opinion. Not just "I believe" thing.
 

Avro Arrow

Posts: 2,204   +2,591
TechSpot Elite
AMD has just wiped off years of effort. Quite convinced their Product Marketing team is a clueless bunch of relics. They have zero idea why they were doing badly for years and equally big-headed and misled why they are still surviving today.
What difference does it make? Even when AMD was doing everything right, the sheep still bleated and overpaid for green cards. If you're running an nVidia card right now, you're being an absolute hypocrite.

I own ten video cards and of those, nine are red. The one green card is an old Palit 8500 GT 1GB. I've had nothing but red since and I've been a perfectly happy gamer. If I'm not complaining, you shouldn't be either.

Now, if you DO own a Radeon, then you can complain because at least you gave them a shot. People who own GeForce cards only have no grounds to talk about Radeons.
 
Last edited: