The Radeon RX 6500 XT will reportedly cost more than its MSRP in some regions

midian182

Posts: 7,774   +79
Staff member
Why it matters: As predicted, one of the items unveiled by AMD during its Product Premier livestream this week was the Radeon RX 6500 XT. The card arrives on January 19 with an MSRP of $199, and while AMD previously said it's expected to be the most accessible and affordable card of the last two years, it already looks like the suggested selling price was optimistic, unsurprisingly.

A report from French site CowCotland claims that the Radeon RX 6500 XT will sell for 299 Euros in the country, making it around 50% more expensive than the European MSRP, according to VideoCardz. That's around the same as or slightly less than the MSRP increase applied to other RDNA 2 cards in Europe.

In an interview with PCWorld about the Radeon RX 6500 XT, AMD CEO Dr. Lisa Su said, “We’re positioning the launch such that – and I know, you guys always say, ‘Well, yeah, they’re just saying that’ – but we really are positioning the launch at a $199 price point. It is sort of affordable to the mainstream. You know, we intend to have a lot of product out there.”

Image courtesy of 3DCenter

As for the card itself, the RX 6500 XT, built on a new 6nm architecture, features a PCIe Gen 4.0 x4 interface, 16 compute units, 16 MB of infinity cache, a 2.6 GHz game clock, and 4GB of GDDR6 on a 64-bit bus. While many may balk at the paltry amount of VRAM, using 4GB was an intentional move by AMD to make the card less appealing to miners, thereby improving availability and the selling price. That’s the plan, anyway.

It’ll certainly be interesting to see how Nvidia’s response to the RX 6500 XT, the desktop RTX 3050, will fare when it arrives in late January. Team green’s card has 8GB of VRAM and a $249 MSRP, but it will likely face more demand from miners, potentially pushing up the price and exacerbating availability problems.

You can watch AMD's full Product Premier livestream here and Nvidia's CES keynote here.

Permalink to story.

 

Dimitriid

Posts: 2,084   +3,986
You guys are missing the other major story: There's some reports that like the 6600 and /xt variants, this will be limited in PCI lanes to save money except much, much worst: it's only a x4 PCIe 4.0 card: If you don't have a PCIe 4.0 board then it goes all the way down to PCIe 3.0 x4 interface.

The impact wasn't tremendous on the 6600xt and 6600but that's because it was cut down to "only" a 3.0 x8 but this goes down to a 3.0 x4 which even for the limited speeds of the 6500xt it's way too constricted

EDIT: Not missing but not focusing enough to me, a potential performance downfall for most people is even more important than price gauging which we all fully expected.
 
Last edited:

Irata

Posts: 2,036   +3,456
I am missing where they got this information from. The article says

store price wise, the first offers should be available for €299

Not being too familiar with the French market, I‘m hoping it will at least be available at / very near MSRP in the US and Germany on the first day like the 6600XT and 6600 were.

That would allow those who really need one and can live with its limitations for a not overly terrible price on launch day.

You guys are missing the other major story: There's some reports that like the 6600 and /xt variants, this will be limited in PCI lanes to save money except much, much worst: it's only a x4 PCIe 4.0 card: If you don't have a PCIe 4.0 board then it goes all the way down to PCIe 3.0 x4 interface.

The impact wasn't tremendous on the 6600xt and 6600but that's because it was cut down to "only" a 3.0 x8 but this goes down to a 3.0 x4 which even for the limited speeds of the 6500xt it's way too constricted

That would actually make it a decent starter card for anyone building a new budget gaming 11400/ 12400 system or 5600x + B550 right now - all are PCIe 4 and in the case of Intel (non-F) the iGPU already comes with a video encode block.

As for PCIe 3 systems - really depends on the actual performance and how large the losses are - I am sure this will be tested in detail by reviewers.

In the end it comes down to customer needs/priorities, how much they are willing to pay and if a given product meets those needs.
 

Dimitriid

Posts: 2,084   +3,986
That would actually make it a decent starter card for anyone building a new budget gaming 11400/ 12400 system or 5600x + B550 right now - all are PCIe 4 and in the case of Intel (non-F) the iGPU already comes with a video encode block.

As for PCIe 3 systems - really depends on the actual performance and how large the losses are - I am sure this will be tested in detail by reviewers.

In the end it comes down to customer needs/priorities, how much they are willing to pay and if a given product meets those needs.
That's a very narrow set of circumstances: I think statistically most people are far more likely to still be on PCIe 3.0 so most people will run into lane issue...If it is an issue that is: it was within margin of error for the 6600 and xt cards so I could be totally wrong and this GPU is already so poor in performance that it truly can do just fine with just nvme 3.0 speeds at the max

In any case it would be advisable to just fully avoid day 1 purchases even if there is a small chance to get them at 200 MSRP you might not want something that's basically just another 1650 (non su or ti) by the time pci lane constrains kick in, only independent testing will tell the full story.
 

Irata

Posts: 2,036   +3,456
That's a very narrow set of circumstances: I think statistically most people are far more likely to still be on PCIe 3.0 so most people will run into lane issue...If it is an issue that is: it was within margin of error for the 6600 and xt cards so I could be totally wrong and this GPU is already so poor in performance that it truly can do just fine with just nvme 3.0 speeds at the max

In any case it would be advisable to just fully avoid day 1 purchases even if there is a small chance to get them at 200 MSRP you might not want something that's basically just another 1650 (non su or ti) by the time pci lane constrains kick in, only independent testing will tell the full story.
Totally agree on not buying it without checking reviews first.

Don‘t really see this as an upgrade card because which graphics card would this even be worth upgrading from on a still relevant system ?

And don‘t get me wrong - I don‘t think this is a particularly desirable card. Looking at all its specs, it‘s clear to me that Navi 24 was designed as a companion card for Ryzen based laptops where x4 lanes are enough and save power and the processor has the necessary video de- and encode functionality.

On desktop, this would be a good RX 550 / 560 successor as a no frills sub 75W entry level GPU - at the same price range, adjusted for current increased memory, component and shipping costs, so closer to €/$150.

But the market is as it is, so it might end up being the least bad deal for those building a new PC or where their current graphics card broke.
 

Dimitriid

Posts: 2,084   +3,986
Totally agree on not buying it without checking reviews first.

Don‘t really see this as an upgrade card because which graphics card would this even be worth upgrading from on a still relevant system ?

And don‘t get me wrong - I don‘t think this is a particularly desirable card. Looking at all its specs, it‘s clear to me that Navi 24 was designed as a companion card for Ryzen based laptops where x4 lanes are enough and save power and the processor has the necessary video de- and encode functionality.

On desktop, this would be a good RX 550 / 560 successor as a no frills sub 75W entry level GPU - at the same price range, adjusted for current increased memory, component and shipping costs, so closer to €/$150.

But the market is as it is, so it might end up being the least bad deal for those building a new PC or where their current graphics card broke.
I think it is an upgrade to a 1050ti or something like the 470 or below but that's cutting it far closer. Also at least on paper the feature set and driver support is far greater on the 6500xt with AMD promising driver level FSR on everything.

Grant you it's not anywhere near as useful as the 3050 getting DLSS 2.x instead but everybody is expecting those to immediately go to miners anyway so it might get nullified in actual practice.

Also just a bit of a comment: You mentioned some common new rig scenarios but I know there's others that don't support PCIe 4.0: If you are actually doing budget AMD and go either B450 and a 3600 or still brand new products on the lower end like A520 mobo + 5600g you would still end up with PCI-E 3.0 only so not even all new systems will be ok with the 6500xt potential limitation and since this is a budget card very likely to be paired with budget systems it's likely you can build a brand new rig that would be impacted by the x4 3.0 only lanes.
 

Lew Zealand

Posts: 2,106   +2,517
TechSpot Elite
You guys are missing the other major story: There's some reports that like the 6600 and /xt variants, this will be limited in PCI lanes to save money except much, much worst: it's only a x4 PCIe 4.0 card: If you don't have a PCIe 4.0 board then it goes all the way down to PCIe 3.0 x4 interface.

The impact wasn't tremendous on the 6600xt and 6600but that's because it was cut down to "only" a 3.0 x8 but this goes down to a 3.0 x4 which even for the limited speeds of the 6500xt it's way too constricted

Even quartering the bandwidth for the RTX 3080 cut its performance by 5%. On such a low end card as the 6500, the impact will likely be negligible.

average-fps_1920_1080.png


https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/26.html
 

Dimitriid

Posts: 2,084   +3,986
Even quartering the bandwidth for the RTX 3080 cut its performance by 5%. On such a low end card as the 6500, the impact will likely be negligible.

average-fps_1920_1080.png


https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/26.html
The difference is the vram: the 6500xt is also starved for vram at just 4gb. If it was just "Limited lanes but decent vram" or "Limited vram but full pci-e x8/16 speeds" then it would probably be minor yes.

But we know that the card is at least as capable or more capable of 1650super performance more or less, so while some of the post-processing effects are too much for it, if given proper vram and speeds it *could* actually be good enough for high detail textures at 1080p which is a very noticeable improvement.

I can conceded that it can end up being very little, but I still think it's important because it's also very little performance increase from the cards low end customers are likely to have or could get like the aforementioned 1650 super which considering the current market, is likely to be the exact same price or even less than the 6500 xt for performance that after we take into account settings that are not cherrypicked by AMD for their presentation (again, like AMD intentionally lowering textures to hide the vram and pci speed limitations) then why would you bother? I honestly would recommend people to steer away from the 6500 xt in favor of the 1650 super since we expect 5 to 10% be as much of a performance uplift as it might have in the best of cases so even minor performance hits like these make it pointless unless it's like a super specific scenario of people who somehow have a 5600x or 12600/Other Alderlake systems up and running but still carring around a 1050ti or a 570 from years ago for those people yes, 6500xt but as I said above in most scenarios of people being limited to pci-e 3.0 just get an Nvidia card instead.
 

Lew Zealand

Posts: 2,106   +2,517
TechSpot Elite
The difference is the vram: the 6500xt is also starved for vram at just 4gb. If it was just "Limited lanes but decent vram" or "Limited vram but full pci-e x8/16 speeds" then it would probably be minor yes.

But we know that the card is at least as capable or more capable of 1650super performance more or less, so while some of the post-processing effects are too much for it, if given proper vram and speeds it *could* actually be good enough for high detail textures at 1080p which is a very noticeable improvement.

I can conceded that it can end up being very little, but I still think it's important because it's also very little performance increase from the cards low end customers are likely to have or could get like the aforementioned 1650 super which considering the current market, is likely to be the exact same price or even less than the 6500 xt for performance that after we take into account settings that are not cherrypicked by AMD for their presentation (again, like AMD intentionally lowering textures to hide the vram and pci speed limitations) then why would you bother? I honestly would recommend people to steer away from the 6500 xt in favor of the 1650 super since we expect 5 to 10% be as much of a performance uplift as it might have in the best of cases so even minor performance hits like these make it pointless unless it's like a super specific scenario of people who somehow have a 5600x or 12600/Other Alderlake systems up and running but still carring around a 1050ti or a 570 from years ago for those people yes, 6500xt but as I said above in most scenarios of people being limited to pci-e 3.0 just get an Nvidia card instead.

Here's some better data, with an AMD Navi 2 card this time and the difference is bigger:

relative-performance_1920-1080.png


7% down on the same bandwidth as PCIe 3.0 x4 but on a more capable card. Still will likely be in the 4-5% performance reduction area. Not horrible but not ideal. The annoying price though...

For this 6600XT comparo, I did look at the games I play and going from PCIe 4.0 to 3.0 (I'm all 3.0 at home) was no difference in half the games, 2% difference in most of the rest, and 5-7% difference in 2 games, so the hits will be large in individual titles and pretty much nothing in the rest.
 

hahahanoobs

Posts: 4,308   +2,311
4 PCIe lanes and 4GB is just not a good look for a mainstream dGPU in 2022. That explains the 2.6GHz core clock and emphasizing it's only a 1080p card. lol
Is anyone really upset they can't buy it?
 
Last edited:

Irata

Posts: 2,036   +3,456
The difference is the vram: the 6500xt is also starved for vram at just 4gb. If it was just "Limited lanes but decent vram" or "Limited vram but full pci-e x8/16 speeds" then it would probably be minor yes.

But we know that the card is at least as capable or more capable of 1650super performance more or less, so while some of the post-processing effects are too much for it, if given proper vram and speeds it *could* actually be good enough for high detail textures at 1080p which is a very noticeable improvement.

I can conceded that it can end up being very little, but I still think it's important because it's also very little performance increase from the cards low end customers are likely to have or could get like the aforementioned 1650 super which considering the current market, is likely to be the exact same price or even less than the 6500 xt for performance that after we take into account settings that are not cherrypicked by AMD for their presentation (again, like AMD intentionally lowering textures to hide the vram and pci speed limitations) then why would you bother? I honestly would recommend people to steer away from the 6500 xt in favor of the 1650 super since we expect 5 to 10% be as much of a performance uplift as it might have in the best of cases so even minor performance hits like these make it pointless unless it's like a super specific scenario of people who somehow have a 5600x or 12600/Other Alderlake systems up and running but still carring around a 1050ti or a 570 from years ago for those people yes, 6500xt but as I said above in most scenarios of people being limited to pci-e 3.0 just get an Nvidia card instead.
The interesting part is that being vRAM starved is not necessarily the reason for the performance degradation. It‘s actually lower at higher resolution where the game needs more vRAM. Techpowerup have this as explanation that also shows degradation depends on the way the game was written:

‘The reason why this effect is much bigger at 1080p than 4K is because the data transfer is performed for every single rendered frame. Since 1080p runs much higher FPS than 4K, the observed effect is much stronger at lower resolutions.‘

So coincidentally, lower fps could also mean a lower degradation.

And:

‘The underlying reason we're seeing these effects in some games is that nearly all titles are developed for consoles first, which have just one kind of memory that's shared between CPU and GPU. This also means that moving data between the CPU and GPU is incredibly fast and doesn't incur the latency or bandwidth penalties we're seeing on the PC platform.‘

 

Dimitriid

Posts: 2,084   +3,986
Ok I can concede that the above posts could spell a different story, I think it's better to just remain skeptical of day 1 purchases as one should but otherwise it might truly be just not a big deal beyond the regular objections like 200msrp for a card that should be worth under 150msrp normally.
 

PetrolHead

Posts: 68   +36
Hardware Unboxed saw a difference between PCIe 3.0 and PCIe 4.0 in some games with the 5500 XT, and that had double the lanes (x8), so it will be very interesting to see how much of a difference the difference PCIe versions make in this case. Even if it doesn't make a huge difference, I don't think the optics of this design choice are very good. Looking purely at PCIe throughput numbers, the 6500 XT is comparable to dGPUs that are 15 years old. That doesn't really sound like something you'd like to fork out 300€ for...
 

Sausagemeat

Posts: 1,592   +1,413
AMD cheaped out big time on this card. They know their customers will buy anything at the moment. This thing doesn’t even have an encoder on board. It’s on PCIe4 which saves metal and production costs as there are less lines to run.

AMD execs have just gone “throw them a bone, claim we will charge $199 but actually charge $299”.

In many ways this card is worse than the RX480 I bought for £200 in 2016. The RX480 does have double the VRAM and an encoder, it even has a higher TF compute rating.
 

Nobina

Posts: 3,724   +4,099
Well it ain't gonna be cheaper than MSRP for sure. 4GB means miners will probably avoid it but so will gamers. It's a tremendous bottleneck for new games.
 

Irata

Posts: 2,036   +3,456
Well it ain't gonna be cheaper than MSRP for sure. 4GB means miners will probably avoid it but so will gamers. It's a tremendous bottleneck for new games.
If there only was a way to reduce memory usage like e.g. reduce quality settings.

Amazingly, while 4 GB on what will probably be a $300ish 1080p card is a huge issue, 10GB on a $1.600 4k card is not an issue.
 

EdmondRC

Posts: 240   +324
I would not be surprised to see the 5500 XT actually outperform the 6500 XT in some games. AMD is really releasing turd here. I don't mean to be crude, but I couldn't think of a better term for it. Only 4 PCI lanes the 5500 XT had 8 (why reduce this from 16 at all?), 4 GB of VRAM, and the 18 Gbps is not going to overcome the fact that many 1080p games already require more than 4GB. I realize this is an entry level gaming card, but that doesn't mean it has to be this bad. The 5500 XT could not clock nearly as high, but it had 22 shader clusters vs 16 in the 6500 XT. The RTX 3050 on the other hand (at least the 8GB version) should be a pretty decent entry level card capable of high settings and 1080p with max textures. The 3050 will even be decent at 1440p I'll bet. For $50 more, the 3050 is definitely the entry level card to get if you can. And that is the crux huh? Maybe AMD is betting this card will be so bad for mining that it will be available to gamers to be equally as bad, but sell anyway? No matter the reason, every AMD GPU released after the initial launch of RDNA2 has been disappointing, but this one takes the cake.
 

Axeia

Posts: 38   +40
Don‘t really see this as an upgrade card because which graphics card would this even be worth upgrading from on a still relevant system ?
Think the simplest way to answer that is if your card has less memory than this one it will be an upgrade. If it has the same amount it will either be an upgrade or nearly the same.

If you have a card with more memory than why the heck consider this overpriced card.
 

Irata

Posts: 2,036   +3,456
Think the simplest way to answer that is if your card has less memory than this one it will be an upgrade. If it has the same amount it will either be an upgrade or nearly the same.

If you have a card with more memory than why the heck consider this overpriced card.
My point was that I really only see this as a card for someone building a new PC or needing a replacement card.

Yes, it‘s overpriced, although at or near msrp far less than pretty much everything else on the market if you look at it in absolute terms.

So, do you buy a $300 3060 for $800 or a $150 6500XT for $250 ?

This all comes down to what you expect graphics card prices to be in the coming months and which performance level you really want, I.e. if a 3060 would be settling for a lower perf level than you wanted.
 

Theinsanegamer

Posts: 3,317   +5,508
I would not be surprised to see the 5500 XT actually outperform the 6500 XT in some games. AMD is really releasing turd here. I don't mean to be crude, but I couldn't think of a better term for it. Only 4 PCI lanes the 5500 XT had 8 (why reduce this from 16 at all?), 4 GB of VRAM, and the 18 Gbps is not going to overcome the fact that many 1080p games already require more than 4GB. I realize this is an entry level gaming card, but that doesn't mean it has to be this bad. The 5500 XT could not clock nearly as high, but it had 22 shader clusters vs 16 in the 6500 XT. The RTX 3050 on the other hand (at least the 8GB version) should be a pretty decent entry level card capable of high settings and 1080p with max textures. The 3050 will even be decent at 1440p I'll bet. For $50 more, the 3050 is definitely the entry level card to get if you can. And that is the crux huh? Maybe AMD is betting this card will be so bad for mining that it will be available to gamers to be equally as bad, but sell anyway? No matter the reason, every AMD GPU released after the initial launch of RDNA2 has been disappointing, but this one takes the cake.
I've said that on some other news articles: the 560x from 2017 is already limited by its 4GB framebuffer in some titles, DOOM and cities skylines, and now halo infinite at 1080p.

4GB of RAM on a 64 bit bus with a PCIe 4x interface is a recipe for disaster. Worst part? It pulls about 100 watts, the same power draw as the 550xt for likely the same performance.

AMD has utterly dropped the ball on everything 6700xt and lower.
 

Axeia

Posts: 38   +40
Yes, it‘s overpriced, although at or near msrp far less than pretty much everything else on the market if you look at it in absolute terms
European prices are in and OEMs aren't going to be any less than a minimum of €300.
The new market is dead to me, I'll get a second hand GTX 1080.
 

Irata

Posts: 2,036   +3,456
European prices are in and OEMs aren't going to be any less than a minimum of €300.
The new market is dead to me, I'll get a second hand GTX 1080.
We‘ll see when they are released. In the case of the 6600 series cards, you could at least get models at and near msrp for a good while on launch day. I hope it‘s the same for the 6500XT, allowing those that really need a graphics card to not overspend by too much.