PCI Express Bandwidth Test: PCIe 4.0 vs. PCIe 3.0 Gaming Performance & Limited VRAM Memory...

That's why I cannot emphasise enough that AMD have gimped this card to the point of making this card redundant. Sure, gamers can buy it because of good availability, but that's because nobody is buying it. It is poor for gamers especially if it cost anything more than the MSRP. It is useless for miners because of the 4GB VRAM. It is less desirable for HTPC users cause it is lacking up to date decoder. It is too expensive for driving multiple displays for work since I can get a cheap GT 1050 and it works fine. So the question is, who did AMD create this train wreak of a card for? I sound harsh, but this is the fact. I think if one is looking for a cheap card, might as well consider a GTX 1650 which does not have up to date video decoder, but its got some encoder. Its got a PCI-E 3.0 x16 connection. It is also probably cheaper than this RX 6500/ 6400, at least from what I have seen from time to time. And it runs most games well enough at 1080p at mid/ low graphic settings.
 
It really is a ridiculously crippled design decision. Even the £99 GTX 1050 (non Ti) 2GB had 16 lanes (link) so it can't be "cost cutting"...
 
I remember several people telling me how this wasn't going to be a big deal even when I pointed out that reducing to both just 4x AND 3.0 would cripple this card, now we can say for certain I was right in being cautious: Nobody should be buying this turd at release.
 
I remember several people telling me how this wasn't going to be a big deal even when I pointed out that reducing to both just 4x AND 3.0 would cripple this card, now we can say for certain I was right in being cautious: Nobody should be buying this turd at release.
even if everything did work fine or they gave it x8 or x16, it's apparently only ~26% faster than the RX 570 which launched nearly 5 years ago at around the same price
 
Last edited:
So the question is, who did AMD create this train wreak of a card for?

"Non-tech savvy Gamers".

It's a gamer-only card, and they made it 4GB so as to shut out the miner whale spam buyers.

You and me won't buy it but there's millions of gamers out there who will buy it and will be happy to play their games at 720p or so.

Despite what the OP says, I am quite confident this card can do over 300 FPS in any game at 1024X768 or so.

Even better, it's a card released with planned obsolescence in mind which means that its unfortunate owner will want smt new after a year or so after he realizes that he "bought the farm".
 
I don't buy the argument that limiting VRAM usage in games to 4GB is hard or impossible. Just dropping top the mip-level off textures will reduce the VRAM used by textures by ~66% and textures usually constitute the majority of VRAM usage.
 
I get the 4GB vram to make the card unappealing to miners but the 4x pcie is just too much
Proably afraid it would cut into their 6600XT sales.
 
Waste of sand. They will rot on the shelves in my country along with all the other overpriced AIB Radeon GPUs. I tried looking at my favorite online store and they don’t even have any 6500xt models listed, but they sure have a crap ton of 3050s listed already (no pricing though) stating January 27th as being the launch date.
 
I don't buy the argument that limiting VRAM usage in games to 4GB is hard or impossible. Just dropping top the mip-level off textures will reduce the VRAM used by textures by ~66% and textures usually constitute the majority of VRAM usage.


The reviews tested Cybperpunk 2077 on medium setting and medium texure, and PCI-E3.0 4x still got performance hit

FarCry 6 was tested on medium preset and PCI-E4.0 4x had big hit performance (specially the 1% lows), and PCI-E3.0 4x got even a bigger hit

So medium setting might not fix the problem in some games (specially if you use PCI-E3.0 mobo)
 
I find it amusing that people genuinely believe that AMD was trying to do gamers a favour by cutting their costs and only bundling 4GB of VRAM with the 6500XT. The execs at AMD must be laughing like pirates!
 
That's why I cannot emphasise enough that AMD have gimped this card to the point of making this card redundant. Sure, gamers can buy it because of good availability, but that's because nobody is buying it. It is poor for gamers especially if it cost anything more than the MSRP. It is useless for miners because of the 4GB VRAM. It is less desirable for HTPC users cause it is lacking up to date decoder. It is too expensive for driving multiple displays for work since I can get a cheap GT 1050 and it works fine. So the question is, who did AMD create this train wreak of a card for? I sound harsh, but this is the fact. I think if one is looking for a cheap card, might as well consider a GTX 1650 which does not have up to date video decoder, but its got some encoder. Its got a PCI-E 3.0 x16 connection. It is also probably cheaper than this RX 6500/ 6400, at least from what I have seen from time to time. And it runs most games well enough at 1080p at mid/ low graphic settings.

We have to see. The cheapest 1650 on Newegg is $324 on Newegg right now but it performs a good bit below an RX 570.

So in the end what matters is a) street price vs performance and b) what you want to use the card for. We‘ll see very shortly.

 
I'm wondering how much influence infinity cache has on this. Theoretically, it should reduce data transfer over the PCI-e bus, so, the performance drop should be reduced.

Additionally, there might be an unusually large difference between resizable bar enabled and disabled for this card. You should test that too.
 
I'm not too hopeful about the new cards performance as not only is it bandwidth constrained with the x4 PCIE it's also got half the memory bus of the 5500 XT at 64bits.

Thanks for revisiting the 5500 XT but you really left out a critical factor, The 5500 XT has a 128 bit bus and the information/rumours indicate the 6500 XT has half that at 64bit bus. It's also going to use more power since it appears to be at least the same draw as the 5500 XT at 130w and we all know the 1050 Ti uses less then 75w total as it's only powered by the slot.

From the standpoint of some one being tech aware and asked about stuff like this, I can't even recomend this waste of sand right now as it's over priced. It shouldn't be more then $50 if they wanted to offer a budget card but I guess what we got used to performanc wise for $50 is now going to cost 3 or 4 times that with $150 or higher for this junk.
 
Last edited:
What we learned - do not ever buy a 4GB card. Honestly can't believe it's a thing in 2022 on a card with RRP of $200US. Even worse the 64 bit bus of the 6500XT is a sad pathetic joke IMO. I'm amazed the 6400 doesn't have a 32 bit bus, 2GB, and 4MB IC.
 
The reviews tested Cybperpunk 2077 on medium setting and medium texure, and PCI-E3.0 4x still got performance hit

FarCry 6 was tested on medium preset and PCI-E4.0 4x had big hit performance (specially the 1% lows), and PCI-E3.0 4x got even a bigger hit

So medium setting might not fix the problem in some games (specially if you use PCI-E3.0 mobo)
4GB VRAM is definitely pretty limited for a modern GPU and PCIe 3.0x4 will impact performance some even when not exceeding 4GB of VRAM usage, but a lot of games will check the amount of VRAM available and adjust the default setting appropriately even it that means low quality texture settings. I haven't done the testing, but I would be surprised if there are many games that can't be adjusted to run in 4GB.
 
What we learned - do not ever buy a 4GB card. Honestly can't believe it's a thing in 2022 on a card with RRP of $200US. Even worse the 64 bit bus of the 6500XT is a sad pathetic joke IMO. I'm amazed the 6400 doesn't have a 32 bit bus, 2GB, and 4MB IC.
In my opinion, there is nothing wrong with 4GB VRAM cards, especially if the price is right. But when you have limited VRAM, it is common sense that there is no avoiding the spillover to use system ram. And that is where the faster bus speed will be meaningful. In this case, you have a combination of limited VRAM, and PCI-E x4 (I still cannot believe AMD can cut cost to this extend), it can be disastrous. The performance hit in the review above highlights the issue clearly. No matter how good the card is when it comes to specs, you will run into the PCI-E bottleneck very quickly. AMD is advertising this to use with FSR, and I see the reason for doing so.
 
Back