AMD Radeon RX 5500 XT 8GB Review: Navi at $200

So yeah 5500XT is the model to go for in the near future, yet it got beaten by 1660 lol.

In this particular game, the 4GB 5500XT beats the 4 GB 1650 super by 1 fps and gets beaten by the 6 GB GTX 1660 by 1 fps which in turn gets beaten by the 8 GB 5500 XT by 21 fps (+36%).

The 5500 XT is also faster than the 1660 Ti and Super by quite a margin in this one test, so not sure you picked a good example to make your point..

What am I missing for the "lol" part ?
 
In this particular game, the 4GB 5500XT beats the 4 GB 1650 super by 1 fps and gets beaten by the 6 GB GTX 1660 by 1 fps which in turn gets beaten by the 8 GB 5500 XT by 21 fps (+36%).

The 5500 XT is also faster than the 1660 Ti and Super by quite a margin in this one test, so not sure you picked a good example to make your point..

What am I missing for the "lol" part ?

That was the comparison between the 4GB and 8GB 5500XT model, not a comparison to 1650S or 1660 as Guru3d could have used old data from the 1650S and 1660 review (not updated with new drivers).
Here at Techspot the results against 1650S and 1660 are very different, however Techspot did not bench the 4GB version of 5500XT. It has been mentioned in many reviews that there are some instances that the 4GB 5500XT model just fell flat though.
Gears_1080p.png


assassins-creed-odyssey-1920-1080.png


assassins-creed-odyssey-1920-1080.png

wolfenstein-2-1920-1080.png

wolfenstein-2-1920-1080.png


Weird that 1650S was fine in Odyssey but suffer then same fate as 5500XT 4GB in Wolfenstein 2 lol.
Seems like 6GB is the minimum VRAM for no compromise 1080p gaming.
 
Last edited:
That was the comparison between the 4GB and 8GB 5500XT model, not a comparison to 1650S or 1660 as Guru3d could have used old data from the 1650S and 1660 review (not updated with new drivers).
Here at Techspot the results against 1650S and 1660 are very different, however Techspot did not bench the 4GB version of 5500XT. It has been mentioned in many reviews that there are some instances that the 4GB 5500XT model just fell flat though.
Well, in their charts Guru3d were comparing all the listed cards. Yes, the review was about the 5500 XT but that does not mean that other cards' numbers cannot be used.

Considering that the 1650 super has not been on the market for that long, I do not think drivers would affect performance that much for this particular card. Were there any performance updates for Gear 5 in the recent drivers ?

The reviews are different since Guru 3d uses 1080p Ultra whereas Techspot uses 1080p High Quality with - oddly - lower results.

Still don't get the "LOL" part though as you are adding a second game (Wolfenstein 2) where the 5500 XT (8GB) to 1650 Super fps difference is very large .

 
Well, in their charts Guru3d were comparing all the listed cards. Yes, the review was about the 5500 XT but that does not mean that other cards' numbers cannot be used.

Considering that the 1650 super has not been on the market for that long, I do not think drivers would affect performance that much for this particular card. Were there any performance updates for Gear 5 in the recent drivers ?

The reviews are different since Guru 3d uses 1080p Ultra whereas Techspot uses 1080p High Quality with - oddly - lower results.

Still don't get the "LOL" part though as you are adding a second game (Wolfenstein 2) where the 5500 XT (8GB) to 1650 Super fps difference is very large .

Yeah there is something off about Guru3d numbers
gears-5-1920-1080.png


Kitguru Gears 5
But yeah as I show you Odyssey and Wolfenstein 2 will punish 5500XT 4GB severely.

Anyways the LOL part is for when 4GB is not enough that performance just fell flat, that happens more times with 5500XT 4GB than with 1650S. GTX 1660 with 6GB GDDR5 happily beat 5500XT with 8GB of 14gbps GDDR6 though. 5500XT 8GB 's direct competition is the GTX 1660 while the 5500XT 4GB competes with GTX 1650S.

Check out this vid that compare 5500XT 4GB vs 8GB

On the other hand all 1650 Super models use 14gbps GDDR6 but capped at 12gbps (12 000mhz), that make them some overclocking monster.
EVGA 1650 Super Overclocking
 
Last edited:
Techspot gave the GTX1660 "95/100" and its ~the same performance for more money / less value.

Mind you that was quite a way back, before "supers" and Navi came along ... it seems our expectations for value is considerably higher now ...

1660 Super and RX 590 are same price and 590 gets destroyed.
Look where the 5500 sits. Typical AMD. 5500XT is between the 580 and 590 the whole time. Three cards within 5fps of each other. What a complete waste of time and money for red team - again.

Fury X - over developed, under delivered
Radeon VII - Rinse and repeat
5500XT - Rinse Rinse repeat.

5500XT should have had a 60/100 score.
 
Last edited:
I support competition, I support AMD but this is just a terrible offering. Nvidia has a node disadvantage yet Geforce cards still offer good efficiency. AMD should sell GPU division, they can't compete, period.
 
... Three cards within 5fps of each other.

Well the polaris cards are outgoing and Navi is incoming ... there has to be some crossover.

I'm not sure what your main disappointment is, a low mid card has to slot into the low mid segment, a 5600xt is incoming so a 5500xt has to sit below that and the 5600xt has to sit below the 5700 non XT.

The fact that AMD still has two outgoing polaris cards in that region should be of no concern, just take a pick for the segment that best suits your needs - NVidia has ~100 skus all a few fps away from each other -- that doesn't seem to be an issue to anyone except Steve Burke and Jayz. ;)

So wanting to give it a lower score because its close to other older cards doesn't seem valid. A 2070 super is ~ 1080ti ... I don't think the 2070 Super should get pooed because of that.

The issue is that the price / performance isn't compelling enough to make a person buy it over NVidia parts with the exact same value. But overall its roughly competitive. 75 is probably about right considering the clearly worse value equal performing 1660 got a 95.
 
Well the polaris cards are outgoing and Navi is incoming ... there has to be some crossover.

I'm not sure what your main disappointment is, a low mid card has to slot into the low mid segment, a 5600xt is incoming so a 5500xt has to sit below that and the 5600xt has to sit below the 5700 non XT.

The fact that AMD still has two outgoing polaris cards in that region should be of no concern, just take a pick for the segment that best suits your needs - NVidia has ~100 skus all a few fps away from each other -- that doesn't seem to be an issue to anyone except Steve Burke and Jayz. ;)

So wanting to give it a lower score because its close to other older cards doesn't seem valid. A 2070 super is ~ 1080ti ... I don't think the 2070 Super should get pooed because of that.

The issue is that the price / performance isn't compelling enough to make a person buy it over NVidia parts with the exact same value. But overall its roughly competitive. 75 is probably about right considering the clearly worse value equal performing 1660 got a 95.

AMD is making poor choices with Radeon, yet they somehow always get a pass. I'm not so forgiving. They have overdeveloped three cards now, only to come in second each time. Fury X, Radeon VII, and now 5500 series. It's a complete waste of time and money to go after a market with inferior products. Why buy 5500 series when any RX 580 or 590 is literally just as good?

GTX 1660 isn't within 2fps like the 580, 590 and 5500XT all are. It's an easy choice over the 5500XT.

AMD needs consistency. They almost have it with Zen 2, but RDNA is not getting the same treatment. Too much on their plate if you ask me. Too many 7nm products in so little time...
 
"This reduces the amount of data written out to memory and transferred from memory to the L2 cache and reduces the amount of data transferred between clients (such as the texture unit) and the frame buffer".
That means by reducing the amount of data transfer the effective bandwidth is increased. Let say you compress a 100GB file into a 90GB zip and transfer it over gigabit network, the time to transfer is reduced because the physical size is reduced. But yeah lossless compression does not reduce the physical size that much though.

Also there is a wide difference between 4GB and 8GB model in Gears 5 even at 1080p
index.php


So yeah 5500XT 8GB is the model to go for in the near future, yet it is too expensive for what it offers.

"NVIDIA GPUs utilize several lossless memory compression techniques to reduce memory bandwidth demands as data is written out to frame buffer memory. The GPU’s compression engine has a variety of different algorithms which determine the most efficient way to compress the data based on its characteristics. This reduces the amount of data written out to memory and transferred from memory to the L2 cache and reduces the amount of data transferred between clients (such as the texture unit) and the frame buffer. Turing adds further improvements to Pascal’s state-of-the-art memory compression algorithms, offering a further boost in effective bandwidth beyond the raw data transfer rate increases of GDDR6. As shown in Figure 10, the combination of raw bandwidth increases, and traffic reduction translates to a 50% increase in effective bandwidth on Turing compared to Pascal, which is critical to keep the architecture balanced and support the performance offered by the new Turing SM architecture."

"A framebuffer (frame buffer, or sometimes framestore) is a portion of random-access memory (RAM)[1] containing a bitmap that drives a video display. It is a memory buffer containing a complete frame of data.[2] Modern video cards contain framebuffer circuitry in their cores. This circuitry converts an in-memory bitmap into a video signal that can be displayed on a computer monitor. "


The Framebuffer does not occupy a significant portion of the RAM. As mentioned time and time again in the Nvidia technical document, compression is 100% about reducing bandwidth requirements. You are looking at 8.33 MB (depending of course on compression, quality, ect) to 33.8 MB. After all, frames are nothing more then rasterized images. Even if Nvidia had a 300% lead in compression, it would not make such of a difference when it comes to overall memory space.

The items that take up a lot of space in RAM, the textures, are compressed before hand. Of did you honestly believe your GPU was compressing large files on the fly? Yeah no.
 
Last edited:
GTX 1660 ... It's an easy choice over the 5500XT.

If that would be your choice ... they are within 2-3FPS of each other on average and the 1660 is $20-$30 more cost, by Amazon and what the article listed.

The 1660 is even more greatly overpriced than the 5500XT which is already overpriced if you consider the 1660 super's value for just $30 more. Not sure how that metric makes the 1660 better, but to each their own, I guess.

I think it might be your expectations that have let you down to the degree you are expressing. I am also disappointed, but only that its not $30 less expensive. It would be an incredible value then.

If a 5900xt launches, then I'll expect it to compete with 2080ti, but a low end card should be expected to perform like a low end card, and it should perform like a low end card else it wouldn't be a low end card. The issue is price/perf.
 
If that would be your choice ... they are within 2-3FPS of each other on average and the 1660 is $20-$30 more cost, by Amazon and what the article listed.

The 1660 is even more greatly overpriced than the 5500XT which is already overpriced if you consider the 1660 super's value for just $30 more. Not sure how that metric makes the 1660 better, but to each their own, I guess.

I think it might be your expectations that have let you down to the degree you are expressing. I am also disappointed, but only that its not $30 less expensive. It would be an incredible value then.

If a 5900xt launches, then I'll expect it to compete with 2080ti, but a low end card should be expected to perform like a low end card, and it should perform like a low end card else it wouldn't be a low end card. The issue is price/perf.

1660 is better in every way, since it's the same or cheaper, and beats the 580, 590 and 5500XT. Not bad for a non-super....
 
1660 ... since it's the same or cheaper,

I guess that depends on what sort of deal you can strike (or what country you live?) ... on Amazoncom all the 1660s I saw were $220 to $250USD (edit ... wait, I found some garbage single fan ones for $209 - still higher than the MSRP of 5500xt). The article reported it at $220. Performance is within margin of error. 2-3fps?

I'm not really seeing what you are seeing ... there's no way I'd buy the 1660. especially since the 1660 super exists, but its your money.
 
I guess that depends on what sort of deal you can strike (or what country you live?) ... on Amazoncom all the 1660s I saw were $220 to $250USD (edit ... wait, I found some garbage single fan ones for $209 - still higher than the MSRP of 5500xt). The article reported it at $220. Performance is within margin of error. 2-3fps?

I'm not really seeing what you are seeing ... there's no way I'd buy the 1660. especially since the 1660 super exists, but its your money.

The article said it was overpriced and under delivered, and I agree. So you want to come at me? lol. The card is trash.

AMD used 7nm and GDDR6 to do what 12nm and GDDR5 were already doing, and want more money for it. 5700XT cards are so poorly cooled, only two are worth buying. 5% loss in dGPU market share a quarter after 5700 series launch. Trying to charge NVIDIA prices for AMD hardware that isn't better in any way. What a joke. 60/100.
 
The article said it was overpriced and under delivered, and I agree. So you want to come at me?

Am I coming at you? I'm pretty sure it was my point from the start about the price being too high ... I just didn't agree with all that other unrelated and unimportant rhetorical opinion. I thought I was being pretty clear about that ... ? That makes me "coming at you"? ... weird.
 
Last edited:
The article said it was overpriced and under delivered, and I agree. So you want to come at me? lol. The card is trash.

AMD used 7nm and GDDR6 to do what 12nm and GDDR5 were already doing, and want more money for it. 5700XT cards are so poorly cooled, only two are worth buying. 5% loss in dGPU market share a quarter after 5700 series launch.

"Basically the Radeon RX 5500 XT 8GB comes in offering the same level of value as the GTX 1660 Super, at $2.27 per frame based on our 12 game sample. "

"If you were expecting AMD to come in and rock the boat you will be dissapointed, as we were. The Radeon RX 5500 XT was never meant to blow your socks off from a performance perspective, but we do take issue with pricing. AMD’s made no attempt to undercut Nvidia, rather they’ve slotted the 5500 XT into Nvidia’s existing pricing structure. In other words, they’ve rocked up to the party late, and offered us nothing new. "

Under Delivered? Yes. Overpriced? Yes. Trash? No. It delivers performance in line with Nvidia's own products, as quoted above and shown in the article. The right answer is: pick whichever offers the best performance for your budget.


Trying to charge NVIDIA prices for AMD hardware that isn't better in any way. What a joke. 60/100.

Seems to me you'd simply take points off a review based solely on the name.
 
I just didn't agree with all that other unrelated and unimportant rhetorical opinion. I thought I was being pretty clear about that ... ? That makes me "coming at you"? ... weird.
"I just didn't agree with all that other unrelated and unimportant rhetorical opinion."

And I don't agree with yours. Deal with it.
 
"NVIDIA GPUs utilize several lossless memory compression techniques to reduce memory bandwidth demands as data is written out to frame buffer memory. The GPU’s compression engine has a variety of different algorithms which determine the most efficient way to compress the data based on its characteristics. This reduces the amount of data written out to memory and transferred from memory to the L2 cache and reduces the amount of data transferred between clients (such as the texture unit) and the frame buffer. Turing adds further improvements to Pascal’s state-of-the-art memory compression algorithms, offering a further boost in effective bandwidth beyond the raw data transfer rate increases of GDDR6. As shown in Figure 10, the combination of raw bandwidth increases, and traffic reduction translates to a 50% increase in effective bandwidth on Turing compared to Pascal, which is critical to keep the architecture balanced and support the performance offered by the new Turing SM architecture."

In the same wiki article:

"RAM on the video card:
Video cards always have a certain amount of RAM. This RAM is where the bitmap of image data is "buffered" for display. The term frame buffer is thus often used interchangeably when referring to this RAM.
The CPU sends image updates to the video card. The video processor on the card forms a picture of the screen image and stores it in the frame buffer as a large bitmap in RAM. The bitmap in RAM is used by the card to continually refresh the screen image."

As in the word "compression", Nvidia stored compressed images in the VRAM, thus reducing the amount of data transfer between VRAM and GPU, boosting the overall effective bandwidth. It even said in the Nvidia white paper "the combination of raw bandwidth increases, and traffic reduction translates to a 50% increase in effective bandwidth on Turing compared to Pascal"

It will be clear if you look for any side by side comparison between AMD and Nvidia, as Nvidia will use less VRAM, sometimes only a few MB but sometimes can be as much as 1GB. Could this be the reason why some people say AMD has better colors than Nvidia ?



So yeah some games can choke the hell out of 5500XT 4GB while 1650S perform just fine.
 
Last edited:
In the same wiki article:

"RAM on the video card:
Video cards always have a certain amount of RAM. This RAM is where the bitmap of image data is "buffered" for display. The term frame buffer is thus often used interchangeably when referring to this RAM.
The CPU sends image updates to the video card. The video processor on the card forms a picture of the screen image and stores it in the frame buffer as a large bitmap in RAM. The bitmap in RAM is used by the card to continually refresh the screen image."

As in the word "compression", Nvidia stored compressed images in the VRAM, thus reducing the amount of data transfer between VRAM and GPU, boosting the overall effective bandwidth. It even said in the Nvidia white paper "the combination of raw bandwidth increases, and traffic reduction translates to a 50% increase in effective bandwidth on Turing compared to Pascal"

It will be clear if you look for any side by side comparison between AMD and Nvidia, as Nvidia will use less VRAM, sometimes only a few MB but sometimes can be as much as 1GB. Could this be the reason why some people say AMD has better colors than Nvidia ?



So yeah some games can choke the hell out of 5500XT 4GB while 1650S perform just fine.

Allocated RAM does not equal used RAM. For all we know the AMD driver could be more aggressively allocating RAM based on perceived need. There's also the game you need to consider, as certain games allocate memory for cards differently. I don't know the technicals of AMD drivers so I can't really say whether it really is the compression or if it's just something with the driver.

"Could this be the reason why some people say AMD has better colors than Nvidia ?"

I haven't had an AMD card in my main rig in years but from the one's I've tested recently they certainly appear to me to have slightly better colors. I should probably mention that my monitor is professionally calibrated though and I could understand 60% of people not even noticing.
 
Allocated RAM does not equal used RAM. For all we know the AMD driver could be more aggressively allocating RAM based on perceived need. There's also the game you need to consider, as certain games allocate memory for cards differently. I don't know the technicals of AMD drivers so I can't really say whether it really is the compression or if it's just something with the driver.

"Could this be the reason why some people say AMD has better colors than Nvidia ?"

I haven't had an AMD card in my main rig in years but from the one's I've tested recently they certainly appear to me to have slightly better colors. I should probably mention that my monitor is professionally calibrated though and I could understand 60% of people not even noticing.

Well anyways Steve would be wrong to think that 4GB and 8GB models are the same on the 5500XT though


It has also been established in many reviews that the 5500XT 4GB is actually slightly slower than 1650 Super (1% difference) while cost 10-20usd more. Not to mention that all 1650 Super models use 14gbps VRAM chips but clock at 12gbps, that give them some serious overclocking headroom. Overall AMD's GPU market shares are just gonna tank in the upcoming months.
 
I can't speak for you, so you tell me ... I never had an issue to begin with.

There's some really sensitive people on this forum ...

People? So you've done this before?
You openly acknowledged my original comment was my opinion, and you're upset I didn't change it for you.
*sigh*
 
Back