Radeon RX 5500 XT 4GB vs. 8GB VRAM Test

4GB on a video card is an affront to God!! (Andraste)

Gather your torches and pitchforks, the video card must be destroyed!
 
In regards to the 580 being a better buy; if you are restrained on budget, look at used cards. there are a metric load of vega 56 cards going for under $200 on ebay right now. Those would be a far better deal then any 580, offering significantly more performance then the 580 or 5500xt. You can find Vega 64s for $230-250, same with GTX 1070s. Hell, the 290x, which slots in between the 570 and 580, can be had for $75-90!

I know the whole meme about "BuT uSeD gPuS" and "BuT mInInG". The last 5 GPUs I've used in my gaming PC have all been bought used for a fraction of the price. My second 550ti, 770s, my 480, my vega 64, all bought used, all without issue. Given how much I've saved, I'd say its well worth it. If I were on a budget, I'd grab an old 290x and wait out the next 1-2 years of GPUs, as the GPUs released early into a console's lifespan dont do very well by the end of said lifespan. With new consoles coming out with more RAM, if VRAM requirements jump significantly again, none of the current da 8GB cards will be comfortable. Why waste your money?
 
Just out of interest:

Could you also test the following (I know this takes time, but one can ask):

- other card features like e.g. the Video de- / encode block and what else there is ?
Really curious to see what the improvements there are (over Polaris but also a GTX 16xx vs. 5500 XT)
- do a test on lower / mid range Intel and AMD systems with the cards listed in the above charts ?

I would be curious to see what the performance differences are on a less than high end CPU and especially curious if there is any difference in Intel vs. AMD CPU.

Thanks.
 
The real crime is you still need a $300+ video card in this day and age to play all games at 1080P ultra graphics with AA. Every one of those cards should be under the $150 price point.
 
In regards to the 580 being a better buy; if you are restrained on budget, look at used cards. there are a metric load of vega 56 cards going for under $200 on ebay right now. Those would be a far better deal then any 580, offering significantly more performance then the 580 or 5500xt. You can find Vega 64s for $230-250, same with GTX 1070s. Hell, the 290x, which slots in between the 570 and 580, can be had for $75-90!

I know the whole meme about "BuT uSeD gPuS" and "BuT mInInG". The last 5 GPUs I've used in my gaming PC have all been bought used for a fraction of the price. My second 550ti, 770s, my 480, my vega 64, all bought used, all without issue. Given how much I've saved, I'd say its well worth it. If I were on a budget, I'd grab an old 290x and wait out the next 1-2 years of GPUs, as the GPUs released early into a console's lifespan dont do very well by the end of said lifespan. With new consoles coming out with more RAM, if VRAM requirements jump significantly again, none of the current da 8GB cards will be comfortable. Why waste your money?
I like your strategy. I typically go new because I generally keep things longer, but staying on the used market could keep one up-to-date at a lower cost. Only early adopters have to pay full price, but they can afford it anyway.
 
So the RX 480 released in 2016 for $200 isn’t all that much slower than this part. Goes to show how aggressively priced that was and how badly priced this is. Looking at the charts and the fact that it depends on AMDs driver team, I would say that $150 is the most anyone should pay.

Of course supply issues will be blamed, it’s quite clear to me that AMD have prioritised their supply for OEMs like Apple, who are bundling these chips in a lot of their systems now. Good for AMD I guess and in the meantime the custom builders will choose an Nvidia card.
 
The real crime is you still need a $300+ video card in this day and age to play all games at 1080P ultra graphics with AA. Every one of those cards should be under the $150 price point.

Why, because you say so?

That's up to the game devs, not the hardware manufacturers. You expect them all to get together and decide on one single engine to use for all future titles for a given generation so that the users don't have to go out and buy more expensive hardware to use higher quality settings at a given resolution? Should Metro Exodus be expected to run in the same performance bracket as Rocket League?

lol wtf
 
I like your strategy. I typically go new because I generally keep things longer, but staying on the used market could keep one up-to-date at a lower cost. Only early adopters have to pay full price, but they can afford it anyway.
I prefer to keep my hardware as long as possible too. Staying at 1200p as long as I did helps a lot. Only one of those upgrades was "unnecessary".

The second 550ti was purchased three months before the 680 came out, to boost performance in battlefield 3 to a playable level.The 770s replaced the 550tis when they simply became too weak and limited by 1GB of VRAM (should have learned my lesson here). These were bought after the release of the 900s. The 480 replaced the 770s because, between the 2GB framebuffer and the death of SLI, I couldnt keep decent performance in newer titles like wolfenstein the new order.

The vega 64 I bought to go with my upgrade to 1440p. If I didnt have that monitor, I'd likely still be on that 480, and the 3570k i5 too. I just get that upgrade itch all the time, hence why I stick to used parts.

In hindsight, I would have stuck with my 1200p monitor, swapped out the 550tis for a single 290x when they were going for $250 shortly after I bought the 770s, and kept it until today along with my old ivy bridge motherboard. To its credit,t hat ivy bridge system went 7 years, and the only reason I replaced it was to get NVMe boot a year ago, it still runs perfectly.
 
Why, because you say so?

That's up to the game devs, not the hardware manufacturers. You expect them all to get together and decide on one single engine to use for all future titles for a given generation so that the users don't have to go out and buy more expensive hardware to use higher quality settings at a given resolution? Should Metro Exodus be expected to run in the same performance bracket as Rocket League?

lol wtf

wjLgBiG.gif


Yes I say so when they (manuf.) are trying to push 4K in games now. Do game developers set hardware prices???? There are a lot less game engines than there were several years ago. Oh and this DX12 thing. It's almost 2020, 1080P should be an afterthought now. Don't be so obtuse.
 
In regards to the 580 being a better buy; if you are restrained on budget, look at used cards. there are a metric load of vega 56 cards going for under $200 on ebay right now. Those would be a far better deal then any 580, offering significantly more performance then the 580 or 5500xt.

One catch is that all the cheap Vega 56 cards appear to be the reference version with the loud blower cooling. Gamers who don't want a loud computer might prefer the much quieter 5500 XT.

The other catch is that both the Vega 56 and the RX 580 are based on the older GCN architecture rather than RDNA. That means that they will probably stop getting driver updates well before the 5500 XT does. If you're the kind of owner who keeps hardware for a while, that might be a drawback.

Finally, I'd like to see somebody test all of these cards on things besides games. We all know what their relative performance is on games now, but what if you also use your computer for a bit of video editing or GPU computing? How will performance compare in those applications? It doesn't necessarily compare with gaming performance; one or the reason those Vega cards were so popular for mining is that they substantially outperformed NVIDIA cards in that application, even though they were left behind in gaming.
 
Yes I say so when they (manuf.) are trying to push 4K in games now. Do game developers set hardware prices???? There are a lot less game engines than there were several years ago. Oh and this DX12 thing. It's almost 2020, 1080P should be an afterthought now. Don't be so obtuse.

Lets completely ignore the fact that games have become several orders of magnitude more complex than before, and that there are many more things to consider than just pixel count when it comes to determining performance. All games should be able to arbitrarily surmount the 30/60FPS threshold (or whatever you consider acceptable) on hardware that does not exceed the arbitrary magical price point of $150 that you've established, just because.
 
This is a little off topic but I think the current dilemma (visual vs performance) was because developers have been too focused on rasterization that has long passed its diminishing returns. Now you need GPU that has 2x the horsepower (R9 290X to 5700XT) just to make game look barely better (compare current games to Crysis 3, personally I haven't found any games that looker better than Witcher 3).

So yeah now everyone (AMD, Microsoft, Sony) is investing in RT because it is the cheaper approach to improve visual, I expect that when RT has becoming mainstream we can have cheap GPU that can handle high visual settings at acceptable FPS (3-4 more years maybe).
 
One catch is that all the cheap Vega 56 cards appear to be the reference version with the loud blower cooling. Gamers who don't want a loud computer might prefer the much quieter 5500 XT.

The other catch is that both the Vega 56 and the RX 580 are based on the older GCN architecture rather than RDNA. That means that they will probably stop getting driver updates well before the 5500 XT does. If you're the kind of owner who keeps hardware for a while, that might be a drawback.

Finally, I'd like to see somebody test all of these cards on things besides games. We all know what their relative performance is on games now, but what if you also use your computer for a bit of video editing or GPU computing? How will performance compare in those applications? It doesn't necessarily compare with gaming performance; one or the reason those Vega cards were so popular for mining is that they substantially outperformed NVIDIA cards in that application, even though they were left behind in gaming.
Counter argument:
1: the blower isnt that loud. I have a stock blower vega 64. The stock fan curve tops out at 65% at 75C. I have mine set to hit 100% at 70C, and believe me it hits that during intense gameplay. Is it loud? Yeah, but its in a desktop on the floor under the desk, it hardly matters, and it is no lounder then my pentium IV rig from the early 2000s. I can see it being a bigger issue if your computer sits on the desk next to you, but I've always foudn this issue overblown. If sound is paramount to you over performance, you should be looking at something like the third party arctic cooling silent GPU heatsink, not relying on an AIB cooler.

2: AMD has a long history of doing just this, but if that is a concern for you, you wouldnt be buying AMD in the first place. Nvidia has good points, and one is absurdly long term driver support. I dont see AMD cutting GCN support antime soon, they just dont have the GPU sales to jsutify that, especially as it is taking them upwards of a year to get a full GPU stack out the door. We still dont have a proper high end rDNA chip, or the 5600xt, and rDNA 2 will be coming out in 2020. AMD is still relying on those GCN GPUs for a large chunk of their existing base, and they wont cut those people off unless they want to give nvidia a nice christmas present.

3: If you are looking to perform GPU comute tasks or video rendering on your GPU,. you probably are not looking at $200 GPUs in the first place. if you ARE, for whatever reason, then a used firepro or quadro card will do far better the a $200 gamer card, full stop.
 
I notice only specific newer AAA games that came after 2018 that require 8 GB VRAM like RDR2, AC odyssey etc..or the games that already struggle to run in 4k 60 FPS ultra..with a RTX 2080 ti
most newer Games & older games(Rainbox Seige, World of Tanks,BF V) and less demanding games run pretty well with 4GB VRAM
 
Yes I say so when they (manuf.) are trying to push 4K in games now. Do game developers set hardware prices???? There are a lot less game engines than there were several years ago. Oh and this DX12 thing. It's almost 2020, 1080P should be an afterthought now. Don't be so obtuse.
No, but the game developers DO set the hardware REQUIREMENTS. You can use 5500 to play games at 4k just fine, it's just that developers are perfectly fine sacrificing 50% of performance for visuals that are barely better, especially when it comes to ultra settings. If the hardware was better, the developers would just make more demanding games.
 
Ultimately, the Ti and Titan series give you the most VRAM and coincidentally command the highest prices. If you buy anything less than a high-end card, you'll regret it.

Therefore, I recommend you save up the money and just get the Ti model.

1080Ti and 1660Ti were the best of the GTX

Now you need a 2080 Super or a 2080Ti.
 
No, but the game developers DO set the hardware REQUIREMENTS. You can use 5500 to play games at 4k just fine, it's just that developers are perfectly fine sacrificing 50% of performance for visuals that are barely better, especially when it comes to ultra settings. If the hardware was better, the developers would just make more demanding games.

Great but what does that have to do with 1080P? So you agree that we should be able to play 1080p ultra games today with $150 but hardware AND software is holding us back. I concur.
 
Great but what does that have to do with 1080P? So you agree that we should be able to play 1080p ultra games today with $150 but hardware AND software is holding us back. I concur.
It has EVERYTHING to do with 1080p. If the hardware was faster, that would just mean that the games would be that much more demanding and you STILL wouldn't be able to play 1080p at ultra settings in graphically intensive games with a $150 GPU.
 
It has EVERYTHING to do with 1080p. If the hardware was faster, that would just mean that the games would be that much more demanding and you STILL wouldn't be able to play 1080p at ultra settings in graphically intensive games with a $150 GPU.

That makes no sense at all. According to that flawed logic we shouldn't be able to because of the straw man argument, "well if they do this then someone might do that"! If that were true then we wouldn't be able to do 720P ultra for $150 because they keep upping the software specs. Pick any resolution and dollar amount you want. Your reasoning doesn't add up. I don't understand consumers (posters) defending higher hardware prices???? wow
 
That makes no sense at all. According to that flawed logic we shouldn't be able to because of the straw man argument, "well if they do this then someone might do that"! If that were true then we wouldn't be able to do 720P ultra for $150 because they keep upping the software specs. Pick any resolution and dollar amount you want. Your reasoning doesn't add up. I don't understand consumers (posters) defending higher hardware prices???? wow
I'm not defending hardware prices, I'm just pointing out that developers tune their games according to what's available. You're not going to get graphically demanding new games running at 1080p ultra on a $150 GPU until 1440p or 4k is the standard. What you WILL get is high or medium settings that look like today's ultra. This isn't a difficult concept to grasp, honestly.
 
I'm not defending hardware prices, I'm just pointing out that developers tune their games according to what's available. You're not going to get graphically demanding new games running at 1080p ultra on a $150 GPU until 1440p or 4k is the standard. What you WILL get is high or medium settings that look like today's ultra. This isn't a difficult concept to grasp, honestly.

Apparently what is a difficult concept for folks to grasp is 1440P and 4K IS the standard now for pc gaming. Even console makers are focusing more on UHD with the current generations. No reviews and or benchmarks focus on 1080P statistics, it's all higher resolutions. What is available is good enough for 1080P ultra yet the prices are kept artificially high. People seem to think every pc game that comes out now has new wiz bang gimmicks to crush video card performance and that just isn't so. Thank you for making my point.
 
Apparently what is a difficult concept for folks to grasp is 1440P and 4K IS the standard now for pc gaming. Even console makers are focusing more on UHD with the current generations. No reviews and or benchmarks focus on 1080P statistics, it's all higher resolutions. What is available is good enough for 1080P ultra yet the prices are kept artificially high. People seem to think every pc game that comes out now has new wiz bang gimmicks to crush video card performance and that just isn't so. Thank you for making my point.
Well, yes and no. Last I checked, most people are still using 1080p screens on their computers.
 
Back