AMD Stagnation: Five Years of Mainstream Radeon GPUs Tested

This is click bait. Everyone is treating the low end like it is the high end. The low end is filled with compromises, that's what it takes to make a low end product. People don't buy a base model Honda Civic and expect it to perform like a Mercedes. I'm getting tired of the sensationalism. When I first saw the title, I thought this was going to be comparing the 9070 vs other cards in its class.

Further, I have never NOT know of someone who buys an entry level GPU and didn't have to tweak the settings. These cards were never meant to play at high/ultra so the fact that they perform as well as they do at these settings is commendable. When I was buying an entry level graphics card in the 2000s I fully expected to be playing at low settings and even dropping the resolution.

Maybe we should point out that entry level cards can give a playable experience at high/ultra, something that wasn't possible even just 10 years ago.
 
This is click bait. Everyone is treating the low end like it is the high end. The low end is filled with compromises, that's what it takes to make a low end product. People don't buy a base model Honda Civic and expect it to perform like a Mercedes. I'm getting tired of the sensationalism. When I first saw the title, I thought this was going to be comparing the 9070 vs other cards in its class.

Further, I have never NOT know of someone who buys an entry level GPU and didn't have to tweak the settings. These cards were never meant to play at high/ultra so the fact that they perform as well as they do at these settings is commendable. When I was buying an entry level graphics card in the 2000s I fully expected to be playing at low settings and even dropping the resolution.

Maybe we should point out that entry level cards can give a playable experience at high/ultra, something that wasn't possible even just 10 years ago.
People buy a Honda Civic and expect it to work. If you bought a Civic and it couldnt climb a hill without having to take the doors off you'd have something to say about that.

It isn't clickbait. 8GB VRAM limitations are holding the entire sector back. These GPUs are severely bottlenecked by their VRAM bus and I'm tired of pretending we can justify it. We cant.

A 550ti or a GTX 960 didnt leave a third of their performance on the table over VRAM limitations. The 970 left some, and there was hell to pay for it. Let's stop justifying these gimped cards and call them out for the e-waste they are. The 8GB 9060 should not exist. Cut the GPU in half and sell it as a 8GB 9050 for $120, now we're talking. That's justifiable. When you have to tune settings down at 1080p on a card that can do 2k/4k rez gaming, that is gimped.
 
Last edited:
People buy a Honda Civic and expect it to work. If you bought a Civic and it couldnt climb a hill without having to take the doors off you'd have something to say about that.

It isn't clickbait. 8GB VRAM limitations are holding the entire sector back. These GPUs are severely bottlenecked by their VRAM bus and I'm tired of pretending we can justify it. We cant.

A 550ti or a GTX 960 didnt leave a third of their performance on the table over VRAM limitations. The 970 left some, and there was hell to pay for it. Let's stop justifying these gimped cards and call them out for the e-waste they are. The 8GB 9060 should not exist.
Vram is holding things back if you fill it with massive textures. Nearly every VRAM problem can be solved by turning the textures down. And let's not let our success as an adult forget about people with "hard budgets." I remember being a broke 20 something and $200 was the max I had, to spend. There was not an extra $20 for a slightly better card. For someone who might be looking at a 7600, the idea of an 8GB 9060 in your price range would be a godsend. And considering that the 16gb model is going for about $30 over it's MSRP and the 8GB can be found for $10-20 under it's MSRP, the real world price difference is closer to $80-100
 
AMD did a great job with 9060 XT.
It basically doubled in performance versus two generations behind. I call that a win.

Regarding the 8Gb (non) issue:
It exists mostly due to non optimized video games. If you going to be angry with someone, they are the ones.
Using extremely high polygon count 3D models for everything, even for things not in main focus, using 4K textures for not even secondary objects in the scene, object that will never fill up the screen, scenes with way too many objects just for sake of it. That eats up your VRAM faster than you think.

And I call it a non issue, because AMD left you a choice, if 8Gb is not enough for you, there is 16Gb model for on average 50 bucks more. There you go, you do not need to suffer so much.
And please, if you have 300 to pay for a graphic card, you can find 350 too.
People who do not have money at all, do not even have 300 to begin with.



 
Vram is holding things back if you fill it with massive textures. Nearly every VRAM problem can be solved by turning the textures down.
So again, we're taking the doors off our Civic to climb the hill while proclaiming it works just as well as the mercedes.

The 16GB 9060 performs noticeable better, and can run at higher resolutions than 1080p. Saddling the card with 8GB limits its potential and makes it a terrible purchase that can be used to artificially increase the price of 16GB models.
And let's not let our success as an adult forget about people with "hard budgets." I remember being a broke 20 something and $200 was the max I had, to spend. There was not an extra $20 for a slightly better card. For someone who might be looking at a 7600, the idea of an 8GB 9060 in your price range would be a godsend. And considering that the 16gb model is going for about $30 over it's MSRP and the 8GB can be found for $10-20 under it's MSRP, the real world price difference is closer to $80-100
All that justification when you could just....get rid of the 8GB card, sell the 16GB card for $300 instead, and all those with "hard budgets" could buy a card that is a genuine improvement and will last them a long time without breaking the bank.

But no, we gotta meatshield the greedy corpos instead......
AMD did a great job with 9060 XT.
It basically doubled in performance versus two generations behind. I call that a win.

Regarding the 8Gb (non) issue:
It exists mostly due to non optimized video games. If you going to be angry with someone, they are the ones.
Using extremely high polygon count 3D models for everything, even for things not in main focus, using 4K textures for not even secondary objects in the scene, object that will never fill up the screen, scenes with way too many objects just for sake of it. That eats up your VRAM faster than you think.

And I call it a non issue, because AMD left you a choice, if 8Gb is not enough for you, there is 16Gb model for on average 50 bucks more. There you go, you do not need to suffer so much.
And please, if you have 300 to pay for a graphic card, you can find 350 too.
People who do not have money at all, do not even have 300 to begin with.
8GB is insufficient for this GPU, and it is artificially raising the prices of 16GB models.

Yah nah, I'll be angry at the multi billion dollar corporations using artificial market segmentation to boost their profits, TYVM.

Some people REALLY cannot let go of 8GB, 11 years later. It's safe to upgrade, and demand GPU makers do better.
 
Man, just sticking it to AMD lately, aren't you guys?

You could just drop in Nvidia in here as well, since both teams have had horrid improvement from previous gen to this gen and from the gen before that.
 
Man, just sticking it to AMD lately, aren't you guys?

You could just drop in Nvidia in here as well, since both teams have had horrid improvement from previous gen to this gen and from the gen before that.
Ah there it is, took over 2 hours this time! guess you missed they did this same thing with nvidia, eh?
 
So again, we're taking the doors off our Civic to climb the hill while proclaiming it works just as well as the mercedes.
We absolutely are not, the very suggestion is absurd. At best we're talking about a used car instead of a new one.

These cards were never meant to perform a max settings and the idea that they do at all is remarkable.

Turning the textures down fixes basically all VRAM issues, people need to stop acting like this is the end of the world. Turn the textures down and the difference between 8GB and 16GB cards disappear
 
Regarding the 8Gb (non) issue:
It exists mostly due to non optimized video games. If you going to be angry with someone, they are the ones.
Using extremely high polygon count 3D models for everything, even for things not in main focus, using 4K textures for not even secondary objects in the scene, object that will never fill up the screen, scenes with way too many objects just for sake of it. That eats up your VRAM faster than you think.
Because optimizing tends to be tedious and boring work with a substantial skill requirement. So games will miss out on other things or they'll be more expensive. I'd rather throw some extra money at a card (which should be about $20-30 going by VRAM prices) then throw $10 extra at every game. It doesn't make sense to skimp on it. The chips are cheap and literally the only thing that needs to be changed on the board to make it perform better in a lot of current and pretty much all future titles.

This is click bait. Everyone is treating the low end like it is the high end. The low end is filled with compromises, that's what it takes to make a low end product. People don't buy a base model Honda Civic and expect it to perform like a Mercedes. I'm getting tired of the sensationalism. When I first saw the title, I thought this was going to be comparing the 9070 vs other cards in its class.
60(0) class cards have been the sweetspot of compromise. Maximum bang-a-buck, from there if you pay more you don't get linear performance increases. Below that compromises start affecting your gaming experience. It was the recommended entry point to gaming.

That's how it's been for a long time however now arguably that 50(0) class performance compromise is part of the 60 class - and the prices don't reflect it because they sure haven't come down.

Further, I have never NOT know of someone who buys an entry level GPU and didn't have to tweak the settings. These cards were never meant to play at high/ultra so the fact that they perform as well as they do at these settings is commendable. When I was buying an entry level graphics card in the 2000s I fully expected to be playing at low settings and even dropping the resolution.
Yes, but Moore's Law does not apply to screens. The entry point monitors only went from 1280x1024 to 1920x1080 - not a massive difference, graphics cards however are immensely more powerful. Back then VRAM also doubled every few years - it doesn't now.

Maybe we should point out that entry level cards can give a playable experience at high/ultra, something that wasn't possible even just 10 years ago.
Not quite 10 years (9 to be precise ago) we had the RX 480 as the entry point. A card that ran no compromise games just fine, not just at launch but still long after.

Turning the textures down fixes basically all VRAM issues, people need to stop acting like this is the end of the world. Turn the textures down and the difference between 8GB and 16GB cards disappear
But Textures MASSIVELY affect image quality. It's the one single setting that can make a game look way way better without affecting performance, all you need is VRAM (and some bandwidth to fill it up). I'll gladly lower basically every other setting to get more performance if it's lacking but there's two I won't touch unless it's a super last resort:
Resolution & Texture quality. Both things the entry point of the market now forces in some cases due to lacking one single thing... VRAM.
 
The thing is that the BOM of 8 vs 16gb gddr6 for someone like amd is maybe 10 or 15 bucks. the other thing is that consoles now have had more than 8gb vram for some time so that's kind of the lowest common denominator you target.
The whole issue is that it is indeed the case that offering 8gb vram for 9060 xt/ 5060 ti exist to upsell the 16gb variants. The thing that is costly is the actual die and not the vram. They could also sell variants with slower vram but same capacity as a better way for the consumer and not hurting their bottom line for bom cost - but then they couldn't sell the more expensive faster vram cards in volume probably.
 
Look at what I just found... are we going to have an 8GB crusade against Nvidia this time?

gigabyte-featured-1.jpg
 
Because optimizing tends to be tedious and boring work with a substantial skill requirement. So games will miss out on other things or they'll be more expensive. I'd rather throw some extra money at a card (which should be about $20-30 going by VRAM prices) then throw $10 extra at every game. It doesn't make sense to skimp on it. The chips are cheap and literally the only thing that needs to be changed on the board to make it perform better in a lot of current and pretty much all future titles.
Optimizing is one of the basic tasks in any respectable software company (gaming or not).
Since I actually work in such a company, I can tell you that it is about planning and managing properly.
It starts with your 3D artists to whom you ask to create several LOD levels for every object.
Same goes for textures. Make sure their sizes (resolution) or polygon count is appropriate for the scene and focus in games.
And finally once you are in scene editing, you can always start HUD and see just how many polygons and VRAM are you using at give setting details. Then you just cut what is not needed.
Like 4K texture for a small 3D object in the scene that would, at best take up 100pixels on a screen. Or the number of lights.
Or cut number of trees and their details, especially for those in the background. Not to mention leafs topology. And there are so many , many examples.
It is one of THE MAIN jobs, to optimize your software when developing it.

But lately, they tend to rush and just throw in without of much consideration.
 
Back