AMD Stagnation: Five Years of Mainstream Radeon GPUs Tested

Texture quality and resolution are absolutely linked. If you have less pixels to view a texture then you see less of the texture.
Except that's not how it works, because 1 texel isn't equal to 1 pixel on the screen. I literally explained this in my previous comment already. On objects that are distant from the camera, sure, there's a point where texture resolution is lost to the screen resolution. On objects that are close to the screen, the opposite happens, even very high res textures can get close enough to the camera where texels become larger than pixels and you definitely see the loss in texture quality.

Why does everyone think they need 4k textures to have a good gaming experience.
Nobody is saying that you "need" "4K" textures. They're saying that those GPUs (4060, 5060, 7600, 9060 8 GB) are VRAM-starved relative to how fast their GPU cores are, and that makes them terrible value. Again, as I said on the previous comment, consoles like the PS5 and Series X have slower GPUs than those but more VRAM, and they make great use of that VRAM by offering higher texture quality (and thus higher visual quality) than what these 8 GB GPUs are capable of, despite the fact that they are slower than those 8 GB GPUs. Those 8 GB GPUs need more VRAM to make sense.

I would also like to remind everyone that the 8GB is under MSRP and that the 16GB is over MSRP if you can find it at all. The real world price difference is large enough that it's forgivable.
Yeah, they are under MSRP because nobody wants them. Nobody wants them because they cannot run modern games without ugly compromises, while the consoles (which, again, have weaker GPUs than these) do not have to make those ugly compromises for no other reason than the fact they aren't VRAM starved.
 
Look at what I just found... are we going to have an 8GB crusade against Nvidia this time?

gigabyte-featured-1.jpg

That looks like a mistake on the image - I found an article with that image which linked to an Amazon listing for a 5060, not a 5070. Try some background checking of the source before posting an image without any context from an independent source.

Expect the 16GB doesn’t perform noticeably better. It’s a mild overlock away at best until you artificially bump up the VRAM usage with insane textures and upscaling to 4K. The 8GB cards are fine for 1080p and low 1440p which is what they’ve always been aimed at.

It's about not running out of VRAM and the card having to move that to system RAM - that's where the terrible 1% lows on some games become a problem. The reason 16GB performs the way it does is either the game not requiring over 8GB VRAM or the textures never need to be managed. 16GB is simply the better option and won't be a limitation.

The fact that the 3060 can beat the 3070 in some tests, only because it has more VRAM, should end any argument as to why 8GB cards should be, and already are, EOL.
 
Texture quality and resolution are absolutely linked. If you have less pixels to view a texture then you see less of the texture.

Why does everyone think they need 4k textures to have a good gaming experience.

I would also like to remind everyone that the 8GB is under MSRP and that the 16GB is over MSRP if you can find it at all. The real world price difference is large enough that it's forgivable.
Because higher quality texture = more detail in the image = looks better close up. There's no direct correlation to your screens resolution. Just because they can both be '4k' does not mean you need one to benefit from the other.

From a GamersNexus article on it:
blops-texture-comparison-2.jpg

The higher the resolution of the texture the crisper the image. That's why it's the one single setting that can make a massive difference with (almost) no impact on performance, all you need is VRAM
blops-3-texture-quality-bench.png

The "real world" price difference is a straw-mans argument imo. Supply and demand as always...
The reason why 8GB is below MSRP is because it's unwanted.
The reason why 16GB is above MSRP is because it's wanted.

The price difference for AMD/NVIDIA would be $20-30 per card although that's with prices that we as consumers can see. Fair chance that difference is even lower when you bulk order hundreds of thousands of the things. It's no more expensive for the pick and part machine in the factory to place a chip with more memory, the costs are just in the chip itself and they're minor.
If the graphics card makers just made the higher VRAM model there'd be no reason for it to be much more expensive other than there simply being higher demand.

Everyone's perception of the world seems to be different though. My eyes seem to be attuned to fine detail, when I walk past a tent in a game and I can see the lines in the fabric - love it. I point it out to my girlfriend and she struggles to see it. Meanwhile it might take me a while to notice when my monitor somehow reset to 60Hz whilst my girlfriend instantly notices.
 
Last edited:
Great article as always Steven! Thanks!

I'm still hanging on to my $900 RX6800 16GB I bought during the crunch. I betting it could still beat all the cards on this list. Even though it was expensive, it has proven to be an excellent card even today.

There's a lot to be said about longevity, I am in 100% agreement with you. Buying an 8GB card is simply guaranteeing you'll be buying a new video card sooner as you start hitting walls. The games aren't getting any less demanding.
 
Turning the textures down fixes basically all VRAM issues, people need to stop acting like this is the end of the world. Turn the textures down and the difference between 8GB and 16GB cards disappear

This is true. I used to have a Quake 3 friend that basically played MP with sticks and sprites on the screen seeking higher FPS, it looked awful, like 16 colors DOS awful..but....he could play. hahahahaha

I think I'd choose a lower res over dropping textures. Modern games are pretty damn cool visually and compromising that reduces the enjoyment overall.
 
AMD did a great job with 9060 XT.
It basically doubled in performance versus two generations behind. I call that a win.

Regarding the 8Gb (non) issue:
It exists mostly due to non optimized video games. If you going to be angry with someone, they are the ones.
Using extremely high polygon count 3D models for everything, even for things not in main focus, using 4K textures for not even secondary objects in the scene, object that will never fill up the screen, scenes with way too many objects just for sake of it. That eats up your VRAM faster than you think.

And I call it a non issue, because AMD left you a choice, if 8Gb is not enough for you, there is 16Gb model for on average 50 bucks more. There you go, you do not need to suffer so much.
And please, if you have 300 to pay for a graphic card, you can find 350 too.
People who do not have money at all, do not even have 300 to begin with.
A comment of reason. I'm so bored of hearing the complaints of 8GB GPUs holding the gaming industry back. It's the gaming industry holding the gaming industry back. Many people who don't game need GPUs for other things. We provide them for 3D X-ray scanners, and we don't even need 8GB to run the very latest 3D rendering software, 4GB is just fine. We would have to charge our clients more for something they really don't need if 8GB cards stopped being available.

This is just another case of this tech community wanting to hate on the tech companies and coming up with an excuse to do so.
 
Why are people defending a 8gb models? It’s like saying “Hey! AMD and Nvidia can you make worse product next gen!”

I’m too poor to buy a card that 20$ extra is fine if you can’t afford. You do realize that they could have easily sold 16gb model for 20$ cheaper. It does not cost AMD or nvidia 20$ for an extra 8gb.
 
Already happened.
Not even close to the extend they went against Frank Azor comment.

Sorry, but they didn't go against Nvidia until they were forced to do so from their narrative against AMD. It is just about 4 months late. They didn't do it at the release of the 5060TI so all this is just pure hypocrisy.

Even there, Steve admitted that he was gaslighting with his owns numbers and methodology.



This chart proves that Frank Azor comment was 100% correct.

Untitledf.jpg
 
Last edited:
Why are people defending a 8gb models? It’s like saying “Hey! AMD and Nvidia can you make worse product next gen!”

I’m too poor to buy a card that 20$ extra is fine if you can’t afford. You do realize that they could have easily sold 16gb model for 20$ cheaper. It does not cost AMD or nvidia 20$ for an extra 8gb.
Because... it is not a limiting factor, it is an option. You are not forced to buy the 8GB SKU. You can buy the 16GB one if it doesn't suite your needs.

This is why going into crusade against an option, is ridiculous. If the 9060XT would have been 8GB only, then that would have been a problem and the drama would have been justify since the GPU can push some interesting 1440p results.

However, if you play at 1080p and you are on a tight budget, maybe lacking 8GB is the least important of your compromise.

Lastly, 20$ on a 300$ card is a good 7% of the total cost. So if you think AMD or Nvidia is making 30% of margins on these low level GPUs, then you understand that this is not JUST 20$.

And if you think GDDR7 cost the same as GDDR6, then you are making another shortcut that is inaccurate.
 
Not even close to the extend they went against Frank Azor comment.

Sorry, but they didn't go against Nvidia until they were forced to do so from their narrative against AMD. It is just about 4 months late. They didn't do it at the release of the 5060TI so all this is just pure hypocrisy.

Even there, Steve admitted that he was gaslighting with his owns numbers and methodology.



This chart proves that Frank Azor comment was 100% correct.

Untitledf.jpg
Except his comment doesn't hold up.
Most played eSports, sure. But do all of those people play just eSports exclusively, no other games on their radar at all that would benefit from it?
I mean I mostly play HotS and StarCraft 2. Because those are my most played games I should get 8GB? Oh wait now my performance in cyberpunk is terrible.

16GB same GPU, no compromise? My wallet disagrees, fair enough if the card was 20 or 30 extra but its like 100 and even north of that.

By the same logic if those esporters don't need the 16GB then why sell them a GPU die that large? Less FPS would be fine too. People are mad at the 8GB model because the horsepower is there under the hood but it's hamstrung by the VRAM and the proper 16GB costs far more then the material bill, it's an upsell - same thing people get mad at Apple for, charging silly prices for memory.
 
I love these comparison charts that show performance and inflation adjusted or real world adjusted prices, gives a good story for how the technology and market has changed over time.

A lot of people are getting hung up on the 8GB vs 16GB issue, and to me, it's less about the VRAM capacity or pricing and more about naming. Both AMD and Nvidia should make their product naming more obvious to the non-technical consumer that there is a difference between the two, otherwise an unsuspecting consumer could easily be mislead to thinking they are getting one product when in reality they are getting another. Let people make their own choices about what capacity and budget is right for them, but don't let the companies off the hook for the way they named these products. In that sense, I'm absolutely fine with the reviewers roasting the products or companies accordingly.
 
Back