Let's put four generations of AMD Radeon GPUs to the test to see how the $300 segment has evolved over the past five years. From the 5600 XT to the new 9060 XT 8GB, here's how they stack up.
Let's put four generations of AMD Radeon GPUs to the test to see how the $300 segment has evolved over the past five years. From the 5600 XT to the new 9060 XT 8GB, here's how they stack up.
People buy a Honda Civic and expect it to work. If you bought a Civic and it couldnt climb a hill without having to take the doors off you'd have something to say about that.This is click bait. Everyone is treating the low end like it is the high end. The low end is filled with compromises, that's what it takes to make a low end product. People don't buy a base model Honda Civic and expect it to perform like a Mercedes. I'm getting tired of the sensationalism. When I first saw the title, I thought this was going to be comparing the 9070 vs other cards in its class.
Further, I have never NOT know of someone who buys an entry level GPU and didn't have to tweak the settings. These cards were never meant to play at high/ultra so the fact that they perform as well as they do at these settings is commendable. When I was buying an entry level graphics card in the 2000s I fully expected to be playing at low settings and even dropping the resolution.
Maybe we should point out that entry level cards can give a playable experience at high/ultra, something that wasn't possible even just 10 years ago.
Vram is holding things back if you fill it with massive textures. Nearly every VRAM problem can be solved by turning the textures down. And let's not let our success as an adult forget about people with "hard budgets." I remember being a broke 20 something and $200 was the max I had, to spend. There was not an extra $20 for a slightly better card. For someone who might be looking at a 7600, the idea of an 8GB 9060 in your price range would be a godsend. And considering that the 16gb model is going for about $30 over it's MSRP and the 8GB can be found for $10-20 under it's MSRP, the real world price difference is closer to $80-100People buy a Honda Civic and expect it to work. If you bought a Civic and it couldnt climb a hill without having to take the doors off you'd have something to say about that.
It isn't clickbait. 8GB VRAM limitations are holding the entire sector back. These GPUs are severely bottlenecked by their VRAM bus and I'm tired of pretending we can justify it. We cant.
A 550ti or a GTX 960 didnt leave a third of their performance on the table over VRAM limitations. The 970 left some, and there was hell to pay for it. Let's stop justifying these gimped cards and call them out for the e-waste they are. The 8GB 9060 should not exist.
So again, we're taking the doors off our Civic to climb the hill while proclaiming it works just as well as the mercedes.Vram is holding things back if you fill it with massive textures. Nearly every VRAM problem can be solved by turning the textures down.
All that justification when you could just....get rid of the 8GB card, sell the 16GB card for $300 instead, and all those with "hard budgets" could buy a card that is a genuine improvement and will last them a long time without breaking the bank.And let's not let our success as an adult forget about people with "hard budgets." I remember being a broke 20 something and $200 was the max I had, to spend. There was not an extra $20 for a slightly better card. For someone who might be looking at a 7600, the idea of an 8GB 9060 in your price range would be a godsend. And considering that the 16gb model is going for about $30 over it's MSRP and the 8GB can be found for $10-20 under it's MSRP, the real world price difference is closer to $80-100
8GB is insufficient for this GPU, and it is artificially raising the prices of 16GB models.AMD did a great job with 9060 XT.
It basically doubled in performance versus two generations behind. I call that a win.
Regarding the 8Gb (non) issue:
It exists mostly due to non optimized video games. If you going to be angry with someone, they are the ones.
Using extremely high polygon count 3D models for everything, even for things not in main focus, using 4K textures for not even secondary objects in the scene, object that will never fill up the screen, scenes with way too many objects just for sake of it. That eats up your VRAM faster than you think.
And I call it a non issue, because AMD left you a choice, if 8Gb is not enough for you, there is 16Gb model for on average 50 bucks more. There you go, you do not need to suffer so much.
And please, if you have 300 to pay for a graphic card, you can find 350 too.
People who do not have money at all, do not even have 300 to begin with.
Ah there it is, took over 2 hours this time! guess you missed they did this same thing with nvidia, eh?Man, just sticking it to AMD lately, aren't you guys?
You could just drop in Nvidia in here as well, since both teams have had horrid improvement from previous gen to this gen and from the gen before that.
We absolutely are not, the very suggestion is absurd. At best we're talking about a used car instead of a new one.So again, we're taking the doors off our Civic to climb the hill while proclaiming it works just as well as the mercedes.
Because optimizing tends to be tedious and boring work with a substantial skill requirement. So games will miss out on other things or they'll be more expensive. I'd rather throw some extra money at a card (which should be about $20-30 going by VRAM prices) then throw $10 extra at every game. It doesn't make sense to skimp on it. The chips are cheap and literally the only thing that needs to be changed on the board to make it perform better in a lot of current and pretty much all future titles.Regarding the 8Gb (non) issue:
It exists mostly due to non optimized video games. If you going to be angry with someone, they are the ones.
Using extremely high polygon count 3D models for everything, even for things not in main focus, using 4K textures for not even secondary objects in the scene, object that will never fill up the screen, scenes with way too many objects just for sake of it. That eats up your VRAM faster than you think.
60(0) class cards have been the sweetspot of compromise. Maximum bang-a-buck, from there if you pay more you don't get linear performance increases. Below that compromises start affecting your gaming experience. It was the recommended entry point to gaming.This is click bait. Everyone is treating the low end like it is the high end. The low end is filled with compromises, that's what it takes to make a low end product. People don't buy a base model Honda Civic and expect it to perform like a Mercedes. I'm getting tired of the sensationalism. When I first saw the title, I thought this was going to be comparing the 9070 vs other cards in its class.
Yes, but Moore's Law does not apply to screens. The entry point monitors only went from 1280x1024 to 1920x1080 - not a massive difference, graphics cards however are immensely more powerful. Back then VRAM also doubled every few years - it doesn't now.Further, I have never NOT know of someone who buys an entry level GPU and didn't have to tweak the settings. These cards were never meant to play at high/ultra so the fact that they perform as well as they do at these settings is commendable. When I was buying an entry level graphics card in the 2000s I fully expected to be playing at low settings and even dropping the resolution.
Not quite 10 years (9 to be precise ago) we had the RX 480 as the entry point. A card that ran no compromise games just fine, not just at launch but still long after.Maybe we should point out that entry level cards can give a playable experience at high/ultra, something that wasn't possible even just 10 years ago.
But Textures MASSIVELY affect image quality. It's the one single setting that can make a game look way way better without affecting performance, all you need is VRAM (and some bandwidth to fill it up). I'll gladly lower basically every other setting to get more performance if it's lacking but there's two I won't touch unless it's a super last resort:Turning the textures down fixes basically all VRAM issues, people need to stop acting like this is the end of the world. Turn the textures down and the difference between 8GB and 16GB cards disappear
Already happened.Look at what I just found... are we going to have an 8GB crusade against Nvidia this time?
![]()
Optimizing is one of the basic tasks in any respectable software company (gaming or not).Because optimizing tends to be tedious and boring work with a substantial skill requirement. So games will miss out on other things or they'll be more expensive. I'd rather throw some extra money at a card (which should be about $20-30 going by VRAM prices) then throw $10 extra at every game. It doesn't make sense to skimp on it. The chips are cheap and literally the only thing that needs to be changed on the board to make it perform better in a lot of current and pretty much all future titles.