AMD Radeon RX 9060 XT 8GB Review: Not Good

What I like from Steve is his numbers, his opinions is literally as bad as any techinfluencer out there. I am an engineer, so I make my own opinion after seeing the numbers and the methodology.

That's the only thing I like from HardwareUnboxed. All the rest, especially with Tims bias, is absolutely unbearable. The guy literally give a pass for Nvidia for anything, but if AMD is doing something similar, he will go in crusade against them.

That 8GB drama is the perfect example. Where is all the backlash against Nvidia for releasing their 5060 TI and their 5060 in 8GB variants? Ah yeah, they didn't bother to go at the same extent even if the price of those GPU are more expensive for the same performances... and their drivers still broken... and didn't provided review drivers for the 5060 launch.

I would even argue that the 5060 NON-TI has the same MSRP than the 9060 XT 8GB...
Exactly, this was not a review it was an opinion piece that shows no real quantifiable data and neither does his YouTube video. A proper review would be a comparison with the 5060 including 1080p and 1440p, 1440p with FSR/DLSS (If I'm not mistaken the 9060XT has access to FSR4 correct?)

I prefer the numbers as well, but when you deliberately take a card and push it past its limits that HE KNOWS it has, and then is upset it doesn't perform the same as the 16GB version.
 
Also from AMD's Computex press release, to quote:
Designed to unlock ultra-smooth gaming at 1440p, the Radeon RX 9060 XT is built for players who expect more. Equipped with up to 16GB of GDDR6 memory and 32 AMD RDNA 4 compute units, the GPU doubles ray tracing throughput compared to the previous generation, providing gamers with more realistic lighting, shadows, and reflections that bring virtual worlds to life.

AMD does make a distinction on their Radeon site by specifically mentioning the 16GB model and outright omitting the 8GB model from their comparison.
1749242654716.png
 
Last edited:
This is very bad example.

For starters, it would be actually difficult to get 1080p Tv and you would probably need to pay more than what you would pay for cheap 4K TVs.

If anything, you can say Steven was suggesting everyone should buy 8K TVs because it is future proof and saying this standard, cheap 4K TV is bad, because it can not show 6K or 8K video.

More closer would be internet speed you should get, in order for to watch better quality video.
Like saying you should get 1Gbit or more, or else you will not enjoy a movie, while in reality, you are good even at 20Mbit and you will still enjoy it (that is if the movie is actually good).

People have different life situations and expectations. Stevens went way of what majority status or financial capability is.

Of course everyone would like 5090, in real life, 3060 is selling the most.
No it is not a bad example. It is a matter of content. And most content nowadays is in 4K. So you buy a 4K, if you have a few shows here and then at 1080p they will play well. Just like games, most games nowadays are played at 1440p. Sure that you can play them at 1080p. But why would you just because nvidia and amd decided to save on the ram? It is not a chip limitation, if it was I would understand. As it is it is like having a car that won't engage the 4th gear up.

And then, what will happen when even at 1080p 8gb are not enough? If you are happy with 8Gb then it makes more sense to buy an older card, spare some money and probably get the same perfomance.
 
Hmm. So he's comparing 1440p with FSR quality between the 8GB and 16GB models.

Using *lower than 1080p* internal render resolution (1706x960)

And that's beyond the capability of the 8GB cards? From the looks of it *every* test should have been well within the capabilities of an 8GB card, but yet they fail repeatedly when the *only* variable is the size of the vram pool.

He's right in calling out that if you're already an esports player, you'll gain nothing over the 6650xt you already have.

But even if it was an esport only card, it needs to be priced like an esports card. $200 max, $300 for the full vram card.

Let the 16GB card crush the 5060 in *every* scenario at equal price point and offer a competitive esports card at 2/3 the price.
 
Kudos! Best review in a long time, Steve!

I am completely with the verdict of the article. 8GB should not exist as a variant in a card like that. Nvidia and AMD make us play 8GB optimized games on PC for another couple of years. At least until a new console gen with more unified memory gets released and PC ports get more VRAM demanding.

This is another moment where AMD failed to grasp an opportunity when Nvidia is not at their best game. It's almost a pattern now.

Shortsighted "8gb is enough to play any game" comments in 3 2 1..
 
No it is not a bad example. It is a matter of content. And most content nowadays is in 4K. So you buy a 4K, if you have a few shows here and then at 1080p they will play well. Just like games, most games nowadays are played at 1440p. Sure that you can play them at 1080p. But why would you just because nvidia and amd decided to save on the ram? It is not a chip limitation, if it was I would understand. As it is it is like having a car that won't engage the 4th gear up.

And then, what will happen when even at 1080p 8gb are not enough? If you are happy with 8Gb then it makes more sense to buy an older card, spare some money and probably get the same perfomance.
Most games aren't played at 1440p, most people play at 1080p, and even consoles are running with aggressive upscaling sometimes from 900p to 1440p.

I don't like the idea that there will be 8GB GPUs in 2025, it's ridiculous... but the discussion has to be based on facts.
 
Final resolution is not the biggest consumer of VRAM. Much less than textures resolution (and number of course), anisotropic filtering, ray tracing, FSR/DLSS...

So, in 2025, this GPU with 8Gb of VRAM should has no RT, no FSR, no AI, just raster performance, and should cost 100 dollars. Then I'd understand its existence.
 
Just PLEASE stop with this "8Gb VRAM bad" BS.

Overwhelming number of 1080p screens and GPUs with 8Gb or less are prove people are OK to play with this kind of VRAM sizes. No one ever said" Ohh gee, I can not run this game at 4K and ultra settings with my card like Steve(s) did, even though I like it, so I will just not play it. that is why there are graphic settings in game in the first place. Not everyone is spoiled, people will adapt to their situation. And enjoy the games they love.

Also, you took card not intended for 1440p and Ultra high settings and showed:
Look, its not working well at those settings!

Duh...

THE ONLY problem is pricing. This should be 200-250USD MSRP and not 300USD.
But from what I see people are buying it in droves.
Free market decides in the end, not someone feelings (including mine).
If it sells well, they got it right, if not, they failed. And guess what they will do?
It is VERY hard question.
They will drop the prices.

Market decides in the end, that is reason I got 5090 below MSRP price.
Please stop defending stupidity and greed. 8GB for any GPU besides maybe an 5030 is not just bad, but it hurts the industry as a whole.

FYI resolution plays a very small role in how much VRAM you use. If it's limited at 1440p chances are that at 1080 it's still limited.

FYI2 the steam results include a vast number of laptops, PCs from school, office PCs and internet cafe which have cheap 1080p screens. It skews results a lot.

"Market decides in the end, that is reason I got 5090 below MSRP price." - no wonder you are out of touch with the average gamers.
 
Come on, I see you point, and it is a valid one. However, you are stacking the deck to make the 8 GB cards look bad. Where is 1080P data, medium or high settings? You are testing at 1440P, usually highest or nearly so visual quality. I would argue that this class of card is really a 1080P card, and the 16GB versions can be stretched to 1440 in some cases. Of course these 8 GB cards are not going to perform well at 1440p. And, as redgarl pointed out in a previous post, your very own results show the 8 GB and 16GB 5060 Ti giving equal results at 1080p. Are the 8 GB cards going to require adjusting settings and be less future proof? Of course, but I think there is a place for the 8 GB cards. The problem is the deceptive branding by both nVidia and AMD. There should be 8 GB cards (5060, 9060) significantly cheaper, and Ti or XT cards with 16GB only.
 
It's not a review. It's a warning to prevent gamers to buy it. And they are right.
People can make their own choices. if they play at 1080p and they are in a tight budget, like I would assume for people buying entry level GPUs, then compromises were always considered.
 
People can make their own choices. if they play at 1080p and they are in a tight budget, like I would assume for people buying entry level GPUs, then compromises were always considered.
Yes they can. The problem is labeling both cards the same misleading people. Or stuffing them on PC's that some good willing parents will buy for their children thinking they are doing a good purchase because the name on the specs is the one their children told them about. It is just a scam for a few bucks, there is no need for that.
And in this case AMD would earn a lot of respect from customers and improve their brand if it did not go down this path. The $50 are not worth this.
 
Please stop defending stupidity and greed. 8GB for any GPU besides maybe an 5030 is not just bad, but it hurts the industry as a whole.

FYI resolution plays a very small role in how much VRAM you use. If it's limited at 1440p chances are that at 1080 it's still limited.

FYI2 the steam results include a vast number of laptops, PCs from school, office PCs and internet cafe which have cheap 1080p screens. It skews results a lot.

"Market decides in the end, that is reason I got 5090 below MSRP price." - no wonder you are out of touch with the average gamers.

I am not defending those companies, I am tired of people saying "8Gb is bad".
By default, all companies are "greedy". You decide by buying or not buying their products.

Also, I literally said that the price is actual problem. The card should cost 50-100USD cheaper.

I have another computer with old Nvidia 3070 that has 8Gb VRAM. It plays just fine at 1080p.

I am getting the cards that I get because of the job I do ( my 5090 is used maybe only 5-10% for gaming). It just happens that I am using Unreal for the applications I work on (yes, Unreal can be used for applications too).

So, I think people who are paying 3K just to play games are out of the touch, I am buying to work on. And those people are spoiled. The guy that wrote the review gets them for free. So you tell me . . .
 
People can make their own choices. if they play at 1080p and they are in a tight budget, like I would assume for people buying entry level GPUs, then compromises were always considered.
Yes, but it could be a very bad choice in 2025. The tradeoff is too significant. If there isn't enough VRAM, performance will drop. Not to mention that frame generation and FSR use it...

Sometimes you have to save your money, wait, or buy second-hand (a 6800 16gb should go for $250 and is a much better choice)...
 
I am not defending those companies, I am tired of people saying "8Gb is bad".
By default, all companies are "greedy". You decide by buying or not buying their products.

Also, I literally said that the price is actual problem. The card should cost 50-100USD cheaper.

I have another computer with old Nvidia 3070 that has 8Gb VRAM. It plays just fine at 1080p.

I am getting the cards that I get because of the job I do ( my 5090 is used maybe only 5-10% for gaming). It just happens that I am using Unreal for the applications I work on (yes, Unreal can be used for applications too).

So, I think people who are paying 3K just to play games are out of the touch, I am buying to work on. And those people are spoiled. The guy that wrote the review gets them for free. So you tell me . . .
I think you are missing the point here: the problem is not that you can play fine at 1080p but that you have a chip that ends being severely limited to spare a few bucks on ram. Why buy then a 9060 to start with?

Besides, and as the review points out, the naming is misleading, the whole thing is being done to catch unsuspecting people people that don't know much about this. It is basically a scam. Why not name it something else so that people aware of what they are buying?
 
While I feel this "review" is subjective at best, there are a couple of things I agree with:

The naming scheme for the 8GB and 16GB cards should not both fall under the RX 9060 XT nomenclature - the 8GB should be named as the RX 9060, with RX 9060 XT used for the 16GB version. This would align with how the RX 7600 and RX 7600 XT were separated to distinguish between cards with different VRAM allocations but the same GPU die.

The pricing for 8GB GPU variants should be markedly lower than the 16GB variants.

Now as to the debate around 8GB being enough - it's worth noting that TPU's review shows the RX 9060 XT 8GB version besting the RX 7600 XT by 28.8% @ 1080p and 31.6% @ 1440p when running at native resolution for the games they tested. https://www.techpowerup.com/review/powercolor-radeon-rx-9060-xt-reaper-8-gb/34.html
 
New reviews from TechPowerUp today, and it sees your article and it raises. Do you want to go all in sir?
 
I am genuinely surprised at the comments here supporting the idea that 1080p gaming is still the standard and 8GB is enough. I argued way back in the RTX 3000/Radeon 6000 era that the VRAM should have doubled to 16 GB as a standard in the same way the GTX 1000/Radeon 580 series doubled the previous GTX 900/Radeon 290 series 4 GB VRAM standard. (And it only took one generation to jump from 2 GB to 4 GB VRAM prior to that.)

That was literally only a one GPU/2 year timeframe and it was deemed necessary to double the VRAM back then. So why does anyone think that the 8 GB VRAM standard should be held on to at this stage? It is now 5 generations and 10 years later into the RTX 5000/Radeon 9000 series. As a consumer standard, we should be at 16 GB VRAM at the very minimum right now and realistically it should actually be 32 GB VRAM.

I get the comments above that are saying that it's more than enough for certain situations, fair enough. Whether gamers actually use the VRAM it to its full potential is another story as per all the comments above, but IMHO that's not the point. This is not just about technology itself, this is now a major anti-consumerism issue. TVs have been 4K standard for a long time now and consoles have caught up to the 16 GB VRAM standard (albeit in shared architectures with the CPU). GPU cards are now literally holding back the general PC, display monitor and gaming markets with lack of advancement, lack of availability and lack of affordability.
 
This launch should have been 9060 XT 16GB at $350, and the 8GB version should have been the 9060 8GB at $250 (dropping the "XT"). This followed what they did for the 7000 series. Then they could battle with the B580 at $249, as both seem sufficient for 1080p. The B580 has more VRAM, but the 9060 XT is just plain faster. The most important part of a GPU is the chip itself, then the VRAM, especially at lower resolutions.

This 8GB version could have been the clear value play over the RTX 5060, which is also at $299. Silly what they chose here, could have been positioned so much better. The 16GB is a pretty great option, all things considered.
 
Back