AMD Radeon RX 9060 XT 8GB Review: Not Good

What I like from Steve is his numbers, his opinions is literally as bad as any techinfluencer out there. I am an engineer, so I make my own opinion after seeing the numbers and the methodology.

That's the only thing I like from HardwareUnboxed. All the rest, especially with Tims bias, is absolutely unbearable. The guy literally give a pass for Nvidia for anything, but if AMD is doing something similar, he will go in crusade against them.

That 8GB drama is the perfect example. Where is all the backlash against Nvidia for releasing their 5060 TI and their 5060 in 8GB variants? Ah yeah, they didn't bother to go at the same extent even if the price of those GPU are more expensive for the same performances... and their drivers still broken... and didn't provided review drivers for the 5060 launch.

I would even argue that the 5060 NON-TI has the same MSRP than the 9060 XT 8GB...
Exactly, this was not a review it was an opinion piece that shows no real quantifiable data and neither does his YouTube video. A proper review would be a comparison with the 5060 including 1080p and 1440p, 1440p with FSR/DLSS (If I'm not mistaken the 9060XT has access to FSR4 correct?)

I prefer the numbers as well, but when you deliberately take a card and push it past its limits that HE KNOWS it has, and then is upset it doesn't perform the same as the 16GB version.
 
Is this a review or just a test aimed at a specific selection of games to show the cases where 8GB of vram is problematic?

No direct competitors, no realistic resolution, it can't be called a review. lol
 
Also from AMD's Computex press release, to quote:
Designed to unlock ultra-smooth gaming at 1440p, the Radeon RX 9060 XT is built for players who expect more. Equipped with up to 16GB of GDDR6 memory and 32 AMD RDNA 4 compute units, the GPU doubles ray tracing throughput compared to the previous generation, providing gamers with more realistic lighting, shadows, and reflections that bring virtual worlds to life.

AMD does make a distinction on their Radeon site by specifically mentioning the 16GB model and outright omitting the 8GB model from their comparison.
1749242654716.png
 
Last edited:
This is very bad example.

For starters, it would be actually difficult to get 1080p Tv and you would probably need to pay more than what you would pay for cheap 4K TVs.

If anything, you can say Steven was suggesting everyone should buy 8K TVs because it is future proof and saying this standard, cheap 4K TV is bad, because it can not show 6K or 8K video.

More closer would be internet speed you should get, in order for to watch better quality video.
Like saying you should get 1Gbit or more, or else you will not enjoy a movie, while in reality, you are good even at 20Mbit and you will still enjoy it (that is if the movie is actually good).

People have different life situations and expectations. Stevens went way of what majority status or financial capability is.

Of course everyone would like 5090, in real life, 3060 is selling the most.
No it is not a bad example. It is a matter of content. And most content nowadays is in 4K. So you buy a 4K, if you have a few shows here and then at 1080p they will play well. Just like games, most games nowadays are played at 1440p. Sure that you can play them at 1080p. But why would you just because nvidia and amd decided to save on the ram? It is not a chip limitation, if it was I would understand. As it is it is like having a car that won't engage the 4th gear up.

And then, what will happen when even at 1080p 8gb are not enough? If you are happy with 8Gb then it makes more sense to buy an older card, spare some money and probably get the same perfomance.
 
Hmm. So he's comparing 1440p with FSR quality between the 8GB and 16GB models.

Using *lower than 1080p* internal render resolution (1706x960)

And that's beyond the capability of the 8GB cards? From the looks of it *every* test should have been well within the capabilities of an 8GB card, but yet they fail repeatedly when the *only* variable is the size of the vram pool.

He's right in calling out that if you're already an esports player, you'll gain nothing over the 6650xt you already have.

But even if it was an esport only card, it needs to be priced like an esports card. $200 max, $300 for the full vram card.

Let the 16GB card crush the 5060 in *every* scenario at equal price point and offer a competitive esports card at 2/3 the price.
 
Kudos! Best review in a long time, Steve!

I am completely with the verdict of the article. 8GB should not exist as a variant in a card like that. Nvidia and AMD make us play 8GB optimized games on PC for another couple of years. At least until a new console gen with more unified memory gets released and PC ports get more VRAM demanding.

This is another moment where AMD failed to grasp an opportunity when Nvidia is not at their best game. It's almost a pattern now.

Shortsighted "8gb is enough to play any game" comments in 3 2 1..
 
No it is not a bad example. It is a matter of content. And most content nowadays is in 4K. So you buy a 4K, if you have a few shows here and then at 1080p they will play well. Just like games, most games nowadays are played at 1440p. Sure that you can play them at 1080p. But why would you just because nvidia and amd decided to save on the ram? It is not a chip limitation, if it was I would understand. As it is it is like having a car that won't engage the 4th gear up.

And then, what will happen when even at 1080p 8gb are not enough? If you are happy with 8Gb then it makes more sense to buy an older card, spare some money and probably get the same perfomance.
Most games aren't played at 1440p, most people play at 1080p, and even consoles are running with aggressive upscaling sometimes from 900p to 1440p.

I don't like the idea that there will be 8GB GPUs in 2025, it's ridiculous... but the discussion has to be based on facts.
 
Final resolution is not the biggest consumer of VRAM. Much less than textures resolution (and number of course), anisotropic filtering, ray tracing, FSR/DLSS...

So, in 2025, this GPU with 8Gb of VRAM should has no RT, no FSR, no AI, just raster performance, and should cost 100 dollars. Then I'd understand its existence.
 
Back