AMD defends 8GB VRAM on GPUs... by admitting they are primarily for esports

Daniel Sims

Posts: 1,908   +53
Staff
A hot potato: Although Nvidia has caught the most flak for continuing to sell mid-range $400+ graphics cards with just 8GB of VRAM, AMD has also persisted with this approach in the budget-performance segment. Although independent benchmark data reveals the ongoing quality and performance sacrifices associated with smaller VRAM pools, Team Red continues to defend its lower-tier products with statements that, while technically accurate, obscure the true value propositions of modern GPUs.

AMD's Frank Azor recently defended the company's decision to sell an 8GB variant of the Radeon RX 9060 XT amid growing criticism of mid-range and mainstream graphics cards featuring limited VRAM. While Nvidia is more frequently guilty of this trend and AMD cards often offer more memory, the most affordable and popular products from both companies suffer from the same issue.

Following a Computex unveiling of the RX 9060 XT, which offers 8GB and 16GB configurations, Azor responded to a question regarding the cheaper version by claiming 8GB of VRAM is sufficient for 1080p, the most popular PC gaming resolution.

Our reviews of similar GPUs like Nvidia's RTX 5060 Ti 8GB and 5060 reveal that, while this is true, low VRAM significantly handicaps cards compared to similar hardware with more memory.

While virtually any title in 2025 is playable with 8GB of VRAM at the right graphics settings, the size of the VRAM pool can still significantly impact the user experience. Comparing the 8GB and 16GB versions of the RTX 5060 Ti reveals that while average frame rates are often similar, running out of memory can dramatically worsen one-percent lows, leading to noticeable stuttering. Some games perform worse overall on the 8GB model, and others – like Indiana Jones and the Great Circle – crash at settings where the 16GB GPU runs smoothly.

Warhammer 40,000: Space Marine 2 highlights a different issue. Both cards deliver nearly identical frame rates at ultra settings with the 4K texture pack enabled, but the 8GB version struggles to render high-resolution textures.

In another benchmark, the 8GB RTX 5060 Ti even falls behind Intel's Arc B580 – a 12GB card that costs over $100 less and targets a lower performance tier. These issues can arise even at 1080p, a resolution still used by 55% of surveyed Steam users.

Azor also noted that the mainstream GPU market is aimed largely at esports players, who likely represent the largest segment of users. Steam's most popular games – esports titles like Counter-Strike 2, Marvel Rivals, Dota 2, and Apex Legends – can still achieve high frame rates on 8GB cards.

Related reading: 4GB vs. 8GB: How Have VRAM Requirements Evolved?

However, as GPUs become more expensive, many users are turning to budget hardware for playing demanding AAA titles, which benchmarks show they can handle surprisingly well. With sufficient VRAM and upscaling technology, mainstream GPUs are perfectly capable of 4K gaming.

Nvidia's approach to reviews of 8GB GPUs also suggests manufacturers are aware of the shortcomings. The company withheld 8GB RTX 5060 Ti review units from independent outlets and restricted access to RTX 5060 drivers for reviewers unwilling to benchmark the card under favorable conditions.

One such condition was enabling quadruple frame generation, a feature that distorts raw performance metrics and consumes additional VRAM. Meanwhile, ray tracing – a feature Nvidia frequently markets as a core benefit – is notably memory-intensive.

AMD's first 8GB card, the Radeon RX 480, launched nearly nine years ago at $229. It's remarkable that manufacturers still sell GPUs with the same VRAM capacity for over $300. Outside of esports, these products are likely to age poorly, especially if next-generation consoles, which are expected to feature more than 20GB of memory, launch within these cards' life spans.

Permalink to story:

 
Techspot poll shows 45% have 8GB or less, but it's only been an hour since this articles' been up. Curious to see where it's at in 12+ hours. I'm probably the minority, but I also think $300 is fair.
I'm with Franky boy on this one.
 
8GB is not problem in any game... Assuming your not playing games at a max settings. In most cases, it is fixed by just lower few setting (mainly textures)


Most reviews test games on max settings and then claim 8GB is not enough (even when fps is also poor on 16GB card too)... Even in most demanding games, 1440p DLSS + high setting or mix of medium/high is all you need to fix the issue on 8GB cards.. People need to stop exaggerating
 
It really depends on what games you play, what visual quality you want, and what price you’re willing to pay. Personally, I’ve been going a bit retro and get a laugh when loading up Doom 3 or Quake 4. There’s a warning about enabling high quality textures, which may severely impact performance even with a 512MB video card…
 
8GB GPU is simply not future proof. I get it we're talking about $300 gpu, but remember when 1060 came out? if you save up a little bit and chose the 3GB option instead of 6GB, you'd definitely regret it. techspot even did an article 5 years after the card came out.

https://www.techspot.com/review/2291-geforce-gtx-1060-3gb-vs-6gb/

Before our wrap up, below is a look at the 17 game average and as you can see the 3GB version of the GTX 1060 is faring much worse than it did a few years ago. It's no longer 7% slower on average at 1080p. In fact, using the lowest quality settings, it was on average 14% slower, then 20% slower using medium settings, and a whopping 32% slower with high quality settings.

In the case of the 3GB vs 6GB battle, it took at least 3 years before the lower capacity model started to regularly run into issues and about 5 years before it was unusable in some games. You could argue that 3GB of VRAM wasn't considered "a lot" back in 2016, and perhaps 4GB vs 8GB would be a more valid comparison, but at the end of the day I think the results would be much the same.

so in short, 8GB may seems okay now, but you'll regret it after a few years.
 
Maybe technically true, but I also think that the majority of budget cards shouldn’t be handicapped like this. It was only 2 gens back that AMD came out with the 12GB 60 series card, and NVidia responded by giving their 3060 12GB. That was comical because the 3080 only had 10GB and forced a 3080ti release.

Anyway, my main point was to say, having 2 SKUs also costs time and money - just give up on the handicapped 8GB versions already!
 
Gotta love this debate. I remember debating 2gb v 4gb in the Gtx 670 days. History repeats again and again.
People were adament you'd never need 4 GB, even though there were plenty of cases enthusiasts ran into problems already.
 
I still use a RX470, a blower Sapphire with 4GB!! (I won't say the rest of my system), and I play most of what interests me (although I've been playing little for about 4 years, I dedicate the time to certain projects), even Cyberpunk 2077. I don't play the latest games at full graphics? True, but the settings I use are enough for me, and that's at 1080 (I don't play any "e-sport"). I just need what I play to go at 30fps or more (anyway, I limit most of the games to 30fps, even if they can go much faster, in addition to using Radeon-chill, to extend the card lifespan).
My buying guide is that the hardware should deliver, now, at maximum quality and at the resolution I use, in the games I'm interested in, minimun 45fps. then, adjustments in the game settings will also extend its useful lifespan.
And now what I'm going to buy will be a second-hand RTX3060 for around $ 200, which will last me a few more years. I was thinking about a 3060ti which gives me a bit more power but I need a bit more memory for some AI experiments that I'll be doing. I've used this same RX470 quite a bit for AI tasks (mainly training, believe or not ) over the past few years, but I need a bit more now. And I'll stick with 1080p. I'll soon buy another primary monitor (my old secondary one is dead) and it'll also be a 1080p 60Hz one. I don't see any reason for more.
 
Last edited:
I don't really have a problem with 8GB cards, I have a problem with the price. $300 max is what I think and 8GB card is worth. I know that ESO, EVE and most of the other games I play are fine with 8GB at 4k, but my 6700xt has fairly frequently used ~10GB in games and I'd have massive frame drops if it was limited to 8GB.

What they need to do is just stop making 8GB, raise the price by $20-30 across the board and work at making 12-16 the new standard. Instead, what we are going to see is 16GB cards out of stock or $50 above MSRP when they are in stock and a surplus of 8GB below MSRP that noone wants that end up getting dumped on OEMs at or near a loss to AMD or the board partners.

The biggest problem isn't that its an 8GB card, its that these GPUs are actually a lot more capable than their VRAM will allow them to be
 
I don't really have a problem with 8GB cards, I have a problem with the price. $300 max is what I think and 8GB card is worth. I know that ESO, EVE and most of the other games I play are fine with 8GB at 4k, but my 6700xt has fairly frequently used ~10GB in games and I'd have massive frame drops if it was limited to 8GB.

What they need to do is just stop making 8GB, raise the price by $20-30 across the board and work at making 12-16 the new standard. Instead, what we are going to see is 16GB cards out of stock or $50 above MSRP when they are in stock and a surplus of 8GB below MSRP that noone wants that end up getting dumped on OEMs at or near a loss to AMD or the board partners.

The biggest problem isn't that its an 8GB card, its that these GPUs are actually a lot more capable than their VRAM will allow them to be
With all settings on low 5 EVE clients don't even use 4GB but that's barely playable due to how it looks. 8GB is fine for a few clients but EVE is an ancient game, there's no way that 8 will be enough in a year or two.
 
This is not a problem with AMD selling 8GB cards, this is a problem with how markets and buyers operate, and it's not just GPU's Suppose they released a version of theses cards, but built on 7nm or higher to save costs. You could do it, since half of the shaders take up less space. You'd loose a bit of clock speed and power efficiency. The tech press would be screaming bloody murder about making such a backwards card. Like it or not, all of these chips are built on the same node and one is not substantially cheaper. Cars, appliances, etc. all get more expensive with every new model, and the entire spectrum from entry level to premium goes up in price. You can blame Nvida, you can blame AMD, you can blame TSMC, but the fact is that supply and demand wins.

AMD is not forcing ANYONE to by these cards, and yes, it sucks, but I'd rather they offer the choice and let me decide rather than not have the option.

What are we really arguing about? That by offering 8GB cards, higher end gaming is being delayed and stagnating? Or are we arguing the the gaming enthusiast doesn't have enough games with uber eye candy, and the buying public, drones who don't read this and don't know any better, are keeping them from using THEIR cards to the fullest?
 
I repeat it again and again... beside a handful of games at Ultra setting, 8GB is enough even at 1440p.

There is a difference of mere percent with Max and Low 1% from a 8GB and 16 GB GPU at 1440p.

relative-performance-2560-1440.png
 
Back