Is More VRAM Just Used for Marketing?

Of the eleven games tested five of them showed absolutely no change in performance regardless of the quality settings, and those games included Crysis 3, Battlefield 4, The Witcher 3, Dragon Age: Inquisition and Dying Light.

There are some seriously good looking and very demanding games in that list.

Crysis 3 only used 2.4GB of VRAM and yet both the GTX 960 and R9 380 struggled to deliver playable performance using the best graphics settings at 1080p. A similar story was found when testing with The Witcher 3 and Dragon Age: Inquisition. Interestingly, Dying Light consumed up to 3.5GBs of VRAM when there was enough at its disposal and yet there was no performance difference between 2GB and 4GB graphics cards.

Then we ran into games such as Assassin's Creed Syndicate and Just Cause 3, where the 4GB models did provide additional performance, but only when faced with unplayable conditions. For example, Assassin's Creed Syndicate at 1080p using the R9 380 and GTX 960 on "Ultra High" settings, frame rates went from the low 20s on 2GB models to high 20s for the 4GB models.

As expected the GTX 960 and R9 380 simply aren't powerful enough to warrant anything bigger than a 2GB memory buffer

Once the graphics settings were set back to "High" both the 2GB and 4GB models provided very playable performance and the GTX 960 delivered the exact same performance regardless of memory capacity.

The only games that genuinely benefited from having a larger 4GB frame buffer were Star Wars Battlefront, Shadow of Mordor (if only slightly) and Rainbow Six Siege. Of those games, Rainbow Six Siege was the big one, and while the 2GB R9 380 and 2GB GTX 960 still delivered very playable performance at 1080p using maxed out settings, the 4GB models were much faster. The 4GB 39 380 enjoyed 23% more performance, while the GTX 960 was 16% faster.

This was a common theme and we found that while the R9 380 benefited from the larger 4GB frame buffer in games such as Grand Theft Auto V, Just Cause 3 and Star Wars Battlefront, the GeForce GTX 960 didn't. In fact, we saw very few cases were the GTX 960 4GB made sense, and certainly there were no instances where the 2GB frame buffer crippled performance.

As expected the GTX 960 and R9 380 simply aren't powerful enough to warrant anything bigger than a 2GB memory buffer. If more games performed like Tom Clancy's Rainbow Six Siege, then there certainly could be an argument for the 4GB cards, but as it stands spending that extra $20 to get the 4GB model seems like a waste of money.

Also if you are a gamer who is going to back the quality settings off one notch to achieve 60fps+ performance, then the 4GB model has absolutely nothing to offer. Grand Theft Auto V is a perfect example of this with the R9 380.

At 1080p using the highest possible visual quality settings, the 4GB model was 41% faster, though it averaged just 31fps with 25fps minimums, not exactly smooth performance then. Backing off the quality settings a bit allowed for a massive boost in performance and now both the 2GB and 4GB models delivered the same 68fps average with minimums of over 50fps.

Granted there isn't a significant cost difference between the 2GB and 4GB models.

As for the 4GB R9 290 vs. 8GB R9 390 comparison, there was nothing to see here. Most games didn't consume more than 4GBs of VRAM, but there were cases such as in Assassin's Creed Syndicate where with almost 5GBs of memory usage, there was still not a single frame difference. Moreover, in this Assassin's Creed Syndicate example the 290 and 390 were good for just 26fps on average, so clearly the quality settings will need to be reduced for achieving playable performance and consequently memory use will drop.

The 390 and 390X are really graphics cards we never wanted. At the time of their release the Radeon R9 290 and 290X were exceptional buys. The 290X cost just $330, while today the 390X costs around $100 more for no additional performance and it is no different with the 290 and 390.

We see plenty of gamers claiming that the 390 and 390X are excellent buys due to their 8GB frame buffer ensuring that they are "future proofed," and well, that simply isn't the case, as neither GPU has the horsepower to efficiently crunch that much data. Perhaps the only valid argument here is that the larger frame buffer could support Crossfire better, but we haven't seen any concrete evidence of this yet.

As was the case with the 2GB and 4GB cards where a few select games favored the larger 4GB models, this might become the case with 4GB and 8GB in time, though we feel it is even less likely.

So in the end, not much has changed from previous years. Akin to the megapixel and megahertz race in cameras and processors, the amount of memory used on graphics cards that are not fast enough to utilize it is simply used as a marketing gimmick. On the bright side, it's not a significant amount of money.