4GB vs. 8GB: How Have VRAM Requirements Evolved?

I am currently using 3440x1440 and let me tell you you want that gpu to have at least 16. It's slowed down thank goodness but it had me worried there for a bit.

I have been gaming on 3440X1440 since 2017 with a GTX 1080 on an Acer X34 Predator monitor.

I certainly never needed 16 GB VRAM GPU and I still don't.

In fact, all these years the most VRAM I have seen a game require was Civ VI at 3440X1440 Ultra which required 5.2 GB of RAM.

The only other game I play, WoW Classic, never went above 3-4 GB VRAM even though I use the NVIDIA upscale tech to render it at 4K and then downsize it to 3440X1440 for even better image quality, and I am playing at Ultra settings.

So do I really need 16 GB to game at 3440X1440? Nah, I totally don't nor I ever will.
 
MSRP is irrelevant if you cannot buy at MSRP. What matters is: what am I getting for my money right now. I would have bought a 3080 at $700 to replace my 1080. I had planned for it and had the money, $700 was a good price for that GPU.

There were none available at that price. It's a simple as that.

I waited and eventually got a 6800 XT at $560 when its price competitor was the 3070. The 3080 was between $900-1000. Pretty easy decision there.

Did you forget that both GPUs were released during the cryptocrapfest? Nvidia's MSRPs were lower because they were priced as crypto took off and AMDs were higher because they were released and priced later during the worst of crypto. Which is why these MSRPs are useless as neither reflect what buyers actually paid.

All the rest of the market share and enterprise blah blah is irrelevant and deviating from the point.
I did not forget anything, as I bought several 3070s for MRSP on release for people, I ordered 3 on release day and got them 1 day after, for 498-510 dollars each. I also bought like 5 3080's on release for clients + a few 3090s. Not a problem.

6700XT released much later and was caught in crypto boom, just like every other GPU. This happend later than RTX 3000 release and Nvidia took a huge marketshare by being first to the market + Using Samsung instead of TSMC for massive output.

6700XT launched about 6 months after 3070 and was priced the same, 20 dollars lower, yet still lost by 12-15% and had lackluster features and RT perf.

3070 was never priced on par with 6800XT. Unless you went with the cheapest 6800XT and the most expensive 3070, which is pointless.

When AMD tried to sell out their huge post mining GPU stock. Thats why 6700, 6800 and 6900 series were massively discounted and the reason why many 7000 series GPUs were heavily delayed (7700XT and 7800XT especially). AMD wanted to sell old tech rather than pushing new, because of HUGE INVENTORY.

No, Nvidia's prices were low because they used Samsung 8nm instead of TSMC, which is much cheaper. It was enough to beat AMD anyway.

I bought 100s of GPUs in the last two generations, I think I know what the prices have been.

If what you are saying was true (for entire world) then AMD would have tons of GPU marketshare. Yet they don't.

If you are in doubt, go check Steam HW Survey or these links:



AMD was in full panic-mode trying to sell 6700, 6800 and 6900 inventory. Massive discounts. Yet they did not really sell well.

Second hands prices on AMD hardware is also WAY LOWER so the money you "save" are not really saved in the end. Nvidia cards keeps their price much better. Just like iPhones vs Android.

Why? Demand is MUCH higher and Nvidia/Apple don't cut prices all the time like AMD does. AMD is all about pricing and every single piece of hardware from them, are lowered multiple times in price over a generation or two, making old AMD hardware close to unsellable when you are done with it. This is a fact and has been true for all CPU and GPU generations from AMD.

AMD = Launch MSRP is set high. Allowing for price reductions, which are bound to happen.
 
Last edited:
my 8GB GPUs are plenty @3440 x1440 Ultra; High for 6GB, Mediumto low for 3GB.
True. Not a single game needs 16GB VRAM for 3440x1440.

3440x1440 is pretty much the same VRAM requirement as regular 1440p. Both are far from 4K/UHD pixel count.

People just don't know what allocation is. Tons of game engines allocate 80-90% of VRAM regardless of how much is actually needed. Has nothing to do with VRAM requirement.

Software readings about VRAM Usage is pretty much worthless, especially if you compare X GPU with 20-24GB VRAM vs Y GPU with 8-12GB VRAM. The numbers will vary wildly.

The only time you need 16GB or more VRAM if you push 4K/UHD native or higher, while running Ray Tracing on top, or Path Tracing which is even more demanding. Which no AMD users will be doing, because RT Perf is miserable on AMD and native 4K is not happening with RT.
 
never priced on par with 6800XT. Unless you went with the cheapest 6800XT and the most expensive 3070, which is pointless.

I'll just address this one statement as an example of motivated reasoning or maybe willful blindness? I already posted TS's graphs of the very prices you're denying in a previous post.

You can look at years of TS articles and associated HUB YT videos here detailing about the current state of GPU pricing at the time these GPUs were available and the 3080 was always way overpriced compared to it's logical competitor, the 6800 XT. Instead the 3070 or sometimes the 3070 Ti was priced at or above the 6800 XT.

Yes for a single day or so at launch people bought some Ampere cards at stock price which is great. And from the following days/week until Ada was released you could not, for >99% of their run.

Again, what matters is pricing and features when you buy and RDNA2 had better pricing for FPS for pretty much Ampere's entire run. Of course some people valued Ampere's features more so paid the extra money. Choices for everyone.
 
Last edited:
Nice to see that Horizon Forbidden West doesn't consume more than 9GB VRAM even when fully maxed out. Nixxes said they designed a system that moves texture data unnecessary for a scene to system RAM when VRAM starts to fill up.
 
Back