4GB vs. 8GB: How Have VRAM Requirements Evolved?

6500XT is a steaming pile, true. 4GB VRAM is too low. 6GB can work for some, but 4, nah. Only for older games or indies.

I remember when AMD launched Fury X with 4GB HBM and called it futureproof because it was HBM, soon running into VRAM issues. Meanwhile they were praising how great 8GB was on the 390 refresh series 😂 980 Ti with its 6GB VRAM aged much much better (and overclocked like a champ, bumping performance by 30-40% with OC). Meanwhile Lisa Su called Fury X an overclockers dream on stage, yet it barely did 1% perf gain with OC.


Most PC gamers today are fine with 8GB, since 99% use 1440p or lower. 12GB is more than plenty, even for 3440x1440. If you want to push settings hard, the GPU will buckle before VRAM anyway.

Big difference between allocation and actual requirement. Its crazy how many people don't understand how allocation work.

Allocation does NOT reflect actual requirement. Especially not when you compare GPUs with diffferent VRAM Pool.

PS5 and XSX have 16GB shared RAM in total. OS and Game uses like 8-10GB. Meaning Graphics are 6-8GB, with a 4K/UHD target.
 
Last edited:
It's always the consoles that dictate base memory usage for the majority of games. They introduce the steep jump in requirements shortly after they launch. Assuming you want to at least match console settings and resolutions, which are also now typically beyond the 1080p tested here.

When you have a new console generation that has at least 8GB of video memory available for developers (probably a bit more) you know that's what most games on PC are going to end up needing for the very bottom end. That has quickly become apparent.
 
So I've been less interest in maximum raw performance as I am in power efficiency. My friend was showing me his new 4090 rig and the AC in his room was running constantly. I hear people saying if you're worried about the power bill then you can't afford a 4090, which has a good bit of truth to it, but having the AC running and the noise with it is annoying.
 
It's always the consoles that dictate base memory usage for the majority of games. They introduce the steep jump in requirements shortly after they launch. Assuming you want to at least match console settings and resolutions, which are also now typically beyond the 1080p tested here.

When you have a new console generation that has at least 8GB of video memory available for developers (probably a bit more) you know that's what most games on PC are going to end up needing for the very bottom end. That has quickly become apparent.

They don't have "at least" 8GB for graphics, they use 50% of 16GB RAM tops, many games uses 2-4GB, some uses 6-8GB. Never heard any PS5/XSX developer talk about using more than 8GB and I watch alot of behind the scenes / coding and hardware talk.

Many PC games uses just 1-2GB more going from 1080p to 2160p/4K UHD on identical settings even tho resolution increased 4 times - Example:

https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html

Some of the most beautiful games on PC uses less than 8GB on max settings at 4K/UHD. Its called good optimization (texture compression etc) and proper coding. Rushed console ports are often a mess in this regard, at least on release.

Many console ports are rushed big time, some uses far more VRAM than they should, other just run like garbage or has tons of bugs. Waiting 6-12 months usually fixes this while you pay 50% less for the game on top. Never preorder console ports.

None of the socalled "demanding" console ports are actually looking great. The most impressive one is Horizon Zero Dawn and this uses less than 8GB at 4K.

Some engines just allocate all the VRAM they can. Has nothing to do with requirement. Avatar for example - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

3070 still beats 6700XT with ease in 4K/UHD at ultra settings, minimum fps included, even tho 4090 uses up to 15GB of VRAM. Allocation is the keyword.
 
Last edited:
So I've been less interest in maximum raw performance as I am in power efficiency. My friend was showing me his new 4090 rig and the AC in his room was running constantly. I hear people saying if you're worried about the power bill then you can't afford a 4090, which has a good bit of truth to it, but having the AC running and the noise with it is annoying.
What cpu does your friend have?
I have the 4090 suprim liquid at 3 ghz oc and I don't have this problem even during long gaming sessions. The cpu might contribute to uncomfortable ambient room temperature. Also I cap my performance in rasterization games at 4k 120hz ( max display refresh rate) to mitigate wasted power.
The CPU I have is the 7800X3D with curve optimizer to peak at 70*c in bios.
 
what cpu does your friend have?
I have the 4090 suprim liquid at 3 ghz oc and I don't have this problem even during long gaming sessions. The cpu might contribute to uncomfortable ambient room temperature. Also I cap my performance in rasterization games at 4k 120hz ( max display refresh rate) to mitigate wasted power.
The CPU I have is the 7800X3D with curve optimizer to peak at 70*c in bios.
13900KS I think. But even if it was a 7800X3D it would still be using as much power as a space heater. The while idea of performance at any cost is getting annoying with somepeople housing their computers(the box) in another room entirely. I miss the days when a 500watt PSU could run a high-end CPU and GPU. Power requirements are getting absurd but everyone keeps talking about how efficient everything is getting. Now it feel like watercooling is almost mandatory if you don't want thermal throttled.
 
They don't have "at least" 8GB for graphics, they use 50% of 16GB RAM tops.

You say "use" here.

So do you mean allocate or use? When I said they have 8GB 'available' essentially it's the same as 'allocation.' Developers have at least 8GB they can utilise for video memory purposes. That doesn't mean they do all use that, but you're basically just repeating and agreeing with what I said in that case!

Obviously what any game might 'use' at any given time is naturally something different according the the type of games and developer priority. It can literally vary mode to mode, what with selectable graphics modes being par for the course on console these days.

In terms of where I said "probably a bit more than 8GB," determination of the maximum amount of video memory is down to developer allocation of the available system resources outside of what is walled off by the OS. Having unified memory has downsides, one upside is you hand this pool of memory to a developer and say "do whatever you want with this."

So in some cases this can exceed 50 percent of a console's total available memory believe it or not. It is not as common as it used to be, but on older systems like Xbox 360 it was a critical aspect of that machine's ability to outperform PS3's texture quality. It was pretty common to see games that had been designed to utilise ~300MB of the 512MB unified memory available for textures and the like. Something essentially impractical on PS3 because it had split 256/256 memory pools, AND video memory walled off for the GUI.

In any case, it's very safe to assume when a developer sits down and works on PS5 hardware for example they can count on at least 8GB of the unified pool being available for video usage if they need it. That rough carry over to PC is inevitable these days.

There are lots more variables that impact PC performance and game allocation. Consoles reserve memory for background GUI etc. On PC though just to display an average desktop at idle you can often find as much as 1GB of your video card's memory already gone. Your 8GB card is only really good for 7GB or so in the game itself, which leads to games themselves using less than 8GB but still running into performance barriers on 8GB cards. So console games pushing 7GB are already marginal on PC with 8GB video cards, let alone if you want better settings....

Lots more factors but this post is long enough already.
 
13900KS I think. But even if it was a 7800X3D it would still be using as much power as a space heater. The while idea of performance at any cost is getting annoying with somepeople housing their computers(the box) in another room entirely. I miss the days when a 500watt PSU could run a high-end CPU and GPU. Power requirements are getting absurd but everyone keeps talking about how efficient everything is getting. Now it feel like watercooling is almost mandatory if you don't want thermal throttled.
I have the air cooled Noctua d 15. The root cause is probably the cpu. Even with a great 480 mm radiator custom loop has to dump the heat into ambient temperature and eventually will cycle the air and throttle. Although if the room is small and closed off any system will eventually throttle as well ( some faster than others). Yes the 4090 can scale both ways as well in terms of power hungry to very power efficient depending on title and power limits.
The combination of a decent power limit to let's say 300 watts gives you 85% of the peak performance at 450watts plus an efficient cpu that uses around 50 watts of power during gaming should improve ambient room temperature and give you consistency in frames ( less frame variance from throttling). The whole system need sto be analyzed including the the room conditions the type of game etc etc. This will all probably be irrelevant next year with 4k 240hz monitors and 1440p 480hz monitors.
 
Last edited:
Why doesn't anyone ever question the evolution of GPUs in general? It used to just be some extra video RAM. Nice and simple. Now there's so many bullshit layers involved, all getting patched all the time. Why does every AAA game release necessitate new driver patches? Why can't devs adhere to the original driver specs? They're all using the same few engines anyway FFS. To me it's just become bullshit.

So I've been less interest in maximum raw performance as I am in power efficiency. My friend was showing me his new 4090 rig and the AC in his room was running constantly. I hear people saying if you're worried about the power bill then you can't afford a 4090, which has a good bit of truth to it, but having the AC running and the noise with it is annoying.

My LianLi towers, consoles, and UPS units are all on a shelf in the basement with 16ft HDMIs and USBs coming up through the floor. It's so nice NOT having to hear any of it running.
 
This is really more of a weird re-review of the 6500XT than it is anything to do with the RAM argument on mainstream cards (the issue being NV selling users a $400 card with 8GB of RAM).

Its a question of bad pricing, you'd hear far fewer complaints if 8GB cards occupied the $200 territory, one would expect to turn a few settings down at that price point, but not at $400.

Also, the 6500XT looks like it really should have launched with 8GB of RAM as the default config, it would have been received much better than the 4GB variety. I wonder if the larger framebuffer bolsters the card at PCIe 3.0 as well since the system is making way fewer trips across the camped BUS, which is bad enough on PCIe 4.0.
 
So with enough VRAM even the trash tier 6500 XT can get 45-80 FPS in almost all 2023+ games at 1080p native with High-Ultra textures and look good. Except in the broken Immortals of Aveum (and Lords of the Fallen, see TPU).

Which means a reasonably specced ~$200 GPU would do a great job for entry level gaming but neither AMD nor Nvidia seems interested in making so little profit/margin. The 4060 and 7600 do mid 80s FPS at 1080p Ultra for $270-300 and that's as low as they want to go. There's loads of room for something below these with 8GB targeting mid 60s FPS at Med-Hi for =<$200 but what do stockholders want with that?

In their place, both companies sell previous gen gimped GPUs. FU low-end gamers.
 
Most PC gamers today are fine with 8GB, since 99% use 1440p or lower. 12GB is more than plenty, even for 3440x1440. If you want to push settings hard, the GPU will buckle before VRAM anyway.
Nope.

First, it's not just the 1%ers with high res monitors and TVs. This is just a fact.

Second, you completely missed the entire point of this article and the last one. Go read or watch it. The "GPU will buckle before VRAM" myth is busted with science and example after example (even at 1080p).

 
99% of games run on 2-3GB VRAM even today; just meter resolution and don't use HD textures; even my 2012/13 GTX 600/700 series 2 and 3GB cards can play most titles.
LMFAO no. If you're fine playing games at 640x480 with no lighting or textures that's great for you, but most of us left the N64 era YEARS ago.
 
They don't have "at least" 8GB for graphics, they use 50% of 16GB RAM tops, many games uses 2-4GB, some uses 6-8GB. Never heard any PS5/XSX developer talk about using more than 8GB and I watch alot of behind the scenes / coding and hardware talk.

Many PC games uses just 1-2GB more going from 1080p to 2160p/4K UHD on identical settings even tho resolution increased 4 times - Example:

https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html

Some of the most beautiful games on PC uses less than 8GB on max settings at 4K/UHD. Its called good optimization (texture compression etc) and proper coding. Rushed console ports are often a mess in this regard, at least on release.

Many console ports are rushed big time, some uses far more VRAM than they should, other just run like garbage or has tons of bugs. Waiting 6-12 months usually fixes this while you pay 50% less for the game on top. Never preorder console ports.

None of the socalled "demanding" console ports are actually looking great. The most impressive one is Horizon Zero Dawn and this uses less than 8GB at 4K.

Some engines just allocate all the VRAM they can. Has nothing to do with requirement. Avatar for example - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

3070 still beats 6700XT with ease in 4K/UHD at ultra settings, minimum fps included, even tho 4090 uses up to 15GB of VRAM. Allocation is the keyword.
All that text, and you miss the point. PC games will typically use as much VRAM as consoles have RAM. This has been true every generation. The only thing holding back this gen is the trash series s with 10gb total VRAM.

The 8gb GPU era is over. It's time to upgrade.
 
So I've been less interest in maximum raw performance as I am in power efficiency. My friend was showing me his new 4090 rig and the AC in his room was running constantly. I hear people saying if you're worried about the power bill then you can't afford a 4090, which has a good bit of truth to it, but having the AC running and the noise with it is annoying.
But the 4090 is the most efficient gpu. Money not withstanding, you limit it to 200w, it's the fastest 200w gpu in existence.
 
I'm very surprised that Horizon Forbidden West still includes 4GB GPUs for minimum specs considering how terrible they were for the Zero Dawn port. Missing geometry and low LOD textures and models everywhere there on my old RX 570. Then again, FW is a cross-gen title available on PS4 as well so I guess it can still scale down pretty well.
 
I've got a GTX1650 (4GB) and ran TLOUI on there. Running through wine and vkd3d. My FPS was OK, like 30 or 40, until I got to a few areas and would suffer mega-jank (like catastrophically low 1% lows.) I think some areas are those claustrophobic tunnels specifically to avoid this issue, but a few of the wide open spots with significantly different stuff ahead and behind you, I'd be looking around behind me every so often and there'd be this massive like 1/4 to 1/2 second delay as it (I assume) replaced like 1/2 the textures in VRAM all at once.

Don't know if it does this on Windows, but in the Graphics settings TLOUI does inform me it's using like 6.8 out of 12GB (on my 4GB card.) nvidia-smi does reveal that in fact TLOU-I is soaking up like 3.9GB (4GB-128MB typically) on my 4GB card (pre-allocated, it's using that much by the time the spinning dog tag pops on screen.)
 
Last edited:
But the 4090 is the most efficient gpu. Money not withstanding, you limit it to 200w, it's the fastest 200w gpu in existence.
I know that's true, but I've been daily driving a steam deck for the last 3 weeks. The idea that I can play games at 20 watts that I played 15 years ago at ~800watts is way more impressive to me than a $2000 GPU running at 200 watts instead of 450-500.

I've always been a gamer but I've also always been a hardware nerd. The wattage specs on the steam deck seem way cooler to me than over all specs on top tear hardware these days. I ran top-tier hardware for over 15 years but I just don't find it as interesting as the lower end stuff these days.

It's not even about price, I can buy a 4090 if I want, but the platform idea of the steam deck as battery tech improves is really what draws my interest these days.
 
I'm very surprised that Horizon Forbidden West still includes 4GB GPUs for minimum specs considering how terrible they were for the Zero Dawn port. Missing geometry and low LOD textures and models everywhere there on my old RX 570. Then again, FW is a cross-gen title available on PS4 as well so I guess it can still scale down pretty well.

I played HZD for a while on a 1050 Ti and RX 6400 (even tried it on a GTX 745 4GB!) and I don't remember missing textures or LOD problems on the 6400 but I had to lower quality enough on the slower 1050 Ti enough that LOD may have ended up being a non-concern relative to other settings.

One thing HZD was unique for was VRAM corruption on the 1050 Ti when I'd OC the VRAM too much. I'd get some consistent major errors in foliage and other things making ridiculous shapes in-game. Clock VRAM down a notch and restart and all is good. No other games have done that with this GPU.
 
I know that's true, but I've been daily driving a steam deck for the last 3 weeks. The idea that I can play games at 20 watts that I played 15 years ago at ~800watts is way more impressive to me than a $2000 GPU running at 200 watts instead of 450-500.
The Steam Deck hardware is impressive in it's efficiency and performance. I haven't owned one, but I had a Dell with Ryzen 3450U (which is very close to the model used in Steam Deck.) 15W TDP and games ran pretty nice on there. I now have a Asus with an Intel Tiger Lake (i3-1115G4, the lowest end one) with Intel Xe Graphics... so the Tiger Lake gets about 2/3rd the CPU performance and 2/3rds the GPU performance using about double the power.
 
The Steam Deck hardware is impressive in it's efficiency and performance. I haven't owned one, but I had a Dell with Ryzen 3450U (which is very close to the model used in Steam Deck.) 15W TDP and games ran pretty nice on there. I now have a Asus with an Intel Tiger Lake (i3-1115G4, the lowest end one) with Intel Xe Graphics... so the Tiger Lake gets about 2/3rd the CPU performance and 2/3rds the GPU performance using about double the power.
I have a 5700x3d and a 6700xt, I find my steam deck as just a way more enjoyable user experience. It's a really interesting platform and I'm excited to see where it's going
 
Back