Why Are Modern PC Games Using So Much VRAM?

The jab at hogwarts legacy was low, its an amazing looking game(imho)

Why not point that vitriol at something like the last of us, which can trace itself back to the ps3?

and although its not a bad game at all, why not add something like forspoken to test like these? not only is it the only title with directstorage activated(the new consoles trump card) it uses a texture streaming technique that very visibly affects the textures depending on gpu vram amounts, on a 4080 it'll go high as 14gigs in usage.
 
The jab at hogwarts legacy was low, its an amazing looking game(imho)

Why not point that vitriol at something like the last of us, which can trace itself back to the ps3?
I would argue that while the visual design of Hogwarts Legacy is excellent, the actual rendering fidelity leaves a lot to be desired, for the level of demand it places on the hardware. At least with The Last of Us remake, one can visibly see the benefits of using maximum settings, whereas the former questionably doesn't.

why not add something like forspoken to test like these? not only is it the only title with directstorage activated(the new consoles trump card) it uses a texture streaming technique that very visibly affects the textures depending on gpu vram amounts, on a 4080 it'll go high as 14gigs in usage.
I thought about purchasing it to include in the testing, but having already spent many hours doing the other games, I felt I had enough information for the article. All open-world games use asset streaming and Forspoken isn't using anything different, compared to the others; DirectStorage doesn't help reduce VRAM demand, just performance.
 
One more thing that's being overlooked the shared RAM pool on consoles is efficient and easier to write for.

On a console, when you load an asset into RAM, it makes one jump (SSD -> RAM), and is immediately available to the GPU and CPU.

On a PC, with separate memory pools, the item has to be loaded into main RAM, then into GPU RAM. This is another pipe with bandwidth that must be managed, and as a result a game engine has to know what needs to go to the GPU when.

DirectStorage may eventually help with this, but in the meantime, this mean increased overall RAM usage and inefficient, poorly optimized games trying to overuse VRAM.
 
A reason that the same games on PC needed much larger amounts of RAM and VRAM was also a lack of optimization for the particular platform. you could Play AC:Black Flag on a XBOX360 with 512MB at 720p30 just fine, but a similarly specced PC (256 MB RAM, 256 MB VRAM) would have not been able to run these games.
 
Funny how some games excel and focus on certain aspects. I have been tinkering with Metal Gear Solid 4 emulation recently and it has very nice looking character/weapon models even for today 15 years later but the environment textures are hilariously bad. PS2 era quality in many cases.

We are just seeing this step up in VRAM usage because developers are starting to dedicate more time to the latest consoles.
 
"Taking the worst VRAM offender, The Last of Us"

^ A game I played (remastered version at 1800p 60fps) years ago on a PS4 pro that looks pretty much identical from what I have seen. Funny how that works.....
 
Personally, I would keep buying new gpus for as long as graphics keet evolving and look almost real. But entire genres, some of the best selling games have a potato graphics.
Games take more Vram because someone "gifted" us ps4 game ports.
 
Last edited:
"Taking the worst VRAM offender, The Last of Us"

^ A game I played (remastered version at 1800p 60fps) years ago on a PS4 pro that looks pretty much identical from what I have seen. Funny how that works.....
its weird right, its like they artificially bloated it when it came to pc.

then theres games like doom eternal that can run on relics yet still look amazing either way, and in some ways rockstar and their games too, gta5 is old and it was one of the first games I installed when I upgraded my gpu.
 
PC gaming "master race" likes to flex on console peasants by buying $1500 GPUs with a lot of VRAM, so devs are making sure these GPUs are not wasted.
 
There are several 290x with 8GB.
There were indeed, but not until 2014 -- when the 290X launched in 2013, the biggest GDDR5 modules available were 256 Mbit, hence why the first 290Xs were 4GB, despite having 16 modules. It wasn't until nearly a full year later when the likes of Sapphire and MSI used 512 Mbit modules.
 
Great article.

I think its reasonable to expect to be able to play games in 4K with all the detail fairly cranked on a brand new $500 graphics card, but apparently nSidious thinks not.

I bought a cheap 3060 for the lounge PC for half this price and even that fairly lowly older generation card has 12GB. 8GB is just a weird decision.
 
Last edited:
4GB VRAM was also sufficient for most of the PS4/Xbox One generation. Though you did have to reduce texture and/or shadow quality a notch in games like Resident Evil 7 and Rise of the Tomb Raider to avoid stutter. Horizon Zero Dawn however had really ugly textures on 4GB cards though or straight up wouldn't even load higher detail assets for character models.
 
A lot of high tech "3D graphics", yet the games are devoid of any innovations. It's just re-skinning of old games or sequels that sold well.

And the "new" games with 'amazing' graphics feel somehow........empty. Gameplay is unfortunately, disappointing.
 
It would be nice to see Dev's actually try and make these games look great and run well. I can appreciate that some VRAM buffers will be limited, like 8GB at 4K, but no game should stutter and crash because of a texture setting, and this article makes it seem obvious that it can be done.
 
FYI 4090 by PNY selling at $1549.99 on Amazon
PNY GeForce RTX® 4090 24GB XLR8 Gaming VERTO EPIC-X RGB™ Triple Fan Graphics Card https://a.co/d/0uTa1qB

4080 for $1139.99
PNY GeForce RTX™ 4080 16GB XLR8 Gaming VERTO EPIC-X RGB™ Overclocked Triple Fan Graphics Card DLSS 3 https://a.co/d/g2EUB48


amd at Microcenter.
7900xtx by Asrock for $999.99 SAVE $40.00
$959.99
PowerColor AMD Radeon RX 7900 XT Triple Fan 20GB GDDR6 PCIe 4.0 Graphics Card
for $849.99 SAVE $80.00
$769.99
Prices are definitely falling.
 
"The developers had similar memory restrictions for the PC versions of the games they worked on, but a common design choice back then was to allow graphics settings to exceed the capabilities of graphics cards available at launch. The idea was that users would return to the game when they had upgraded to a newer model, to see the benefits that progress in GPU technology had provided them."

This is how I remember PC gaming. There were settings you could never turn on without the absolute top of the range GPU of that generation and part of the fun of getting a new GPU was going back to old games and cranking up the settings.

I was away for a while playing Xbox One etc (after PC gaming from 1995 to 2012 with a bit of xbox 360 thrown in for good measure) until 2020 when I upgraded my i7 nehalem to an AMD 3600 with a 1660 super (then a jump to a 3060ti) and I was very surprised that I could turn every game up to ultra on the 3060ti @ 1440p and still get fps beyond 80+ (until ray tracing). Looking back, that was the end of the previous console generation so games weren't doing anything on console that couldn't be done on a PC and now its a new console generation with more shared memory and nvme storage, still benefitting from less OS overhead.

So the max settings on some new games can only be played with the very top range gpus again. There are arguments of pricing, gouging, planned obscelescence. All valid. But its quite interesting that we are where we were 10 years ago, with only the top gpus able to turn on all the eye candy with acceptable performance.

If I had a long beard I would be stroking it right now, supping on a long winding tobacco pipe while rocking in my porch chair, muttering "interesting indeed" to myself over and over and over..
 
"The developers had similar memory restrictions for the PC versions of the games they worked on, but a common design choice back then was to allow graphics settings to exceed the capabilities of graphics cards available at launch. The idea was that users would return to the game when they had upgraded to a newer model, to see the benefits that progress in GPU technology had provided them."

This is how I remember PC gaming. There were settings you could never turn on without the absolute top of the range GPU of that generation and part of the fun of getting a new GPU was going back to old games and cranking up the settings.

I was away for a while playing Xbox One etc (after PC gaming from 1995 to 2012 with a bit of xbox 360 thrown in for good measure) until 2020 when I upgraded my i7 nehalem to an AMD 3600 with a 1660 super (then a jump to a 3060ti) and I was very surprised that I could turn every game up to ultra on the 3060ti @ 1440p and still get fps beyond 80+ (until ray tracing). Looking back, that was the end of the previous console generation so games weren't doing anything on console that couldn't be done on a PC and now its a new console generation with more shared memory and nvme storage, still benefitting from less OS overhead.

So the max settings on some new games can only be played with the very top range gpus again. There are arguments of pricing, gouging, planned obscelescence. All valid. But its quite interesting that we are where we were 10 years ago, with only the top gpus able to turn on all the eye candy with acceptable performance.

If I had a long beard I would be stroking it right now, supping on a long winding tobacco pipe while rocking in my porch chair, muttering "interesting indeed" to myself over and over and over..
I agree, and you make a good point I think the issue is the price of these new GPU's
Top end was 600$ and now its 1600$, must be allot of folks making 166.6% more money I guess.
 
Last edited:
Back