Sony recommends 32GB of RAM and 100GB of storage to run The Last of Us Part 1 on PC

Daniel Sims

Posts: 1,376   +43
Staff
Something to look forward to: Another PC port recommends 32GB of RAM for gameplay above 1080p. While not all recent major PC releases suggest such high memory usage, it is becoming a concerning trend as the industry fully transitions into a new console generation and a new technical floor for PC performance.

Sony released details on the PC system specs and features for The Last of Us Part I this week, just in time for the smash-hit HBO TV adaptation to conclude its first season. Although based on similar graphics technology, the GPU and memory requirements are significantly heavier than Naught Dog's last PC port, Uncharted.

The Last of Us Part I requires at least 16GB of RAM and 100GB of storage – much more than the PlayStation 5 version's 79GB storage requirement. Furthermore, the system specs chart doesn't mention an HDD. It's unlikely that the game won't boot from an HDD, but take the omission as a sign that Sony strongly recommends an SSD.

Like other recent blockbusters, the game suggests an Nvidia GeForce GTX 970, GTX 1050 Ti, or AMD Radeon 470 at minimum for 30-frame-per-second gameplay at 720p. It also follows ongoing trends in recommending a GeForce RTX 2070 Super, RTX 3060, or Radeon RX 6600 XT for 1080p 60fps gameplay at high settings.

Sony's system requirement chart erroneously recommends a Radeon RX 5800 XT – a graphics card that doesn't exist. Judging by the other cards in the "recommended" tier, it probably meant the 5700 XT.

For playing at 1400p or 4K, Sony recommends 32GB of system memory and a recent high-end GPU. While unusually high, the requirement echoes recent titles like Forspoken and Sony's Returnal.

Like Sony's other PC ports, The Last of Us Part I supports DLSS, FSR 2.2, 21:9 and 32:9 ultrawide, wired DualSense features, and 3D audio. Adjustable PC graphics settings include texture quality, shadows, reflections, ambient occlusion, and more. Like the PS5 version, the game doesn't feature ray tracing.

The Last of Us Part I arrives on Steam and the Epic Games Store on March 28. The standard edition is $59.99.

Permalink to story.

 
Maybe to be less aggressive with my SSD they should release the regular game for those who play in FHD and a high resolution texture pack for those looking for 4K.

8TB SSD still isn't cheap. :p
 
Unless they rebuilt the game for PC there is absolutely no reason this games should be demanding at all.
A 7900xt or 4080 for 4k/60...gtfo.
That is absolutely ridiculous. I played the remake on PS5 at 4k and it wasn't anything exciting to look at. What a joke.
 
Unless they rebuilt the game for PC there is absolutely no reason this games should be demanding at all.
A 7900xt or 4080 for 4k/60...gtfo.
That is absolutely ridiculous. I played the remake on PS5 at 4k and it wasn't anything exciting to look at. What a joke.

that's 7900XT WITH FSR to run at 60hz@4k... I mean 60hz@1440p
 
This better look amazing with those kind of specs.

So someone explain this recent ram thing to me. I've been building computers for over 30 years and for a very long time you could NEVER get enough ram and then I remember upgrading to 16gigs back in 2012 thinking, "alright, I'm probably never need all this but it's cool and I want it." Now that's just on my desktop, I have servers that have that will suck down as much ram as I can give them. The problem is, I don't see what it is these programs are DOING with this much ram. Functionally, what is it for? What are these programs doing that it needs this much ram? Is it truly lazy development or is their some innovation that requires more resources?
 
This is becoming a trend. It seems most new game released are resource hungry and yet don't perform well, I.e. average FPS. To require a RX 7900 XT or RTX 4080 to get to 4K@ 60FPS shows how poorly optimized the game is when the game runs at 4K@ 30FPS on a PS4 Pro. The GPU in the PS4 Pro, is nowhere near what a RX 7900 XT can deliver. Granted it does not run at native resolution, due to checker board processing and dynamic resolution, it still ran well for the very aged GPU and terrible AMD Jaguar SOC.
 
This is becoming a trend. It seems most new game released are resource hungry and yet don't perform well, I.e. average FPS. To require a RX 7900 XT or RTX 4080 to get to 4K@ 60FPS shows how poorly optimized the game is when the game runs at 4K@ 30FPS on a PS4 Pro. The GPU in the PS4 Pro, is nowhere near what a RX 7900 XT can deliver. Granted it does not run at native resolution, due to checker board processing and dynamic resolution, it still ran well for the very aged GPU and terrible AMD Jaguar SOC.
This isn't "The Last of Us Remastered" though, this is "The Last of Us: Part I" which was completely remade in a newer game engine for the PS5 only. So the graphics card and CPU to actually compare it to would be Zen2 cores and a 5700XT but being a console, its more optimised and not plagued with DRM slowing everything down.
 
Maybe to be less aggressive with my SSD they should release the regular game for those who play in FHD and a high resolution texture pack for those looking for 4K.

8TB SSD still isn't cheap. :p

Couldn't agree more. What should we do when game producers suddenly decide to go for 8K for the lucky few? Users should be given the option to download what they need when they need it.
 
We need to wait and see just how well (or poorly) this is ported to PC. I am hopeful that given it is a PC specific release, it's going to be very well done (rather than something like Hogwarts Legacy which was plagued with issues). I feel like hardware requirements are much more about the quality of the developers optimisation rather than the visual fidelity of the game.

In a dream world, it needs this much power because it is very well optimised and absolutely visually staggering. In a more realistic world, its got 2 layers of DRM, some bad optimisation and mediocre raytracing implementation.
 
Everyone going crazy about 32GB of RAM, the simple answer is Devs don't know what SSD you might have and to help with asset streaming the chances are the game will just load an entire level into system memory and just swap the files into VRAM as and when it needs it. It's lazy yes but from their perspective it guarantees a level of performance similar to the PS5.
 
Far Cry 2, great looking sandbox FPS, weighed in at less than 4gig for such a massive game, what is really in your 100Gig
FC2 uses a fair degree of procedural data generation, which is why it's big but also mostly empty and lacking significant detail. Textures and shadow maps were low resolution, a lot of the lighting was pre-baked, and mesh structures were pretty basic.

farcry2.jpg

Only town/villages areas had any noticeable degree of fine detail to them, but it was all CTRL+C, CTRL+V across the world map. The developers wanted the game to run on the PS3 and Xbox 360, but not be a total hardware hog on the PC, unlike Crysis (which came out the year before).

Compared to something like RDR2, where the devs threw everything at the PC version, it's no wonder it's only 4GB in size.

2022-10-19-image-8.jpg


So someone explain this recent ram thing to me. I've been building computers for over 30 years and for a very long time you could NEVER get enough ram and then I remember upgrading to 16gigs back in 2012 thinking, "alright, I'm probably never need all this but it's cool and I want it." Now that's just on my desktop, I have servers that have that will suck down as much ram as I can give them. The problem is, I don't see what it is these programs are DOING with this much ram. Functionally, what is it for? What are these programs doing that it needs this much ram? Is it truly lazy development or is their some innovation that requires more resources?
Can't speak for other applications, but when it comes to games, they still all follow the same process where all of the assets required for rendering have to be stored in system memory first and then copied across to VRAM. In open world titles, where there is a lot of asset streaming taking place, the system RAM will also need to store all of the immediate and surrounding materials. Hence why the memory footprint of such games has ballooned.

If the current cell of the 3D world is using, say, 8GB of assets, the surrounding cells may require an additional 2 to 3GB of extra materials. It's impossible to really tell exactly what data may be required next, beyond a handful of frames, so a well-optimized game will have 10GB or more of data in system RAM, and 8GB in VRAM.
 
I'm sure the game itself is great. I enjoy watching the show. But does anyone else have a hard time playing single player games? I feel like I'm putting myself in some sort of timeout or punishment. Like Resident Evil Village or whatever whhhyyyyyy can't they make games like that co-op. My it's adult ADD or something.
 
I'm sure the game itself is great. I enjoy watching the show. But does anyone else have a hard time playing single player games? I feel like I'm putting myself in some sort of timeout or punishment. Like Resident Evil Village or whatever whhhyyyyyy can't they make games like that co-op. My it's adult ADD or something.

Nope, just the opposite. I'll gladly play a single player game over any of the multiplayer crap we have now days.
Play at my own pace (usually), get to actually enjoy the game, and not have to deal with random internet people acting like a bunch of dickwads for whatever reasons (or no reason at all), never mind not having to deal with cheaters...

Edit: ah, co-op though...can take it or leave it myself. I have enjoyed several Back4Blood run throughs with my brother over the last year, did enjoy l4d 1 & 2 with a small regular group, it has its moments for sure.
That said, not every game needs multiplayer, be it vs or co-op. More players doesn't always mean a better game, story, experience, or even more fun. Unless it's been a design decision at the very start of the production of said game, with focus built around that design, it usually ends up deterring from the experience.
 
I'm sure the game itself is great. I enjoy watching the show. But does anyone else have a hard time playing single player games? I feel like I'm putting myself in some sort of timeout or punishment. Like Resident Evil Village or whatever whhhyyyyyy can't they make games like that co-op. My it's adult ADD or something.
Sometimes but it depends on the game.
This particular have is pretty linear so yeah.
 
Last edited:
Back