The Last of Us Part I GPU Benchmark

Interesting that the 3080Ti was almost equal to the 4070Ti at 4K ultra, then significantly faster at 4K high and slightly faster at 4K medium. Looks like the higher memory bandwidth of the 3080Ti (and likely the 3080 12G) might be a benefit in VRAM constrained cases to a certain point. Granted, it’s only one game, but it will be interesting to monitor how these cards age relative to each other as VRAM needs increase over time.
 
The ps3 sure was a monster, it ran this game fine years ago!

sarcasm aside, any pc gamer who waited this long to play this dull game should wait a bit longer for it to hit a sale. quit giving money to these studios each time they drop garbage on your desk.
 
OK this mostly makes sense, however, the 7900XTX and 3090Ti and other cards with more VRam fall behind the 4080 with 16G. Why is that? And, other AMD cards with 16G don't perform as well as Nvidia cards with 12G, so this all can't be just VRam.
 
The ps3 sure was a monster, it ran this game fine years ago!

sarcasm aside, any pc gamer who waited this long to play this dull game should wait a bit longer for it to hit a sale. quit giving money to these studios each time they drop garbage on your desk.

Pretty sure the ps3 struggles to maintain 30fps throughout this game. BUT, its still a great experience, Im currently playing through again on a 77” OLED sitting like 6ft away and cant believe what Naughty Dog was able to pull out of the PS3’s goofy architecture (Not to mention the Uncharted games as well). The OLED’s color depth is breathing new life into a slow game with lots to look at.

Dull? Hmm, its definitely a game with lots of different pacing. But I like it, sometimes ur slowly finding a way out, searching every nook and cranny of an old building for supplies, traversing beautiful terrain, stealthily killing faction members, stealthily slipping by clickers or running for your life. Its fun and feels like your playing through a book or movie with its story telling.
 
Today we're taking a look at the game's GPU performance even though controversy centers around poor performance, with many gamers taking the time to express their disappointment in the form of a negative Steam review (over 10k negative reviews as of writing), and this is unusual because us gamers are normally a really positive bunch.

"this is unusual because gamers are normally a really positive bunch" made me LMAO!

Anyways, I'm sure they can optimize it w/ patches so 8GB can handle Ultra. I'm still waiting for Unreal Engine 5 games which will hopefully be super efficient...and what about that Direct Storage NVME optimization that allows the GPU to load textures faster? Hopefully that'll also eliminate the need for tons of VRAM.
 
The game is a mess. Buggy, video memory allocation is absurd, CPU usage and asset streaming is out of control. Looks like a rushed port that had virtually no consideration for PC.

On 8GB cards the game walls off something like 1.6GB of VRAM for no apparent reason, when idle at desktop you might only really need about 500MB.

It's not like PS5 itself has enormous amount of video memory either. 16GB + 512MB, we know at least 2GB of the main pool is reserved for the UI/OS. You can't use all the remaining 14GB just for video, of course you have considerations like sound, geometry, game engine logic etc.

On PC this thing busts through 14GB usage here and while reasonably pretty let's face it, it's not Crysis 2023!
 
This is even worse than Hogwarts Legacy because the problems even exist at 1080p ultra settings.

It looks like 10GB is the minimum needed to avoid stuttering altogether so the RX 6700 should be fine as well, even though it wasn't in the list. I don't blame Steve for that, the RX 6700 is relatively rare and wasn't well-publicised by AMD (for some odd reason).

I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6800) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.

A card's performance and features are useless if the card can't actually employ them, regardless of the colour of the box they come in.
 
Last edited:
I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now
Ha ha ha I'm using a 3070ti here, but don't feel bad for me cause I have no interest in this game xD
 
Pretty sure the ps3 struggles to maintain 30fps throughout this game. BUT, its still a great experience, Im currently playing through again on a 77” OLED sitting like 6ft away and cant believe what Naughty Dog was able to pull out of the PS3’s goofy architecture (Not to mention the Uncharted games as well). The OLED’s color depth is breathing new life into a slow game with lots to look at.

Dull? Hmm, its definitely a game with lots of different pacing. But I like it, sometimes ur slowly finding a way out, searching every nook and cranny of an old building for supplies, traversing beautiful terrain, stealthily killing faction members, stealthily slipping by clickers or running for your life. Its fun and feels like your playing through a book or movie with its story telling.
I never beat the game and I'm an uncharted uberfan, I tried to play tlou a few times but something about it was always off to me, then when the tv show was made it hit me, tlou really wanted to be a movie, imo for the bit I played it looked great but was boring, its a game that gets so much praise for its story but no one ever talks about the gameplay...cause it was meh.

one of the things that really stood out to me was although the world looked so real and things seemed so gritty, ellie was a busted character, not story or personality wise, gameplay wise. I remember a point where you have to sneak through a place, joel has to be dead quiet or one of them clicky dudes will get him. but that didnt apply to ellie, she was just talking and walking around, making noises and stuff, I was like wtf is this?!

if I want to play a zombie game in a rough broken world I'll play days gone(which is underrated), which also happens to run like butter on pc, but I will give it to naughty dog for how they know how to create beautiful worlds, seems strange they didnt handle this port though considering how much sony trots this ip out get some cash. but maybe thats the point, they knew it would make bank even if it was broken on release, what are gamers gonna do, not buy it? its the what, like 4th release so far? of course gamers will buy it. and get even more broken ports in the future.
 
seems strange they didnt handle this port
Stranger still that Nixxes wasn’t used for the port. I can only assume that the team just didn’t have the capacity at the time, hence why they went with Iron Galaxy instead - it’s an experienced company at handling ports, though not with the best track record of making good ones.
 
"this is unusual because gamers are normally a really positive bunch" made me LMAO!

Anyways, I'm sure they can optimize it w/ patches so 8GB can handle Ultra. I'm still waiting for Unreal Engine 5 games which will hopefully be super efficient...and what about that Direct Storage NVME optimization that allows the GPU to load textures faster? Hopefully that'll also eliminate the need for tons of VRAM.

Efficiënt, but not free :) and definitely not in terms of memory, so don't expect ue5 games to run buttery smooth with low memory configuration. The new tech is enabled by advances in technology, that includes faster and more memory.
 
Anyways, I'm sure they can optimize it w/ patches so 8GB can handle Ultra. I'm still waiting for Unreal Engine 5 games which will hopefully be super efficient...and what about that Direct Storage NVME optimization that allows the GPU to load textures faster? Hopefully that'll also eliminate the need for tons of VRAM.
Direct Storage could help, but it could also make devs lazy as it allows them to fill the VRAM almost instantly, reducing load times with less optimization, yay! At least this is what it felt like with the first game I got to experience the feature in, granted that was just a Beta so lets hope D4 can get it together by launch.
 
In 2020 if you went into Call of Duty Warzone settings and had 6GB of vram you would see it wasn't enough. I thought these settings aren't anything special for my RTX 2060... Kinda raised some red flags for me. I sure didn't think spending $100 more for an 8GB 3070 sounded like a great idea over a 12GB 6750 XT in 2022. Glad I didn't drink the kool-aid, I definitely had a feeling it would end up tasting like "Huang" in the long run.
 
I'm glad. I don't want anyone to suffer, I just hope that this opens their previously-closed eyes.
Honestly I did not think much about VRAM when I bought the 3070ti, although I did want to buy the 6800xt instead. Now I'm happy with what I have, and I don't see myself upgrading VGA in the next 2 to 3 years. My personal point of view is this type of toy should last 3 to 5 years before changing, otherwise I feel like throwing too much money for a hobby xD
 
What a freaking joke.

4k at 30 fps on a 6800xt? Uhh...my 6800xt runs new games at 60 fps at 4k.
Wow these guys really f%$*ed the pc fans.

I hate naughty dog as it is but this was the nail in the coffin for me.

 
Honestly I did not think much about VRAM when I bought the 3070ti, although I did want to buy the 6800xt instead. Now I'm happy with what I have, and I don't see myself upgrading VGA in the next 2 to 3 years. My personal point of view is this type of toy should last 3 to 5 years before changing, otherwise I feel like throwing too much money for a hobby xD
Well, to be honest, I think that most of us paid way too much for this hobby as it is over the past two years, regardless of which card we bought. :laughing:
 
Interesting that the 3080Ti was almost equal to the 4070Ti at 4K ultra, then significantly faster at 4K high and slightly faster at 4K medium. Looks like the higher memory bandwidth of the 3080Ti (and likely the 3080 12G) might be a benefit in VRAM constrained cases to a certain point. Granted, it’s only one game, but it will be interesting to monitor how these cards age relative to each other as VRAM needs increase over time.
yeah the 4070 Ti also has fewer ROPs/TMUs and the L2 cache isn't big enough to make up for the reduced VRAM bandwidth it looks like, relative to the 4080 with the much larger 64MB L2
 
The truly bizarre result of this benchmark is the RTX 3050 outperforming the RX 6600/6650 cards in some settings. Normally even the non-XT RX 6600 is comfortably 30% or so faster than the 3050.
 
I can in no way comprehend how a game running on a console with a rickety CPU at the level of a celeron and a GPU at the level of an RX580 goes so badly on PCs with much better hardware.

unless the "translation" of the code is subpar. but I could understand it if it was from the PS3 version, BUT NOT from the improved version of PS4
 
This is even worse than Hogwarts Legacy because the problems even exist at 1080p ultra settings.

It looks like 10GB is the minimum needed to avoid stuttering altogether so the RX 6700 should be fine as well, even though it wasn't in the list. I don't blame Steve for that, the RX 6700 is relatively rare and wasn't well-publicised by AMD (for some odd reason).

I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.

A card's performance and features are useless if the card can't actually employ them, regardless of the colour of the box they come in.
I think this is greatly unfair assumption, during the crypto boom it was practically impossible to buy anything unless it was greatly inflated, I managed to buy a 3060ti in 2021 for £540 but a RX 6700 XT was selling at the time for around £750 on average with the 3060ti at around £650. If I was just normal gaming the RX 6700 XT or the 6800 would have been my main choice but I wanted to get into PCVR and for wireless streaming to the Quest 2 the Nvidia cards have a better video encoder that's quicker and higher quality. All I wanted to do since the start of the pandemic was play half life Alyx, I'd been waiting for a half life game since 2007!
I knew 8GB of VRAM would eventually not be enough as was the same for my 7870XT LE 2GB from 2013 (not the standard but a cut down 7950XT that performed the same at 1080p) and it was a good card up until cross gen games stopped being made around 2015/2016 and it's history repeating itself again but my options were utter **** because of the situation of the pandemic and crypto boom.
 
In 2020 if you went into Call of Duty Warzone settings and had 6GB of vram you would see it wasn't enough. I thought these settings aren't anything special for my RTX 2060... Kinda raised some red flags for me. I sure didn't think spending $100 more for an 8GB 3070 sounded like a great idea over a 12GB 6750 XT in 2022. Glad I didn't drink the kool-aid, I definitely had a feeling it would end up tasting like "Huang" in the long run.
Depends on the resolution, CoD warzone ran flawlessly on my RX 470 4GB at 1080p, I'm sure I was running medium textures and medium settings throughout, so looked and played better than my PS4 with frame rates at 60-80fps if my memory remembers. Warzone was definitely built around AMD architecture though as the game excels on AMD hardware over Nvidia.
 
Back