The Last of Us Part I GPU Benchmark

I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.
I bought the misses a 3060Ti for £300 off eBay, had been ripped out of some eGPU enclosure, hadn't been used much and wasn't even 6 months old. The 8GB VRAM sucks but I did get the card cheaper than any competing product at the time.

Her previous card was a 970 so it was a massive upgrade for her. Hogwarts Legacy runs absolutely fine to be honest, she hasn't complained once.

Definitely do not feel bad for anyone who bought a GPU in the last 3 years, they have money ;)
 
Since 23.3.2 driver , 7900 XTX is not so good as 23.2.2 driver . Random crashes if not GP Frequency and voltage manual settings , even for not demanding games .
 
The game is a mess. Buggy, video memory allocation is absurd, CPU usage and asset streaming is out of control. Looks like a rushed port that had virtually no consideration for PC.

On 8GB cards the game walls off something like 1.6GB of VRAM for no apparent reason, when idle at desktop you might only really need about 500MB.

It's not like PS5 itself has enormous amount of video memory either. 16GB + 512MB, we know at least 2GB of the main pool is reserved for the UI/OS. You can't use all the remaining 14GB just for video, of course you have considerations like sound, geometry, game engine logic etc.

On PC this thing busts through 14GB usage here and while reasonably pretty let's face it, it's not Crysis 2023!
I have a 7" display that monitors my system stats .... On boot up it uses .2 GB, with Chrome open .3 GB with 5 tabs or .4 GB if one or more of those tabs is YouTube but that's because I have RTX Super Resolution enabled
 
So you need a 4080+ to run this maxed out at 4k/60. This overrated linear small map 10 year old console game takes a 4090 to get 70fps at 4k. A 40freaking90......the fastest card on the market?
Talk about spitting in the face of pc gamers. People need to stop buying this crap when they hear about the performance. Man, this pisses me off.....
 
The ps3 sure was a monster, it ran this game fine years ago!
To be fair, the Cell was an absolute beast of a CPU on paper; as I noted *many* times over the years, the Cell was at maximum throughput about twice as powerful as the XB1/PS4 CPUs. The problem was using the Cell in a way that let you get that performance.

Regardless: Consoles are just PCs now. They use the same high-level APIs you see on PCs, using HW that is basically identical to PCs. You aren't going to see the low-level optimizations you used to have, because that only makes port more expensive. Which (paradoxically) makes ports worse because it's easy to get the port completed, so no one bothers to invest in such things like "testing across multiple configurations" and "performance optimizations".
 
This is even worse than Hogwarts Legacy because the problems even exist at 1080p ultra settings.

It looks like 10GB is the minimum needed to avoid stuttering altogether so the RX 6700 should be fine as well, even though it wasn't in the list. I don't blame Steve for that, the RX 6700 is relatively rare and wasn't well-publicised by AMD (for some odd reason).

I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.

A card's performance and features are useless if the card can't actually employ them, regardless of the colour of the box they come in.

Meanwhile in the real world, the 3060ti is within 4 fps of the 6700xt at 1440p High. LOL
 
This is even worse than Hogwarts Legacy because the problems even exist at 1080p ultra settings.

It looks like 10GB is the minimum needed to avoid stuttering altogether so the RX 6700 should be fine as well, even though it wasn't in the list. I don't blame Steve for that, the RX 6700 is relatively rare and wasn't well-publicised by AMD (for some odd reason).

I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.

A card's performance and features are useless if the card can't actually employ them, regardless of the colour of the box they come in.
I went with my 3070ti because when I bought it I was still using a 1080p tv and nvidia's features like dldsr worked better than amd's versions(which I actually never got to work when I tried their cards) now I have a 4k hdr tv and once again my 3070ti works better with it than the 6800 I borrowed from a friend to try out(seems they treat hdr differently)

BUT, I dont agree with nvidias vram amounts or prices at all and do consider them ripoffs, but also in my case nvidia hardware has been more stable for me over the years, and I bet its opposite for some amd users who had terrible nvidia hardware.

at this point it seems some devs who port to pc will just release janky work and rely on brute horsepower and frame generation to finish the job.
 
I'll repeat myself, but it seems to me that with most modern games, you're actually a paying beta tester the first three months after a game is launched. :/
3 months? You're being pretty generous. The first DLC is when I assume half the problems are worked out. I'm so burned on big game studios I won't buy new anymore. For a year 2042 wouldn't even load for me. Reinstalled it, different SSD, Different processors. Yet when I buy from indie developers they work.

Buying games new especially the gouging prices they sell them for outside the US is insane. Redfall $89.99 Give me a break.... Call of Duty: Black Ops Cold War $79.99! Diablo 4 $89.99!
 
Game ran fine on my system with a 6900XT and 16 GB VRAM. Played it on everythign Ultra and 4K FSR Q with always limited 60 fps. I realized how important VRAM is when I had texture loading issues in HZD on my 6600XT and someone on a PS5 didnt have them.

Now I see I was right. Always have enough memory. And nvidia is doing this on purpose. So it is your decision if you wanna support that with your purchase decision or buy something else.
 
Plague tale requires 6.5GB of VRAM at 4k Ultra. TLOU Require 9.5GB at 720p LOW and 13GB on 720p Ultra.

Nough said I think...
 
3 months? You're being pretty generous. The first DLC is when I assume half the problems are worked out. I'm so burned on big game studios I won't buy new anymore. For a year 2042 wouldn't even load for me. Reinstalled it, different SSD, Different processors. Yet when I buy from indie developers they work.

Buying games new especially the gouging prices they sell them for outside the US is insane. Redfall $89.99 Give me a break.... Call of Duty: Black Ops Cold War $79.99! Diablo 4 $89.99!
I only buy new if me and a group of friends wants to play it for our co-op game nights, otherwise I have a huge library of games I haven't played, or otherwise just wait for sales.
 
I think this is greatly unfair assumption, during the crypto boom it was practically impossible to buy anything unless it was greatly inflated, I managed to buy a 3060ti in 2021 for £540 but a RX 6700 XT was selling at the time for around £750 on average with the 3060ti at around £650. If I was just normal gaming the RX 6700 XT or the 6800 would have been my main choice but I wanted to get into PCVR and for wireless streaming to the Quest 2 the Nvidia cards have a better video encoder that's quicker and higher quality. All I wanted to do since the start of the pandemic was play half life Alyx, I'd been waiting for a half life game since 2007!
I knew 8GB of VRAM would eventually not be enough as was the same for my 7870XT LE 2GB from 2013 (not the standard but a cut down 7950XT that performed the same at 1080p) and it was a good card up until cross gen games stopped being made around 2015/2016 and it's history repeating itself again but my options were utter **** because of the situation of the pandemic and crypto boom.
From my post (that you quoted):
"I know a lot of them bought the card because at the time, it may have been all that they could get."

So yeah, I am aware that this was the case.
 
I went with my 3070ti because when I bought it I was still using a 1080p tv and nvidia's features like dldsr worked better than amd's versions(which I actually never got to work when I tried their cards) now I have a 4k hdr tv and once again my 3070ti works better with it than the 6800 I borrowed from a friend to try out(seems they treat hdr differently)

BUT, I dont agree with nvidias vram amounts or prices at all and do consider them ripoffs, but also in my case nvidia hardware has been more stable for me over the years, and I bet its opposite for some amd users who had terrible nvidia hardware.

at this point it seems some devs who port to pc will just release janky work and rely on brute horsepower and frame generation to finish the job.
Oh don't get me wrong, I don't believe that nVidia's drivers are problematic and I never have. I just haven't really had problems with ATi drivers either, well, except for a little overscan that sometimes happened with my R9 Fury but I just had to toggle GPU scaling to fix it. It wasn't exactly something that bothered me too much.
 
AMD drivers are also no problem. Whenever you hear about that nonsense, its from people that had an AMD card 10+ years ago had an issue and thats from what they draw their conclusions that AMD is bad. The AMD drivers are stable and if anything, nvidia in this case makes themselves look bad as they had issues with crashes in this game.

Everything else is propaganda by nvidia.
 
Also people need to realize like steve already said in his tweets, that history repeats itself. Now we have the PS5 with 16 GB memory and the last few titles clearly had the PS5 VRAM as baseline. Too bad nvidia ripped its consumers off with that planned obsolesence, but if people still buy it, its on them.
 
Also people need to realize like steve already said in his tweets, that history repeats itself. Now we have the PS5 with 16 GB memory and the last few titles clearly had the PS5 VRAM as baseline. Too bad nvidia ripped its consumers off with that planned obsolesence, but if people still buy it, its on them.
Yeah it does feel like having 8GB of VRAM is like buying a 2GB VRAN card in 2013-2014 when the PS4 first launched, by 2016 most titles required vastly more VRAM.
 
"this is unusual because gamers are normally a really positive bunch" made me LMAO!

Anyways, I'm sure they can optimize it w/ patches so 8GB can handle Ultra. I'm still waiting for Unreal Engine 5 games which will hopefully be super efficient...and what about that Direct Storage NVME optimization that allows the GPU to load textures faster? Hopefully that'll also eliminate the need for tons of VRAM.

The company ashtray said the new unreal engine will demand s lot of vram, something north from 12G
 
Back