The Last of Us Part I GPU Benchmark

My 3080 hasn’t aged well.

DLSS is ageing incredibly well (check out hubs latest comparison) can drop the newest dll in any game and even mod it into some FSR games because AMD locks it out, the card has enabled over a dozen amazing RT experiences for me at 4k on my C2 OLED too.

I've not had to turn down textures from the maximum setting once, but admittedly I don't preorder and play garbage ports as soon as they launch.

Mines aging fantastically, what's wrong with yours?

Plague tale has not the fidelity of Last of Us. Sorry.

Yeah, TLOU looks significantly worse, doubly so at equal VRAM allocation / usage.
 
What’s the meaning in testing a broken game ? This game was released basically as a “paid Beta”. There is NO REASON to have 9 GB of used VRAM at medium quality in this game. This is a PlayStation game, and not even an impressive one, technically speaking. This is a clickbait article/video, riding the “8 GB isn’t enough drama” so popular on YouTube today. Let them fix the game, and then test it again. You did the same with Hogwarts Legacy, and today the game is quite different after a few huge patches.
 
The ps3 sure was a monster, it ran this game fine years ago!

sarcasm aside, any pc gamer who waited this long to play this dull game should wait a bit longer for it to hit a sale. quit giving money to these studios each time they drop garbage on your desk.
Exactly.
This game should not be downloaded until the developer won’t fix it. There is not a single good reason those texture are so big.
 
This is even worse than Hogwarts Legacy because the problems even exist at 1080p ultra settings.

It looks like 10GB is the minimum needed to avoid stuttering altogether so the RX 6700 should be fine as well, even though it wasn't in the list. I don't blame Steve for that, the RX 6700 is relatively rare and wasn't well-publicised by AMD (for some odd reason).

I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.

A card's performance and features are useless if the card can't actually employ them, regardless of the colour of the box they come in.
Radeon cards are not an option. No one wants a Radeon, and for good reasons. No point in having a good hardware if your software is a mess… Nvidia is at fault here, as usual: 8 GB mid range cards were a bad choice since the beginning. But it just needs a competent developer to fix this. TLOU is not even a next-gen game. It just is a poorly coded port. Hogwarts Legacy after a few patches runs smoothly on 8 GB cards at 1440P.
 
Radeon cards are not an option. No one wants a Radeon, and for good reasons. No point in having a good hardware if your software is a mess…
#404 Software mess not found.

I've just switched from a GTX 1070 to a RX 6800XT and the Radeon software suite is simply better than the bloatware nvidia has.
The reason I switched to Radeon is simply it is better value ATM.
 
I have a GTX 1050ti, I'm playing at 1440p on low settings, it doesn't studder at all, it plays just fine. I even have a screen recorder going (ZD Screen Recorder). It went to the main menu a couple times on one part of the game but that's it. I have a video on YouTube of me playing about an hour of the game, showing my experience with the game. I kinda suck at the game but I'm having fun playing it. My username is LinkSquish. I'm not looking for subscribers or attention. I just think it's interesting that there's a bunch of issues with the game but I'm playing fine only having a GTX 1050ti. Rest of my PC specs are. ASUS Tuff Gaming Z690-Plus WiFi 4D, Intel 12th gen i9-12900K, 32gb RAM 3600MHz DDR4, 1tb WD Black NVMe SSD and the game is installed on a separate drive that is 1tb WD Blue NVMe SSD.
 
The math is simple:

- graphic cards' makers pressure somehow ($$$) AAA studios to increase the "level"

- AAA studios also benefit from that ($$$$$) because on people's minds, it's the same as the megapixel count "more the better" (which is not true necessarily), so if a game needs more hardware "it should be better"

- the last point means, less need for optimization, less time consumed, we can release it sooner, so more $$$ and (they think) people with PC spend gladly on newer hardware so no worries

At the end, in a PC, game studios have very low motivation to optimize games; on the other hand, on consoles, the hardware is fixed so they *have to* optimize the game so that it runs well. Sadly this is the truth about making games for a console (which now they are PCs with a different OS and case) or for a PC.

There would be an easy solution as I saw on some games: effects and resolution on one side; textures on the other side; so if you have 8 GB VRAM, just max out everything and let textures on very high instead of Ultra. That way you can still play at maximum detail but the textures, no harm done.

It is the same as HD vs FHD vs 4K vs 8K, above FHD there is little difference, so if I have to use textures on very high instead of Ultra I won't see the difference anyway. What I don't like is that a game studio artificially limits my gaming because of $$$ interests.
 
My system with Aorus Master 6800xt(max oc) paired with Ryzen 7 5800x3d and 32gb 3600mhz cl16 ram has completely different results. On 1440p Ultra my fps in exteriors stays at 75fps ish and interiors 85fps ish with some drops to 83 and some jumps to 90+. With FSR 2 on quality it stays mostly over 100fps.
No idea how you managed that 50 to 60ish fps....
 
What’s the meaning in testing a broken game ? This game was released basically as a “paid Beta”. [...] This is a clickbait article/video, riding the “8 GB isn’t enough drama” [...] Let them fix the game, and then test it again. You did the same with Hogwarts Legacy, and today the game is quite different ...
I cannot say you're right here, but they are.

The game was released + asking money for it, so it means they consider the game is done. TS just tested the mess the company thinks it's "ready" (to finance sooner the company).

If they do your way, more paid beta or alpha will be released as "ready" and then after some years you'll get the final game after you already paid the big bucks, played and got tired of it.

Would you pay 60.000€/$ for a car
- OPTION A: in 2023 on the testing beta phase, run into many issues, slow downs, breaking on the street, consuming much more, glitches etc and then after a year of updates, issues, letdown and so on, you finally get the car doing good thing?

- OPTION B: you would consider spending that money in 2024 when *the company itself* did on their own cost the testing and the car is 99.9% ok and then just receives minor fixes and improvements?

Would you pay for the option A? (Basically pay for being a beta tester). I wouldn't and won't.

As long as more people are willing to pay a lot for doing beta testing, play rushed out material and willing to pay for new hardware to cover bad programming, these companies just will keep producing garbage. That is the advantage of gaming consoles: the hardware is fixed, the software has to adapt to the hardware.
 
Radeon cards are not an option. No one wants a Radeon, and for good reasons. No point in having a good hardware if your software is a mess… Nvidia is at fault here, as usual: 8 GB mid range cards were a bad choice since the beginning. But it just needs a competent developer to fix this. TLOU is not even a next-gen game. It just is a poorly coded port. Hogwarts Legacy after a few patches runs smoothly on 8 GB cards at 1440P.
You dont know what you are talking about. Have you seen the textures in Hogwarts? No? Thats becasue there are none every few seconds when the VRAM spills over. ^^
 
Imo it does make sense. If you play it you will notice that there is lots of detail in the whole levels everywhere. You really have to look for a texture thats bad. Its really impressive.
 
Back