Nvidia GeForce RTX 5060 Ti 8GB Review: Instantly Obsolete

Without the blue bar charts and comparative overview, it's quite hard to see how it performs at a glance, compared to, say 4060.
The point is that it is trash compared to the 16GB version and shouldn't be purchased by anyone. As the only difference is the VRAM, there is no room for all the excuses fanboys give about "the card not being strong enough for -insert settings here- so the lack of VRAM doesn't matter".

So comparative overview is unnecessary and frankly counter productive. The 5060 will have bar charts and the 5060 TI 16GB already does. Buy either or a 4060 on eBay but don't buy the 5060 TI 8GB.
 
I can't understand which one is correct here https://www.techpowerup.com/review/gainward-geforce-rtx-5060-ti-8-gb/25.html in Spider-Man 2 the 5060 ti 8gb is at 94 fps, here at techspot it is at 50 fps.
1080P
I found the differences surprising too.

Techpowerup 1080p "highest setting unless otherwise stated" (native, no RT)
16GB: 104 FPS
8GB: 94

HUB/Techspot
1080p Very High (native, no RT)
16GB: 73 FPS
8GB: 48

1080p High (native, no RT)
16GB: 84 FPS
8GB: 59

Some variance from different locations is going to happen but these are BIG differences.
 
All models with 8gb are out of stock on Newegg.
None on Amazon.
Because at the end of the day, they are the second cheaper cards this year
 
Previous developer optimization is largely a myth.

The difference today is that GPUs did not respond to the added VRAM in consoles like they have in the past. In the past, 2ish years after a new gen of consoles dropped GPUs responded with more VRAM to stay competitive.

Today, covid slowed upgrades followed by AI which fundamentally broke the incentives for VRAM. That is, AI loves VRAM and therefore responding to consoles added VRAM not only makes the GPU more competitive for gaming it makes it more competitive for AI which can be sold for MUCH more money. So Nvidia is foremost protecting it's AI cash cow by limiting VRAM on gaming GPUs. The better margins and earlier obsolesce on gaming hardware are more of a secondary bonus for them. Put another way, it's more about no losing a huge chunk of 90% of their revenue rather than padding 10% of their revenue.

it doesn't change the fact that RTX 3070 has 8gb of Vram, RTX 4060 has 8gb of VRAM, RX6600 has 8gb plenty of cards released in recent time use just 8gb of vram, the devs knew this and ignored it, this is on them, and I will not change my mind on it, Nvidia may be stingy but the game devs job is to make the game playable on the most hardware within reason, they failed
 
it doesn't change the fact that RTX 3070 has 8gb of Vram, RTX 4060 has 8gb of VRAM, RX6600 has 8gb plenty of cards released in recent time use just 8gb of vram, the devs knew this and ignored it, this is on them, and I will not change my mind on it, Nvidia may be stingy but the game devs job is to make the game playable on the most hardware within reason, they failed
Devs made the game playable on $500 consoles.

Nvidia and AMD selling $500 GPUs unable to keep pace is on them.

If they had responded to the PS5's launch in the same way they did to the PS4 and PS3, the 12GB would have been the new minimum for RTX 4060. But they didn't and even worse repeated it again for 5000 series. Devs want to make a game once not twelve times because hardware companies want to squeeze a bit more profit.
 
Devs made the game playable on $500 consoles.

Nvidia and AMD selling $500 GPUs unable to keep pace is on them.

If they had responded to the PS5's launch in the same way they did to the PS4 and PS3, the 12GB would have been the new minimum for RTX 4060. But they didn't and even worse repeated it again for 5000 series. Devs want to make a game once not twelve times because hardware companies want to squeeze a bit more profit.
when PS4 came out the market size for conoles was bigger than PC, these days its the opposite, the most popular cards are 8gb at 35.52%, 12gb is only 18.42% and 6gb is 11.98%, and since pc gaming is now bigger tahn console gaming and 8gb is the most common vram and its not even close it seems to me this is a case of the game devs not doing any work.

 
Last edited:
The point is that it is trash compared to the 16GB version and shouldn't be purchased by anyone. As the only difference is the VRAM, there is no room for all the excuses fanboys give about "the card not being strong enough for -insert settings here- so the lack of VRAM doesn't matter".

So comparative overview is unnecessary and frankly counter productive. The 5060 will have bar charts and the 5060 TI 16GB already does. Buy either or a 4060 on eBay but don't buy the 5060 TI 8GB.
At least we could have seen how low in the chart this card will be. Surely there would have been some worse performing cards in the comparison too.
 
This has got to be the lowest score I've seen on techspot and rightly so. I can't believe 8GB of VRAM was made standard in the midrange 9 years ago and now we're still getting 8GB VRAM in 2025 at the midrange. Nvidia deserve as much negative discord about this as possible. Gamers and system builders please do not buy this crap... Vote with your wallet.
 
So - they could’ve gone for 16gb GDDR6 instead - price would be possible to keep at the same low level and performance would be better
 
8GB is enough. For a 50-series aimed at low-power gaming. My boo has an 8 GB 3050 and it's fantastic for what she wants to play, which is mainly stuff like Cozy Grove and Sims 4.
"low power gaming" - this is a 380$ card in 2025 (in reality 450-500$, the MSRP is a complete fabrication). 8GB is not just a joke, it's an insult to everyone.

As we've seen in the tests, the 16GB can handle 1440p and 4K in AAA games. this is a crippled card at even 1080p.

Nvidia has shown you the middle finger and here you are asking them to show the other middle finger too.
 
Im going to again note that the people purchasing xx60 class cards are people who are not "gamers", but people using budget rigs that more often then not will be running 1080p, default (medium) settings. For that use-case, the card is more then sufficient (outside of a few cases like Spider Man).

I'm not defending 8GB, but it is important to remember what the market is for this class of card.
 
This is appalling, and trickery.

People who buy these very average level cards (xx60ti) are unlikely to be, mostly enthusiasts.

So they can't be blamed, which I am sure some people will do, if they get an 8GB card because they didn't check. At this level, people just want to game decently on their PC and don't care about, or know much about performance and how it links to tech.

NV know this. They will get away with this. I feel sorry for the casual non-tech person who buys a mid range, latest gen card expecting decent but not brilliant performance.

Those who get the 8GB won't even be able to play a few games. The 16GB is not a bad card. The 8GB is just straight out disgusting.

Shame on NV. (And they know it - no review samples given out.) Just shame on them.
 
Im going to again note that the people purchasing xx60 class cards are people who are not "gamers", but people using budget rigs that more often then not will be running 1080p, default (medium) settings. For that use-case, the card is more then sufficient (outside of a few cases like Spider Man).

I'm not defending 8GB, but it is important to remember what the market is for this class of card.
the market is the majority of gamers. this card can run games even at 4K with 16GB of VRAM. it's not a 5030, it's a 5060 TI.
 
I found the differences surprising too.

Techpowerup 1080p "highest setting unless otherwise stated" (native, no RT)
16GB: 104 FPS
8GB: 94

HUB/Techspot
1080p Very High (native, no RT)
16GB: 73 FPS
8GB: 48

1080p High (native, no RT)
16GB: 84 FPS
8GB: 59

Some variance from different locations is going to happen but these are BIG differences.

I would expect that "highest setting" refers to the highest preset. In a lot of cases these days, that automatically turns on dlss. If so, it's not native performance.
 
Im going to again note that the people purchasing xx60 class cards are people who are not "gamers", but people using budget rigs that more often then not will be running 1080p, default (medium) settings. For that use-case, the card is more then sufficient (outside of a few cases like Spider Man).

I'm not defending 8GB, but it is important to remember what the market is for this class of card.
If they're not "gamers" why would they even be picking up this GPU? Why would they be spending £400+ on this GPU and not just getting either the Intel B580, or just using their onboard GPU? Or a Console?

The market for this card is PRECISELY gamers and gamer focused, it's not for workstations, it's not for video editing, AI, CGI, CAD, or anything else, it's very specifically for gamers.

The price is also very high, that's not even budget gamer territory, that's mid territory, 8GB for the price they're asking is insulting, especially when Intel's giving double that for less money.
 
Last edited:
when PS4 came out the market size for conoles was bigger than PC, these days its the opposite, the most popular cards are 8gb at 35.52%, 12gb is only 18.42% and 6gb is 11.98%, and since pc gaming is now bigger tahn console gaming and 8gb is the most common vram and its not even close it seems to me this is a case of the game devs not doing any work.

The problem with this logic is that you would never increase VRAM because most people have 8GB so all we should target 8GB which means that in the future most people will have 8GB so all we should target 8GB...
 
Im going to again note that the people purchasing xx60 class cards are people who are not "gamers", but people using budget rigs that more often then not will be running 1080p, default (medium) settings. For that use-case, the card is more then sufficient (outside of a few cases like Spider Man).

I'm not defending 8GB, but it is important to remember what the market is for this class of card.

There is no market for this card. It's a $380 USD GPU, not a $200 budget card. There is zero good reason to spend $80 more than the base 5060 for the 5060 Ti if all you do is 1080p default settings.
 
At least we could have seen how low in the chart this card will be. Surely there would have been some worse performing cards in the comparison too.
I'm sure some older cards that are no longer for sale are worse ... but so what?

What does knowing how old cards compare to a card no one should buy accomplish? Get the 16GB or the non-TI or get the AMD equivalent or wait for the Refreshes in a year.
 
I would expect that "highest setting" refers to the highest preset. In a lot of cases these days, that automatically turns on dlss. If so, it's not native performance.
Techpowerup's setup page says that they run native without RT unless otherwise stated and when they do either they put them on a separate page. I don't have SP2, but I think Very High is the highest setting. If it isn't then it's even more weird because TPU is getting higher FPS than HUB.

There are a lot of differences between the exact location in the game but the 1080p data has TPU at double the FPS of HUB and that is way out of the norm.
 
Seems the whole 5000 series is a flop, a sham! Nvidia trying to sell a supposed new series which are little more than 4000 series posers. I refuse to pay the outrageous prices asked for even 4000 series cards nowadays!
 
Im going to again note that the people purchasing xx60 class cards are people who are not "gamers", but people using budget rigs that more often then not will be running 1080p, default (medium) settings. For that use-case, the card is more then sufficient (outside of a few cases like Spider Man).

I'm not defending 8GB, but it is important to remember what the market is for this class of card.
Why do you think they will be running at 1080p? Even budget monitors are now 1440p and any TV is 4K.

And most games don't default to medium. They either run a quick HW scan then default to settings that should get you 60ish FPS, or they default to 720p lowest everything (older and some indie games).

I don't understand why gamers are always grouped into 4K Max Everything and 1080p Low-Medium. If this was really true, the could just sell 2 GPUs every generation. There is a lot of middle ground even for casual gamers that just swap between settings presets.
 
I found the differences surprising too.

Techpowerup 1080p "highest setting unless otherwise stated" (native, no RT)
16GB: 104 FPS
8GB: 94

HUB/Techspot
1080p Very High (native, no RT)
16GB: 73 FPS
8GB: 48

1080p High (native, no RT)
16GB: 84 FPS
8GB: 59

Some variance from different locations is going to happen but these are BIG differences.
I found
TLOU Part II 1440P 5060 8gb 60FPS and 16gb 71 FPS.
 
Back