Nvidia GeForce RTX 5060 Ti 16GB Review: Not Great, Not Terrible

The only real reason I see any manufacturer releasing 8GB RAM versions of GPUs these days, is to capture people still in the 1080p market.

Which shouldn’t exist given the relative affordability of 1440P monitors these days. That market segment should be contracting, not expanding or staying flat. Even if people choose to game at 1080P for performance reasons, the baseline experience should be 1440P now
 
The only real reason I see any manufacturer releasing 8GB RAM versions of GPUs these days, is to capture people still in the 1080p market.
I'm not even convinced of that anymore. Indiana Jones for example on the 4060ti 8GB can't even run at 1080p Ultra as it runs out of VRAM. Sure, you can turn down settings to make it run, but you shouldn't have to when the 16GB 4060ti runs it just fine. If more and more games start requiring ray tracing then this will become more common, even if the rest of the card gets increased performance.
 
It's actually one of the best value cards around now. Would be good at 1080p, and enough to do 1440p pretty decently without too much culling of the settings.

Only really the 9070 series is a better bang for your buck. An MSRP 5070 is slightly better but the lack of memory on it would make me avoid that specific model. See the total collapse on Indiana Jones.

I can't say I would be particularly excited for a 5060Ti 16GB as it is just another marginal gain from last generation however it looks like the best budget card you can now buy. If you can stomach the fact $430 is a budget card....
 
Can't beat a RTX 4070 and barely ekes out a 4060Ti.

I wouldn't call this card "Not great, not terrible", I'd call it bad. Maybe I'm expecting too much from Nvidia and AMD this gen, but so far nothing has really stood out.

9700XT giving you 7900XT/XTX performance, but with better RT capabilities (that I don't care about)....not really exciting. It's okay, if you can get a card at or near MSRP.

Pretty much any Nvidia card gets you 10-15% over the previous model, but at a similar or higher price. The only real winner is the 5090, but once you look at the price you realize they're just doing this to screw consumers.
 
The performance in RT games is not bad..
unfortunately, even though it has gone through 2 generations, I was hoping that the x60 series could surpass the raster performance of the x80, but it seems that this is not the case..
 
The only real reason I see any manufacturer releasing 8GB RAM versions of GPUs these days, is to capture people still in the 1080p market.
This notion that VRAM usage is tied to resolution really needs to die, that's not how any of this works. The difference that smaller framebuffers for lower resolutions like 1080p makes is a few hundred MBs worth of VRAM at most. It made sense to talk about this when we were talking about 2 GB cards vs 4 GB cards 10 years ago, it makes absolutely zero sense now.

VRAM consumption is 90% textures, and texture quality doesn't depend on screen resolution. So no, 8 GB is not enough for 1080p, 1080p on 8 GB will still suffer degraded texture quality and worse pop-in in modern games just like 1440p and 4K would.

8 GB is not enough for anybody who intends to play current games, irrespective of resolution. New 8 GB cards today are essentially e-waste.
 
The performance in RT games is not bad..
unfortunately, even though it has gone through 2 generations, I was hoping that the x60 series could surpass the raster performance of the x80, but it seems that this is not the case..
I remember when you'd get two generations out and the mid ranged new gen card would be around the performance of a top-end two generation old card. Look at the 2060 vs the 980Ti or the 3060 vs the 1080Ti or the GTX 1060 vs the GTX 780Ti.

My 3080Ti still has around 25% more performance over the 5060Ti. I'll just happily keep gaming on my card and be content that there is zero reason to upgrade anytime soon
 
A great article as usual, thank you, but am I the only one missing the comparison to Intel B580? Sure, its performance is more or less on par with RTX 4060 and RX 7600 (not included either), but considering it's a new card and the closest equivalent from the second competitor, it would be nice to see it at least in the Cost per Frame / Power Consumption comparisons. Even if just to keep readers aware there is a third player in the graphics card market, for the time being anyway.
 
At least it has a decent gain over the previous 4060 ti.
But, forgive a novice please but, what I'm seeing in the 16 games average is the 2 and a half year old 7900 XTX is faster than the current AMD flagship 9070 XT.
Is AMD planning a faster 9000 series that I can't find, or is the 9000 series more about slightly better RT?
 
This notion that VRAM usage is tied to resolution really needs to die, that's not how any of this works. The difference that smaller framebuffers for lower resolutions like 1080p makes is a few hundred MBs worth of VRAM at most. It made sense to talk about this when we were talking about 2 GB cards vs 4 GB cards 10 years ago, it makes absolutely zero sense now.

VRAM consumption is 90% textures, and texture quality doesn't depend on screen resolution. So no, 8 GB is not enough for 1080p, 1080p on 8 GB will still suffer degraded texture quality and worse pop-in in modern games just like 1440p and 4K would.

8 GB is not enough for anybody who intends to play current games, irrespective of resolution. New 8 GB cards today are essentially e-waste.
Let's not forget that upscaling, a necessity, takes up a decent bit of VRAM and so does framegen. Mix in RT and all those things together mean that cards like a 5060 could easily use 12GB or more VRAM these days. And textures don't take up tons of GPU power relative to the amount of VRAM they need. I'm not against 8GB cards, but they have to be priced in a way that it's understood you aren't getting a premium or modern experience.
 
At least it has a decent gain over the previous 4060 ti.
But, forgive a novice please but, what I'm seeing in the 16 games average is the 2 and a half year old 7900 XTX is faster than the current AMD flagship 9070 XT.
Is AMD planning a faster 9000 series that I can't find, or is the 9000 series more about slightly better RT?

AMD said they were only doing midrange and below this generation, like the 5700/5600/5500/XT RDNA1 gen. 9070 is 70-level, no 80 or 90 level GPUs.
 
Such a waste of time article. You said literally nothing - no info about hardware, no disassembly photos, no VRM showcase... you just showed some charts with no MFG/FG comparison... I will buy two of these cards for local LLMs and will play with MFG... There will be IMHO great FPS on 4k but from this article, I can just guess.
 
Such a waste of time article. You said literally nothing - no info about hardware, no disassembly photos, no VRM showcase... you just showed some charts with no MFG/FG comparison... I will buy two of these cards for local LLMs and will play with MFG... There will be IMHO great FPS on 4k but from this article, I can just guess.
What VRM showcase is possible without a 8GB card to review?

You can just multiply the terrible 4K FPS by 3.8 (4X minus overhead) to get the Fake Frames FPS. And you don't have to guess about how janky the games with sub 30 rendered FPS will feel.
 
While not great for potential buyers, I was happy to see my 3080 still winning even with it's VRAM handicap since it may be a while until sanity returns to the GPU market.

(But I really want to play Spider-Man 2 with RT at 120+ FPS sooner than later, alas it's steam backlog until MSRP priced upgrade)
 
Such a waste of time article. You said literally nothing - no info about hardware, no disassembly photos, no VRM showcase... you just showed some charts with no MFG/FG comparison... I will buy two of these cards for local LLMs and will play with MFG... There will be IMHO great FPS on 4k but from this article, I can just guess.
He focused on performance and value, which is what these reviews are about.

Disassembly and VRM showcase are irrelevant to almost everybody when it comes to this GPU class.
 
Where 1080p results?????????????????????????????????????????????????????????????????

this card not 4k vga card's so why test 4k??? there is no reason
 
Not great, not terrible seems to be the best way to describe the card. There isn't anything really bad about it, pricing seems to be okay. Performance and VRAM is fine. It's just very "boring" compared to the new Radeon cards.
 
Which shouldn’t exist given the relative affordability of 1440P monitors these days. That market segment should be contracting, not expanding or staying flat. Even if people choose to game at 1080P for performance reasons, the baseline experience should be 1440P now
I don't disagree, but there are still many people out there using 1080p monitors who still game and don't want, or feel the need, to upgrade to a higher resolution - so there is still a market for 1080p GPUs.
 
This notion that VRAM usage is tied to resolution really needs to die, that's not how any of this works. The difference that smaller framebuffers for lower resolutions like 1080p makes is a few hundred MBs worth of VRAM at most. It made sense to talk about this when we were talking about 2 GB cards vs 4 GB cards 10 years ago, it makes absolutely zero sense now.

VRAM consumption is 90% textures, and texture quality doesn't depend on screen resolution. So no, 8 GB is not enough for 1080p, 1080p on 8 GB will still suffer degraded texture quality and worse pop-in in modern games just like 1440p and 4K would.

8 GB is not enough for anybody who intends to play current games, irrespective of resolution. New 8 GB cards today are essentially e-waste.
I understand what you're saying, but it really depends on the resolution the game developer is targeting, and whether or not they create different sized textures to cater for differing screen resolutions.

Texture quality will be affected by the base resolution of the textures and the scaling techniques used to generate the required size textures for the display resolution. If your base resolution doesn't match your display resolution you will obviously use some more VRAM during the scaling phase, but this comes back to my first point about the developer's target resolution.

The other thing to consider is the type of games people play. Not everyone plays the latest 3D, open world, ray-traced games.
 
Seems like you're best off finding a second hand 4060Ti in 16gb, or even a 3060Ti in 8gb if you aren't really planning on playing the latest games at high resolution anyway, given there is minimal boost with this card.

Then again, if you have one of those cards, you're probably not going to bother upgrading, so there won't be many on the market.

Kind of a **** sandwich situation for most gamers really. You can get the same performance as 5 years ago for higher pricing, or just keep what you have.. And as pointed out in the conclusion of this article, nothing has really advanced much at all at this price point for a long time, which sucks.
 
Back