Nvidia RTX 4060 Ti Review: 8GB of VRAM at $400 is a No-Go

What is this garbage that's barely faster than a 3060Ti? The performance metrics against that two and a half year old card are simply horrible. You offer barely 10 percent performance uplift and no extra memory for the same price as a card that launched in early December 2020. You can make up most of that by overclocking a 3060Ti!

This is the most ridiculous thing I have seen in a long time. This should have 16GB of memory as standard for $399 and then at least Nvidia could point to that as the slight overall improvement of the old 3060Ti. Not even that going for it.
DLSS 3 and 40 less watts is Nvidia's answer, heard with a hollow echo.
 
8GB of VRAM is plenty for 99% of users who are @1080p and don't bother maxing anything out; even my GTX 780Ti 3GB, R9 290X 4GB, and R9 Fury 4GB cards can play modern games at medium to low settings @1080p, my GTX 980Ti 6GB and Vega 64 can do high to medium @1080p/1440p.

But you don't want to pay $400 for those cards or this one either. That's the point.

This is a perfectly fine, even a very good $270 to $300 card. But it's not, it's a crap $400 card.
 
Good article but it would be nice if you could highlight the new card's position in the comparison tables. It would also be nice to include Intel's cards in the comparisons just so customers know what their options are. Fingers crossed that the RX 7600 will be a much more affordable card otherwise I'll just stick with my 1060 6GB.
 
I saw this coming but it's still utterly shameful. Good to see that even tech sites are refusing to shill this garbage. Jayz2cents tried to, got massively downvoted, pulled his video and is probably going to go the other direction now.

Personally I've thought only HUB and GN are worth it for some time now. It's a hobby we share and journalism shouldn't be corpo's friends, they're supposed to be our, the reader's, advocates. A lot of these other places just don't get that.
 
Steve, you nailed this one. Thanks for continuing to harp on NVIDIA’s attempt to force consumers into a continuous upgrade cycle via planned obsolescence with their VRAM specs.

The 4060 Ti reminds me a lot of Intel’s 11th gen release. Bad value with mixed performance including some regressions versus the previous generation.

To use a quote from the “other tech Steve” (Gamers Nexus), the 4060 Ti is a waste of sand.
 
Nvidia GPU here scored 60? I'm seeing things. I was expecting 80, maybe 70, but 60?

I suppose it's well deserved considering everything.

I'd rather nab a 6750XT for around the $350 range over this overpriced POS.
 
Average fps @ 1440p, taken from the various tests Steve has done with these cards and then averaged again, against the MSRP launch price.

rubbish_ada_graph.png

What can one take away from this? Well, perhaps if Nvidia had used the same pricing strategy with the 4090 and 4080, as it did with the 4070Ti/4070/4060, the 4090 might have been around $1200 and the 4080 possibly been $1000.

Also, the performance spacing between the five models doesn't bode well for any future versions, although the 4090 is pretty CPU limited in a lot of tests at 1440p.

Edit: This is what it looks like at 4K:

rubbish_ada_graph_2.png
 
Seems Nvidia is really trying to push what should have been a xx50 class card up a tier. I mean, 128-bit bus? PCIe x8? Targeted at 1080p? Barely any performance improvement over its predecessor without DLSS3 plus the same amount of VRAM unless you pay $100 more? Laughable. Even the 4080, if you ask me, should have been a 4070 Ti at best. Then use a further cut-down AD102 die for the 4080.
 
Is this one of the worst examples of execution in GPU history, or a deliberate and wholly unjustified attempt to unilaterally reprice gaming GPUs across the board? Looking at the entire 4000 series, I'm guessing it is the latter.

I've been gaming on PC for over 25 years. I have a high end PC right now, but I paid a pretty penny for it. I'm looking at this article, while watching my son play a PS5 on a 4k HDR OLED display across the room. It looks absolutely beautiful, and I have to acknowledge, painfully, that there's currently little appreciable difference in experience between the $500 consoles and high end gaming PCs. I have to say - high end PC gaming is now in a very deep crisis, and Nvidia and their AIB partners are the vanguard of those who are to blame.
 
Good article but it would be nice if you could highlight the new card's position in the comparison tables. It would also be nice to include Intel's cards in the comparisons just so customers know what their options are. Fingers crossed that the RX 7600 will be a much more affordable card otherwise I'll just stick with my 1060 6GB.

In preparing the review and presentation we forgot to color the bars, now reuploaded.
Watch out for another GPU review tomorrow.
 
8GB of VRAM is plenty for 99% of users who are @1080p and don't bother maxing anything out; even my GTX 780Ti 3GB, R9 290X 4GB, and R9 Fury 4GB cards can play modern games at medium to low settings @1080p, my GTX 980Ti 6GB and Vega 64 can do high to medium @1080p/1440p.
If you can play the games you want at the settings you want, then you wouldn't be buying a new GPU.

Notice how you are both happy and using old cards. You are not the target market.

However, people that *do want to buy a new GPU*, want to get something reasonable for the money being asked. At these prices and performance, one is better off getting a console.
 
If you can play the games you want at the settings you want, then you wouldn't be buying a new GPU.

Notice how you are both happy and using old cards. You are not the target market.

However, people that *do want to buy a new GPU*, want to get something reasonable for the money being asked. At these prices and performance, one is better off getting a console.
I have newer cards as well, but just reranan these older cards along with GTX 580 3GB and GTX 780 66GB through a couple dozen games alongside newer GTX 1080Ti/2080T/3080Ti and RX 7900XT; bottom line is even 2013 era GPUs are enough for modern games @1080p; and even top end Fermi can do well in many modern titles @1080p/720p.
 
I have newer cards as well, but just reranan these older cards along with GTX 580 3GB and GTX 780 66GB through a couple dozen games alongside newer GTX 1080Ti/2080T/3080Ti and RX 7900XT; bottom line is even 2013 era GPUs are enough for modern games @1080p; and even top end Fermi can do well in many modern titles @1080p/720p.
2013 flagship cards do well in some modern games you won’t list at high-ish settings at 1080 and lol 720p.

Compelling argument.
 
Devs *should* continue to account for 8GB cards when releasing AAA console ports.

Should 8GB VRAM gamers expect lower texture quality vs a comparable card with more VRAM? yes, absolutely.
Should they expect the game to be a buggy flawed mess that is unplayable because of it? no, of course not.

When all of the actual game content already exists, and much of the budget for the PC release is spent getting it to run on that platform, I would absolutely expect 8GB VRAM gamers to be able to play those games without serious, game and immersion breaking playability issues when paying circa $70 USD, provided of course their system meets or exceeds the minimum spec, and the expectation around performance/quality is forthcoming. Game *needs* more? 8GB holding your game back? that's ok, publish it in the required specs, that's what they're for.

None of this is equal to, or any justification for this overpriced under-specced garbage to be released today for $400 USD, it's an actual insult and the message should be loud and clear to Nvidia when people vote with their wallets.
 
there's currently little appreciable difference in experience between the $500 consoles and high end gaming PCs.
Sure there is. The consoles load & resume much more quickly, stutter less often, [Sony] has better exclusives, and there's less hassles re: game & driver bugs & configuration.

It annoys me that PCs fall behind that way but I stick with mine for keyboard & mouse support, and better modding & multi-tasking.
 
Nvidia would have saved a lot of grief if the specs of the "single" Ti were:

12GB 21GB/s GDDR6, 2.6GHz clocks, 192 bit bus, 48MB L2 cache, full AD107 core count, $429 3070 TI performance.

The 4060 non Ti should then have been 10GB 18GB/s GDDR6, 160bit bus 40MB L2 cache, 15% less cores than Ti, performance between 3060 Ti and 3070 but stronger RT. $379

The 4070 TI should have been a cut down 4080 with 16GB @ $699 on 256 bit bus, leaving the 4070 as it is but $549.
 
GPUs like this are perhaps indicative of the AI-ification of nvidia’s focus. It seems like they’re pushing the DLSS v3 numbers hard in the marketing material, and dropping the ‘native’ raster numbers in small print in the footnotes.

To me this is the only way that the “1080p gaming” marketing line makes any sense. DLSS v3 is nvidia’s cheat code for proclaiming impressive 1440p numbers. “Look, 4060 with DLSS v3 is so much faster than 3060 with DLSS v2 (*conditions apply, see your physician if pain persists).”
 
2013 flagship cards do well in some modern games you won’t list at high-ish settings at 1080 and lol 720p.

Compelling argument.
Games like Chernobylite, Hitman 3, For Honor, Destiny 2, The Division, Vermintide 2, Age of Conan, and definitly fantastic in CSGO.

Moreover; even the first mainstream 2008 i7 920 I have @3.8Ghz is plenty for modern games @1080/1440p with the exception of AC Origins DRM implentation and STILL shows scaling with my 3080Ti and 7900XT gpus; sure it's lower overall, but plenty useable almost 15 years later; and unlike my C2Q QX6700 and Q9650; is not CPU limited to around 30FPS as you increase GPU performance
 
Last edited:
Great article Steve we need more brutally honest tech reviews like this to keep these companies in check.

I'm a 3060Ti owner and what I liked about the 3060Ti was it's performance was very close to the 3070. It's performance was also above the 2080 Super for half the price. The 4060Ti has a huge gap between it and the 4070 that even the 16GB model won't be able to tackle and then to only perform between a 3060ti and 3070 is pure shameful.

You was right to ignore Nvidia on pushing to review with DLSS3 on as it's just an unfair comparison which looks good for Nvidia in terms of marketing. DLSS3 does have it's benefits but if we wants to know the raw performance it totally hides the true data that consumers such as myself need to make an informed decision to buy such a product.

Look forward to future testing to see if the card ages better but I can't see it improving massively to warrant it's purchase at the prices set.

It does look worrying for the standard 4060 now, I feel it's going to be a good 20% slower and sitting just above the 3060 by 5%.

This has got to be the closest gen on gen improvement since the GTX 460 to the GTX 560?
 
nVidia: "Here you go AMD, the mainstream market is all yours."
AMD: "Thanks, now watch us balls it up with some equally bone-headed launch pricing designed to ruin our review scores, then eventually claim that win once we discount the MSRPs and improve performance through driver updates."


 
When buying a new card in 2023 people are looking into starting for some years by 1440p at the bottom; 4K high/ very high quality at the midrange; 4K/8K at the very highest.

To stay limited in settings and 1080p due to low ram amount, people is better off buying a last gen gpu at very good prices.

I will skip this gen entirely.
 
In latest patch, the difference in quality between high and ultra texure in Last of Us is very small and it fixed the issue with 8GB.... It is not big deal as this review try to make it.

The only reason why lows in 1440p average graph is low because of Last of Us game.
 
Back