Nvidia RTX 4060 Ti Review: 8GB of VRAM at $400 is a No-Go

Yeah last gen GPU at good price was the thing for me also. The 3070 I got for $300 handles TLoU at 1440p with patch 1.0.4 around 60 fps. haven't tried the 1.0.5 yet.
 
Mwahahaha what a joke, I wasn't expecting this from a 4050Tie card to almost beat the last gen 3070.
With half the memory bus, half the PCIe lines and 70% less power, impressive.
Now only if the price was below $250 ($300 adjusted) maybe it will make some sense.
4060 Ti Founders Edition performs 4% less than 3070 at 1440p. Techpowerup.
It uses 25-30% less power, not 70%.

It beats 3060 Ti by 9%.

At 1080p tho, 4060 Ti performs 1% less than 3070 and 11% better than 3060 Ti.

Fixed.

16GB version will change nothing and will be pointless. Might as well go 4070 to get 32-35% higher performance.
 
Only because most PC gamers are still using very old cards. The GTX1060 is still the most popular card on the steam survey. What gamers want is a reasonably priced replacement for much older cards that CAN allow them to play at high settings. I replaced my 1060 with a AMD 6700XT and can run Forza Horizon 5 at 1440p with maxed out/Ultra settings. I couldn't even play AC: Odyssey at 1080p at 30fps without turning settings down with a 1060! The AMD 7600 only having 8gb and a higher power usage (due to node size) is disappointing.
Nvidias x60 series always sell like hotcakes. This is nothing new. AMD barely has any GPUs present in Steam Top 20 GPUs.

Actually the GTX 1650 overtook the 1060 :D
 
Only because most PC gamers are still using very old cards. The GTX1060 is still the most popular card on the steam survey. What gamers want is a reasonably priced replacement for much older cards that CAN allow them to play at high settings. I replaced my 1060 with a AMD 6700XT and can run Forza Horizon 5 at 1440p with maxed out/Ultra settings. I couldn't even play AC: Odyssey at 1080p at 30fps without turning settings down with a 1060! The AMD 7600 only having 8gb and a higher power usage (due to node size) is disappointing.
Did you expect to play AC Odyssey maxed out at 1440p on a 7 year old mid-end GPU tho?

Try maxing out a demanding AAA game in 2028 on your 6700XT. You will see the same result.

A GPU is typically outdated after ~5 years. If you are playing new games on high settings that is. Even flagship GPUs will feel slow at that point.

My 1080 Ti felt slow back in 2020, which was the reason I went 3080.

980 Ti -> 1080 Ti -> 3080 -> 4080 has been my route since 2015. Big upgrades every time.

VRAM won't futureproof much, when GPU is the limiting factor. I actually hate the word futureproof, because it's not possible. Both Nvidia and AMD have little to no focus on GPUs more than 2 generations old. Meaning you will get next to zero optimization for new games at this point = wonky performance and/or visual artifacts.

This is why I upgrade every few years. Staying on a somewhat new arch, preferably the newest = Full optimization focus from both Nvidia/AMD and game developer.

Most game dev's are using Nvidia GPUs for testing PC versions of games. Why? Because 4/5 of PC gamers are using Nvidia GPU and because Nvidia actually give many big game companies GPUs for free, for testing and optimizing.
 
Last edited:
4060 Ti Founders Edition performs 4% less than 3070 at 1440p. Techpowerup.
It uses 25-30% less power, not 70%.

It beats 3060 Ti by 9%.

At 1080p tho, 4060 Ti performs 1% less than 3070 and 11% better than 3060 Ti.

Fixed.

16GB version will change nothing and will be pointless. Might as well go 4070 to get 32-35% higher performance.
I wanted to say it uses 70% power that the 3070 uses, but I got it wrong :).
It's still an impressive result from a castrated GPU.
 
yeah, I was hoping we'd be surprised by the 4060ti GPU performance when it wasn't limited by VRAM but they couldn't let it compete with the 4070. I guess that 32MB of cache they put on it doesn't make up for the limited bus width on the memory bus.

Considering relative performance to last gen cards, the 4060ti really seems like 50 series card.
Did you expect 4060 Ti to beat 4070? :joy:

4060 Ti is not the upgrade path for people with 3070.

It delivers 3070 perf with less watts and with a lower pricetag. Which was to be expected.

4060 non-Ti will probably deliver 3060 Ti perf.

+ DLSS 3 support.

If you have a last gen GPU that delivers, good for you - Many people don't and this is why cards like this sell. 300-400 dollar range is typically what most PC gamers buy.

Just look at Steam HW Survey if in doubt. 1650, 1060, 2060 and 3060 dominates. They sell like hotcakes.
 
I wanted to say it uses 70% power that the 3070 uses, but I got it wrong :).
It's still an impressive result from a castrated GPU.
Its not a GPU for me either, haha, in terms of performance per watt tho it looks decent, especially for 1080p and 2560x1080 gamers, it does 1440p decently as well
 
Last edited:
Polls like that just don't represent the average PC gamer tho, because these tend to use Nvidia GPUs and more people on hardware forums are using AMD GPUs than the average Steam gamer.

Ask any PC gamer that is not into tech/hardware which GPU he has. 9 out of 10 times, it's Nvidia and most often, they had AMD before with a very mixed experience or not ever owned an AMD GPU. This is my experience.

I have used many AMD GPUs earlier, I think they stagnated tho - too expensive, lack of features etc.

It's fine to focus on raster perf only and lack some features when price reflects this.

In some games, AMD GPU is fine, in others, not so fine. With Nvidia, it's just overall a good experience with alot of superior features. Nvidia is the big player. AMD is the small player. The smaller player should always represent much better value, if they has intentions to actually sell anything. However AMDs GPU output, eats away at their CPU and APU output. I think this is the reason why AMD don't really have alot of focus on desktop GPUs.

This is the problem for AMD. CPUs, APUs and GPUs are all produced on the same process node and they earn more per wafer by making CPUs and APUs (including the custom console ones)

There's simply not much money in gaming dGPUs for AMD when all is said and done. Especially because most AMD GPU buyers are buying the cheap ones, often on sale.

AMD cares more for low to mid-end GPU market, which is what Intel aims at as well. Nvidia dominated high-end for years, well mid-end as well since x60 cards always sell like hotcakes.
 
Last edited:
Polls like that just don't represent the average PC gamer tho, because these tend to use Nvidia GPUs and more people on hardware forums are using AMD GPUs than the average Steam gamer.

Ask any PC gamer that is not into tech/hardware which GPU he has. 9 out of 10 times, it's Nvidia and most often, they had AMD before with a very mixed experience or not ever owned an AMD GPU. This is my experience.

I have used many AMD GPUs earlier, I think they stagnated tho - too expensive, lack of features etc.

It's fine to focus on raster perf only and lack some features when price reflects this.

In some games, AMD GPU is fine, in others, not so fine. With Nvidia, it's just overall a good experience with alot of superior features. Nvidia is the big player. AMD is the small player. The smaller player should always represent much better value, if they has intentions to actually sell anything. However AMDs GPU output, eats away at their CPU and APU output. I think this is the reason why AMD don't really have alot of focus on desktop GPUs.
I am sorry don't know what you are on about, my AMD GPU has been an amazing experience, no matter what game I play, no issues with software or anything, no lack of features here as well(RT included), can we stop promoting NVIDIA or insulting peoples hardware choices(AMD).
 
Seems Nvidia is really trying to push what should have been a xx50 class card up a tier. I mean, 128-bit bus? PCIe x8? Targeted at 1080p? Barely any performance improvement over its predecessor without DLSS3 plus the same amount of VRAM unless you pay $100 more? Laughable. Even the 4080, if you ask me, should have been a 4070 Ti at best. Then use a further cut-down AD102 die for the 4080.
To be fair, this started with AMD's RDNA2. The emphasis on cache over bus basically means the mid low end cards are going to be badly gimped when attempting to run any games above the target resolution. It is probably cheaper to throw more cache at the problem than to make a chip more complex with a wider bus.
The review results are to be expected here. The card is just too bandwidth starved.
 
Sure there is. The consoles load & resume much more quickly, stutter less often, [Sony] has better exclusives, and there's less hassles re: game & driver bugs & configuration.

It annoys me that PCs fall behind that way but I stick with mine for keyboard & mouse support, and better modding & multi-tasking.
Define "stutter less" and what it means for you, because that's definitely not true. Unless you are talking about some specific title, an exception.
 
I am sorry don't know what you are on about, my AMD GPU has been an amazing experience, no matter what game I play, no issues with software or anything, no lack of features here as well(RT included), can we stop promoting NVIDIA or insulting peoples hardware choices(AMD).
Ehh? Not insulting anyone, just speaking facts. AMD can't match Nvidia features, the end. If you are only playing popular titles using pure raster and don't care about awesome features that can revamp older games, then sure, AMD is fine, for the most part.

RTX with its Tensor and RT cores just allows for awesome stuff, that you simply won't get to experience on an AMD GPU.

RT modding / RTX Remix, DLSS, DLDSR, DLAA, Reflex, NvEnc/Shadowplay, Ansel just to name a few. AMD has nothing that matches these features.

FSR was shown to not beat DLSS in any titles recently. Hardware Unboxed @ YT. DLSS easily won overall and FSR did not beat DLSS in a single game + AMD has no answer for DLSS 3 which can transform your performance using heavy RT and Nvidia already have way superior RT perf. DLSS 3 with Reflex is great in heavy RT SP games. Obviously not talking about Multiplayer gaming here.

Why do you think AMD talk about FSR 3 now? With just simple frame interpolation. AMD almost panic'ed when DLSS 2 and still can't match it at all. I can use DLSS in far more games than FSR. FSR works on my Nvidia GPU but end-result is never better than DLSS.

Call me fanboy, whatever. I just have exprience with the Nvidia features that I talk about.

DLDSR was insanely awesome in Elden Ring improving graphics alot without a huge performance hit and the image quality was much better than native. Another highly underrated feature that AMD users barely know exist.
 
Last edited:
Ehh? Not insulting anyone, just speaking facts. AMD can't match Nvidia features, the end.
You mean features like hardly ever used in all games RT and fake frame DLSS3, that skipped last gen Nvidia cards so you have to buy the latest series card just to have those gimmicks or features as you call them. I just play games to enjoy them don't buy into "Features". I have both AMD and Nvidia cards, your "Facts" are your opinion, this is just mine.
 
Define "stutter less" and what it means for you, because that's definitely not true. Unless you are talking about some specific title, an exception.
I mean that out-of-the-box, the console titles I play tend to immediately work and play smoothly, albeit at perhaps lower absolute performance.

On PC, I agree that after day 1 and week 1 patches, and after the driver updates, and after you've done some tweaking of the settings, you'll probably find some settings that work well. But I'm counting those periods and that higher fussiness in general against the PC experience, because it's a hassle I don't have as much on the consoles. That's why my overall impression is that my PC glitches more often than my console.

I'll admit its not a fair fight in that I'm demanding much more of my PC, and usually eventually getting it. But it doesn't change my impression that averaging it all together, I experience more stutters, glitches, crashes, "must download patch / update now" BS on the PC.
 
It's really unfathomable what Nvidia is doing here. Why are they pricing these cards so high?
The simple answer as I see it: Corporate Hubris and being drunk on Nvidia fanbois who used to shell out as much money as they could, and far more money than they should have, just for bragging rights - IMO.
 
I mean that out-of-the-box, the console titles I play tend to immediately work and play smoothly, albeit at perhaps lower absolute performance.

On PC, I agree that after day 1 and week 1 patches, and after the driver updates, and after you've done some tweaking of the settings, you'll probably find some settings that work well. But I'm counting those periods and that higher fussiness in general against the PC experience, because it's a hassle I don't have as much on the consoles. That's why my overall impression is that my PC glitches more often than my console.

I'll admit its not a fair fight in that I'm demanding much more of my PC, and usually eventually getting it. But it doesn't change my impression that averaging it all together, I experience more stutters, glitches, crashes, "must download patch / update now" BS on the PC.
"immediately work and play smoothly"

if you had followed digital foundry's yt channel you would know that it's just simply not true.
 
4070 seems like the most reasonable choice atm. I mean, if you need serious gaming power, while not being a cousin of Elon Musk, Jeff Bezos or Sultan of Brunei.
 
I used to upgrade regularly but not anymore , still using my 1060 and that will be the last card I buy unless the prices return to being sensible. Luckily most good games now are from small companies that run on low end PC's and AAA games are just the same mediocre games remade over and over.
 
I wonder if GPU developers ever thought of creating a kind of VRAM slot so we could upgrade it afterwards just like we do with RAM.
Data read/writes from/to DRAM is by far the slowest aspect of any part of rendering -- this is why GPUs have so many cores, so that they can continue to work on new instruction threads while previous ones are stalled waiting for data. Adding any additional latency to the memory system is something the likes of AMD, Intel, and Nvidia actively go out their way to avoid doing. Adding any kind of a RAM slot into a graphics card would add some latency, albeit not a huge amount.

Other reasons for not doing it are that it would make cooling the whole shebang really awkward, and therefore more expensive; there's no standard system for such a thing in the industry and AMD, Intel, and Nvidia are unlikely to agree on a common format; it would additional cost and complexity to PCB manufacturing; there's a risk of installing the wrong RAM which could easily bork your very expensive graphics card.
 
Nvidia stock popped 25% on earnings this morning. Up 166% year to date. If you think they are going to lower prices, you are sorely mistaken. Everyone just needs to get used to higher prices for GPUs. I don't like it any more than you do, but the shareholders have spoken. They don't care about gamers. AI is now their core profit drive. Hail to the AI king baby!
 
Back