Nvidia GeForce RTX 4060 Ti 16GB Reviewed, Benchmarked

"Today, we're taking our first look at the 16GB version of the much-loved GeForce RTX 4060 Ti. It seems that this is indeed the case, as you all can't stop talking about your feelings towards the 4060 Ti, and we can't stop benchmarking it, so we must all love it."
....
"For now, though, it's a terribly priced graphics card that you can ignore."

That's all you have to read from this article.

Now let's all spread our love towards Nvidia.
 
Thank you for the review!

What I do not get is this:
- on one hand you recommend users to mostly use HIGH settings because ultra brings little to nothing to quality
- on the other hand, you go out of your way to show that 8GB is not enough for ULTRA settings
So maybe this is related to futureproofing, but so long as the market is full of 8GB cards, then game developers need to make the games work on 8GB in order to address a large enough market.

Also there is this affirmation in the article:
"We're not sure where it originates or what the exact theory is, but the belief seems to be that there simply isn't enough memory bandwidth to effectively use a 16GB memory buffer."
As far as I understand graphics computing the two are not dependent but both are limiting factors. While you can have 16GB memory for large textures, you also need a fast bus to bring them to the GPU. So with the 16GB card nvidia addressed the buffer problem but did not address the bus. The larger the resolution, the larger the need for memory bandwidth. This is somewhat mitigated by the 32MB L2 cache, but once the resolution is increased and more things need to be brought from memory the cache becomes insufficient and it shows the bus is to slow. You can see that at 1080p the 16GB is a bit faster than the 8GB because the limiting factor was the buffer while at 1440p they are almost the same as I believe the bandwidth becomes the bottleneck. Also the 3060ti which has more raw bandwidth does not suffer the same degradation from 1080p to 1440p
 
Last edited:
So, in the "un optimized" games, AKA the games made for modern 16GB consoles, the 16GB 4060 poses a noticeable advantage, and going forward, that will continue to be the case.

Ina year we've gone from "8GB is fine, 16GB is useless" to "one or two games wont run right on 8GB" to "8GB is useless on an increasing number of games".

Now, if the 4060ti 16GB was the $300 4050ti it should have been (and is) then it may have actually been decent.,
 
Thank you for the review!

What I do not get is this:
- on one hand you recommend users to mostly use HIGH settings because ultra brings little to nothing to quality
- on the other hand, you go out of your way to show that 8GB is not enough for ULTRA settings
So maybe this is related to futureproofing, but so long as the market is full of 8GB cards, then game developers need to make the games work on 8GB in order to address a large enough market.
How long do you expect game developers to kneecap themselves for old hardware? You can run games on 8GB, just use low settings at 720p, like the xbox series S.

At some point you have to move on. The onus is on the hardware makers, 8GB was on $250 GPUs 7 years ago. There is no reason the ow end should have at least 10-12GB today.
 
"As for the 16GB RTX 4060 Ti, obviously you shouldn't buy it at the current price – for $400 maybe, but we don't expect it to hit that price any time soon."
My two cents, it's going to hit that price soon. They can keep pushing this bullsh1t pricing, but when nobody buys, they'll start losing big. Supply and demand, you mfers...
 
How long do you expect game developers to kneecap themselves for old hardware? You can run games on 8GB, just use low settings at 720p, like the xbox series S.

At some point you have to move on. The onus is on the hardware makers, 8GB was on $250 GPUs 7 years ago. There is no reason the ow end should have at least 10-12GB today.
I expect developers to optimize games for 8GB as long as the vast majority of the target market consists of 8GB video cards. I do not really know the market composition, but from what I can see from Steam surveys (which I highly doubt, but I have no alternative source), it seems the >8GB market share is below 20%. Also, keep in mind that from these 20%, 11.5% are 12GB, of which a considerable amount are 3060 12GB, which are by no means suitable to run with ULTRA settings. So, while I expect the game developers to bake in textures that make the games use those large buffers, they can't only address those 15% of the market.

As for the buffer size the problem might be that scaling the RAM cells does not work that well on newer processes as it did in the past. So, while RAM speed has increased in recent times, density did not increase that much, so it does not cover the cost increase of new processes. In order to have larger buffers we would need more memory dies are and thus a higher price.
 
Last edited:
Nvidia card you should ignore gets 65/100.

AMD card that offers best value for certain price gets 20/100.

"(y) (Y)"

The 6500 XT was unforgivably awful. It deserved the score it got.

“Best value” my rear end. You had to have a current-gen platform to even get the most out of it (largely defeating the purpose of its existence), and it was severely stripped down in functionality due to it being a repurposed laptop GPU. It being the “best value” was largely thanks to the GPU shortage, somehow managing to stay on shelves despite said shortage (I wonder why?). In reality, even at $200 MSRP, it was twice as expensive as it should ever have been. It was also no faster than it’s predecessor!
 
Last edited:
Finally a review I can use for reference to promote the $450 6800 for 1440p gaming. Who knows maybe by this Black Friday Steve's preferred pricing for $350 for the 4060ti 16 gigs of vram might come to fruition.
 
Perhaps this review should have been titled - "16GB RTX 4060 Ti - How Nvidia continues to think it can gouge its customers and how this card will also fail in the market."
 
One thing in the video (and screen grabs) that isn't explicitly mentioned, is that 12GB (and often 10GB) would fix all of these stuttering games.

It highlights how the latest consoles are really driving this and a shared 16GB RAM makes targeting 10-12GB for VRAM make a lot of sense. So while "future proofing" is always a consideration, the odds games go past 12GB at 1080p before the next gen consoles is pretty low and therefore a good level for mainstream cards for the next few years.
 
I'm shocked by these benchmarks! Shocked I tell you! All the ***** fan boy experts were saying adding an extra 8GB of ram would turn the RTX4060ti into a RTX4090. How could they be so wrong....again?
 
I'm shocked by these benchmarks! Shocked I tell you! All the ***** fan boy experts were saying adding an extra 8GB of ram would turn the RTX4060ti into a RTX4090. How could they be so wrong....again?
Nobody was saying that, 0/10 terrible strawman argument. What they DID say was: "An extra 8GB will fix the issues where the 4060ti is running out of VRAM and as a result performance is terrible". And in these situations, the 16GB DID fix these issues. Funny how that works.
I expect developers to optimize games for 8GB as long as the vast majority of the target market consists of 8GB video cards. I do not really know the market composition, but from what I can see from Steam surveys (which I highly doubt, but I have no alternative source), it seems the >8GB market share is below 20%. Also, keep in mind that from these 20%, 11.5% are 12GB, of which a considerable amount are 3060 12GB, which are by no means suitable to run with ULTRA settings. So, while I expect the game developers to bake in textures that make the games use those large buffers, they can't only address those 15% of the market.
News flash: console sales make up the majority of video game sales. 60-70% usually, sometimes higher. THEY are the benchmark, not people running 5 year old budget cards. Developers will target consoles for baseline performance, PC users with slow hardware can play at low settings and resolution if they cant even match a series S in performance.

"high" and "ultra" settings are not going to run well on budget hardware. They never have. for some reason, people have forgotten this. If you want to play at high settings, you better have as much ram as current consoles. Its not that hard.
 
Nobody was saying that, 0/10 terrible strawman argument. What they DID say was: "An extra 8GB will fix the issues where the 4060ti is running out of VRAM and as a result performance is terrible". And in these situations, the 16GB DID fix these issues. Funny how that works.
don't try to walk about on your claims once you got pantsed, have to courage to admit you were wrong (again) and live and learn
 
Big engineering shortcoming there. If they just leaved a 192bit bus to cheap slow GDDR6... 12GB could have been a sweet spot.
 
What I do not get is this:
- on one hand you recommend users to mostly use HIGH settings because ultra brings little to nothing to quality
- on the other hand, you go out of your way to show that 8GB is not enough for ULTRA settings
"Ultra" settings for things like geometry, shadows, shader effects, post-processing and so on are usually inefficient in terms of how much visual improvement they bring compared to how much performance they cost.
The same does not apply to textures. Higher resolution textures have no effect whatsoever on FPS, so long as you have enough VRAM for them you can turn settings to "ultra" and it doesn't affect your performance at all. For textures, all that matters is the amount of VRAM you have.

So maybe this is related to futureproofing, but so long as the market is full of 8GB cards, then game developers need to make the games work on 8GB in order to address a large enough market.
They do. If your card doesn't have enough VRAM, you can turn your texture settings down. The issue here is that consoles have 16 GB of shared memory, with 10+ GB being able to be used as VRAM, and 8 GB cards will simply not be able to match the visual quality of the consoles (as far as textures go). It's pathetic that Nvidia is launching $300, $400 GPUs that can't match the quality settings consoles use.

Also there is this affirmation in the article:
"We're not sure where it originates or what the exact theory is, but the belief seems to be that there simply isn't enough memory bandwidth to effectively use a 16GB memory buffer."
As far as I understand graphics computing the two are not dependent but both are limiting factors. While you can have 16GB memory for large textures, you also need a fast bus to bring them to the GPU.
Your understanding is wrong then. Texture quality settings are not bandwidth intensive, and texture filtering settings have a very modest bandwidth requirement (we got past the point where anisotropic filtering is "free" performance-wise on PC over 15 years ago). If a card has 16 GB of VRAM, it can benefit from it to use high-res textures, regardless of how narrow its bus is.
It's compute and shaders that have high bandwidth requirements. That's why having caches in the order of MBs helps RDNA and Ada GPUs. What goes into those caches is shader code and data (vectors/matrices) the GPU is doing compute work on, not textures. The textures that go into a 3D scene don't fit into those caches to begin with.

The larger the resolution, the larger the need for memory bandwidth.
Yes, because shaders (especially pixel shaders) now have to work with even more data and even larger framebuffers. It's not because of textures.

Also, keep in mind that from these 20%, 11.5% are 12GB, of which a considerable amount are 3060 12GB, which are by no means suitable to run with ULTRA settings.
The 3060 is perfectly capable of handling ultra settings for textures, because 1) texture settings don't affect framerate, and 2) it has enough VRAM to do it.

As for the buffer size the problem might be that scaling the RAM cells does not work that well on newer processes as it did in the past. So, while RAM speed has increased in recent times, density did not increase that much, so it does not cover the cost increase of new processes. In order to have larger buffers we would need more memory dies are and thus a higher price.
DRAM prices right now are the lowest they have been in ages. Recent estimates are that 16 Gigabit GDDR6 modules are at around $6 to $7 each, meaning 8 GB worth of modules costs less than $30 to AIB partners.
 
The 6500 XT was unforgivably awful. It deserved the score it got.

“Best value” my rear end. You had to have a current-gen platform to even get the most out of it (largely defeating the purpose of its existence), and it was severely stripped down in functionality due to it being a repurposed laptop GPU. It being the “best value” was largely thanks to the GPU shortage, somehow managing to stay on shelves despite said shortage (I wonder why?). In reality, even at $200 MSRP, it was twice as expensive as it should ever have been. It was also no faster than it’s predecessor!
Even Techspot considered it to have best value, here you go;

For those of you looking to spend as little as possible the cheapest new graphics card is the unfortunate Radeon RX 6500 XT. Essentially, it's a bad product with numerous issues that costs too much for what it is. However, there's no alternative for under $450, so if you desperately need a new graphics card and the used market is out of the question, this is it.
You and Techspot have same problem: fail to see bigger picture. No matter how "bad", "outdated" or "expensive" 6500XT was, it wasn't suitable for cryptomining that meant it was widely available. And no matter how expensive it was compared to predecessor, it still offered good value at time it was released. Basically every GPU release since 2019 sucks using that logic because price is horrible.

It's funny that AMD got hate when they release something that you can actually buy. And Nvidia got praise when the release something "better" (3050) that was nowhere to be found.

Those who say 6500XT was bad release should understand that many people had just two choices: 1. Buy 6500XT 2. Don't buy anything. Taking option 1 out seemed to be better for some reason 🤦‍♂️
 
Even Techspot considered it to have best value, here you go;


You and Techspot have same problem: fail to see bigger picture. No matter how "bad", "outdated" or "expensive" 6500XT was, it wasn't suitable for cryptomining that meant it was widely available. And no matter how expensive it was compared to predecessor, it still offered good value at time it was released. Basically every GPU release since 2019 sucks using that logic because price is horrible.

It's funny that AMD got hate when they release something that you can actually buy. And Nvidia got praise when the release something "better" (3050) that was nowhere to be found.

Those who say 6500XT was bad release should understand that many people had just two choices: 1. Buy 6500XT 2. Don't buy anything. Taking option 1 out seemed to be better for some reason 🤦‍♂️


Availability is no indicator of quality. And if you continue past the cited paragraph of the article you linked, USED RX 570, RX 5500XT, and 1650S cards were strongly recommended over the poor 6500 XT. It was just an awful product. Completely unnecessarily neutered past its utter lack of mining performance (celebrating this aforementioned lacking also means celebrating the existence of a $200+ 4GB card released in this decade), the real Achilles Heel was the paltry FOUR pcie lanes it had to work with, which basically killed performance unless you had a newer platform. Like I said, for $100, maybe even $150, fine. But $200? It would have to have at least eight lanes, hardware decode, and 3 display output to even remotely justify what was a card which provided no generational performance boost over its predecessor! Ironically, AMD beat nVidia in setting this terrible precedent by a full generation in this instance. But I’m sure this isn’t the only unfortunate time such a phenomenon has occurred (except for any GPU launches which were clearly just refreshes).


Ultimately, as AMD/nVidia have shown with their latest generation of GPUs, they have absolutely no issue taking a page out of Intel’s 4-core-hell-book during the years of Bulldozer. Ironically, it now seems that Intel may be the only answer to this same phenomenon occurring in the GPU market.
 
Availability is no indicator of quality. And if you continue past the cited paragraph of the article you linked, USED RX 570, RX 5500XT, and 1650S cards were strongly recommended over the poor 6500 XT. It was just an awful product. Completely unnecessarily neutered past its utter lack of mining performance (celebrating this aforementioned lacking also means celebrating the existence of a $200+ 4GB card released in this decade), the real Achilles Heel was the paltry FOUR pcie lanes it had to work with, which basically killed performance unless you had a newer platform. Like I said, for $100, maybe even $150, fine. But $200? It would have to have at least eight lanes, hardware decode, and 3 display output to even remotely justify what was a card which provided no generational performance boost over its predecessor! Ironically, AMD beat nVidia in setting this terrible precedent by a full generation in this instance. But I’m sure this isn’t the only unfortunate time such a phenomenon has occurred (except for any GPU launches which were clearly just refreshes).
You do not seem to remember how bad situation was. Any, repeat, ANY card that you can buy new and is able for 1080p gaming is good card. When options are limited, even crap becomes good quality.

At that time, recommendations were always "buy used one" but many still wanted new card for obvious reasons. If you continue that article, there were many recommendations for new cards on other price categories. For obvious reasons. Article even states those used ones are better IF you can find them.

All of those technical shortcomings came from fact that 6500XT was never meant to be desktop part. No matter how it was supposed to suck, it sold like hotcakes over MSRP price. That is what I mean: no matter how good or bad product is, it's The best if nothing else is available. Like it or not.
Ultimately, as AMD/nVidia have shown with their latest generation of GPUs, they have absolutely no issue taking a page out of Intel’s 4-core-hell-book during the years of Bulldozer. Ironically, it now seems that Intel may be the only answer to this same phenomenon occurring in the GPU market.
To be fair, there is still some oversupply from older products and AMD/Nvidia must accept big losses if they release cards that are better buys than unsld older gen cards. Nvidia even had to pay for 5nm capacity in advance. Basically Nvidia is currently selling parts that were likely manufactured around year ago. AMD on other hand is still selling out 6700/6800 cards, launched over 2.5 years ago.

I don't blame either Nvidia or AMD for this generation (4000 Nvidia, 7000 AMD), reasons are mostly elsewhere. If same crap still happens on next gen cards, that's different thing.
 
Even Techspot considered it to have best value, here you go;


You and Techspot have same problem: fail to see bigger picture. No matter how "bad", "outdated" or "expensive" 6500XT was, it wasn't suitable for cryptomining that meant it was widely available. And no matter how expensive it was compared to predecessor, it still offered good value at time it was released. Basically every GPU release since 2019 sucks using that logic because price is horrible.

It's funny that AMD got hate when they release something that you can actually buy. And Nvidia got praise when the release something "better" (3050) that was nowhere to be found.

Those who say 6500XT was bad release should understand that many people had just two choices: 1. Buy 6500XT 2. Don't buy anything. Taking option 1 out seemed to be better for some reason 🤦‍♂️
What is this consooomer mindset? It's available, thus it has value?

Bruh no. The 6500 and 6400 were objectively bad values, no matter how you twist it.

AMD got "hate" because they released a 1650, 4 years later, at a 33% HIGHER price, with a gimped PCIe bus on top of it, that has no real place in the stack. Sorry, but being the underdog is no excuse to release half baked products. Do better.
 
You do not seem to remember how bad situation was. Any, repeat, ANY card that you can buy new and is able for 1080p gaming is good card. When options are limited, even crap becomes good quality.
No, crap is still crap, you are convincing yourself that it doesnt smell so you can justify consooming product instead of just....not doing that.
 
Back