Nvidia RTX 4060 Ti specs confirmed by die shot and retail listing

Daniel Sims

Posts: 1,371   +43
Staff
Something to look forward to: As the anticipated release dates of the GeForce RTX 4060 Ti and Radeon RX 7600 draw closer, ushering in a new generation of mainstream gaming GPUs, additional leaks seem to confirm previous rumors about Nvidia's chip. The upcoming GPU, based on the AD106 core, will controversially feature 8GB of VRAM.

Reliable leaker MEGAsizeGPU has posted a die shot of the upcoming desktop variant of the RTX 4060 Ti, confirming it is based on the AD-106-350 GPU. Retail listings of MSI pre-built systems featuring the card also confirm it includes 8GB of GDDR6 VRAM, which could be a concern given the demands of recent high-profile PC game releases.

Previous leaks suggested that, in addition to these specs, the 4060 Ti will feature 4,352 CUDA cores, 18 Gbps VRAM speed, 32 MB of L2 cache, a 128-bit memory bus for 288 GB/s bandwidth, and a 160W TGP. Performance may be similar to the RTX 3070 Ti if its 8GB of VRAM don't hold it back.

Nvidia's choice of memory for the card could prove problematic, as some recent high-end PC games have consumed substantial amounts of VRAM. Titles such as Hogwarts Legacy, the Dead Space Remake, and The Last of Us Part 1 encountered performance issues and other problems on GPUs with less than 12GB of VRAM.

The recently launched Star Wars Jedi: Survivor might be the worst case yet (of poor optimization). The game appears to eat so much memory that, in some situations, even the mighty 24GB RTX 4090 struggles with it. CPU issues are also to blame, and developer Respawn Entertainment has promised to improve the game's performance.

The release date for the RTX 4060 Ti remains uncertain, but it may debut at Computex 2023 at the end of May alongside the Radeon RX 7800 XT, 7700 XT, and 7600. While there is no concrete information on the 4060 Ti's price, a DigiTimes report suggests it will be $399 – the same price as its predecessor.

Details about the 4060 Ti's younger sibling, the desktop 4060, are also still under wraps. Rumors from February indicate it will also have 8GB of VRAM and be based on the same GPU as its laptop counterpart. The specifications suggest that the 4060 has less memory and fewer CUDA cores than its predecessor, the 12GB RTX 3060, although it will feature higher clocks and faster VRAM.

Permalink to story.

 
I don't think 8GB of vram is "controversial" but I do think charging $400 for a card and only puting 8 gigs on it is. VRAM is CHEAP. putting 16 gigs instead of 8 gigs on a card isn't going to make the 4060 perform like a 4090. Just put the effing VRAM on the card. The return rate on the 40 series is already sky hign, 4070's are sitting on shelves and then they're going to charge $400 for an 8gig 4060? For $400 you can absolutely absorb the extra $20 it's going to cost to put 16 gigs of VRAM on the card. For this money you can nearly buy a Playstation 5 AND THAT HAS 16gigs of VRAM.
 
I don't think 8GB of vram is "controversial" but I do think charging $400 for a card and only puting 8 gigs on it is. VRAM is CHEAP. putting 16 gigs instead of 8 gigs on a card isn't going to make the 4060 perform like a 4090. Just put the effing VRAM on the card. The return rate on the 40 series is already sky hign, 4070's are sitting on shelves and then they're going to charge $400 for an 8gig 4060? For $400 you can absolutely absorb the extra $20 it's going to cost to put 16 gigs of VRAM on the card. For this money you can nearly buy a Playstation 5 AND THAT HAS 16gigs of VRAM.
The PS5 has 16Gb of GDDR6 which is shared between graphics memory and system memory. So the PS5 doesnt have 16 Gb of Vram.
 
So in 3 months, it's a 1080p gpu med/high settings, without ray tracing...

yea I know I'm a little "extrem" but for the price... unbelievable, and I'm sure it'll be a top sell...
 
You should stop comparing console ram or Vram... devs have to optimize for the same console hardware, for PC... you already know, and they can't care less... or they pay incompetent 3rd parties to do the job... or even think that surely FSR and DLSS, so the Gpu constructors will do the job for them... this is why those upscaling are not always "good", they moved the optimization job from the devs to the gpu producers... in old times, you never saw "special optimized drivers" when a AAA game hit the shelves, it was the devs 's job to optimize for the hardware, and not the hardware to be optimized for the game... that's why we had drivers updates every 3 or more months... now you have new drivers every week...

(tired today sorry for my bad english )
 
Last edited:
The PS5 has 16Gb of GDDR6 which is shared between graphics memory and system memory. So the PS5 doesnt have 16 Gb of Vram.
What's the G for in GDDR?

Jokes aside, the PS5 skips the "I have to store things in main memory" issue and puts it on the GPU. You can either store 10 gigs of assets in memory AND on the graphics card or you can have a more direct approach with some smart coding and just have it all on the GPU's memory. The PS5 also has an interesting direct access high speed storage which was meant to circumvent the GDDR memory issue. the PS5 does have 512mb of system memory, just so you know.

EDIT:
and I would like to point out that while the VRAM on the PS5 may be used for other things it still has 16gigs of VRAM for $400.
 
PS5 can allocate 10-12 or maybe more of It's RAM as VRAM. Technically PC GPU needs 12GB of VRAM to compete with consoles on textures quality @1440p in new titles. That's why 4060Ti will be dead on arrival @ $400.
4060TI will be a tough choice for consumers. It has DLSS3 but do not have enough RAM.
3080 12GB has the RAM but does not have DLSS3.
It is not funny or enjoyable at all.
Lets hope that AMD FSR3 will make it at least enjoyable for Nvidia videocards too, if will work as advertised.
 
Nvidia simply has no ethics, is fully anti consumer and downright predatory with it's practices and business strategies where mainstream consumers are concerned.

The solution is simple, stop buying Nvidia. Not just the short term, but commit to never patronizing the brand again.
 
It's only recently that games have started requiring huge amounts of graphics memory, but is this simply because the highest-end hardware offers it? Since it appears nVidia - and the buying public's response to their more premium offerings - are effectively and maybe deliberately locking the mainstream gaming market to 8Gb of VRAM, perhaps this is a cue for developers to make a better effort to stay within that limit, and thus not render obsolete the many thousands of cards that are still being sold new with that 'limited' amount of VRAM.

Or, since AMD are responsible for the internal specs of the current major consoles, they could step up to the plate, declare 12Gb to be the new minimum, and wipe the floor with nVidia as cross-platform titles all become "Optimised for AMD (min. spec. 10Gb VRAM)", a spec they ensure gamers can reach for $250-300...
 
4060TI will be a tough choice for consumers. It has DLSS3 but do not have enough RAM.
3080 12GB has the RAM but does not have DLSS3.

I as a 3080 12G owner, I’ll gladly take the extra VRAM over DLSS3. For what I play (FPS titles), I could care less about fake frame generation.

That aside, there’s zero excuse for not bumping up the VRAM specs on these new cards, other than NVIDIA wanting to manufacture a continuous upgrade cycle.
 
I wouldn't buy this GPU and I think anyone that buys it over a PS5/X-Series is a complete fool.

If you're deadset on PC then accept you have to pay a ridiculous premium and get the 4090. If you don't want to pay that, and to me that's just too silly of an amount just to play video games, then you have to accept PC gaming for new AAAs is not for you. Either get an older used GPU and play older games/indies, or skip PC altogether. The iGPU solutions nowadays can actually play virtually every indie game and can perform well on many older titles as well.
 
I wouldn't buy this GPU and I think anyone that buys it over a PS5/X-Series is a complete fool.

If you're deadset on PC then accept you have to pay a ridiculous premium and get the 4090. If you don't want to pay that, and to me that's just too silly of an amount just to play video games, then you have to accept PC gaming for new AAAs is not for you. Either get an older used GPU and play older games/indies, or skip PC altogether. The iGPU solutions nowadays can actually play virtually every indie game and can perform well on many older titles as well.
That's ridiculous, you don't need a $1600 flagship to have a better experience than the consoles. Consoles have GPUs equivalent to a RTX 3060 or RX 6600 XT (which are the current equivalents to the RTX 2070/RTX 2070 Super, which are the GPUs that usually match the consoles in the tests done by Digital Foundry). If you want a PC, all you need is to ensure you have a GPU with 12 GB of VRAM or more. The RX 6700 XT for $350 already fits the bill and outperforms the console GPUs by a considerable margin (~30% faster than the 6600 XT in reviews), or any other card above the 6700 XT. For Nvidia, you can get a RTX 3060, which matches the performance of the consoles, or the RTX 4070 which is roughly twice as fast as them (~100% faster than the 6600 XT in reviews).
 
There's a lot I like about console gaming, but there's just too many PC specific advantages where when they matter to you, it's easy to feel stuck with PC:

- keyboard & mouse for FPS aiming
- multi-tasking such as having your discord, browser, and video player active simultaneously on the screen with good audio mixing.
- modding
- ultrawide support

Ultimately for me it's tough to replace one with the other, it's more that ideally you have them both available for different moods/needs.

re: the pricing comparison, while shared GDDR architecture is interesting to understand technically, the pricing comparison point is the PS5 digital is a complete self-contained gaming system for $399 including not just the shared 16 GB GDDR6, but also the APU, motherboard, power supply, case, cooling system, storage, operating system, and a controller. That does contribute to making a 8GB mid-range card for the same price harder to stomach at least for me.
 
It's only recently that games have started requiring huge amounts of graphics memory, but is this simply because the highest-end hardware offers it? Since it appears nVidia - and the buying public's response to their more premium offerings - are effectively and maybe deliberately locking the mainstream gaming market to 8Gb of VRAM, perhaps this is a cue for developers to make a better effort to stay within that limit, and thus not render obsolete the many thousands of cards that are still being sold new with that 'limited' amount of VRAM.

Or, since AMD are responsible for the internal specs of the current major consoles, they could step up to the plate, declare 12Gb to be the new minimum, and wipe the floor with nVidia as cross-platform titles all become "Optimised for AMD (min. spec. 10Gb VRAM)", a spec they ensure gamers can reach for $250-300...
Or maybe they can spend the extra $20 and put a proper amount of VRAM on $400+ cards
 
There's a lot I like about console gaming, but there's just too many PC specific advantages where when they matter to you, it's easy to feel stuck with PC:

- keyboard & mouse for FPS aiming
- multi-tasking such as having your discord, browser, and video player active simultaneously on the screen with good audio mixing.
- modding
- ultrawide support

Ultimately for me it's tough to replace one with the other, it's more that ideally you have them both available for different moods/needs.

re: the pricing comparison, while shared GDDR architecture is interesting to understand technically, the pricing comparison point is the PS5 digital is a complete self-contained gaming system for $399 including not just the shared 16 GB GDDR6, but also the APU, motherboard, power supply, case, cooling system, storage, operating system, and a controller. That does contribute to making a 8GB mid-range card for the same price harder to stomach at least for me.
more, it's the fact that microsoft and sony are losing money on console hardware... they can, coz they're making their cash on games, online service, and peripherals etc... it's nearly impossible to compare this to a GPU price on pc
 
EDIT:
and I would like to point out that while the VRAM on the PS5 may be used for other things it still has 16gigs of VRAM for $400.
In FY2022, Sony took hardware revenues in its gaming sector of roughly $6.4 billion. AMD's financial year is Dec-Dec (Mar-Mar for Sony) but it saw a revenue of $6.8 billion for the same sector. Approximately $3.8 billion of that was Sony, so this paper napkin exercise puts half the price of a PS5 just to cover the cost of the APU. So yes, it does indeed have 16 GB of GDDR6, but the unit itself makes no, or next to no, profit. One can't expect that with a graphics card.

Jokes aside, the PS5 skips the "I have to store things in main memory" issue and puts it on the GPU. You can either store 10 gigs of assets in memory AND on the graphics card or you can have a more direct approach with some smart coding and just have it all on the GPU's memory.
In both a PS5 and a PC, the GPU only works with specific memory -- in the case of the former, the unified RAM, and the onboard local RAM with the latter. The fact that all bar one PC games store a copy of the assets in system memory doesn't affect the actual VRAM usage.

Or maybe they can spend the extra $20 and put a proper amount of VRAM on $400+ cards
Even with bulk discounts, 8 GB of high-speed GDDR6 is unlikely to cost $20. Compared to the rest of the DRAM market, such products are relatively low-volume sellers (in comparison, not in raw figures) -- Jon Peddie estimated that 38 million discrete graphics cards were shipped in 2022, whereas 286 million PCs were shipped in that period and this doesn't account of discrete DIMM sales, too.

But even if it was super cheap, adding on more RAM does have further costs. To have 16 GB on a 128-bit bus means using 8 modules in clamshell mode. So the PCB needs to account for the additional traces and the power delivery system to RAM has to account for double the demand. Minor things, of course, but when graphics card sales have been constantly declining for the past decade, vendors aren't going constantly absorb additional costs when everything else has increased in price.

While the significant markup of high-end graphics cards can cover such things, every cent counts at the other end of the scale. AIB vendors typically have very low operating margins and as Intel's graphics revenues have shown, it's easy to make a substantial loss in this particular market.

As frustrating as it is to see yet another 8 GB card come out, AMD, Intel, and Nvidia are probably feeling very jittery about sales right now and are going to keep costs down as much as they can.
 
DOA. 8Gb won`t gonna cut it, anyway you slice it. DLSS 3 vs 2 is not worth it, just a touch nicer fake frames. Man, I have a feeling we`re gonna have a big laughter at Nvidia`s quarterly report. If AMD is thinking is not worth to slice prices, because people are still going to massively buy Nvidia, they are far off. This is the perfect time to strike for midrange, where RT or other cool stuff just don`t matter for 2k and up and currently Nvidia has a piece of sh1t line of cards. Make 7800XT 16Gb 500-550 bucks AMD and triple your profits!
 
The good news is that the gimped performance in the latest games will come up in many reviews.

This is no longer a “future proofing” issue but a play today’s games issue.
 
The PS5 has 16Gb of GDDR6 which is shared between graphics memory and system memory. So the PS5 doesnt have 16 Gb of Vram.
Extremely efficient 16gb of ram shared between CPU and GPU unlike PC. Show a port of PS5/Xbox SX that runs with 8gb of ram and 8gb of vram on PC.

You need double or triple of that to run like consoles.
 
Back