Nvidia RTX 4060 Ti and RTX 4050 launch dates have leaked, but no word about the RTX 4060

DragonSlayer101

Posts: 371   +2
Staff
Something to look forward to: Nvidia launched the RTX 4090 and RTX 4080 last September, followed by the RTX 4070 Ti earlier this year. The RTX 4070 is tipped to go official on April 13, and a new leak has now seemingly revealed the launch time frames for the RTX 4060 Ti and the RTX 4050.

According to reliable tipster Zed Wang, the RTX 4060 Ti could debut by the end of May, while the RTX 4050 might drop in June. While earlier rumors suggested that the standard RTX 4060 would make its way to the store shelves in May, Wang says Nvidia hasn't yet decided on its launch window.

In another tweet, Wang posted an image that's said to be the RTX 4060 Ti box design template. Packaging templates are a precursor to the launch of all Nvidia GPUs. Nvidia provides the design to all board partners who use it when creating the packaging material for their customized versions of the graphics cards. The company sends them to OEMs just before the card enters mass production.

Although earlier leaks revealed some crucial specifications of the 4060 Ti, the product description on the box does not seem to confirm any of that. Instead, it lists some of the card's software features and connectivity options.

According to various leaks, the RTX 4060 Ti is based on the AD106-350-A1 GPU, has 4,352 CUDA Cores, 8GB of GDDR6 VRAM at 18Gbps, and a rated TGP of 160W. The card might also have a 128-bit memory bus, giving it a 288GB/s bandwidth.

There are no more details about the upcoming GPU, but gamers have expressed their displeasure about the rumored specs since they became public. The most concerning is the 8GB VRAM, which is insufficient for many modern games. Some have also questioned Nvidia's decision to use the AD106 GPU instead of AD104, which would have made for a more powerful card.

Meanwhile, earlier rumors suggested that the standard RTX 4060 would debut in June, so there is speculation about whether Nvidia will announce both the 4060 and the 4060 Ti simultaneously. It remains to be seen how everything will pan out, but it sure looks like we're getting close to more Nvidia 40-series graphics cards launching in the coming weeks and months.

Permalink to story.

 
Hooray! nVidia just came out with RX 5600XT and RX6600(XT). For $500+. Hooray!
As a current 5700 XT owner I can attest those cards will be trash considering this card is trash. 1080p gaming for me is a joke. Dead by Daylight on low with 100% texture resolution = almost 70% gpu usage....in the lobby. It Takes Two all low about 70% or more. The new Resident Evil 4 demo with "recommended" settings and FSR, nearly 100% gpu usage while my 12400 sits at no more than 30% in games. Truly garbage hardware that aged like fine dog crap. I've owned an MSI Ventus 3070 LHR that fried also along with the motherboard. That card was a joke too.
 
The "Ti" versions should have been maintained only with the xx80 series. Putting "Ti" after (almost) every lesser cards degrades the face value of it. Usually it stood for the "xx90 killer".
 
4050 with a 96 bit bus will be rendered obsolete by next gen iGPU's. 4060 Ti is looking at around 3070 ish performance for $450+, 4050 rumoured to be $250. In years gone by this would been classified a X030 class card.
 
NVidia is keen to not repeat their “mistake” from the Pascal era. Those cards aged well in large part because they had abundant VRAM for their time (1060 with 6GB and 1070 with 8 GB.. in 2016!).

This is planned obsolescence at its worst.
 
4050 with a 96 bit bus will be rendered obsolete by next gen iGPU's. 4060 Ti is looking at around 3070 ish performance for $450+, 4050 rumoured to be $250. In years gone by this would been classified a X030 class card.

The lowered bandwidth is going to be a problem for older PCIE 3.0 builds though. TBH if someone has a newer CPU with PCIE 4.0 they also have resizable BAR, if they're looking for mid-range performance Intel Arc would be the play---especially it Battlemage is improved and drivers continue to improve.

Otherwise the 3060ti might actually be a better purchase than the 4060ti especially pending prices. 4070 if willing to pay the premium but it might be a massive premium, all of this is a mess imo.
 
As a current 5700 XT owner I can attest those cards will be trash considering this card is trash. 1080p gaming for me is a joke. Dead by Daylight on low with 100% texture resolution = almost 70% gpu usage....in the lobby. It Takes Two all low about 70% or more. The new Resident Evil 4 demo with "recommended" settings and FSR, nearly 100% gpu usage while my 12400 sits at no more than 30% in games. Truly garbage hardware that aged like fine dog crap. I've owned an MSI Ventus 3070 LHR that fried also along with the motherboard. That card was a joke too.
Not sure what you mean. Gpu usage is supposed to be 98-100%. Just means it is being fully utilized. It being lower would mean it is being limited or bottlenecked by something. It's like you have been giving the gpu as little work as possible, and then you are complaining it is working to hard, while it's hardly doing anything.
Low frame rate (fps) is what indicates low performance.

And CPU usage around 30% is perfect, headroom left over so tasks don't pile up creating stuttering. 100% CPU usage would be an unplayable stuttering mess, because tasks would not be completed in time to give the finished data to the gpu to create frames.
 
Last edited:
Too many 3060, 3060 Ti's to sell, 4060 won't come before inventory is gone.

4070 was heavily delayed too, because of 3070 and 3080 inventory - However they are almost gone now. Perfectly timed, 4070 releases in about 2 weeks

Same for AMD, and the reason why 7900 series were priced high, prices are steadily coming down for 7900XT and XTX as well, mostly for XT tho - which was priced far too high (XTX was better perf per dollar, which is just wrong for a flagship)

RTX 4080 has dropped about 10% in price since launch as well
When 4080 Ti hit, it will probably drop to 899-999 usd range

Bet we will see a 4080 Ti with 320 bit and 20 GB memory in H2

I don't think AMD will have anything better than 7900XTX (chip is peaked, unless they can add more modules - what would naming be tho?) - they need to compete on price per dollar instead, which is fine. 7900XTX for 799-849 usd would be a good deal for people mostly into raster perf, which is the majority.
 
Last edited:
Too many 3060, 3060 Ti's to sell, 4060 won't come before inventory is gone.

4070 was heavily delayed too, because of 3070 and 3080 inventory - However they are almost gone now. Perfectly timed, 4070 releases in about 2 weeks

Same for AMD, and the reason why 7900 series were priced high, prices are steadily coming down for 7900XT and XTX as well, mostly for XT tho - which was priced far too high (XTX was better perf per dollar, which is just wrong for a flagship)

RTX 4080 has dropped about 10% in price since launch as well
When 4080 Ti hit, it will probably drop to 899-999 usd range

Bet we will see a 4080 Ti with 320 bit and 20 GB memory in H2

I don't think AMD will have anything better than 7900XTX (chip is peaked, unless they can add more modules - what would naming be tho?) - they need to compete on price per dollar instead, which is fine. 7900XTX for 799-849 usd would be a good deal for people mostly into raster perf, which is the majority.

4070Ti was delayed because it was a rebadge of the 4080 12gb…
 
Having used a 3060Ti 8GB and recently tested it on games like Resident Evil 4 remake and The Last Of Us it is clear the core is willing but the memory capacity is weak. This GPU is plenty fast enough even at 1440p for the most part to get a good experience, but it is distinctly handicapped by the lack of VRAM. There are more examples recently too.

4060Ti looks like a disaster waiting to happen. A card that should theoretically be fast than the 3060Ti but hampered by the same memory limitations? Unacceptable for a $450 (perhaps even $500) card to have only 8GB of video memory today, and things will only get worse with every new release.....
 
For me anything below 192bit and 12Gb vram it's a waste of time and money. All recent cards I owned had at least 192bit or more. All cards below this point didn't aged that well. Does not matter release date performance charts, all will be obsolete quite fast with newer games that will come. At current prices 256 bit and 16 GB vram is preferred if you plan to keep card for more than 2-3 years.
 
While the first releases of the 4060 Ti, 4060, and 4050 are going to be low in VRAM capacity, there's nothing stopping AIB vendors from developing PCBs that can field double the number of GDDR6 modules.

The first releases will only have four (or three, in the case of the 4050) modules, but GDDR6 supports clamshell mode which allows for two modules to be paired to a single 32-bit memory controller. So, theoretically, it's possible to have a 16 GB 4060 and a 12 GB 4050.

Whether we actually see them at some point in the future is another matter, though.
 
4070Ti was delayed because it was a rebadge of the 4080 12gb…

Not only. Price was reduced as well. It was 899 USD before re-launch, why do you think? They had to delay and lower price to sell out remaining last gen cards first. Logic 101.

Mining crash = buyers stop buying = Overflood of both Nvidia and AMD cards, HENCE the high prices on 4000 and 7900 series. Instead of selling THOUSANDS if not milions of GPUs at reduced prices, they sold last gen near MSRP and price the new gen HIGH. No sherlock needed here.
 
Affordable garbage. 8Gb Vram? 128 bit?Seriously? 500 bucks and you'll probably get 10 more framerates than 3060ti for 1080p. Higher res would struggle. But wait, apply DLSS3 and you'll get 30 more and voila, a shiny new overpriced turd by Nvidia.
 
The 40 series is trash outside the 4090, and I strongly suspect the same song and dance will happen with the 50 series, except I bet prices are at least 25% higher by then across the board. It will be about 3 years until the 50 series releases. I suspect the GPU companies all expect another crypto bubble and want to have higher prices this time around to take advantage.
 
Not sure what you mean. Gpu usage is supposed to be 98-100%. Just means it is being fully utilized. It being lower would mean it is being limited or bottlenecked by something. It's like you have been giving the gpu as little work as possible, and then you are complaining it is working to hard, while it's hardly doing anything.
Low frame rate (fps) is what indicates low performance.

And CPU usage around 30% is perfect, headroom left over so tasks don't pile up creating stuttering. 100% CPU usage would be an unplayable stuttering mess, because tasks would not be completed in time to give the finished data to the gpu to create frames.
No. Nononono. You don't want *either* to be at 100% because the load isn't constant. 100% means you're getting bottlenecked by whatever is at 100%, because it means there's more work than can be processed.
 
As a current 5700 XT owner I can attest those cards will be trash considering this card is trash. 1080p gaming for me is a joke. Dead by Daylight on low with 100% texture resolution = almost 70% gpu usage....in the lobby. It Takes Two all low about 70% or more. The new Resident Evil 4 demo with "recommended" settings and FSR, nearly 100% gpu usage while my 12400 sits at no more than 30% in games. Truly garbage hardware that aged like fine dog crap. I've owned an MSI Ventus 3070 LHR that fried also along with the motherboard. That card was a joke too.
As a current 5700 XT owner I can attest those cards will be trash considering this card is trash. 1080p gaming for me is a joke. Dead by Daylight on low with 100% texture resolution = almost 70% gpu usage....in the lobby. It Takes Two all low about 70% or more. The new Resident Evil 4 demo with "recommended" settings and FSR, nearly 100% gpu usage while my 12400 sits at no more than 30% in games. Truly garbage hardware that aged like fine dog crap. I've owned an MSI Ventus 3070 LHR that fried also along with the motherboard. That card was a joke too.

What are you talking about?
It takes two has recommended GPU as GTX 980 for 1080p.
Rx5700xt is atleast 40% faster than that. It can easily do 1080p 60 @ high settings.


If you are getting low fps maybe it is because of your RAM or super resolution.
 
No. Nononono. You don't want *either* to be at 100% because the load isn't constant. 100% means you're getting bottlenecked by whatever is at 100%, because it means there's more work than can be processed.
How can the gpu be a bottleneck if it is being fully utilized... That is the goal, it can't do more.
 
These cards sound like a poor upgrade. I went from a GTX 660Ti to my current GTX 1060 6GB. That was 10 years ago. $260 bought me a lot of graphics card back then. Now NVIDIA is shafting the most important segment of the GPU market with crap that is way overpriced. Why the hell doesn't the "RTX 4060Ti" have double the memory of my current card? 8GB sounds like a step backwards. Is this NVIDIA's strategy to force you to pay hundreds more for higher tier models? Or is that me being cynical?
 
Back