Both the Radeon RX 7600 and GeForce RTX 4060 Ti could arrive in about five weeks

midian182

Posts: 9,745   +121
Staff member
Rumor mill: Those looking for a new graphics card might have two new options in a few weeks, with both Nvidia's RTX 4060 Ti and AMD's Radeon RX 7600 rumored to arrive around the same time Computex takes places in Taiwan: May 30 – June 2.

News of the mid-range cards' release comes from Igor's Lab. Igor Wallossek writes that, according to his sources, the RTX 4060 Ti will launch around the end of May or the beginning of June. That coincides with Computex, so expect the card to appear during the event. It also matches previous leaks about the card's release date.

Nvidia has come under a barrage of criticism for its Ada Lovelace pricing. Even its cheapest entry, the RTX 4070 with its $600 MSRP, is proving too big an ask for most consumers at a time of economic uncertainty. The poor demand has reportedly seen Nvidia inform AIB partners that it would pause production of RTX 4070 GPUs for at least one month to prevent an overstocking, allowing retailers more time to clear existing inventory.

According to a report by DigiTimes, Nvidia is finally paying attention to the negative feedback of its previous releases and is considering launching the RTX 4060 Ti at the same $399 price as its last-generation version.

The RTX 4060 Ti is believed to be based on the AD106-350-A1 and come with 4,352 CUDA Cores, 8GB of GDDR6 VRAM at 18Gbps, 32 MB of L2 cache, and a rated TGP of 160W. It might also feature a 128-bit memory bus for 288GB/s bandwidth. We're expecting performance similar to the RTX 3070 Ti, but if these specs prove accurate, consumers will likely be unhappy about the 8GB of VRAM when so many big games are demanding more video memory.

As for the Radeon RX 7600, Wallossek writes that AMD board partners will show the card off at Computex. However, these will be manufacturers who only sell Radeon cards, such as Sapphire. Companies that make both AMD and Nvidia products are reportedly holding off production of the new RDNA 3 entry until they determine if it will be worth it.

There's no word on how much the RX 7600 might cost, though under $350 is a general estimate. It's expected to use the Navi 33 GPU, have 32 compute units, 1,792 shaders, 8GB of GDDR6, a 128-bit memory bus, and perform around the same as the RX 6600 XT.

Computex starts in just over five weeks, so we should find out all the details about Nvidia's and AMD's next cards during the event.

Permalink to story.

 
According to a report by DigiTimes, Nvidia is finally paying attention to the negative feedback of its previous releases and is considering launching the RTX 4060 Ti at the same $399 price as its last-generation version.

The RTX 4060 Ti is believed to be based on the AD106-350-A1 and come with 4,352 CUDA Cores, 8GB of GDDR6 VRAM at 18Gbps, 32 MB of L2 cache, and a rated TGP of 160W. It might also feature a 128-bit memory bus for 288GB/s bandwidth.

So in other words, something close to or slightly better in performance than the RTX 3070 for $100 less than you could buy that same level of performance two years ago (crypto insanity notwithstanding) and no bump in VRAM.

Not hard to figure out why the community is rejecting the 40 series (and probably AMD’s 7600 as well).
 
7600 needs to be $250 or below to be even remotely attractive, you can get the Asrock Challenger 6650 XT for $259 right now! Plenty of other 6650 XT options under $299.

The 4060 Ti is pointless because it only has the VRAM to be a 1080p offering, but will be priced for 1440p. Better off to go with 6700 XT if you want a good 1440p card and you can get that for under $350 right now. The 4060 Ti should be a $299 card and the 4060 should be $250 if the VRAM is going to be limited to 8GB.
 
The 7600 deserves to be DOA while the 6600-6650XT are plentiful at $300 or less for 8Gb, because people buying low-end or mainstream aren't bothered about power efficiency and that's the only card AMD will have to play when trying to sell this not-an-upgrade.

After their recent shot across nVidia's bow AMD should put their money where their mouth is and put 12Gb minimum on the 7600 for $300 as the mainstream 1080p offering, pile the shelves high with them and watch the market share come their way from value-seekers who don't fancy getting LastOfUs-ed into upgrading a year-old card.
 
If they really believe that they'll be surprised yet again when this one doesn't sell either. They'll have to do $100 Steam vouchers within a week of release to move them.
 
"There's no word on how much the RX 7600 might cost, though under $350 is a general estimate ... and perform around the same as the RX 6600 XT."

That's a weird estimate of a price if the 6600 XT currently costs around $230 or did you mean the 6700 XT?.
 
RT RT RT RT is all I see guys talking about, I care about frame rates, not some goofy ahh technology that kills my frames for some barely better lighting you wont even pay attention to when actually playing the game instead of sightseeing, RT is obsolete.
 
7600 needs to be $250 or below to be even remotely attractive, you can get the Asrock Challenger 6650 XT for $259 right now! Plenty of other 6650 XT options under $299.

The 4060 Ti is pointless because it only has the VRAM to be a 1080p offering, but will be priced for 1440p. Better off to go with 6700 XT if you want a good 1440p card and you can get that for under $350 right now. The 4060 Ti should be a $299 card and the 4060 should be $250 if the VRAM is going to be limited to 8GB.

I see lmao

3070 beats 6700XT across 25 games in 4K, 1% minimum low fps on high settings, by 14%, using newest drivers for both cards :joy:

https://www.techpowerup.com/review/asus-geforce-rtx-4070-tuf/36.html
 
I see lmao

3070 beats 6700XT across 25 games in 4K, 1% minimum low fps on high settings, by 14%, using newest drivers for both cards :joy:

https://www.techpowerup.com/review/asus-geforce-rtx-4070-tuf/36.html
The latest games are putting 8GB squarely in the 1080p category. The 4060 Ti with 8GB of VRAM is a 1080p card. It will perform well in older titles requiring less VRAM, but the 6700 XT has already proven to be a better option when the VRAM requirements exceed 8GB, which newer titles are @ 1440p. Plus, its cheaper. It's not ideal, but its still a better buy. It also would not surprise me if AMD drops the price of the 6700 XT to $300 when the 4060 Ti is release which would make it an even better buy. The 4060 Ti would be worth the $400 price tag at 10GB, hopefully the rumors are wrong, but it's probably too late for a correction. I imagine there are already cards sitting in warehouses at this point. Nvidia's VRAM choices will come back to bite them this generation and rightfully so. They'll learn and I imagine the 50 series will be a much better option when it releases. Or, possibly even a 40 series refresh next year with more VRAM, I think that is a real possibility given the reception of the 4070.

You always have to look at the current games, not the games from 2.5 years ago. What is releasing now is what you should base your purchase on, if you want to be able to play newer releases with higher settings. If you are okay with medium settings and limited games where you'll be able to enable RT, or fake frames, then the 4060 Ti might be a better buy for you.
 
What about 7700xt and 7800xt?
The 7800 XT is rumored to only have 60 CUs compared to the 72 in the 6800 XT. It looks to only be around the 6900 XT in performance if that is the case. The only thing you'd really be getting is better RT performance. Considering you can get the 6900 XT for $550 and the 6950 XT for $600 at the moment and that the 7800 XT is likely to be at least $649 likes it's predecessor, I'm not sure there is a compelling reason to get it to the market. Obviously that is rumored specs, but so far these rumors have all been dead on. Seems that 60CU should be a 7800 non-XT to me, and maybe that will end up being the case. Perhaps they'll get 72 out of a cutdown Navi 31 and release that as the 7800 XT (or 7800 XTX) instead.
 
The latest games are putting 8GB squarely in the 1080p category. The 4060 Ti with 8GB of VRAM is a 1080p card. It will perform well in older titles requiring less VRAM, but the 6700 XT has already proven to be a better option when the VRAM requirements exceed 8GB, which newer titles are @ 1440p. Plus, its cheaper. It's not ideal, but its still a better buy. It also would not surprise me if AMD drops the price of the 6700 XT to $300 when the 4060 Ti is release which would make it an even better buy. The 4060 Ti would be worth the $400 price tag at 10GB, hopefully the rumors are wrong, but it's probably too late for a correction. I imagine there are already cards sitting in warehouses at this point. Nvidia's VRAM choices will come back to bite them this generation and rightfully so. They'll learn and I imagine the 50 series will be a much better option when it releases. Or, possibly even a 40 series refresh next year with more VRAM, I think that is a real possibility given the reception of the 4070.

You always have to look at the current games, not the games from 2.5 years ago. What is releasing now is what you should base your purchase on, if you want to be able to play newer releases with higher settings. If you are okay with medium settings and limited games where you'll be able to enable RT, or fake frames, then the 4060 Ti might be a better buy for you.
There's tons of current games in the list and Atomic Heart is one of the best looking games ever, yet runs just fine on 8GB cards, even in 4K.


Try again.

Mention a single game that's not flawed (TLOU console port) that needs more than 8GB at 1440p, let alone 1080p, from a proper source, I am waiting.

Simply BS and 3070 easily beats 6700XT in any game still.

3070 beats 6700XT by 25% in terms of minimum fps in Atomic Heart using high settings in 4K... And you think 8GB is obsolete for 1080p :joy: :joy:

AMD only tries to push 6000 series because they have tons of inventory left. Nvidia 3000 series sold like hotcakes in comparison. This is why AMD is not rushing 7800 series and will release 7600 and 7700 series first. Logic 101.

7600 series will get 8GB.
77000 series will get 12GB.

AMD started this VRAM panic, and mostly AMD owners believe it.
Why? Because AMD has nothing else on Nvidia right now. Nvidia sits at 85% dGPU marketshare, and climbing. AMD is doing worse than ever in terms of GPU sales. MCM was not the lifesaver AMD thought it would be. They are in full panic mode, hence trying to stop buyers from buying Nvidia because of VRAM amount.

You don't have to be a rocket scientist to understand this. In reality, 8GB is more than fine for 1440p gaming.
 
Last edited:
RT RT RT RT is all I see guys talking about, I care about frame rates, not some goofy ahh technology that kills my frames for some barely better lighting you wont even pay attention to when actually playing the game instead of sightseeing, RT is obsolete.
Not one mention of RT in this article. And you're the first to mention it in the forums. Why do people continue to use hyperbole to argue something no one is talking about? People said the same thing about automobiles, but as they improved, horses were used less and less by people who needed to get someplace.
 
3070 beats 6700XT by 25% in terms of minimum fps in Atomic Heart using high settings in 4K... And you think 8GB is obsolete for 1080p :joy: :joy:
I am sorry but who buys a 3070 or 6700XT for 4K obviously this is not the target for those cards, and how is this relevant to the topic at hand, just like 4gig cards 8gig is eventually becoming obsolete(for more demanding Games) and 10 gig just does not make sense let alone 12gig in new AAA games.
 
I am sorry but who buys a 3070 or 6700XT for 4K obviously this is not the target for those cards, and how is this relevant to the topic at hand, just like 4gig cards 8gig is eventually becoming obsolete(for more demanding Games) and 10 gig just does not make sense let alone 12gig in new AAA games.
No-one buys or recommends those cards for 4K. The GPUs are too slow, yet VRAM is not the issue. It was a statement to prove 8GB is more than fine for 1440p... Think!

It still proves that 3070 beats 6700XT - EVEN IN 4K - with 4GB VRAM less, in MINIMUM FPS that will reveal ANY VRAM issue. 3070 is 25% faster. Not 2%, not 5%, but TWENTY FIVE PERCENT.

By the time 8GB cards are obsolete for 1440p gaming, they won't be running anywhere near max settings anyway, because GPU is too slow.

Atomic Heart uses ~6GB for 1440p maxed out. The game looks better than 99.9% of games and you think 8GB is obsolete for 1440p just because a rushed console port like TLOU came out? LMAO!

Don't believe all AMD wants you to believe. After all, they are doing worse than ever in terms of GPU sales.

Also, don't believe random youtubers that wants to make you watch their crap, so they can earn money. Panic = views = money.

Reality is that 8GB is fine for 1440p gaming, outside of bugged games and rushed ports, yet most people with 8GB cards won't be maxing out settings anyway, because GPU is too slow to do it. Meaning VRAM usage drops.
 
Last edited:
Don't believe all AMD wants you to believe. After all, they are doing worse than ever in terms of GPU sales.

Also, don't believe random youtubers that wants to make you watch their crap, so they can earn money. Panic = views = money.
I don't believe anyone's opinion, I trust my own opinion(Judgement) because everyone always has one, I am no fan of either anyway I have an AMD desktop and NVIDIA Laptop, I'm not raving about how much better AMD is than NVIDIA I just believe in common sense, let people decide what is best for them.
 
No-one buys or recommends those cards for 4K. The GPUs are too slow, yet VRAM is not the issue. It was a statement to prove 8GB is more than fine for 1440p... Think!

It still proves that 3070 beats 6700XT - EVEN IN 4K - with 4GB VRAM less, in MINIMUM FPS that will reveal ANY VRAM issue. 3070 is 25% faster. Not 2%, not 5%, but TWENTY FIVE PERCENT.

By the time 8GB cards are obsolete for 1440p gaming, they won't be running anywhere near max settings anyway, because GPU is too slow.

Atomic Heart uses ~6GB for 1440p maxed out. The game looks better than 99.9% of games and you think 8GB is obsolete for 1440p just because a rushed console port like TLOU came out? LMAO!

Don't believe all AMD wants you to believe. After all, they are doing worse than ever in terms of GPU sales.

Also, don't believe random youtubers that wants to make you watch their crap, so they can earn money. Panic = views = money.

Reality is that 8GB is fine for 1440p gaming, outside of bugged games and rushed ports, yet most people with 8GB cards won't be maxing out settings anyway, because GPU is too slow to do it. Meaning VRAM usage drops.
If you watch the hardware unboxed video on the A4000 vs the 3070 you'll see that the 3070 is kneecapped by the lack of VRAM. TLOU P1 and other games run just find on the A4000, albeit slower than what a 3070 with 16GB would have been able to run. At the end of the video, Steve even tells you why this is, because Nvidia holds back on VRAM in their gaming line up in order to sell their productivity cards for a lot more. Why would you buy a A4000 when you could install Studio drivers on a much cheaper 3070 with 16GB? Well, because the 3070 is intentionally capped at 8GB just so you have to pay a lot more.

Also in the video he mentions that developers are telling him that going forward the recommended requirements for even 1080p/1440p will be 12GB+ of VRAM. It's not that every game is poorly optimized, it's that games really are moving up in required VRAM for Ultra/High textures. I'm sure 8GB will be supported for medium/low textures for quite some time, so its not like your 3070 is a brick, but the fact that so much performance is being left on the table simply because of insufficient VRAM is a huge slap to the face of Nvidia customers. I feel this myself with a 10GB 3080 realizing that this card is easily capable of 1440p/Ultra is upcoming titles except for the fact that it won't have enough VRAM in some of those titles.

It's Nvidia you should be upset with, not those stating the obvious, the 3070, 3070 Ti, 3060 Ti, and 3080 all suffer from a lack of VRAM that will cause you to have to lower settings that the GPU itself would otherwise be capable of handling. That's the real issue, you are not able to get the most out of the GPU because its handicapped by low VRAM.
 
7600 needs to be $250 or below to be even remotely attractive, you can get the Asrock Challenger 6650 XT for $259 right now! Plenty of other 6650 XT options under $299.

The 4060 Ti is pointless because it only has the VRAM to be a 1080p offering, but will be priced for 1440p. Better off to go with 6700 XT if you want a good 1440p card and you can get that for under $350 right now. The 4060 Ti should be a $299 card and the 4060 should be $250 if the VRAM is going to be limited to 8GB.
Let's recognize that a 1440p or 4k gaming monitor is out of the pricerange of someone buying a $400 card. If you are shopping for a card in this price range, it is doubtful that you can afford expensive 1440p and 4k gaming monitors in the first place. A good 1440p or 4k monitor for gaming is generally several hundred dollars. These monitors can be more than twice the price of a comparable 1080p monitor, especially for 4k monitors, so someone shopping for a budget or a midrange graphics card won't be looking at these monitors anyway.
Also, you don't really need above 1080p for gaming. Sure, it might be nice to watch TV on, but for gaming purposes, which this card is intended for, you don't need above 1080p and it could be considered wasteful to buy a higher resolution for gaming. High resolution gaming is almost as useless as rtx.
I am sorry but who buys a 3070 or 6700XT for 4K obviously this is not the target for those cards, and how is this relevant to the topic at hand, just like 4gig cards 8gig is eventually becoming obsolete(for more demanding Games) and 10 gig just does not make sense let alone 12gig in new AAA games.
Can you provide some examples of games running at 1080p on normal settings that a 3060ti, let alone the upcoming 4060ti wouldn't be able to run at a reasonable framerate? For the vast majority of games, that extra VRAM means next to nothing. Most games are pretty easy to run for modern GPUs, so that really shouldn't matter that much.
 
Can you provide some examples of games running at 1080p on normal settings that a 3060ti, let alone the upcoming 4060ti wouldn't be able to run at a reasonable framerate? For the vast majority of games, that extra VRAM means next to nothing. Most games are pretty easy to run for modern GPUs, so that really shouldn't matter that much.
So my comment was about 4K, and not 1080p normal settings, I for one did not build my PC to play normal, I want to get the most out of the games I play, so it matters to me. People build their PC's to their preferance, we have come a long way since 1gig VRam cards, so if 16gig or 24gig did not matter, all the cards would have had 8gig VRam still, but they dont.
 
So my comment was about 4K, and not 1080p normal settings, I for one did not build my PC to play normal, I want to get the most out of the games I play, so it matters to me. People build their PC's to their preferance, we have come a long way since 1gig VRam cards, so if 16gig or 24gig did not matter, all the cards would have had 8gig VRam still, but they dont.
Note the earlier part about how pointless 4k is in general and especially for the pricepoint of a 4060ti shopper
 
Back