Upcoming Radeon RX 7600 could reach 6750 XT levels of performance with a lower power draw

nanoguy

Posts: 1,355   +27
Staff member
Rumor mill: We're still several weeks away from the official reveal of the RX 7600. Gamers have been waiting for a mainstream graphics card to shake things up in the GPU market, and an early leak suggests the new AMD card has the potential to do just that. That is if Team Red is willing to price it close to the $300 mark.

Earlier this week, we learned that AMD's traditional AIB partners like Sapphire and PowerColor are gearing up for the launch of the Radeon RX 7600 graphics card. Supply chain insiders believe the new Team Red offering will break cover at Computex, so it won't be too long before we see a more affordable RDNA 3 product.

In the meantime, the rumor mill is abuzz with hints about the capabilities of the RX 7600, which is likely to be positioned as a competitor for Nvidia's RTX 4060. The already-launched laptop version of the Navi 33 GPU suggests it will feature 32 compute units (1792 unified shaders) paired with eight gigabytes of GDDR6 over a 128-bit memory bus.

According to Moore's Law is Dead, an engineering sample of the new card performs at least 11 percent better than the RX 6650 XT using pre-release drivers. MLID claims to have spoken to a person with direct access to the new hardware but like all rumors, this should be taken with a healthy dose of salt.

If true, however, it could mean the final product has a chance of slotting in between the RX 6700 and RX 6750 XT in terms of performance using optimized drivers. MLID's source also claimed the RX 7600 sample achieved boost clocks above 2.6 GHz and drew around 175 watts during gaming benchmarks. For reference, the RX 6650 XT draws around 180 watts, meaning the RX 7600 could also need just one 8-pin PCIe connector for external power.

Still, the price is what will make or break this product as gamers are no longer rushing to buy GPUs at exorbitant prices. Given the smaller die size for the Navi 33 GPU (204 sq mm) when compared to Navi 23 (237 sq mm), AMD should be able to cram more units of the newer GPU onto a wafer, meaning it will at least be more cost-effective to manufacture.

We expect the MSRP to fall somewhere in the $300 to $350 range, especially since eight gigabytes of VRAM are now decidedly low-end. Ideally, we'd like to see this only on sub-$200 products, but that's hardly possible when companies like Nvidia are doing everything in their power to prevent prices from dropping to more acceptable levels for most consumers.

Permalink to story.

 
Steam fun facts:
Most popular VRAM amount is 8GB. 6 and 8 had the 2nd and 3rd highest gains for the month.
Most popular GPU's on Steam for the last few generations have been 6GB cards.

 
If it‘s priced around $300, I wonder what that‘s going to do to RX 6600 - 6750 XT prices.

Note: The cheapest new 6750 XT (MSI Mech) is $329 with a free game at Newegg right now.
 
The market clearly needs a card that ups the value. Considering that you can get a 6750XT right now for $330, the 7600 matching that performance, but with only 8GB of VRAM for just a little less doesn't do it. It might be a good $250.00 option for 1080p. A real AMD value card would need to have 6800 XT level of performance @ $400 with 16GB VRAM.
 
If developers are making games that don't run well with 8GB of VRAM, they are doing a bad job. The vast majority of GPUs have 8GB or less. Most of the examples of games that don't run well with 8GB of VRAM are bad console ports (The Last of Us) or games that launched in a broken state (Callisto Protocol). I agree that you probably shouldn't buy an expensive GPU with 8GB of VRAM, but for $300, I think it's fine.
 
Things that use to be done by the CPU have been moved to GPU. It's no longer about graphics quality, core game features now run on the GPU
I understand that it's possible but I'd imagine doing this would be a very expensive option for developers, expensive both in development time and in how it would restrict the number of possible players with compatible computers (ie CUDA or OpenCL).Do you know any example games that do this?
 
Come on, throw us a bone AMD. It doesn't really cost that much more to include 16GB of RAM.
 
Which ones, if you don't mind me asking?
Not at all. From the news I read and the people I listen to(I won't pretend to be well versed on this subject). Things like AI(game AI, not GPT type AI) l, projectile trajectories and the math behind weapons effects calulating damage have been offloaded to the GPU. it is my understanding that the cpu is mainly used now for managing resources between the hard drive, memory and the GPU with the GPU handling most tasks in games.

Im not an expert but It makes sense to me. If you're interest in learning more I feel that's a good place to start

EdIt:
I've also been told that people who work with graphics cards are familiar with CUDA programing and similar languages so they naturally program thing to run on the GPU instead of the CPU
 
Last edited:
Considering the RX 6700 (non-XT) is currently $300 and has 10GB of VRAM, this needs to come in below that price to be more appealing than RDNA2 cards. $250 and it would be a home run!
 
This card makes no sense over RDNA2. A second hand 6750XT has 12GB and will still be faster and good for 1440p high settings.

Would pay more than $250.
 
A bit below $300 would be decent. I'm still not sold about 8GB being too low. But I bet if Nvidia had a $300 card like this everyone would collectively lose their minds over it.
 
A bit below $300 would be decent. I'm still not sold about 8GB being too low. But I bet if Nvidia had a $300 card like this everyone would collectively lose their minds over it.
are you sure? Only card that will get close to $300 is a gimped 4050. 4060 will probably be $350 and 4060 Ti is only said to equal 3070 Ti and still 8GB for $4400-450
 
are you sure? Only card that will get close to $300 is a gimped 4050. 4060 will probably be $350 and 4060 Ti is only said to equal 3070 Ti and still 8GB for $4400-450
My point was that if Nvidia released RTX 4060 for $300 it would be praised, but AMD equivalent, RX 7600 at the same price is too expensive.

Nvidia got people thinking high GPU prices are normal. At some point AMD will try to capitalise on that and will do the same.
 
Things like AI(game AI, not GPT type AI) l, projectile trajectories and the math behind weapons effects calulating damage have been offloaded to the GPU.
This is going to be very game-dependent, as there are still thousands of titles released every year that still does all of this on the CPU, especially in games that are primarily developed for consoles.
it is my understanding that the cpu is mainly used now for managing resources between the hard drive, memory and the GPU with the GPU handling most tasks in games.
I suspect that some developers have said this either as a somewhat flippant remark or as a judgment on how they view the role of the CPU in the titles they're worked on. There is still an awful lot of work in a game that a GPU simply can't do, irrespective of how it's programmed -- examples would be input polling and processing, handling audio and video streams, managing command lists for the GPU, running drivers + API calls, processing the engine tick. For example, here's a CPU workload capture, for 30 milliseconds, from The Last of Us:

TLOU_CPU_PIX_Capture.jpg

There are 8 primary threads generated by this game (there are dozens more, but in this particular capture they were mostly all stalled, during this time period), and the colored bars for the cores relate to which thread they're working on - where it's blank, the core is working on a thread not associated with the game (e.g. managing an Explorer thread).

Grey sections in the game's thread timelines are where it's stalled, the small white blocks underneath are API commands being processed, and the red ones are where context switches are taking place (where a core switches from one thread to another).

There's still plenty for the CPU to do!
 
Needs to be at $250 if it is to be successful and an option for AIB's to do a 16GB model at $330-350 and then you might have a winner. Also a lot of people mentioning the RX 6700 XT are forgetting that the 7600XT will have AV1 encoding so streaming and wireless VR will be higher quality than the 6700 XT.
 

This is going to be very game-dependent, as there are still thousands of titles released every year that still does all of this on the CPU, especially in games that are primarily developed for consoles.

I suspect that some developers have said this either as a somewhat flippant remark or as a judgment on how they view the role of the CPU in the titles they're worked on. There is still an awful lot of work in a game that a GPU simply can't do, irrespective of how it's programmed -- examples would be input polling and processing, handling audio and video streams, managing command lists for the GPU, running drivers + API calls, processing the engine tick. For example, here's a CPU workload capture, for 30 milliseconds, from The Last of Us:

View attachment 88987

There are 8 primary threads generated by this game (there are dozens more, but in this particular capture they were mostly all stalled, during this time period), and the colored bars for the cores relate to which thread they're working on - where it's blank, the core is working on a thread not associated with the game (e.g. managing an Explorer thread).

Grey sections in the game's thread timelines are where it's stalled, the small white blocks underneath are API commands being processed, and the red ones are where context switches are taking place (where a core switches from one thread to another).

There's still plenty for the CPU to do!

I think the issue with TLOU remaster is the asset streaming decompresses the textures and game assets on the CPU as most gameplay has show 80+ CPU utilisation, which can lead to performance bottlenecks to the GPU.
 
This is going to be very game-dependent, as there are still thousands of titles released every year that still does all of this on the CPU, especially in games that are primarily developed for consoles.

I suspect that some developers have said this either as a somewhat flippant remark or as a judgment on how they view the role of the CPU in the titles they're worked on. There is still an awful lot of work in a game that a GPU simply can't do, irrespective of how it's programmed -- examples would be input polling and processing, handling audio and video streams, managing command lists for the GPU, running drivers + API calls, processing the engine tick. For example, here's a CPU workload capture, for 30 milliseconds, from The Last of Us:

View attachment 88987

There are 8 primary threads generated by this game (there are dozens more, but in this particular capture they were mostly all stalled, during this time period), and the colored bars for the cores relate to which thread they're working on - where it's blank, the core is working on a thread not associated with the game (e.g. managing an Explorer thread).

Grey sections in the game's thread timelines are where it's stalled, the small white blocks underneath are API commands being processed, and the red ones are where context switches are taking place (where a core switches from one thread to another).

There's still plenty for the CPU to do!
That's really interesting, I wish I was more familiar with the subject. I've talked to devs and my google contact about this but lots of this stuff is so over my head that I essentially have to nod my head and take them at their word.
 
Last edited:
With only 8GB of VRAM, this card is DOA at $300. Why pay $300 for what amounts to an 8GB RX 6750 XT? I'd sooner pay an extra $40 for an RX 6700 XT because:

1. The RX 7600 will have a small positive GPU performance impact
2. The RX 7600's will have a small power savings impact
3. The RX 7600 will have a HUGE detrimental VRAM performance impact

We can consider that the RX 7600 will behave similarly to the RTX 3070 because they have similar performance and the same 8GB frame buffer. Let's take a look and remember what that means:
TLOU_High-p.webp

This shows that in some games, it won't even be able to handle 1080p high, let alone ultra or 1440p. I can guarantee you that the RX 6700 XT wouldn't have this problem and that is more than worth the extra $40 to me.

If the card has only 8GB, it shouldn't cost more than $250 because it's already crippled as soon as you buy it. Because of its VRAM limitation, it will perform no better than an RX 6650 XT in some situations and those situations will only increase in frequency.

I predict that most, if not all, AAA games released in 2025 will either force this card to use 1080p low settings and some games won't be able to run at all. By 2027, it might still be able to play some games at 720p potato settings.

I mean, sure, it'll still play games like Overwatch, WoW, Valorant, Rocket League, Rainbow Six Whatever, Fortnite, PUBG and Counter Strike: Potato Offensive (credit to Steve Walton for that one, it never gets old!) but so will an R9 Fury.

Once again, AMD has managed to snatch defeat from the jaws of victory. How is it that AMD managed to do so well with AM4, RDNA1 and (especially) RDNA2, only to completely bugger-up AM5 and RDNA3? The execution of the RDNA3 launch has been so bad that it reminds me of Vega!

This chiplet design was advertised as a cost-cutting measure, so where are the cut costs? AMD can kiss my arse because they've offered only two somewhat compelling products (RX 7900 XTX and RX 7900 XT) this generation and they're only compelling compared to nVidia because nVidia's pricing is completely out to lunch.

AMD has failed to compete with their own parts from last-gen and that's absolutely pathetic! I completely understand why only ASRock, Powercolor, Sapphire and XFX were willing to even attempt to sell this thing. It's going to be a bigger dud than the RTX 4070 and AMD will have nobody to blame for its failure than themselves.

All they had to do to make this card a MASSIVE success was give it 12GB of VRAM but they were too stupid and too greedy for that. If this card was $300 with 12GB of VRAM, it would've out-sold all other RX 7000 cards combined. Instead, AMD managed to royally screw themselves. They've managed to miss the mark so many times that I'm really starting to wonder if their executive staff are all on drugs.
 
Back