AMD Radeon RX 7600 XT Review: Can AMD Kill the RTX 4060 Competition?

What we learned?
AMD just pull out a Nvidia move, see 4060Ti 8 vs 16GB 128bit.
Both cards lack the horsepower to use that framebuffer.
 
This card should be $200, $250 max.
Well with the increase in memory prices they likely are breaking even on the price hike. Originally the card was supposed to be $300 but they increased it to $320 at the last minute because memory prices are going up.

I know everyone says that the card can't use 16GB of vram, and that's true, but with 128bit bus you really have 2 options, 8GB or 16GB. The 6700XT is still widely available for $350 so grab that instead. 8GB will become a bottleneck here shortly. Honestly, what they should do is make all their chips 192bit so they can use the cheaper, 2GB modules and make all their cards 12GB. That would increase performance and keep costs downish. Well, it might increase the price to 340-350 but a 192bit 12GB card would beat the leaving crap out of their 128bit 16GB card.
 
To me it is more interesting to see how it stacks up against Intel's offerings. My last six cards for personal builds have been Nvidia, going all the way back to a GeForce 2 MX, but they have gone so anti-mainstream consumer that I may be done with them for my price range.
 
I think it is time to stop bashing lower spec AMD GPUs and trash talk about them and move on.

Just recently we saw Steven bringing up the topic of condemning AMD's underperforming card *again*.

P.S. I am using an Nividia card now.
 
I think it is time to stop bashing lower spec AMD GPUs and trash talk about them and move on.

Just recently we saw Steven bringing up the topic of condemning AMD's underperforming card *again*.

P.S. I am using an Nividia card now.
Well I think AMD is going to be done with low spec GPUs now. Their APUs actually provide pretty good performance and there is an APU with 7600 level of performance in the works. However, I don't understand why people bash MAINSTREAM GPUs because they, by volume, are the biggest sellers. I use to buy the biggest and badest every year but I haven't seen a game worth spending that kind of money on in years. Currently have s 6700xt and I serves my needs fine. I'd probably still have my 1070ti if it didn't die on me.
 
True.. imo, its better to have more horsepower than more vram..
better yet, much horsepower with much vram..
Except when you have games like forspoken, as an example, or resident evil village where even on low settings 8GB results in stuttering and crashes.
Well I think AMD is going to be done with low spec GPUs now. Their APUs actually provide pretty good performance and there is an APU with 7600 level of performance in the works. However, I don't understand why people bash MAINSTREAM GPUs because they, by volume, are the biggest sellers. I use to buy the biggest and badest every year but I haven't seen a game worth spending that kind of money on in years. Currently have s 6700xt and I serves my needs fine. I'd probably still have my 1070ti if it didn't die on me.
No APU will hit 7600xt performance until they have 7600xt memory bandwidth, by which time the hypothetical 7600xt successor will have significantly more.
 
Except when you have games like forspoken, as an example, or resident evil village where even on low settings 8GB results in stuttering and crashes.
8gb is sometimes not enough for some games, but sometimes the problem is that some games are not optimized properly, not the gpu itself..
and there are still lots of games that can be played without needing 8gb of VRAM...
 
Last edited:
Well with the increase in memory prices they likely are breaking even on the price hike. Originally the card was supposed to be $300 but they increased it to $320 at the last minute because memory prices are going up.

I know everyone says that the card can't use 16GB of vram, and that's true, but with 128bit bus you really have 2 options, 8GB or 16GB. The 6700XT is still widely available for $350 so grab that instead. 8GB will become a bottleneck here shortly. Honestly, what they should do is make all their chips 192bit so they can use the cheaper, 2GB modules and make all their cards 12GB. That would increase performance and keep costs downish. Well, it might increase the price to 340-350 but a 192bit 12GB card would beat the leaving crap out of their 128bit 16GB card.

I believed this principle myself… until I saw the 4070 Ti SUPER. Nowadays a deliberately restricted power envelope has a more significant impact on performance potential improvements than anything…
 
Great article as always, Steve!

Quick question: Maybe I’m missing something, but the chart for Cyberpunk 2077 at 1080p just looks strange. You show the 6600 beating the 7600 XT, the 7600, and the 6650XT. The next chart for CP2077 at 1440p has the 6600 at the bottom, behind all of those other cards and more inline with what I would expect. Is there an error in your 1080p chart, or perhaps an explanation of how the 6600 is surpassing these other cards?
 
Last edited:
8gb is sometimes not enough for some games, but sometimes the problem is that some games are not optimized properly, not the gpu itself..
and there are still lots of games that can be played without needing 8gb of VRAM...
I love this argument. The current gen consoles have 16GB. People have a real hard time understanding that even properly optimized games are going to push past 8GB today. It's been like that for every game console generation going back to the PS2/xbox days.

The first 8GB GPUs can be traced back to 2014. A decade ago. It's time to move on. IDK why people are so attached to the 8GB figure, but just like 512MB GPUs in 2012, they have reached the end of their service life.
 
Back