GeForce RTX 4060 vs. Radeon RX 7600: $300 (or Less) GPU Upgrade

RT is not a good argument, every time they release a new generation of GPUs, the devs simply increase the number of rays in the scene making the old GPUs run terribly.

I've seen the 7600 for $249 these days, while 8gb tends to limit the GPU's potential for years to come, it's still enough for most games at 1080p and good quality. I think it's acceptable.
 
This is truly pathetic.

"However, if you want to enable ray tracing, the RX 7600 might not be the best choice; in fact, it isn't, with the RTX 4060 delivering more than twice as much performance at both resolutions."

Yeah man, the 4060 @ 21 FPS with RT is so playable and worth the win!! /s

How does TS save Nvidia's skin, even if it's a ludicrous margin / meaningless scenario?? Throw RT in the review!!
 
This is truly pathetic.

"However, if you want to enable ray tracing, the RX 7600 might not be the best choice; in fact, it isn't, with the RTX 4060 delivering more than twice as much performance at both resolutions."

Yeah man, the 4060 @ 21 FPS with RT is so playable and worth the win!! /s

How does TS save Nvidia's skin, even if it's a ludicrous margin / meaningless scenario?? Throw RT in the review!!


TS literally said they thought RT was pointless at this price point… calm down.
 
To be honest, even without ray tracing RTX 4060 is faster in more games, even if by a small margin. 10% difference in price about evens it out with a slight favour to AMD. Because these cards are more budget oriented, I believe RX 7600 is a better choice because budget and RT don't go hand in hand. In my region, the price difference is more than 10%.
 
To be honest, even without ray tracing RTX 4060 is faster in more games, even if by a small margin. 10% difference in price about evens it out with a slight favour to AMD. Because these cards are more budget oriented, I believe RX 7600 is a better choice because budget and RT don't go hand in hand. In my region, the price difference is more than 10%.

The price difference is around $40-$50. So $300 for a 4060 is a hard no when there's the 6700XT/6750XT for $300-$320. And then $400/$500 4060ti just doesn't make any sense at all when you have the $430 6800 and $480 6800XT. This is the final conclusion. End of story.
 
This is truly pathetic.

"However, if you want to enable ray tracing, the RX 7600 might not be the best choice; in fact, it isn't, with the RTX 4060 delivering more than twice as much performance at both resolutions."

Yeah man, the 4060 @ 21 FPS with RT is so playable and worth the win!! /s

How does TS save Nvidia's skin, even if it's a ludicrous margin / meaningless scenario?? Throw RT in the review!!

You did read the review, or just looked at the graphs?
 
The price difference is around $40-$50. So $300 for a 4060 is a hard no when there's the 6700XT/6750XT for $300-$320. And then $400/$500 4060ti just doesn't make any sense at all when you have the $430 6800 and $480 6800XT. This is the final conclusion. End of story.

That would be great if RDNA2 GPUs were actually available to buy in my region. Here, we have 6700XT at $500, 6800/XT completely unavailable, and the only available GPUs are RTX 3000/4000 and RX7900/7600 at near MSRP.

Yes RDNA2 prices are extremely competitive in the US and are a no-brainer over the 4060, but still, they massively screwed over other regions by not releasing any mid-range product at all while at least Nvidia is releasing new cards
 
Prices as of 8/13/23
Source pcpartpicker
7600 by MSI going for $254.99
4060 by PNY$289.99
6700 non XT by XFX $269.99
6700xt by Saphire and Asrock at $319
6750xt by Asrock $359.99
fyi
 
Power consumption is important , especially when the price of energy is rising . Also , more power requires better cooling and higher PSU . So , you may get something cheap but in 3 years you ll be at loss because of the higher power consumption , the added case fans or the new case/PSU .
 
You did read the review, or just looked at the graphs?
You mean the article? I don't know where you read it was a review in the article, however, I did read it...

"Although we're completely uninterested in ray tracing support at this performance tier, we'd likely still choose the GeForce RTX 4060 over the Radeon RX 7600."

And yeah, TS recommend a GPU that cost 25% over the 7600 which, however, perform the same in RASTER... even if they told us RT is not important in that tier...

I really wonder who doesn't make any sense here, because from my POV, TS and You are definitely not making any sense...
 
This is truly pathetic.

"However, if you want to enable ray tracing, the RX 7600 might not be the best choice; in fact, it isn't, with the RTX 4060 delivering more than twice as much performance at both resolutions."

Yeah man, the 4060 @ 21 FPS with RT is so playable and worth the win!! /s

How does TS save Nvidia's skin, even if it's a ludicrous margin / meaningless scenario?? Throw RT in the review!!
Don't forget it isnt even 21fps in a realistic rig, which probably has half the ram and a 5600x or a 7600x, not the fastest gaming cpu available.
 
Who cares? Power consumption RX6xxx vs RTX3xxx and Ryzen vs Intel didn't interest anyone.
Power consumption is a huge aspect for many people, including myself. When you have two or three space-heater gpus running in one room, the excess A/C cost alone outweighs the price difference between the cards by a factor of 100 or more.

Yeah man, the 4060 @ 21 FPS with RT is so playable and worth the win!! /s
Choose a different title and reso, and the raytracing performance rises to 120+ FPS. Cherry-pick much?

Worse for the inane "RT sux" argument is that the worst Hollywood film at 1080p@30 fps is vastly more realistic than the best video game at quadruple the resolution and quintiple the frame rate. When you understand why this is so, you'll understand why ray-tracing is the future.
 
Power consumption is important , especially when the price of energy is rising . Also , more power requires better cooling and higher PSU . So , you may get something cheap but in 3 years you ll be at loss because of the higher power consumption , the added case fans or the new case/PSU .

45W difference isn't that important in terms of typical PSU requirements. Some GPUs like 6700XT and above may generate power spikes way above rated power consumption (like +100W). But it happens under specific test conditions rather than gaming. Anyway it doesn't belong to the cards tested in the article.

I personally came from 570 to 1070 then to 6600XT and to 6700XT, while using 65W AMD Zen3 CPU -- all that with 500W 80+gold PSU. I used watt meter to track power consumption of the whole system, and it is typically less than 300W when gaming now, sometimes a bit more (in Cyberpunk). I also should subtract 10% losses of the PSU itself.

Idle power consumption is about 50W, which is close to the difference between 7600 and 4060. I mean if you care about such small numbers you may end up with some integrated graphics solution :)

I think 50W is not important at all for a modern budget gaming PC.
 
FTFY "RT is the reason why we choose nVidia everytime"

The price difference is around $40-$50. So $300 for a 4060 is a hard no when there's the 6700XT/6750XT for $300-$320. And then $400/$500 4060ti just doesn't make any sense at all when you have the $430 6800 and $480 6800XT. This is the final conclusion. End of story.

🤦‍♂️

Let’s go over what was covered in this article as well as its intended purpose…

In this case, the 4060 is 11% more expensive for 9% better rasterization, 20% better RT (for whoever cares at this price point), superior upscaling, much lower power consumption, and an objectively better ancillary feature set. There have been discounts on both sides and prices fluctuate on a daily basis, so there’s really nothing wrong with going off of MSRP.

Ultimately, the purpose of this article is to compare 4060 and 7600. Which by the way, were BOTH reviled by many tech publications, including TS. The point is, 6700XT and 6800XT have NOTHING TO DO WITH THIS ARTICLE. And let’s not forget, there are additional downsides to purchasing last gen parts, which is again beyond the scope of this article. So please… calm down man… Lisa Su isn’t going to be evicted from her house any time soon!
 
Last edited:
To be honest, even without ray tracing RTX 4060 is faster in more games, even if by a small margin. 10% difference in price about evens it out with a slight favour to AMD. Because these cards are more budget oriented, I believe RX 7600 is a better choice because budget and RT don't go hand in hand. In my region, the price difference is more than 10%.
the real difference is dlss vs fsr2 and better use of 8gb vram on nvidia cards.
 
Power consumption is a huge aspect for many people, including myself. When you have two or three space-heater gpus running in one room, the excess A/C cost alone outweighs the price difference between the cards by a factor of 100 or more.


Choose a different title and reso, and the raytracing performance rises to 120+ FPS. Cherry-pick much?

Worse for the inane "RT sux" argument is that the worst Hollywood film at 1080p@30 fps is vastly more realistic than the best video game at quadruple the resolution and quintiple the frame rate. When you understand why this is so, you'll understand why ray-tracing is the future.
And please tell me why someone would actually want to play a game other than minecraft creative mode with rt on. Try playing an fps game with reflections everywhere. It definitely has its uses, but they would be specific to adventure games. Video games aren't movies, and aren't meant to be Hollywood quality. Seeing light being accurately depicted in a game like COD is pointless. In a game where you need to focus, rt is either a distraction or going to be completely ignored in favor of actually playing. Cool gimmick. Useful for films, which actually need that level of realism. Games have never been realistic, and if devs and tech companies really want to find something that has an actual impact on gameplay, maybe they should look at physics or something before marketing entire gpu lines on the basis of funny light paths.

For the power consumption -
Based on their tdps of 115 and 165, that's a 50w difference
Assuming an extreme 16hrs per day, every day usage, and the average us cost per kwh of 23 cents, the difference in an entire year between these two cards in terms of cost to run is 67 dollars. And remember thats for 16hrs per day, which is completely unrealistic to begin with.
Also, I keep my 7600x+6700xt pc right next to my leg. It doesn't radiate heat even under load. The idea that you need to pay for ac just to prevent a pc from heating up your room is completely ridiculous, especially when you're talking about a 6700. Maybe if you make it all exhaust and get a 4090 you can do that?
 
please tell me why someone would actually want to play a game other than minecraft creative mode with rt on. Try playing an fps game with reflections everywhere. It definitely has its uses, but they would be specific to adventure games. Video games aren't movies, and aren't meant to be Hollywood quality. Seeing light being accurately depicted in a game like COD is pointless.
By this logic, you should be playing all your games in 16-color block graphics. Get rid of all shading and those annoying textures. Replace all those detailed human-model opponents in COD with large red squares. Much less distracting!


For the power consumption - that's a 50w difference, Assuming an extreme 16hrs per day, every day usage, and the average us cost per kwh of 23 cents, the difference in an entire year between these two cards in terms of cost to run is 67 dollars...
Incorrect assumptions. 50w at the card level is (due to PSU losses) 60-65 watts at the socket. That's 65w of heat-- which must be exhausted from your home whenever your A/C is operating.

Now, if that heat was evenly distributed throughout the home, that would require only another 40w or so of cooling, say 100w total. But it isn't evenly distributed. In most homes, cooling one room requires cooling the entire house (or in a zonal system, a large portion thereof). This can easily triple that 100w figure, for an annual cost of around $250. If you keep the card five years, you've spent an extra $1,250. Or you can sit and sweat in a room several degrees warmer than the rest of your home.

Now you're correct that most people don't play 12 hours a day. But they do tend to use their computer in the afternoon and evening hours, when the A/C is running the hardest. So the difference isn't that large.

Also, I keep my 7600x+6700xt pc right next to my leg. It doesn't radiate heat even under load.
You have a graphic card that defies the laws of physics? If it consumes 150 watts, it produces 150 watts of heat. Period.
 
Back