AMD Ryzen 5 7600X Review: Mainstream Zen 4

AMD might be competitive in gaming but it will lose against Raptor Lake in mult-threading (specially against budget 13th gen CPU)

Leaks are showing that 13600K is slightly faster than 12700K in multithreading. If it has same price as 12600K/12600KF or 10$ dollar more then intel will be unbeatable in performance per dollar

Non-K i5 will also have cores increase. i5 13400 will be 6P + 4E cores while i5 13500 and i5 13600 will be 6P + 8 E cores..... Imagine if i5 13400F cost less than 200 dollar then that would monster value for anyone doing productivity

Zen4 does not seem to be big threat to Intel.


Aside from low resolution gaming, 7600X does not even beat 12600K and 5800X. So, not much that impressive for something that cost 300 dollar.
Both Intel and AMD could sell some CPUs for loss. Therefore pricing is not about die size/manufacturing costs.

Zen4 not big threat to Intel? Intel still hasn't got anything against Zen2 released 2019 🤦‍♂️Because Intel cannot do anything on server side, they try to get at least something on desktop. AMD focuses on servers and so Ryzens are basically downgraded Epycs. AMD does not really care about desktop since servers are much more profitable.

Also I gladly take CPU where all cores have same architecture, not some Intel "these cores are good and these are trash" -combo that created major problems already. I there will be more to come.
 
Why are we still only talking about purchasing costs for CPUs/GPUs instead of running costs? I get that many readers here are from US were electricity is quite cheap, but even there it is getting more expensive. Where I live, If someone were to use this CPU for gaming at full power for 2 hours a day, it would cost 100 Euro per year in electricity costs just for the CPU.

What we need from reviewers are dynamic CPU/GPU/system performance/cost charts, where users could input their desired power draw (if achievable by undervolting or whatever), average system usage time and their electricity tariffs. Obviously nobody in the server worlds is purchasing anything without doing such calculations, so why aren't we?

I suspect this would send Intel systems out of the picture. :)
 
Where I live, If someone were to use this CPU for gaming at full power for 2 hours a day, it would cost 100 Euro per year in electricity costs just for the CPU.
So a 105 W CPU running at that level for 2 hours per day, 365 days in total, would consume 76.65 kWh per annum. If one is being forced to pay €1.31 per kWh (EU average is €0.24), then worrying about new CPUs and their power consumption is the least of one's worries.

But seriously though, just use a frame rate limiter, rather than bothering with undervolting or underclocking.
 
Old R5 5600x reports 64w for die and for I/O die another 19w when put to stress test. This totals 83w but advertised as 65w.
This reported by HWinfo. I don't have an extension cord for CPU power to put an ampermetre and get a real reading.
 
At higher resolutions, differences in FPS are dictated more by changes in GPU as the workload becomes GPU bound. Therefore, when comparing CPUs, looking at the higher resolutions will show many of the CPUs to be performing at the same level. Even if you have two CPUs that show considerably different performance at 1080p, if the workload becomes GPU bound then the CPU difference in performance doesn't translate to any change in the already-saturated GPU performance, and so the two CPUs appear to be even.

In this situation, it's best to compare GPUs. It may still be helpful to compare specific CPU features (DDR5 or resizable BAR for example), but that's more about a specific CPU-GPU-RAM configuration, and there are too many combinations to include those in every GPU/CPU review, so the reviews only highlight the comparisons that show meaningful differences.

If you want an example, this article shows several games where the RTX 3080 and RTX 2080 Ti perform near-identically at lower resolutions, but as the resolution increases the differences become clear. https://www.techspot.com/review/2110-nvidia-rtx-2080-1440p-gaming-bottleneck/

Although this article is older, you can see in some cases how this plays out here: https://www.techspot.com/review/2041-ryzen-2700x-vs-3700x/

There's probably an article somewhere on TechSpot that goes into this in detail, although I can't find one off the top of a quick search result.
Not that I'm disagreeing, I believe you are right. However, since we are using a 3090Ti here, it would seem that being GPU bound would be less of an issue and it could show what we all know and that is, at a certain point the CPU isn't the issue. In other words, why buy an i9-12900K or 7600X when a lesser CPU will get the job done?
 
Ram benchmarks have riftbreaker graph twice! Pretty sure the second graph is supposed to be Spiderman! Please fix this mistake ASAP!
 
Temps, pricing, no DDR4 support.

zen41.gif


zen42.gif


zen43.gif
 
AMD said 95C is NOT hot and is safe 0.0. Well, it seems awful close to the unsafe 96C:)


Like I said above, even Over 95 degrees was not problem for Intel for years.

I'd like to hear temperature for CPU that is NOT problem. 90? 85? 80? 75? Why 95 is magically problem even when Intel CPUs and motherboards have not had problems for 9 years or so?

Intel CPUs have not been running at 95C by default, as a continuous steady state, for 9 years or so. Where did you get that? Unless you mean that handful of computers designed with insufficent cooling that operate at 95C and fail after a couple years. Running at 95C for a few minutes and then running at 95C for years on end are different things. If you don't believe me, sit in a 95C (203F) room for a few hours instead of a few seconds. You'll notice the difference.
 
Last edited:
Strange decision to have 10th gen in charts and say we skip 11th calling it flop when in your latest cpu-focused feature review (Spider-man) it was easily holding its ground against zen 3 series and being ahead of 10th gen with measurable margin.
They have 12th gen included so they are not trying to show Intel as worse. And besides 11th gen mas mostly a flop. Spider man is only one game.
 
Just bad IHS design because of those condensers. IHS is so thick to keep mechanical integrity at a big cost for temperature. Der Bauer delided Zen4 with +20C temp drop.
Going from 40c to 95c in 8 seconds just point to bad thermal transfer.
I wonder if you just grind 1-1.5mm from the IHS what are the thermals.
 
AMD said 95C is NOT hot and is safe 0.0. Well, it seems awful close to the unsafe 96C:)




Intel CPUs have not been running at 95C by default, as a continuous steady state, for 9 years or so. Where did you get that? Unless you mean that handful of computers designed with insufficent cooling that operate at 95C and fail after a couple years. Running at 95C for a few minutes and then running at 95C for years on end are different things. If you don't believe me, sit in a 95C (203F) room for a few hours instead of a few seconds. You'll notice the difference.
Intel quad cores starting from Haswell with stock cooler reach throttling temperatures in about two seconds on heavy load. Those temps are then pretty normal. Also I doubt these new Ryzens run 95 degrees on idle, neither does Intel CPUs. But still, 95 degrees is pretty normal under heavy load and have been for long time.

I doubt 100W heater could heat room to 95 degrees unless room is very small. Also CPUs are not humans.
Going from 40c to 95c in 8 seconds just point to bad thermal transfer.
I wonder if you just grind 1-1.5mm from the IHS what are the thermals.
Progress, with Intel stock cooler that takes under two seconds usually...
 
Progress, with Intel stock cooler that takes under two seconds usually...
I know that, I had a 4790 and changed the "tooth paste" after warranty ended.
But after that delid and grinding+lapping, cooler mod it was no more than 59-60 in a 22-24C room.
Same thing with 5600x now, waiting for warranty to end.
Damn I miss Athon days and direct die cooling.
 
They have 12th gen included so they are not trying to show Intel as worse. And besides 11th gen mas mostly a flop. Spider man is only one game.
Is one game indeed, but it’s not that much different situation in the other either in terms of performance. My point was that I’d skip on 10th gen instead of 11.
 
Not that I'm disagreeing, I believe you are right. However, since we are using a 3090Ti here, it would seem that being GPU bound would be less of an issue and it could show what we all know and that is, at a certain point the CPU isn't the issue. In other words, why buy an i9-12900K or 7600X when a lesser CPU will get the job done?
That would depend on your use case. If a cheaper CPU will indeed get the job done, then by all means save your money and go for a cheaper one. On the other hand if you want maximum performance, particularly in content creation or productivity and not just gaming, then a more expensive CPU is mandatory.

Games also tend to get more CPU intensive over time. For example, I have an old Threadripper 2950 and an RTX 2080 Ti, and while the CPU wasn't the fastest at gaming it serves my needs quite well. However, I now play Microsoft Flight Simulator which is heavily CPU bound, and so I could get a lot of FPS improvement if I were to upgrade my CPU and keep my GPU as-is (they are working on a DX12 implementation but they have a long ways to go). Other games (as I get around to them since my backlog is a decade long) will also benefit, I'm sure.

So, for future proofing, you definitely want a CPU that has great single-threaded performance, and you can often only get that towards the high end, even if you don't need all the extra multicore performance. Note that for Ryzen, the single threaded performance tends to drop off when you go over 16 cores b/c all the cores produce so much heat that the clock speeds just don't go as high. The reviewers will also mention that getting more than about 8 cores when all you do is gaming is a waste a money, and that is mostly true.

Regardless, back to the original question, including the CPU performance at the higher resolutions might be interesting, but I don't think they would add much to the discussion about which CPU you should get, which is why they aren't included in the review.
 
I know that, I had a 4790 and changed the "tooth paste" after warranty ended.
But after that delid and grinding+lapping, cooler mod it was no more than 59-60 in a 22-24C room.
Same thing with 5600x now, waiting for warranty to end.
Damn I miss Athon days and direct die cooling.
Agreed. Problem with Athlon XPs (or should I say: Socket A CPUs) were "professionals" that broke CPU into several pieces. So perhaps we have to accept low quality IHS solutions.
 
Agreed. Problem with Athlon XPs (or should I say: Socket A CPUs) were "professionals" that broke CPU into several pieces. So perhaps we have to accept low quality IHS solutions.
I were a few other factors involved, the mount mechanism was a two point and the metal was too hard at some coolers. I required both hands, one to keep cooler fixed and the other one to fix the clamp.
I had my Barton 2500+ chipped in all four sides,but still working, because of the Thermaltake Volcano7+.
volcano7+.jpg
 
I were a few other factors involved, the mount mechanism was a two point and the metal was too hard at some coolers. I required both hands, one to keep cooler fixed and the other one to fix the clamp.
I had my Barton 2500+ chipped in all four sides,but still working, because of the Thermaltake Volcano7+.
volcano7+.jpg
Yeah, that mounting system brings bad memories. Newer solutions had either rotating "button" or much wider locking mechanism. However, that was still quite easy. This cooler (Thermaltake Golden Orb) crashed tons of CPUs:

gPplITv.jpeg


Legendary times *nerd*
 
That would depend on your use case. If a cheaper CPU will indeed get the job done, then by all means save your money and go for a cheaper one. On the other hand if you want maximum performance, particularly in content creation or productivity and not just gaming, then a more expensive CPU is mandatory.

Games also tend to get more CPU intensive over time. For example, I have an old Threadripper 2950 and an RTX 2080 Ti, and while the CPU wasn't the fastest at gaming it serves my needs quite well. However, I now play Microsoft Flight Simulator which is heavily CPU bound, and so I could get a lot of FPS improvement if I were to upgrade my CPU and keep my GPU as-is (they are working on a DX12 implementation but they have a long ways to go). Other games (as I get around to them since my backlog is a decade long) will also benefit, I'm sure.

So, for future proofing, you definitely want a CPU that has great single-threaded performance, and you can often only get that towards the high end, even if you don't need all the extra multicore performance. Note that for Ryzen, the single threaded performance tends to drop off when you go over 16 cores b/c all the cores produce so much heat that the clock speeds just don't go as high. The reviewers will also mention that getting more than about 8 cores when all you do is gaming is a waste a money, and that is mostly true.

Regardless, back to the original question, including the CPU performance at the higher resolutions might be interesting, but I don't think they would add much to the discussion about which CPU you should get, which is why they aren't included in the review.
Future proofing is always a consideration when building your own PC. That can be a bit of a challenge at times. Consider last gen AMD CPUs. If you built a PC in the past year or two, your upgrade path is somewhat limited because the latest gen uses a different socket. And, you'll have to get new RAM as well. I try to look at the CPU, RAM, GPU that will do what I need it to do without paying a big premium to have the top-of-the-line parts.

Generally I go with a N-1 philosophy. What I mean is that if the i9 12900 is the top dog, I would usually go with the i7. If the 3090 Ti is the best, I'd go 3080. That is usually drive by price, meaning I look for that sweet spot of great performance without paying top dollar.

I guess now, this whole article is out-of-date. Let's see how these comparisons change with 13th gen. While I think AMD might win the GPU wars this round, I think Intel may have avoided disaster with 13th gen CPUs. Of course we'll need to see some real benchmarks to do a real analysis.
 
Temps, pricing, no DDR4 support.
I've seen recent undervolting test that show you can get and keep the temps down for the new AMD CPUs, but otherwise, yeah, Intel seems to have gotten their CPU launch mostly right. Assuming you can trust their performance numbers.
 
So if you're primarily a gamer and you already have an AM4 board + DD4 (and happen to game at 1080p with a 3090Ti), you could spend ~$430.00 on a still-recent 5800X3D for an average of ~203 FPS. Or you could spend ~$870.00 on a 7600X + AM5 board + DDR5 for...8 more frames!

I'd like to see another article on that with more games and at 1440p.
Perhaps without 'Remastered-90s-Half-Life-Mod-Turned-Pointlessly-High-Framerate-Generator (2012)'.
 
Last edited:
Future proofing is always a consideration when building your own PC. That can be a bit of a challenge at times. Consider last gen AMD CPUs. If you built a PC in the past year or two, your upgrade path is somewhat limited because the latest gen uses a different socket. And, you'll have to get new RAM as well. I try to look at the CPU, RAM, GPU that will do what I need it to do without paying a big premium to have the top-of-the-line parts.

Generally I go with a N-1 philosophy. What I mean is that if the i9 12900 is the top dog, I would usually go with the i7. If the 3090 Ti is the best, I'd go 3080. That is usually drive by price, meaning I look for that sweet spot of great performance without paying top dollar.

I guess now, this whole article is out-of-date. Let's see how these comparisons change with 13th gen. While I think AMD might win the GPU wars this round, I think Intel may have avoided disaster with 13th gen CPUs. Of course we'll need to see some real benchmarks to do a real analysis.
I know what you mean with the N-1 approach, and that will save you tons of money. I think that when I upgrade I'll probably do the same instead of going after top tier parts (but considering my current desktop was my first one, I think I had a right to splurge a little, but it's not something I intend to continue, especially with prices as crazy as they are). In fact, even N-2 might be the way to go these days. It depends greatly on how the competition and pricing goes.
 
@Stephen I only have a question, why are you testing Cyberpunk in medium settings? The game is incredibly CPU light when you are testing with RT off. The real CPU test is with RT to max, lots of cpus struggle to keep 60 fps with that. And since you are using a 3090ti for the gaming tests, why not enable DLSS to avoid any GPU bottleneck at a max RT scenario?
 
If someone wants an upgrade, 5800X3D is the way to go.

Switching to 7600x is stupidly expensive for now. 350 euro the cpu, MB 300+ euro, 300 euro for a decent DDR5, new psu (considering the new video cards and future proofing your system) 200 euro. Better cooler, aio, 150 euro.


While someone on am4 might only need to change the cpu to 5800x3d.
 
While someone on am4 might only need to change the cpu to 5800x3d.
If you want the best on AM4 platform, yes.
Otherwise 5600x can work just fine up to rtx3080.
I dont want more than 3070 or 3070ti perf for this platform, PCIE 4.
 
Back