AMD Ryzen 5 5600X Review: 6-Core Gaming Beast

One thing I will say in AMD's defense is they brought competition to Intel and that comes at a cost. The price increase is justifiable if we are going to continue to see these kinds of increases in performance. I will gladly pay that price.

Also the high hike is to continue to pay for R&D to keep AMD competitive or we can go back to overpaying for small incremental upgrades to intel CPU's.
 
You know, it's getting to the point that there is literally NO reason for a gamer to get more than a hexacore CPU. Look how fast tech is moving now with CPUs performing much better than far more expensive CPUs from just one year prior.

Making a heavier investment in a higher-end CPU makes no sense when it will just be matched or beaten one year later by a CPU that is far less expensive. Now, professional apps are different because they MAKE you money and the more expensive CPUs become relevant but for gaming, the hexacore Ryzen 5 is the only CPU that should be considered right now. Since they all have similar gaming performance, you won't be upgrading any less by getting a Ryzen 7 or Ryzen 9 but you WILL be paying a LOT more to do so each time.

Agreed, to the most part, but...

When you pay for a build of cpu, ram, mb, videocard etc, $100 can't do much of a difference (in the $300-ish cpu market). Moreover, when you buy something you usually spend some time choosing components. So the next time you will pay again for the same thing. It's ok to waste some time once in several years but not on a regular basis (like Steve and Tim). And again, it's a bit risky, I mean installing a cpu etc. It's easier to do it once, and then again when it's certainly required.

But when you don't have a high-end GPU, it's definitely no need to buy anything above 6/12 cpu right now. Except you really need it beside gaming.
 
Well, the next question is to the author. What's happened to these 1% low graphs?


If it is obsolete, I missed.

About price hike, I guess, they (at AMD) have lots of Zen2 parts on the market. I mean 3600X, 3600XT. They need to clean the stock. Btw, thanks to covid-19, we have a bit of price hike just here. The 3600 price was 175 usd, now it's 200 again. No reason to expect less demand in the nearest future, I guess. And after that, there can be time to focus on next gen Intel products, and here the sun will come...
 
300$ for 6/12 chip is a bit of overpaying, to these days.
I'm not happy about the $50 premium over the 3600x, but at the same time, it does outperform the competition's $400 8 core solution in gaming, and isn't that far behind in heavy multi-threaded and defeats it in lighter threaded apps.

CPU pricing isn't just about the number of cores, you also pay for the IPC and gaming performance ... as clearly previously demonstrated by Intel, as well as more advanced platform benefits (PCIe 4.0, etc)

Unfortunately, with public companies, stock holders demand that the consumer lose so they can win. Nothing new here ...
 
Why score is only 85/100? Once again I should remind how Nvidia cards regurarly get 100/100 ( https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/ , https://www.techspot.com/review/1182-nvidia-geforce-gtx-1070/ ) or 95/100 ( https://www.techspot.com/review/2124-geforce-rtx-3070/ , https://www.techspot.com/review/1811-nvidia-geforce-gtx-1060/ )
despite those cards were either overpriced or had many undeniable weaknesses. Or both.

Now AMD releases CPU that has around 20% IPC improvement and 400 MHz higher boost clocks despite retaining same TDP compared previous generation product.

And that is worth 85/100?

What in the heck should AMD offer to get at least 95/100? 50% IPC improvement, 1 GHz higher boost clocks and 50% less price vs last generation?????
 
Will there be a price war? not sure with Intel - not the smartest cookies in the box - I think they will hold - and it will cost them market awareness - plus they might lose automatic loyalty.
How much stock does AMD have of 3600? - what are there production plans.
If they priced the above chip at $260 - then the 3600 would have to drop to at least 180 etc etc
So I think AMD is probably making good margins and they have OPTIONs .
Anyway hoping for a discount in 6 months as supply stocks build and they cream has already been taken ( eg move the next buyers off the rank ).

Imagine getting a 3600+m/b +16gb for somewhere between $300 to $350 - for most PC users that would be a corker of a m/c
I got mine in Feb/Mar this year. r5 3600= £153 MSI B450 tomahawk max = £95 16gb 3600 cl 18 RAM = £85. However, I have to point out that this was just before the Covid madness, and a pure fluke, as I I'm usually an ***** at predicting tech pricing and usually pay over the odds. So I'll sit here in my temporary smugness until I blow it all on some overpriced storage!
 
Agreed, to the most part, but...

When you pay for a build of cpu, ram, mb, videocard etc, $100 can't do much of a difference (in the $300-ish cpu market).
Well, I was more thinking on banking the difference and using it to supplement the next build, but I wasn't clear about that and I should've been.
Moreover, when you buy something you usually spend some time choosing components. So the next time you will pay again for the same thing. It's ok to waste some time once in several years but not on a regular basis (like Steve and Tim).
That's probably a situation where I'm not typical. Having been building PCs since 1988, it takes me mere minutes to decide what parts to use in a build because I generally ignore brand names and look at specs instead. I know what to look for and the rest is just pricing.
And again, it's a bit risky, I mean installing a cpu etc. It's easier to do it once, and then again when it's certainly required.
Again, my level of experience isn't typical. I could install a CPU with my eyes closed 100 times and the computer would post 100 times. When I do a standard platform upgrade, it's usually a cpu-mobo-RAM affair. All I do is mount the CPU (w/cooler) and RAM to the motherboard, mount it all in my case as a single unit and connect the cables. Thanks to AMD, my most recent platform upgrade was just a CPU drop-in and a BIOS update to go from my R7-1700 to my R5-3600X. Once you're used to it, it's easy as pie.
But when you don't have a high-end GPU, it's definitely no need to buy anything above 6/12 cpu right now. Except you really need it beside gaming.
Another thing that I didn't include in my post (sorry about that) but I did post elsewhere is that I have the same philosophy concerning video cards. The two work very well together because you only pay for what you want, not for what you may or may not use. It's like buying a pickup truck because you MAY need to tow a boat at some point. Meanwhile, you're paying huge gas fees for doing the same thing that you could do in a Civic.

The Coles Notes version of it is that if you only buy nVidia cards that end in 70, you'll game very well and save hundreds of dollars over time. You'll also still be able to sell your cards when you're done with them which saves you even more cash.
 
Last edited:
Kinda "opened my eyes" with that last explenation. I have been one of those
"You have to have the number of cores it can utilize" people. Just because that's normaly what you need to have the nessesary processing power. But of course it's the combined processing power of the utilized cores/threads that matter, be it one or 16.
While the 5600X and 5800X are very good, I would wait for the better value non X's, assuming they are coming. Or the price adjustments if intels next gen closes the gap.
 
Last edited:
Why score is only 85/100?
If it was priced like we were all hoping then I'm sure it would of scored higher. It's definitely a great processor but it's no bargain - the cost per frame chart shows it being slightly more expensive than both the i5 and the i7.
 
If it was priced like we were all hoping then I'm sure it would of scored higher. It's definitely a great processor but it's no bargain - the cost per frame chart shows it being slightly more expensive than both the i5 and the i7.

I'd like to agree but I don't. When you look at GeForce 1080 or 1070, both had quite poor cost per frame ratio compared to other cards. So cost per frame doesn't matter at all and so that 15 points (from 100) disappeared for some other reason.
 
I'd like to agree but I don't. When you look at GeForce 1080 or 1070, both had quite poor cost per frame ratio compared to other cards. So cost per frame doesn't matter at all and so that 15 points (from 100) disappeared for some other reason.
When it was introduced, the 1070 offered the same performance as the previous years best cards and cost only $50 more than it's predecessor. It was not only faster than AMD's Fury X but also 40% cheaper. I suspect that's why it got the score it did. I got all that from reading the link you supplied ;)

The 3600X is a fine processor. It's giving i5/i7 performance at a $20 discount which is good but not earth shattering. I'm hoping the 5700X will offer better value otherwise I'll just say damn it and splash out on the 5900X in my next build.
 
When it was introduced, the 1070 offered the same performance as the previous years best cards and cost only $50 more than it's predecessor. It was not only faster than AMD's Fury X but also 40% cheaper. I suspect that's why it got the score it did. I got all that from reading the link you supplied ;)

The 3600X is a fine processor. It's giving i5/i7 performance at a $20 discount which is good but not earth shattering. I'm hoping the 5700X will offer better value otherwise I'll just say damn it and splash out on the 5900X in my next build.

AMD Fury X was using two generations older manufacturing tech, so that is not valid comparison at all. I'm not blaming you but that logic is simply flawed.

Using same logic, 3600X should be compared against Sandy Bridge (i7-2600K). I would say that 3600X is quite earth shattering against 2600K.

So formula to receive perfect score seems to be: publish pure crap, using 5 year old manufacturing tech. Then, make die shrink, skipping over one manufacturing process. Get 100/100 "👏"
 
AMD Fury X was using two generations older manufacturing tech, so that is not valid comparison at all. I'm not blaming you but that logic is simply flawed.
It was a current GPU at the time of the 1070's release. That's why reviews of the time do that comparison. They ended up having to halve their price to shift them. That's why the 1070 got the high review score and that's why people bought them. I can't see which alternative AMD GPU you think was better value:
  • The Vega 56 came out a year later, was marginally better but cost 50% more.
  • The Vega 64 also came out a year later, was 25% better but cost 100% more.
  • The 5700 XT was 20% better but cost 100% more and came out 3 years later.
  • The RX 590 came out 2 years later and was the same price but offered 25% less performance.
  • The R9 Fury X came out the year before and offered less performance for a far higher price.

I got most of the data from UBM as it was easier to collate. The really telling thing they show on UBM is market share - roughly 10 x the number of people bought the 1070 in comparison to the other (roughly) equivalent AMD cards. I'm guessing more people agreed with that 10/10 score than you think.
 
It was a current GPU at the time of the 1070's release. That's why reviews of the time do that comparison. They ended up having to halve their price to shift them. That's why the 1070 got the high review score and that's why people bought them. I can't see which alternative AMD GPU you think was better value:
  • The Vega 56 came out a year later, was marginally better but cost 50% more.
  • The Vega 64 also came out a year later, was 25% better but cost 100% more.
  • The 5700 XT was 20% better but cost 100% more and came out 3 years later.
  • The RX 590 came out 2 years later and was the same price but offered 25% less performance.
  • The R9 Fury X came out the year before and offered less performance for a far higher price.

Fury X was 28nm, GTX1070 was 16nm. End of comparison. Absolutely o point comparing those, as 28nm was 2011 tech and 16nm was 2016 tech. Much more valid comparison is RX480 vs GTX1070, and RX480 offers better FPS/price ratio. That reason alone is enough not to make GTX1070 not 100/100 card.

Review score was given when review was made, so anything that happens after that doesn't matter at all. When giving review score, there was no real knowledge what comes on future so...

I got most of the data from UBM as it was easier to collate. The really telling thing they show on UBM is market share - roughly 10 x the number of people bought the 1070 in comparison to the other (roughly) equivalent AMD cards. I'm guessing more people agreed with that 10/10 score than you think.

We are talking about review score here, not about how many people bought what. Again, score was 100/100, not 10/10, and you are entirely missing the point.

GTX1000-series was basically nothing more than die shrink (some small improvements) of GTX900-series. So how just die shrink justify 100/100 score? If there would have been die shrink WITH architecture improvements AND huge amount of new features (like still quite useless Ray Tracing acceleration but that is still good new feature), then what score would have been? 200/100?? Problem is that when giving 100/100, it indicates nothing could have been done better. With GTX1070, there was very much that Nvidia could have done better.

Another thing is that Nvidia got rewarded for crappy work (skipping 20nm entirely). So 1000-series was much better than 900-series. Using this same logic. Let's assume that AMD launches Zen4 (5nm) and Zen5 (3nm) on some future. Now, AMD would get better score for Zen5 if they just skip Zen4 entirely. That way boost compared to predecessor is larger *nerd*

So overall, you have some points but they are not so valid when staying on this scoring system.
 
It might be a good idea to change the color of the part being reviewed in the charts so that it stands out among the rest.
I hear you. It's a tad late but since this review will serve as future reference for many, we've updated the graphs to highlight the 3600X. We will do the same with the 3900X review. Small but important detail for sure, it was just a ton of work getting the reviews ready back to back, but thanks for the feedback as always.
 
Fury ... blah blah blah
In all honestly, I've got a bit bored of this. It just seems a bit silly debating GPUs from 2016 when the review is about a middle of the road CPU from 2020. If you want to give this CPU a 10/10 then you do that, what do reviewers know anyway?
 
In all honestly, I've got a bit bored of this. It just seems a bit silly debating GPUs from 2016 when the review is about a middle of the road CPU from 2020. If you want to give this CPU a 10/10 then you do that, what do reviewers know anyway?

Not stupid at all. You do understand that lazy people look nothing else than review scores? And so those scores should really represent how "good" product actually is. If review scores are badly flawed, they should be left out.

I didn't say this CPU should receive 100/100. But if GTX1070 received 100/100, then this should receive at least 120/100. Problem is that GTX1070 got way too high score without real merits.
 
I just got my 5600X in the mail yesterday! I was using a 3600X and 1080Ti. Playing Watch Dogs Legion at 1440p High Settings, I went from about 45-65 fps to 80-100 fps (not very scientific testing haha). And I already sold my 3600X on eBay for more than I paid for it.

All this from a 4770k about six months ago. As far as gaming goes, I didn't notice any difference between the 4770k and 3600X in the few games I was playing at the time (Control, Witcher, Halo MCC).
 
I hear you. It's a tad late but since this review will serve as future reference for many, we've updated the graphs to highlight the 3600X. We will do the same with the 3900X review. Small but important detail for sure, it was just a ton of work getting the reviews ready back to back, but thanks for the feedback as always.
That small detail makes the graphs miles better. Thanks for the update, and for listening!
 
You may want to correct this, "we did say that the 7600K’s days were counted as...". It should be, "days were numbered", if you were to use the correct phrase.

Easy translation mistake to make.
 
Back