AMD Ryzen 5 3600 Review: Best All-Round Value CPU

Absolutely hilarious to see noobs like this guy can still whine about "cut and dry" great products like 3600.
There is ALWAYS a premium to be paid for the flagship products. Do I have to remind you that you beloved Intel 9900k is within 10fps of 8700k.
https://www.techspot.com/review/1730-intel-core-i9-9900k-core-i7-9700k/page4.html
Conclusion: 8700k is awesome, but mistakes were made when designing 9900k.

*rubs hands together*

9900K and 8700K are both flagships from different architectures aren't they? They were released a year apart weren't they? Are the 3600 AND 3900X both flagships? Do they have different architectures? Were they released a year apart? The 9900K is a hot mess, but Intel is stuck on 14nm, so what is AMD's excuse?

Ryzen 3000 series CPU's performance is too close together, and the 3900X is not even good for gaming when you look at the price difference between the two. The 3600 beats it at times. Albeit with SMT off, but it does beat it sometimes. A flagship part should never perform below a part that is about 4 or 5 chips below it, should it? There are about 5 Intel chips, including an i5 or two, that are faster than the 3900X. i5's!!! That's not how flagships are supposed to perform. Gamers, gaming, gamers! You're probably tired of hearing that, but in reality, people that email, share photos, surf the net, Netflix, Facebook and YouTube are the majority. Next is gaming. After that are professionals. Guess which ones bring in the most revenue. Guess!

3900X vs 3700X = within 3fps of each other in EVERY game and resolution tested here.
3700X vs 3600 = within 10fps of each other in EVERY game and resolution tested here.

The 3600 is great, but I'm not impressed with the rest of the Ryzen lineup. If you're a gamer or lower, there is NO reason to buy anything higher than a 3600. Not even overclocking helps, and that's not good for AMD, because they needed hits, and they got too many misses. In fact, manual overclocking Ryzen is worse than enabling PBO! That's just ridiculous.

The people buying chips like the 3900X are the minority. Now that's usually the case with any flagship, except even fewer will buy it, because it's not a good gaming CPU for what it costs, how it performs and who it's marketed for.

And ffs no, people are not going to start coming up with reasons to buy a slow 8 or 12 core part when they need 4 or 6 fast cores. Ryzen gamers aren't suddenly going to become streamers, and moms and dads aren't going to learn to use Blender and Photoshop just because they got a 12 core chip for cheap. It just doesn't work that way.
 
Last edited:
Absolutely hilarious to see noobs like this guy can still whine about "cut and dry" great products like 3600.
There is ALWAYS a premium to be paid for the flagship products. Do I have to remind you that you beloved Intel 9900k is within 10fps of 8700k.
https://www.techspot.com/review/1730-intel-core-i9-9900k-core-i7-9700k/page4.html
Conclusion: 8700k is awesome, but mistakes were made when designing 9900k.

*rubs hands together*

9900K and 8700K are both flagships from different architectures aren't they? They were released a year apart weren't they? Are the 3600 AND 3900X both flagships? Do they have different architectures? Were they released a year apart? The 9900K is a hot mess, but Intel is stuck on 14nm, so what is AMD's excuse?

Ryzen 3000 series CPU's performance is too close together, and the 3900X is not even good for gaming when you look at the price difference between the two. The 3600 beats it at times. Albeit with SMT off, but it does beat it sometimes. A flagship part should never perform below a part that is about 4 or 5 chips below it, should it? There are about 5 Intel chips, including an i5 or two, that are faster than the 3900X. i5's!!! That's not how flagships are supposed to perform. Gamers, gaming, gamers! You're probably tired of hearing that, but in reality, people that email, share photos, surf the net, Netflix, Facebook and YouTube are the majority. Next is gaming. After that are professionals. Guess which ones bring in the most revenue. Guess!

3900X vs 3700X = within 3fps of each other in EVERY game and resolution tested here.
3700X vs 3600 = within 10fps of each other in EVERY game and resolution tested here.

The 3600 is great, but I'm not impressed with the rest of the Ryzen lineup. If you're a gamer or lower, there is NO reason to buy anything higher than a 3600. Not even overclocking helps, and that's not good for AMD, because they needed hits, and they got too many misses. In fact, manual overclocking Ryzen is worse than enabling PBO! That's just ridiculous.

The people buying chips like the 3900X are the minority. Now that's usually the case with any flagship, except even fewer will buy it, because it's not a good gaming CPU for what it costs, how it performs and who it's marketed for.

And ffs no, people are not going to start coming up with reasons to buy a slow 8 or 12 core part when they need 4 or 6 fast cores. Ryzen gamers aren't suddenly going to become streamers, and moms and dads aren't going to learn to use Blender and Photoshop just because they got a 12 core chip for cheap. It just doesn't work that way.

Most of this is right. To point out one thing though...no, the 8700k and 9900k *are* the same architecture. Literally. They're both coffelake.
 
*rubs hands together*

9900K and 8700K are both flagships from different architectures aren't they? They were released a year apart weren't they? Are the 3600 AND 3900X both flagships? Do they have different architectures? Were they released a year apart? The 9900K is a hot mess, but Intel is stuck on 14nm, so what is AMD's excuse?

Ryzen 3000 series CPU's performance is too close together, and the 3900X is not even good for gaming when you look at the price difference between the two. The 3600 beats it at times. Albeit with SMT off, but it does beat it sometimes. A flagship part should never perform below a part that is about 4 or 5 chips below it, should it? There are about 5 Intel chips, including an i5 or two, that are faster than the 3900X. i5's!!! That's not how flagships are supposed to perform. Gamers, gaming, gamers! You're probably tired of hearing that, but in reality, people that email, share photos, surf the net, Netflix, Facebook and YouTube are the majority. Next is gaming. After that are professionals. Guess which ones bring in the most revenue. Guess!

3900X vs 3700X = within 3fps of each other in EVERY game and resolution tested here.
3700X vs 3600 = within 10fps of each other in EVERY game and resolution tested here.

The 3600 is great, but I'm not impressed with the rest of the Ryzen lineup. If you're a gamer or lower, there is NO reason to buy anything higher than a 3600. Not even overclocking helps, and that's not good for AMD, because they needed hits, and they got too many misses. In fact, manual overclocking Ryzen is worse than enabling PBO! That's just ridiculous.

The people buying chips like the 3900X are the minority. Now that's usually the case with any flagship, except even fewer will buy it, because it's not a good gaming CPU for what it costs, how it performs and who it's marketed for.

And ffs no, people are not going to start coming up with reasons to buy a slow 8 or 12 core part when they need 4 or 6 fast cores. Ryzen gamers aren't suddenly going to become streamers, and moms and dads aren't going to learn to use Blender and Photoshop just because they got a 12 core chip for cheap. It just doesn't work that way.

If we left your attitude aside, I consider you're raising good questions. But it's just your point of view, and it doesn't necessarily reflects the world better than AMD's one.

If I understand your points correctly, you find Intel product placement (not design) better than AMD's one. Like i5 are enough for gaming now, and there's no need to give them more physical or HT cores (or larger L3 cache). If someone wants better 1% lows in gaming or high-refresh gaming, he needs to proceed to i7, which is considerably better in that regard.

AMD's approach is a bit different here, and they split products more sharply only below the 6/12 parts. Like, if you pay the sweet spot price you'll get the best all-round CPU for everything. But when you consider to spend more (people usually tend to think more when they spend more), you encounter certain benifits there, but probably not in every aspect of performance/price ratio (not in every task).
 
Last edited:
If we left your attitude aside, I consider you're raising good questions. But it's just your point of view, and it doesn't necessarily reflects the world better than AMD's one.

If I understand your points correctly, you find Intel product placement (not design) better than AMD's one. Like i5 are enough for gaming now, and there's no need to give them more physical or HT cores (or larger L3 cache). If someone wants better 1% lows in gaming or high-refresh gaming, he needs to proceed to i7, which is considerably better in that regard.

AMD's approach is a bit different here, and they split products more sharply only below the 6/12 parts. Like, if you pay the sweet spot price you'll get the best all around CPU for everything. But when you consider to spend more (people usually tend to think more when they spend more), you encounter certain benifits there, but probably not in every aspect of performance/price ratio (not in every task).

Give attitude, get attitude. Look at the comment I was replying to. I was pretty tame compared to that one.

My comment doesn't need clarification, so I will not repeat myself here.
 
Yeah, I skipped mentioning that totally ridiculous statement. How can a 6-thread CPU be comparable with a 12-thread CPU?

Because they are in the same price range? Is it ridiculous to compare two products within the same price range while one of them only offering 6c/6t and the other 6c/12t? Same reason why Jay compares 5700XT with 2060S and 5700 with 2060: they are in the same price range. It's a dollar to dollar comparison, not tier-to-tier comparison.

I was referring more to the use of things like, for example, the i7-8700K + 3rd party cooler in the comparisons rather than the much cheaper i7-8700 with included cooler. In fact there isn't a non-K part used in any of the comparison tables.

I guess because it's just for the sake of comparison then? I don't know why he doesn't benchmark 8700 instead of 8700K, maybe the same logistic problem. But like I said, it's fair to compare them to 9600 (in this case represented by 9600K). You can say that's out-of-the box 9600K (w/o OC) is basically the best case scenario of 9600.
 
Last edited:
*rubs hands together*

9900K and 8700K are both flagships from different architectures aren't they? They were released a year apart weren't they? Are the 3600 AND 3900X both flagships? Do they have different architectures? Were they released a year apart? The 9900K is a hot mess, but Intel is stuck on 14nm, so what is AMD's excuse?

Ryzen 3000 series CPU's performance is too close together, and the 3900X is not even good for gaming when you look at the price difference between the two. The 3600 beats it at times. Albeit with SMT off, but it does beat it sometimes. A flagship part should never perform below a part that is about 4 or 5 chips below it, should it? There are about 5 Intel chips, including an i5 or two, that are faster than the 3900X. i5's!!! That's not how flagships are supposed to perform. Gamers, gaming, gamers! You're probably tired of hearing that, but in reality, people that email, share photos, surf the net, Netflix, Facebook and YouTube are the majority. Next is gaming. After that are professionals. Guess which ones bring in the most revenue. Guess!

3900X vs 3700X = within 3fps of each other in EVERY game and resolution tested here.
3700X vs 3600 = within 10fps of each other in EVERY game and resolution tested here.

The 3600 is great, but I'm not impressed with the rest of the Ryzen lineup. If you're a gamer or lower, there is NO reason to buy anything higher than a 3600. Not even overclocking helps, and that's not good for AMD, because they needed hits, and they got too many misses. In fact, manual overclocking Ryzen is worse than enabling PBO! That's just ridiculous.

The people buying chips like the 3900X are the minority. Now that's usually the case with any flagship, except even fewer will buy it, because it's not a good gaming CPU for what it costs, how it performs and who it's marketed for.

And ffs no, people are not going to start coming up with reasons to buy a slow 8 or 12 core part when they need 4 or 6 fast cores. Ryzen gamers aren't suddenly going to become streamers, and moms and dads aren't going to learn to use Blender and Photoshop just because they got a 12 core chip for cheap. It just doesn't work that way.
The answer is simple: because Intel intentionally gimps the performance of its CPUs as you go lower down the stack with things like removing HT. Remember when the i7 line used to have HT? If the 9600K had it enabled it would have been on the same level as the 9700k/9900K just like how the 3600 is similar to the 3700x/3900x, with only the small differences between clocks speeds being the differentiator in games.

I have no idea why you are supporting Intel on this matter. It's like saying that you like mid range products to be worse just so that the premium you pay for the top end products to be justified.

You are making zero sense. By your standards AMD should price its CPUs only based on gaming results which is stupid at best (and this is me being nice to you). To you, the simple fact that the AMD CPUs scale tremendously in multithreading results and smokes Intel in productivity means nothing.

The 3900x already scores 100% more in some multithreaded workloads compared to the 3600. How big would have been the difference if AMD artificially disabled SMT like Intel does?

I've seen this argument made so many times and every time the one who made it was publicly shamed by others. (I really hope it wasn't you in the past since I'm too busy to read old comments)
 
Last edited:
When I kept seeing the 3600 within 10fps of the 3900X, I cringed, because that's not how it's supposed to work. Flagships are supposed to be the far better performer. 3900X is $500 ffs! The only conclusion is 12 cores are too much for gaming. The 3900X should be an HEDT part. If the 9700K was within 10fps of the 9900K there would be pitchforks, but it's AMD so they get a pass? Um, okay.

Example: If the $200 3600 is doing 60fps and the $500 3900X is doing 70fps, what does that say about every chip in between? Why should anyone buy anything but he 3600 and 3900X for the foreseeable future? How is this going to help AMD dominate if they only have two chips for consumers worth buying?

Yes, the 3600 is awesome, but mistakes were made when designing Ryzen 2.

Oh, you mean the same situation as the i5-8400 vs. the i7-8700K. See here:

https://www.techspot.com/review/1502-intel-core-i5-8400/page3.html

The 8700K was such a cringey product because of that. Who'd pay twice as much for pretty much the same chip while gaming? Those fools.

Your arguments are forced and here's something different AMD does, relative to Intel: Intel arbitrarily disables HT on the 8400. That creates a necessary divide between the 2 products so Intel could keep the prices up on the 8700K for productivity loads.

Bravo Intel for showing AMD exactly how to win at business: don't sell an actually great midrange product, instead artificially gimp it to create a reason to squeeze more money out of your customers for a now clearly higher-end product. Isn't that what we all laud Apple for?
 
The same could be said of the 9700K and 9900K. If you overclock the 9600K you can surpass their performance as well, after all it's not that big of a gap to begin with.

That said overclocking requires a Z class motherboard, a decent CPU cooler, and a decent power supply as OCing that 9600K is going to increase the CPUs power requirements. You are also going to need a case able to take care of all that hot air, especially during the summer. I've had customer builds before that would max out at 63c with a moderate overclock when I was testing them and at their house during a hot summer day the motherboard CPU temp warning beep would go off (old school ASUS Sabertooth P67).

Point being, you are looking at a lot of added cost for a CPU that is already $50 or more then the competition. Not to mention, the convenience factor of simply installing the CPU and it's included cooler, which is exactly what people buying at this price point want.

Going along with the budget theme of this processor, are most gamers at this price point even in a position to take advantage of any potential FPS increase from overclocking? Given that in order to get those extra FPS you need a 2080 Ti, I'm going to say no. The number of budget gamers with 2080 Tis is 0 because you are not a budget gamers if you are dropping $1,200 on a video card alone. A majority of people running these budget CPUs will have GPUs that bottleneck in a majority of games before the CPU ever does. That's not even considering if they have a high refresh rate monitor when needed to see the extra FPS.

All of these factors are important to budget oriented customers, as is convenience. Of which the 3600 has in spades over the 9600K.
That's all true for budget gamers. I guess im just not one. When I build one I tend to spend about $1500-$2000 overall as I typically start over. This is usually after 4 or more years though.
I plan on getting a super 2080 within the next month or so. Barring they aren't impossible to get lol
The 9600K isn't a budget model for one so I guess it's not fair for the comparison. Since the 3600x is meant for budget builds. K models are meant for overclocking which if one does compare them head to head, the winner is the 9600K based on gaming performance. Yes the 3600 will win in other areas like productivity but it also doesn't win in every productivity category. That's how I see it anyways.
 
Oh, you mean the same situation as the i5-8400 vs. the i7-8700K. See here:

https://www.techspot.com/review/1502-intel-core-i5-8400/page3.html

The 8700K was such a cringey product because of that. Who'd pay twice as much for pretty much the same chip while gaming? Those fools.

Your arguments are forced and here's something different AMD does, relative to Intel: Intel arbitrarily disables HT on the 8400. That creates a necessary divide between the 2 products so Intel could keep the prices up on the 8700K for productivity loads.

Bravo Intel for showing AMD exactly how to win at business: don't sell an actually great midrange product, instead artificially gimp it to create a reason to squeeze more money out of your customers for a now clearly higher-end product. Isn't that what we all laud Apple for?

AMD isn't Intel. Therefore, having too many parts performing the same and still not beating Intel 14nm in gaming isn't how you make a comeback. All you're doing is flooding the market with mediocre chips. AMD tried this already with Polaris....

Your HT example is poor at best. How is HT on some parts any different than the different core counts on Ryzen parts?
 
AMD isn't Intel. Therefore, having too many parts performing the same and still not beating Intel 14nm in gaming isn't how you make a comeback. All you're doing is flooding the market with mediocre chips. AMD tried this already with Polaris....

Your HT example is poor at best. How is HT on some parts any different than the different core counts on Ryzen parts?

A 6 core Zen 2 part is an 8 core (single CCX) Zen 2 part with one or 2 defective cores that AMD can continue to sell at a lower price. Are all of them failed 8 core parts? I assume not but maybe they really all are. The fact that reviewers are getting lower max single threaded clocks from the 3600 than the 3800 or 3900 certainly argues for lower silicon quality on the 6 core parts and thus some failed cores.

But disabling HT in the all i5s and now 9th Gen i7s as well is mostly a feature issue. Is Intel suddenly not able to sell 9th Gen 6 core chips without HT as their screening is broken? AMD also does this on the very low end as the 1200 and 2200 also have no SMT. Which shows us that these threading choices are made for product segmentation purposes alone.

If there were so many dies that fail HT/SMT, then why doesn't AMD sell 6c6t, 8c8t CPUs as well to recoup even more costs? It seems likely that this type of failure rate is vanishingly low and they choose not to gimp their midrange CPUs to force people to higher price tiers. The simple answer is that Intel has us trained to accept this so we do and AMD knows they can't (yet) get away with this anti-consumer practice.
 
The 3600 is an insane CPU, comparable to how good the 1600 was 2 years ago. The 8700k (who I am an owner of) is a completely different beast and I don't think the 2 are comparable. The 8700k, and all high end k chips are for the enthusiast market that's going to pay 200+ for a mobo, 150+ for a good cooler and a delid on top, and take it to the extreme max. Mine is currently running at 5.1 / 4.7ghz cache with 4000c16 ram.

The 3600 competes in a different market. It's about people that want to pay half the price, get 90% of the performance with no hassle about coolers / delidding / ocing etc.I would blindly suggest it to everyone basically, it gets the job done as fast as anyone else no matter whta the job actually is, be it gaming or other productivity applications. Kudos AMD
 
The 3600 is an insane CPU, comparable to how good the 1600 was 2 years ago. The 8700k (who I am an owner of) is a completely different beast and I don't think the 2 are comparable. The 8700k, and all high end k chips are for the enthusiast market that's going to pay 200+ for a mobo, 150+ for a good cooler and a delid on top, and take it to the extreme max. Mine is currently running at 5.1 / 4.7ghz cache with 4000c16 ram.

The 3600 competes in a different market. It's about people that want to pay half the price, get 90% of the performance with no hassle about coolers / delidding / ocing etc.I would blindly suggest it to everyone basically, it gets the job done as fast as anyone else no matter whta the job actually is, be it gaming or other productivity applications. Kudos AMD

Ultimately, that's the point. As a general recommendation, the 3600 is a great product for almost all uses. There are certainly use cases where you get more performance, even per dollar, from an Intel or another AMD CPU, but as a simple recommendation it'll get almost any job done well at a great price point.
 
Common misconception: with the new bios the 3900x can now boost 1 core to 4.6ghz at a time whereas before, it wasn't able to. In no way can a 3900x hit 4.6ghz on all cores at the same time.
I believe the reviewed 3600 was able to hit 4.2ghz all core with an after market cooler which is very good.
Hmm, yes I think it was unreasonable to expect an all core overclock of 4.6ghz now. I guess Intel CPUs in recent years will hit their boost on all cores automatically with sufficient cooling so I kinda expected the same. Do these things still boost one core to 4.6 if you manually overclock to 4.2? If they don’t then it would be interesting to see if manually overclocking actually has a negative effect on applications that are often bottlenecked by a single core.

But I don’t think 4.2ghz is all that impressive, it’s not much of an improvement over the previous gen and way short of Intel’s stock clocks. The performance numbers make do up for it, the IPC is clearly here on Ryzen 2 which makes it more disappointing, if these things could sustain an all core overclock of 4.6/4.7 then they have matched or even beaten Intel’s single core performance. However, these CPUs are still clearly better buys than the Intel parts and perform faster in most cases. Must say this 3600 is the one I’m most impressed with, the more expensive Ryzen 2 parts only seem to offer more cores for quite a bit more money and I don’t think most users really need more than 6/12 at the moment.

"would be nice" and "reasonably expected" are vastly different things, the 4.2 odd Ghz range and above becomes VERY expensive in terms of electricity and to the silicon as well as whatever solder etc they used and the like.

Intel is "mainly" all about raw IPC focus on a single core at a time
(to keep the explain simple)
AMD went the other way and kept clock speed down just crammed more cores, threads, better solder, overall better subsystems (pcie 4 and all the other magic stuff)

so, AMD "cannot" sit there at 4.6Ghz+ all day long like SOME SOME (repeat again) SOME other actual high performance silicon is able, even fewer at the 5+ghz range..Intel even with their seemingly "amazing" TDP the power use goes up MAGNITUDES when they go above 4.5 let alone 4.8-5+Ghz..there is no longer "close" to its rated TDP like the 3800x @105w, nah, try 180+w cpu alone.

I honestly do not see a "need" to burn that much extra energy to finish encoding say .30seconds quicker than one using ~1/3 less overall power CPU alone..certainly for us little folks, to burn excess 180w etc for the cpu alone when they (GPU etc) at higher res are absolutely perfectly content with one running a wee bit slower ( not to mention much less costly)

seems to me AMD learned from Bulldozer era, Intel did not over the course of many years after pentium 4 after AMD mucking up after Nvidia muck constantly.... seems the only one of the "big boys" who has learned and is trying to propel everyone forward is

AMD

4.2 Ghz all core 6+c 12+thread at 150w TOTAL board (not gpu) power....sorry, but I call that a significant win, considering the price point and the overall subsystem which these days Intel makes you pay through the nose to get all the "good stuff" AMD is effectively saying
"it costs us pennies on the dollar, why in the #$% shouldn't we make them the best we can, it costs nothing extra and it makes the consumer enjoy a solid product at a very fair price purchase for years to come"
 
"would be nice" and "reasonably expected" are vastly different things, the 4.2 odd Ghz range and above becomes VERY expensive in terms of electricity and to the silicon as well as whatever solder etc they used and the like.

Intel is "mainly" all about raw IPC focus on a single core at a time
(to keep the explain simple)
AMD went the other way and kept clock speed down just crammed more cores, threads, better solder, overall better subsystems (pcie 4 and all the other magic stuff)

so, AMD "cannot" sit there at 4.6Ghz+ all day long like SOME SOME (repeat again) SOME other actual high performance silicon is able, even fewer at the 5+ghz range..Intel even with their seemingly "amazing" TDP the power use goes up MAGNITUDES when they go above 4.5 let alone 4.8-5+Ghz..there is no longer "close" to its rated TDP like the 3800x @105w, nah, try 180+w cpu alone.

I honestly do not see a "need" to burn that much extra energy to finish encoding say .30seconds quicker than one using ~1/3 less overall power CPU alone..certainly for us little folks, to burn excess 180w etc for the cpu alone when they (GPU etc) at higher res are absolutely perfectly content with one running a wee bit slower ( not to mention much less costly)

seems to me AMD learned from Bulldozer era, Intel did not over the course of many years after pentium 4 after AMD mucking up after Nvidia muck constantly.... seems the only one of the "big boys" who has learned and is trying to propel everyone forward is

AMD

4.2 Ghz all core 6+c 12+thread at 150w TOTAL board (not gpu) power....sorry, but I call that a significant win, considering the price point and the overall subsystem which these days Intel makes you pay through the nose to get all the "good stuff" AMD is effectively saying
"it costs us pennies on the dollar, why in the #$% shouldn't we make them the best we can, it costs nothing extra and it makes the consumer enjoy a solid product at a very fair price purchase for years to come"
I guess it depends on what you’re using it for. I’d prefer less cores and higher clocks myself, for my use case scneario, if that’s the trade off. But if you are encoring then I imagine you’d happily sacrifice a clock bump for more cores. But the 3600 only has 6 cores so I didn’t think it was unreasonable to expect it to clock higher than the larger core count models. But for the decade, chips from both manufacturers, chips have all managed to hit their boost clocks on all cores automatically just by adding better cooling, so it’s not unreasonable at all to expect that this time round.

Clock speeds are only half the story. The old Athlon CPUs clocked way lower than Intel parts and delivered more performance. However in this case, I do feel that being able to overclock all cores to 4.6/4.7 may well have let these chips edge out all but Intel’s most expensive in gaming. Instead we see tests were the Intel parts screaming at >5.0 will pull away in lightly threaded application.

Also, I agree with a lot of the comments here, this 3600 is so good and so cheap that it kind of alienates it’s own more expensive options. Unless you really need those extra cores buy this and save a lot money. It destroys the value argument for a 3700 or 3900 over Intel in gaming too as they come so close. In fact I think all but gamers wearing Rolex and dollar sign sunglasses should be buying Ryzen now.

Oh and buy a cooler too, the wraith stealth this comes with is pants, your CPU hits >80C and gets thermally limited to around 4.025 at most. Get a Noctua thing or something.
 
The 3600 is an insane CPU, comparable to how good the 1600 was 2 years ago. The 8700k (who I am an owner of) is a completely different beast and I don't think the 2 are comparable. The 8700k, and all high end k chips are for the enthusiast market that's going to pay 200+ for a mobo, 150+ for a good cooler and a delid on top, and take it to the extreme max. Mine is currently running at 5.1 / 4.7ghz cache with 4000c16 ram.

The 3600 competes in a different market. It's about people that want to pay half the price, get 90% of the performance with no hassle about coolers / delidding / ocing etc.I would blindly suggest it to everyone basically, it gets the job done as fast as anyone else no matter whta the job actually is, be it gaming or other productivity applications. Kudos AMD

Don't really agree with this, and I'm a 8700K owner too. I run it on a $100 entry level Z370 mobo and a $25 CM 212+ HSF. It runs at 5GHz btw, non delidded.

That being said, the 3600 is insane value compared to the 8700K, $200 vs $350 for similar IPC and slightly lower boost clocks.

The only saving grace for the 8700K (and any CFL 'K' chip) is the fact that they all reach 5GHz with relative ease whereas Ryzen 3000 tops out at 4.2 - 4.3GHz. You're paying a huge premium for that extra 700 - 800MHz though!
 
Oh, you mean the same situation as the i5-8400 vs. the i7-8700K. See here:

https://www.techspot.com/review/1502-intel-core-i5-8400/page3.html

The 8700K was such a cringey product because of that. Who'd pay twice as much for pretty much the same chip while gaming? Those fools.

Your arguments are forced and here's something different AMD does, relative to Intel: Intel arbitrarily disables HT on the 8400. That creates a necessary divide between the 2 products so Intel could keep the prices up on the 8700K for productivity loads.

Bravo Intel for showing AMD exactly how to win at business: don't sell an actually great midrange product, instead artificially gimp it to create a reason to squeeze more money out of your customers for a now clearly higher-end product. Isn't that what we all laud Apple for?

AMD isn't Intel. Therefore, having too many parts performing the same and still not beating Intel 14nm in gaming isn't how you make a comeback. All you're doing is flooding the market with mediocre chips. AMD tried this already with Polaris....

Your HT example is poor at best. How is HT on some parts any different than the different core counts on Ryzen parts?
Explain to us what is "mediocre" on AMD and how Intel gimping the performance of their mid range CPUs is better for the consumer. You keep writing the same thing over and over with nothing to substantiate your claims. All I hear from you is "gaming gaming gaming" like a broken record even though even there Intel doesn't have any significant lead anymore. The days where you can brag about 20+ % in game x for the high end Intel CPUs are over.

Is gaming the only thing you rate your CPU with? Is AMD having good gaming performance across their entire CPU stack that bad? They should have disabled SMT like Intel does to better justify the more expensive CPUs, right? That would definitely be a "win" for consumers. If the 9600K had HT it would have fixed some of the issues it has with low 1% results in some games like BFV.
 
Last edited:
Excellent IPC performance, excellent multicore performance, excellent gaming performance, excellent platform support, excellent power figures so I think this CPU really deserved a 100 score
 
Don't really agree with this, and I'm a 8700K owner too. I run it on a $100 entry level Z370 mobo and a $25 CM 212+ HSF. It runs at 5GHz btw, non delidded.

That being said, the 3600 is insane value compared to the 8700K, $200 vs $350 for similar IPC and slightly lower boost clocks.

The only saving grace for the 8700K (and any CFL 'K' chip) is the fact that they all reach 5GHz with relative ease whereas Ryzen 3000 tops out at 4.2 - 4.3GHz. You're paying a huge premium for that extra 700 - 800MHz though!

Yes but, you were just lucky. Mine without a delid was hitting 99c at anything above 1.2v even on Cinebench. I delided it and yeah, there was paste and glue everywhere , some Intel magic right there.

Also I doubt your temps are manageable at anything that stress it
 
Yes but, you were just lucky. Mine without a delid was hitting 99c at anything above 1.2v even on Cinebench. I delided it and yeah, there was paste and glue everywhere , some Intel magic right there.

Also I doubt your temps are manageable at anything that stress it

I was lucky? I don't think so. The average 8700K OC is between 5.0 - 5.1GHz. Mine needs 1.36V for 5.0, which is pretty typical. Temps are in the 60s during gaming, which is good enough for me. It's not a productivity rig, its a gaming rig, after all.

Rather, I think you got unlucky in that your 8700K had a really bad connection to the IHS and really needed a delid to OC well. I'm sure my 8700K can be pushed further with a delid too, maybe 5.1, 5.2 if I'm lucky and upgrade my cooling, but that's a lot of effort and cost for a tiny gain in performance. I'd rather put that money towards something else... like maybe a new Ryzen 3000 chip ;)
 
I was lucky? I don't think so. The average 8700K OC is between 5.0 - 5.1GHz. Mine needs 1.36V for 5.0, which is pretty typical. Temps are in the 60s during gaming, which is good enough for me. It's not a productivity rig, its a gaming rig, after all.

Rather, I think you got unlucky in that your 8700K had a really bad connection to the IHS and really needed a delid to OC well. I'm sure my 8700K can be pushed further with a delid too, maybe 5.1, 5.2 if I'm lucky and upgrade my cooling, but that's a lot of effort and cost for a tiny gain in performance. I'd rather put that money towards something else... like maybe a new Ryzen 3000 chip ;)
Well sure you can see it both ways, but thats the point. Its luck of the draw, and not just the silicon lottery but also the IHS lottery!

I tested yesterday my 8700k stock vs 3600 in the SOTR demo benchmark, and...the 3600 was on top. By 1-2%, but still. Ofcourse when oced to 5.1 I was about 20-25% up. My ram are 4000c17
 
You say that the idle temperature was 33°C using the box cooler, I use a Noctua NH-D9L cooler with double NF-A9 fans (spinning @ ~700 rpm/30%) and the Noctua NT-H1 compound, and I receive 42°C as the lowest during 30 minutes monitoring. I used the CPUID HW-monitor, what utility did You use? /jb
 
Back