Core i5-8400 vs. Overclocked Ryzen 5 1600

The 720p benchmarks have been shown to be irrelevant time and again. Take a look at the recent Gamer Nexus review of the 8350k. Overclocked it beats the crap out of the 8400 in pretty much everything they tested it. So going by the future proofing argument, the oc'ed 8350k is going to play future games better than the 8400. Is it though? I don't think so.
they are only irrelevant to AMD fan boys who go crying about the performance results, everyone else finds great relevance to them as most professional review sites now include them. You can continue watching AMD fan boy from mom's basement on youtube posting how much faster Ryzen is though...im sure that is helpful
 
Wasn't the purpose od direct x 12 to reduce the load of the cpu and send it to the gpu? So why are we testing battlefield 1 in D12? And D12 in this game is broken.

Also use multiplayer in games for testing (I know its hard to measure fps but you should atleast find out the average fps. Battlefield 1 multiplayer is not the same as the single player. Test a 64 player map and see wich cpu is the best. Where are games like Crysis 3 and Assassins Creed Orgin? Those are all heavely threaded games. If you use multicore cpu's then use multithreaded games otherwise you can use apps.
 
they are only irrelevant to AMD fan boys who go crying about the performance results
They are also irrelevant to everyone that doesn't care about performance in today's games with future graphics cards. Cause that's all 720p results show you. I already demonstrated their irrelevancy for every other purpose. An oc'ed 8350k beats the 8400 on 720p in most games. Check Nexus Gamer's review that just came up. Is it more future proof? I don't think so.

Calling me or anyone else an AMD fanboy is not an argument. If anything, it's an admittance of defeat. Frankly, the only one who brought AMD into the mix, I didn't. I was purely comparing Intel CPU's. That shows me that if anything, you are the Intel fanboy here.

everyone else finds great relevance to them as most professional review sites now include them.
Most professional review sites also include 4k. They are still irrelevant for cpu benchmarking. So, why 720p? Why not 320p? Or 240p? After all, we want to remove GPU bottlenecking, right?

You can continue watching AMD fan boy from mom's basement on youtube posting how much faster Ryzen is though...im sure that is helpful

Again, calling me an AMD fanboy doesn't count as an argument. I can call you back an Intel fanboy and we are left exactly where we began. I didn't mention AMD in any shape or form in my argument, you are the one that did want to turn this to a VS. I guess you are an Intel fanboy posting form your mom's basement about how much faster 8350k is, right?

PS1. Nevermind, I just checked your previous posts. You are an Intel fanboy indeed, turning every post relevant or irrelevant to Intel vs AMD even when no one mentions AMD. And you dare call other people fanboys! The irony is astonishing here.
 
What about machines that pull dual gaming + productivity/rendering duties? Ryzen specs (12 threads) suggest that it might be better for this scenario

Yep " Gaming aside, the R5 1600 is the superior CPU for productivity even before it's overclocked (vastly superior in some tests)."

Cheers Steve, I missed that line in the conclusion (it was a pretty comprehensive information-dense conclusion!).

I don't think I could go wrong choosing either processor. What hurts right now is RAM prices, and with no relief in sight.
 
With (presumably) cheapish B360 mobos in the near future the non-K Intels will likely gain a slight advantage when it comes to price/performance gaming systems.
 
Again, calling me an AMD fanboy doesn't count as an argument. I can call you back an Intel fanboy and we are left exactly where we began. I didn't mention AMD in any shape or form in my argument, you are the one that did want to turn this to a VS. I guess you are an Intel fanboy posting form your mom's basement about how much faster 8350k is, right?

You mean you want me to repost the graph up top for you proving the intel quad is faster in gaming? I thought you checked my previous post? Misinformed and confused is no way to go through life.
 
Most professional review sites also include 4k. They are still irrelevant for cpu benchmarking. So, why 720p? Why not 320p? Or 240p? After all, we want to remove GPU bottlenecking, right?

No reviewer who takes their job seriously will test CPUs at 4K, I don't have the words to describe how dumb that is. Can you get over the low resolution testing, either get over it or go away we're all sick of reading this nonsense.
 
Please link to an i5 8400 for $190. I've got a fallout 4 build waiting on one, thanks.

Grabbed a Ryzen 5 1400 w/ ASRock ab350m Pro4 from Microcenter for a customer build yesterday; $196.36 after tax.

I'd love to see that @ 4.0GHz on your performance per dollar chart LOL
 
Please link to an i5 8400 for $190. I've got a fallout 4 build waiting on one, thanks.

Grabbed a Ryzen 5 1400 w/ ASRock ab350m Pro4 from Microcenter for a customer build yesterday; $196.36 after tax.

I'd love to see that @ 4.0GHz on your performance per dollar chart LOL

I feel like your trying to be smart with this comment , but you've made it without reading the entire article. I quite clearly stated numerous times that you can't buy the Core i5-8400 right now.

The Ryzen 5 1400 is a really nice budget CPU, cost per frame is great and we've shown this countless times already. That said it's nowhere near being in the same league as the Core i5-8400 for serious gaming.
 
Most professional review sites also include 4k. They are still irrelevant for cpu benchmarking. So, why 720p? Why not 320p? Or 240p? After all, we want to remove GPU bottlenecking, right?

No reviewer who takes their job seriously will test CPUs at 4K, I don't have the words to describe how dumb that is. Can you get over the low resolution testing, either get over it or go away we're all sick of reading this nonsense.
hi, I really don't care about the whole intel vs amd thing but I am one of those users with a 4k monitor who is interested in seeing 4k results. maybe I was dumb buying a 4k monitor but I really do like it. especailly seeing results done with tweaking settings. I knew going into it that it was going to be hard to drive which is why I like seeing results for it as a lot of those so called "dumb, non serious" reviewers do extra results with tweaked settings. like dropping this down to medium and whatnot. to give an idea of getting good frames in fullscreen.

oh also, in terms of cpu, those dumb reviewers did show the fx 8350 being the worst in 4k results when I was looking for a budget gpu for my 4k monitor. so cpu does matter in 4k. granted they all were weak at ultra, but the fx 8350 was around 10-15 below the others. not all, there were some where the 8350 was like around 5 fps difference or none at all, but there were ones that showed it heavily below. especially when setting graphic settings below ultra. thats where the cpu becomes more into play.
 
People give examples between Intel processors - about the 720p conclusions - never mentioning the word AMD and people either ignore them, or call them AMD fanboys.

Nice. Objectivity...
 
People give examples between Intel processors - about the 720p conclusions - never mentioning the word AMD and people either ignore them, or call them AMD fanboys.

Nice. Objectivity...

Not sure which people you are talking about but I agree the thread has been hijacked. Low resolution testing back in 2011 comparing the 2500K and 2600K proved that the 2600K was at least 20% faster. Yet everyone who tested at 1080p or higher said the 2600K was no faster than the 2500K for gaming.

I am one of those people who messed up and only tested 1080p and 1600p. https://static.techspot.com/articles-info/353/bench/Gaming_03.png

Here is the Hard|OCP testing; https://www.hardocp.com/article/2011/01/03/intel_sandy_bridge_2600k_2500k_processors_review/4
 
No reviewer who takes their job seriously will test CPUs at 4K, I don't have the words to describe how dumb that is. Can you get over the low resolution testing, either get over it or go away we're all sick of reading this nonsense.
I have no problem with low resolution testing. Actually, I agree that it's the only way to test a CPU. The problems I have are with the conclusions drawn from the testing.
 
Not sure which people you are talking about but I agree the thread has been hijacked. Low resolution testing back in 2011 comparing the 2500K and 2600K proved that the 2600K was at least 20% faster. Yet everyone who tested at 1080p or higher said the 2600K was no faster than the 2500K for gaming.

I am one of those people who messed up and only tested 1080p and 1600p. https://static.techspot.com/articles-info/353/bench/Gaming_03.png

Here is the Hard|OCP testing; https://www.hardocp.com/article/2011/01/03/intel_sandy_bridge_2600k_2500k_processors_review/4
A thread where people express a different opinion is NOT hijacked. Yes it is disappointing and tear some after so much testing to also have to explain things you might consider self explanatory, but that's not thread hijacking. Just different opinions. Different perspectives.

What I see from that Hard|OCP testing.
2600K is 4 cores, 8 threads. 2500K is 4 cores/4 threads. In those tests the i7 920, a much older processor, when overclocked at 3.6GHz was also faster than the 2500K. Again the i7 920 a 4 cores 8 threads processor. So the extra threads helped in low resolution testing and in games from that period.

1293839528CCXLXmKatJ_4_1.png


Fast forward to 2015 and something seems to have changed.
https://www.techspot.com/article/1039-ten-years-intel-cpu-compared/page5.html

Based on your testings, the 2700K, a faster version compared to the 2600K, against 2500K scores almost the same. So that 20% faster you are saying, based on the low res testing from Hard|OCP in 2011, 4 years latter ends up being a fantasy. Never happened, or just gone because of the advancements made in programming.

Gaming_02.png


Also while you don't have an i7 920 in that testing, you do have a i7 870, that it is faster than a stock 920?(Intel. WT...?) Well, that advantage the 920 was enjoying in the low resolution testing that Hard|OCP did, is also gone. In 2015, the 870 is slower than the more modern 2500K, when in 2011 and at ultra low resolutions the 920, or any other processor close to that specs, was looking much much more future proof than a 4cores/4 threads 2500K.

YOU CAN NOT make conclusions based on low res testing in PRESENT games. It looks logical, but things change. Minimum requirements change, the way games are programmed is changed, what hardware companies try to push in the market changes. Based on TODAY, the 8400 is a miracle. But this miracle will fade much faster than what your low resolution testing makes us believe.

Please Don't see this as a hijacking. Just as a different opinion.
 
Last edited:
A thread where people express a different opinion is NOT hijacked. Yes it is disappointing and tear some after so much testing to also have to explain things you might consider self explanatory, but that's not thread hijacking. Just different opinions. Different perspectives.

What I see from that Hard|OCP testing.
2600K is 4 cores, 8 threads. 2500K is 4 cores/4 threads. In those tests the i7 920, a much older processor, when overclocked at 3.6GHz was also faster than the 2500K. Again the i7 920 a 4 cores 8 threads processor. So the extra threads helped in low resolution testing and in games from that period.

1293839528CCXLXmKatJ_4_1.png


Fast forward to 2015 and something seems to have changed.
https://www.techspot.com/article/1039-ten-years-intel-cpu-compared/page5.html

Based on your testings, the 2700K, a faster version compared to the 2600K, against 2500K scores almost the same. So that 20% faster you are saying, based on the low res testing from Hard|OCP in 2011, 4 years latter ends up being a fantasy. Never happened, or just gone because of the advancements made in programming.

Gaming_02.png


Also while you don't have an i7 920 in that testing, you do have a i7 870, that it is faster than a stock 920?(Intel. WT...?) Well, that advantage the 920 was enjoying in the low resolution testing that Hard|OCP did, is also gone. In 2015, the 870 is slower than the more modern 2500K, when in 2011 and at ultra low resolutions the 920, or any other processor close to that specs, was looking much much more future proof than a 4cores/4 threads 2500K.

YOU CAN NOT make conclusions based on low res testing in PRESENT games. It looks logical, but things change. Minimum requirements change, the way games are programmed is changed, what hardware companies try to push in the market changes. Based on TODAY, the 8400 is a miracle. But this miracle will fade much faster than what your low resolution testing makes us believe.

Please Don't see this as a hijacking. Just as a different opinion.

I've improved many of my testing methods, this is a much better example...
https://www.techspot.com/review/1325-intel-pentium-g4560/page4.html

The 2600K is clearly faster than the 2500K today, just as the 7700K is faster than the 7600K.

Finally my Metro results above are heavily GPU bound on the modern quad-cores, well any Intel quad-core released in the last 6 years.
 
I've improved many of my testing methods, this is a much better example...
https://www.techspot.com/review/1325-intel-pentium-g4560/page4.html

The 2600K is clearly faster than the 2500K today, just as the 7700K is faster than the 7600K.

Finally my Metro results above are heavily GPU bound on the modern quad-cores, well any Intel quad-core released in the last 6 years.
I don't have a problem with your testing. I love your tests. I am reading Techspot for years and usually link to it's articles and reviews because they are well written. Also the site is well designed. You don;t get lost, like in other sites. You know where to look and find what you are searching.

I only have a different opinion on low res testing and conclusions making based on that. 2600K is 8 threads, 2500K is 4 threads. 7700K is 8 threads. 7600K is 4 threads. DX12, Vulkan, modern programming, all will favor the processor with the more threads(I am talking between processors with the same architecture, AMD FX line proved that no matter how many threads you can have, you'll still fail miserably, and AMD is still paying this with lower sales of Ryzen).
With Intel going 6 cores this year and 8 cores next year for it's MAINSTREAM platform, 6 cores will become a minimum requirement in no time. The next 3 years will be the complete opposite to what we saw in the last 7 years.
 
With Intel going 6 cores this year and 8 cores next year for it's MAINSTREAM platform, 6 cores will become a minimum requirement in no time. The next 3 years will be the complete opposite to what we saw in the last 7 years.



1) Please link us the Intel article stating the i5-9400 will be a 8 core desktop otherwise you are simply lying making your entire argument inaccurate.

2) Below is break down of Physical CPUs from Steam's September Hardware survey, notice the physical six core market percentage. In six months of Ryzen, the 6 core+ market percentage is 2%. May 2017 is the first time quad cores passed dual cores in the hardware survey, ten years it took. It won't take six cores as long but it's not going to be in 2-3 years either.


1 cpu
1.32%
+0.04%

2 cpus
36.83%
-0.08%

3 cpus
1.83%
-0.38%

4 cpus
57.94%
+0.58%

5 cpus
0.00%
-0.01%

6 cpus
1.42%
-0.17%

7 cpus
0.00%
0.00%

8 cpus
0.59%
0.00%

10 cpus
0.02%
+0.01%

11 cpus
0.00%
0.00%

12 cpus
0.02%
0.00%

14 cpus
0.00%
0.00%

16 cpus
0.01%
0.00%
 
What I dont get, is the fuss about pc resources that have improved incrementally, yet ~complete disregard for the vital component that has improved exponentially. NVME storage.

Indicatively, storage read speed progression has been ~100MB/s sata hdd, 500MB/s sata ssd, 3400MB/s for samsung SSDs.

ie. after nvme, your next best option (sata ssd) is 6x slower.

It defeats me that folks dont regard the matter as important. Me, I will prefer the best nvme and the option of more.

Yet the pros and cons of amd vs intel rigs re nvme is ~never discussed afaik.

The huge catch is that these HB devices have snuck up on the industry. Each nvme needs 4 lanes, and sub threadripper systems simply dont have them.

Its murky, but amd is a clear winner, and it comes down to having four extra pcie3 lanes available.

Both amd and intel allow nvme port(s) on the chipset, but its not full strength, as its 4 lanes of chipset bandwidth, shared with many resources, and also lags due to chipset overheads.

AMD however, allows an additional nvme, which IS full strength, using dedicated lanes direct to the cpu.

To the extent that a game e.g., needs to interact with storage, then the fastest storage you can make available by far is an nvme on an amd ryzen mobo.
 
I feel like your trying to be smart with this comment , but you've made it without reading the entire article. I quite clearly stated numerous times that you can't buy the Core i5-8400 right now.

The Ryzen 5 1400 is a really nice budget CPU, cost per frame is great and we've shown this countless times already. That said it's nowhere near being in the same league as the Core i5-8400 for serious gaming.[/QUOTE]

I wasn't trying to be smart at all - 8400 destroys Ryzen in Fallout 4 when paired with fast ram and I'm stuck with a $199 preorder unless you knew where I could place a better preorder ($190 as you stated).

Also, I find the Ryzen 5 1400 @ 3.9GHz to be excellent with Vega 56 and 75Hz 1440p Freesync for "serious gaming." Outside of a few niche scenarios such as 144hz gaming, I feel like intel barely has a leg to stand on these days - though I don't mind recommending them for appropriate use-cases such as Fallout (w/ 3773MHz DDR4).
 
Really good review as usual

Actually a very poor review. The tyest should have 16GB memory in all three machines. Some game titles benefit significantly with more memory added even from 16GB to 32 GB. Since the machines did not have equal memory the results are completely garbage. I expect if all three machines had 16GB memory the results for the Intel machines for FPS would be much lower and they would be in the same ball park.
 
Really good review as usual

Actually a very poor review. The tyest should have 16GB memory in all three machines. Some game titles benefit significantly with more memory added even from 16GB to 32 GB. Since the machines did not have equal memory the results are completely garbage. I expect if all three machines had 16GB memory the results for the Intel machines for FPS would be much lower and they would be in the same ball park.

Steven has posted why he only uses 16GB in the AMD test system multiple times as well as providing a detailed test of how beyond 16GB it does not affect gaming in anyway. Show us a professional review that proves him wrong (not "I know its true" or fan boy from mom's basement posted tests on youtube)? We will wait....

https://www.techspot.com/article/1043-8gb-vs-16gb-ram/
 
Last edited:
At 2560x1080 with GTX 1070, which one of those two cpus would you recommend me? My pc is only for gaming.
Im going for R5-1600 for the Socket longevity, AM4 will be compatible with Feb (Zen+) and will live up to 2020 but Intel Changes Socket every 2 gens or 2 years (less sometimes)

Grettings
 
Good review, im going for R5-1600 I Got the motherboard last month (MSI B350 Tomahawk) maybe I get an RX 560 or 1050 for 1080p 60fps Gaming, enought for me, other is vídeo Edition, 16GB 3000MHz RAM will help

Grettings
 
I'm liking my Ryzen 1700X and I just bought a Quad Core 1500X that hasn't arrived yet. I have no problems supporting AMD's efforts with Ryzen. They work fine for my uses.

That said, My Intel CPUs (6800K, 7700K, 6600K, 6700K and 7900X) are pretty potent, although expensive.

I have no problem buying AMD CPUs, but I like NVIDIA GPUs and this is where AMD needs to concentrate their efforts for a while.
 
Back