Core i5-8400 vs. Overclocked Ryzen 5 1600

What about machines that pull dual gaming + productivity/rendering duties? Ryzen specs (12 threads) suggest that it might be better for this scenario
 
Another good review again, Steve. 1% mins (in addition to avg fps) plus multiple resolutions should really be the standard for all tech sites.

Here's an idea for a future budget article - by how much does Ryzen vs Coffee Lake differ on older games? I'm thinking pre-2014 stuff like the Bioshock Trilogy, Deus Ex Human Revolution, Half Life 2, Portal 2, Oblivion / Skyrim, etc. Hell, it's that time of year when I'm more getting the urge to fire up Amnesia: Dark Descent (2010), SOMA (2015) or have a blast at FEAR1 (2005) again, than struggle to maintain interest in some recent 2017 titles. A lot of us play a wider mix of old & new games than most mainstream tech sites tend to represent, and I haven't seen a single site do something like an R3 1300X vs i3-8100 for older games outside of the usual "12-game bubble" (BF1, Civ VI, GTA V, Hitman, Overwatch, Tomb Raider, Witcher 3, etc).

In theory "all old games will run fine on modern CPU's", in practise some open-world titles like Morrowind / Oblivion / Operation Flashpoint with large draw distances can absolutely bring a modern CPU to its knees by maxing out only 1-2 cores. In theory the Intel would win on IPC (based on Cinebench 1T scores), but the question is by how much do those synthetics scale on older 1-4 core usage games in actual practice? Be interesting to do something different and pick a few oldies to test as part of a budget gaming article series.
 
Great Article and timely for me- thanks Steve. And.... You covered all the bases you Intel-AMD fanboy; just what the hell are we supposed to argue about now? :p

If I was upgrading today (and I will probably in the next 3-4 months), I would have to go with Ryzen based on which Video card I currently have for games and especially for the amount of encoding and for the little bit graphics work I do. The only thing that would make me consider the 8400 is if I get that itch to grab an expensive video card while I am upgrading. Mine works fine for what I play now so Ryzen is looking good.
 
The other side being that it's hiding how much faster the Intel CPUs are and with faster GPUs that margin will open up in the future.

On the one hand, I agree with this statement. Intel still has superior single core performance, which is still important years later. And as we have seen with the phenom II chips, the "almost as fast" chip really does show it's age a lot faster then the faster chip. The first gen core i5s are much better for gaming then any phenom II today.

OTOH, many of the games coming tested were released when ryzen was either brand new or hadnt been released yet. these games were optimized for intel because AMD had 0 competitive parts. Civ VI also showed that games preferring multiple threads run much better on ryzen then intel.

Given that zen+ and zen 2 will be on AM4 as well, I'd be more confident in buying a ryzen platform. Ryzen isnt as good as the core I lineup, but it is an impressive step for AMD, its multicore performance, which is becoming more important in newer games, is on par with or exceeds intel, and future chips will most likely fix the IPC/ clock rate gap.
 
From the beginning this review creates the wrong impression. 720p testing really? Yes yes I know "No GPU bottleneck, shows how processors will play in the future" and stuff. Well, wrong. Future games will be probably heavier on graphics and game engines updated to ask at least 6 cores if not more as a recommended setting. Do you remember how great the unlocked Pentium was for gaming and how fast it fell on it's face? How about throwing in there a quad core i5 Kaby Lake and see what it does in modern games? Does it verify those conclusions made a couple of years ago, again based on 720p results, or does a quad core looks as a bottleneck today?
I also like the choice of words.
Battlefield 1 (Vega, DX12): Look at how fast 8400 is. And look at that low MSRP. What a great value.
Civilization VI(Vega again, DX12 again): It's an anomaly. Maybe it is Vega, or DX12. So with an Nvidia card or another API things could be the opposite? Just asking.
 
Last edited:
From the beginning this review creates the wrong impression. 720p testing really? Yes yes I know "No GPU bottleneck, shows how processors will play in the future" and stuff. Well, wrong. Future games will be probably heavier on graphics and game engines updated to ask at least 6 cores if not more as a recommended setting. Do you remember how great the unlocked Pentium was for gaming and how fast it fell on it's face? How about throwing in there a quad core i5 Kaby Lake and see what it does in modern games? Does it verify those conclusions made a couple of years ago, again based on 720p results, or does a quad core looks as a bottleneck today?
I also like the choice of words.
Battlefield 1 (Vega, DX12): Look at how fast 8400 is. And look at that low MSRP. What a great value.
Civilization VI(Vega again, DX12 again): It's an anomaly. Maybe it is Vega, or DX12. So with an Nvidia card or another API things could be the opposite? Just asking.
So, you are complaining that games were toned down to test CPU limitations, because newer games may require more CPU power?

Dude, your argument holds no water. Do you want the author to just not test at all, or bring games from the future to test?

if you dont think this kid of testing is relevant, why even bother commenting? This article was testing cpus that are available RIGHT NOW, with games that are also available RIGHT NOW, not games 5 years from now. And said article did a wonderful job.
 
Last edited:
I know the article is primarily the 1600 vs 8400 but then you do a cost breakdown of all the platforms including an overclocked 1600. Yet the 7700k is at stock speeds, there should be a platform cost for the 7700k @ 5ghz
 
I know the article is primarily the 1600 vs 8400 but then you do a cost breakdown of all the platforms including an overclocked 1600. Yet the 7700k is at stock speeds, there should be a platform cost for the 7700k @ 5ghz

The Core i7-7700K is now a horrible buy and one of the biggest shaftings of 2017. We don't need to compare it's value at 5 GHz, it's no good.
 
From the beginning this review creates the wrong impression. 720p testing really? Yes yes I know "No GPU bottleneck, shows how processors will play in the future" and stuff.
Yes really. You max out CPU's by removing the GPU bottleneck and you max out GPU's by removing the CPU bottleneck. The only way the review would be biased or unfair is if Steve gave only 720p scores and gave the illusion that's true of all higher resolutions. Instead he's given you all 3 data-sets (720p + 1080p + 1440p), which is both a lot more useful for gauging how CPU/GPU dynamics scale, and also something few other sites can be bothered to do. Honestly, this has been done to death and explained in depth in previous CPU reviews too.

No-one said they are 100% perfect predictor of 2022 games, but they DO actually max out the CPU's being tested rather than hide non-maxed out (to varying degrees) CPU's behind other component bottlenecks (which is amusingly what "fair CPU benchmarking" now passes as)...
 
How about throwing in there a quad core i5 Kaby Lake and see what it does in modern games? Does it verify those conclusions made a couple of years ago, again based on 720p results, or does a quad core looks as a bottleneck today?

It's been done, the kaby lake beats the ryzens in every CPU gaming test both in 720 & 1080p. Not really sure what you are talking about as far as a budget $60 pentium from 2015 falling on it's face as it holds up considerable well against multicore AMDs at the time. Truth hurts but denying the truth hurts even more.
perfrel_1280_720.png


CPU_01.png
 
Last edited:
Oh boy, this is the first time I read such a long conclusion lol.

Kudos Steve, nice review as usual. When someone realizes they may have not made the best decision through facts, they try to put logic into why it was the best decision they could've ever made, because of this they will defend their brand of choice whatever this may have been with whatever illogical thinking they can.

The 8400 looks like a mighty contender and choice on the BBB category.
 
@Theinsanegamer
This article was testing cpus that are available RIGHT NOW, with games that are also available RIGHT NOW, not games 5 years from now.
Really? Because those 720p results try to come to conclusions about 5 years from now. With games that are available today. Isn't this suppose to be the reasoning behind testing at 720p? That's what I am saying. You can't come to conclusions about what a processor will be capable off in 2-3 years, when that processor comes with the minimum number of cores that it is considered necessary today for avoiding bottlenecks. The 8400 will start dropping in charts pretty fast from next year when Intel starts selling 8 core mainstream processors. And while it will probably never fall at Ryzen levels, that have their restrictions in gaming, the difference will be much smaller compared to what is shown here.

@BSim500
The review is already unfair when 720p gets all the attention, for the reasons I already posted. And even if we agree that Civ VI is an anomaly, you can't start finding excuses why Intel is losing in that benchmark, when just in the previous benchmark, with the same GPU and the same API, you didn't feel the need to post a number of possible reasons for the results.

@dirtyferret
We just started seeing the first 6 core Intel processors. 4 cores are dead and everyone knows it. Probably even you. The second chart of yours shows what I am talking about. Just compare the Pentium with the i3 results. Huge difference in the minimum FPS.
 
Here's an idea for a future budget article - by how much does Ryzen vs Coffee Lake differ on older games? I'm thinking pre-2014 stuff like the Bioshock Trilogy, Deus Ex Human Revolution, Half Life 2, Portal 2, Oblivion / Skyrim, etc. Hell, it's that time of year when I'm more getting the urge to fire up Amnesia: Dark Descent (2010), SOMA (2015) or have a blast at FEAR1 (2005) again, than struggle to maintain interest in some recent 2017 titles. A lot of us play a wider mix of old & new games than most mainstream tech sites tend to represent, and I haven't seen a single site do something like an R3 1300X vs i3-8100 for older games outside of the usual "12-game bubble" (BF1, Civ VI, GTA V, Hitman, Overwatch, Tomb Raider, Witcher 3, etc).

In theory "all old games will run fine on modern CPU's", in practise some open-world titles like Morrowind / Oblivion / Operation Flashpoint with large draw distances can absolutely bring a modern CPU to its knees by maxing out only 1-2 cores. In theory the Intel would win on IPC (based on Cinebench 1T scores), but the question is by how much do those synthetics scale on older 1-4 core usage games in actual practice? Be interesting to do something different and pick a few oldies to test as part of a budget gaming article series.

I'd be interested in this too, if only because I've had some shockingly bad performance in early 2010ish games and I'm wondering if ryzen is the reason. I can't maintain 60fps with a 1080ti in Far Cry 3 Blood Dragon, or Serious Sam 3 BFE for example (although the fusion version runs fantastically under vulkan) and I don't really have anything to compare it to.
 
The review is already unfair when 720p gets all the attention, for the reasons I already posted.
Gets all the attention where? The only people I see obsessed with commenting on 720p benchmarks in review after review are the ones demanding they don't exist "because Intel vs AMD". Personally I find Techspot's all-resolution reviews a hell of a lot more useful than nonsensical "CPU" benchmarks like Hexus who test all of 2x games at 1440p only. Oh, goody, another "53.4fps 1950X vs 53.5fps R3 1300X" GPU bottlenecked benchmark. Very 'helpful' in showing me how quad-cores are the future and +6-16 cores are useless if you cripple them all with enough GPU bottleneck. Personally, I think every site should do 4K Ultra only. Then we can all save a load of money buying $70 Pentium G4560's...
 
Guys... you are complaining and fighting for the sake of it... just let it go. Unsubscribing to yet another thread.
 
It is really starting to irk me with these low res benchmarks so we can all proclaim Intel the champion of the known universe. I understand the concept of these tests; you lower the demand on the GPU and now have a good indicator of the CPU's performance. However, this approach is incredibly flawed, and the discrepancies in results between different games is the dead giveaway. If one game claims to show the AMD CPU ahead of the Intel, and the next doesn't, how can you make bold proclamations that "this is the best CPU EVER!"

What these benchmarks instead show, is how lazily these games are produced and how poorly optimized they are. I find it ironic that synthetic benchmarks are largely overlooked and classified as "not indictive of real world usage" when they are the only things capable of fully taxing a given CPU. As if running a game on a very expensive Vega64 at 720p /is/ indicative of real world usage? Riiiiight...

What really needs to happen is these coders need to get off their *** and make a killer multithreaded game to really show what the CPU is capable of. Oh, wait... There are games like that, but we instead keep benchmarking BF1 at 720p as if that is something meaningful.

Now to be fair, I am an AMD fanboy admittedly. I only build AMD systems unless there is a particular reason I cannot. That said, I'm not small-minded enough to realize there are or can be better options out there. If Intel truly deserves the crown, then by all means give it to them. But I'm sick sitting idly by and watching these review sites weasel around and come up with ridiculous tests to produce disingenuous results.

All I'm asking is these sites to come up with a different testing protocol. We're all sick of it; please stop the nonsense.
 
Excellent article, Bravo!

I had this same decision to make and plumped for the 1600/B350 board.

Mainly because I don't have cash to burn and only upgrade when absolutely necessary, the old Phenom 965 had been struggling for a while.

The number one reason for me was AM4 socket being relevant for longer meaning I can just drop Zen2 in.
Extra threads is also nice to have.
 
The 720p benchmarks have been shown to be irrelevant time and again. Take a look at the recent Gamer Nexus review of the 8350k. Overclocked it beats the crap out of the 8400 in pretty much everything they tested it. So going by the future proofing argument, the oc'ed 8350k is going to play future games better than the 8400. Is it though? I don't think so.
 
@dirtyferret
We just started seeing the first 6 core Intel processors. 4 cores are dead and everyone knows it. Probably even you. The second chart of yours shows what I am talking about. Just compare the Pentium with the i3 results. Huge difference in the minimum FPS.

AMD phenom II x6 failed to beat intel quads in gaming
AMD bulldozer failed to beat intel quads in gaming
AMD ryzen fails to beat Intel quads in gaming

Quads lead six core CPUs in steam hardware survey 60% to 1% market share buy yea quads are dead...
 
Back