Intel Core i7-8700K, i5-8600K, 8400 versus AMD Ryzen 7 1800X, R5 1600X, 1500X

Hello Steve,

May I ask how the 9 game average is calculated? It seems the math is not quite correct. The chart shows this:
https://static.techspot.com/articles-info/1505/bench/Average_720p.png

I just spot check the AMD Ryzen 5 1600x results for the 720p, so if I add up the following for average frame rate:
154, 98, 94, 140, 190, 109, 97, 183, 122, I get 1187 and divide by 9 I get 131.89 or 132 rounded up.

That does match what the chart is showing of 158. Is there something I am not understanding?

Thanks.

On all three resolutions "9 game average" charts are showing the same results. :(
Something is wrong here.

Try CTRL+F5, sometimes the cache takes ages to refresh.
 
Back in real life where people use 1080p, 2560x1080p, and 2560x1440p Freesync monitors, Excellent AM4 motherboards were 20% off at newegg (only $60 shipped for ASRock's excellent AB350m Pro4) and Ryzen 5 1600 was $169.99, but let's go ahead and do everything we can to pretend intel is still relevant in price/performance.......
When will AMD fans realize they will never be the best processor. You can have all the value you want, doesn't mean anything when the Intel still has the CROWN. Being KING is what it's all about. There are plenty of ppl in this world who care about having the best not the best value. Best value are for ppl who cant afford the best. AMD will try hard and come up short, as that's always been there moto for almost 20 yrs. The little engine that couldn't beat Intel. Sure they can FINALLY compete, good for them. Know what that really means, nothing. Intel users/fans will still buy the best, which is Intel, that wont be changing anytime soon.

It seems that you forgot that buying something is a trade. Why would you want to go for an inferior trade option?
 
Ya that was a terrible example a Core 2 quad will get destroyed by a current gen dual core cpu with HT.
lol I should know. For a year or two my dinosaur QX6850 (basically an overclocked Q6600) overlapped with my GTX 970. When I finally moved up to 6700K a big up in consistency of performance - esp for stuff like Gamestream! :p
 
Back in real life where people use 1080p, 2560x1080p, and 2560x1440p Freesync monitors, Excellent AM4 motherboards were 20% off at newegg (only $60 shipped for ASRock's excellent AB350m Pro4) and Ryzen 5 1600 was $169.99, but let's go ahead and do everything we can to pretend intel is still relevant in price/performance.......
When will AMD fans realize they will never be the best processor. You can have all the value you want, doesn't mean anything when the Intel still has the CROWN. Being KING is what it's all about. There are plenty of ppl in this world who care about having the best not the best value. Best value are for ppl who cant afford the best. AMD will try hard and come up short, as that's always been there moto for almost 20 yrs. The little engine that couldn't beat Intel. Sure they can FINALLY compete, good for them. Know what that really means, nothing. Intel users/fans will still buy the best, which is Intel, that wont be changing anytime soon.

Maybe you are too young to remember but that "little engine" had the best CPUs from 2001 to 2006. And not just for gaming.
Intel with their higher clocked Pentium 4's could not compete with 400-500Mhz lower clocked AMD Athlons.

You can protect your investment, but clearly Intel is a bit worried about Ryzen. And that means that they don't suck as someone is describing them.

Additional information:
Intel Core i7 8700k is not the BEST CPU today (overall).
You'll have to pair it with correct hardware at 1080p to make it THE BEST, at that is for gaming only.
If you have unlimited money you would not buy 8700k, but even if you would the you would probably have 4k monitor or at least 1440p with 1080 Ti in which case CPU choice would not matter much.
 
Would be nice to see a review where the CPU gaming performance is compared when multitasking with a non-game related process such as doing a video encode whilst gaming or streaming + gaming.
 
Steve,

Thanks for another interesting review. You mentioned you would be switching from the R5 1500X to the R5 1400 for the overclocking results. Any chance you could stick with the R5 1500X and maybe do just a couple of reference benchmarks with the R5 1400? The reason I'm asking is because of the L3 cache. While for example the R5 1600X and R5 1600 should perform identically at identical clock speeds, the same is not exactly true for the R5 1500X and R5 1400. Even though the difference in many real-world cases is undoubtedly negligible, the differences should be measurable with the Vega64 and especially at 720p. It would be interesting to see more data on the theoretical differences between the 1500X and 1400, but if you only have time for one, the 1500X would be the more consistent choice and would ensure easy comparability with this review's results. With the 1400 we'd have to factor in (maybe) 0-15% lower figures what could have been achieved with the 1500X and you know what we enthusiasts are like; nobody likes to speculate. ;)
 
So that theory just fails. The reason why it fails is that you are comparing CPU's with widely different characteristics. The Intel's are single thread beasts but the Ryzen's have more overall power. The more core aware a game is, the better it performs on the overall more powerful CPU. Take for example Crysis 3 that maxes out every single core your PC has. In the heavy scenes, the R5 1600 demolishes the 8400, and not by a small margin.
Your statements have very little substance behind them.

AMD CPU's, regardless of the threads/cores a game demands, have always lost to Intel in gaming benchmarks besides a few examples here and there.
You made some statement about older games and newer games at 720p but they are still mostly built using the same gaming engines with similar architectural support.

Also your FX comment is not quite spot on. The 8300 wasn't the only FX in town. There was also the FX 6100 / 6300, which right now outperforms the 3rd gen i3 which was it's competition
I am willing to bet something like an i3-3240 will still match/beat the FX 6300 in gaming benchmarks, new and old games.
 
Back in real life where people use 1080p, 2560x1080p, and 2560x1440p Freesync monitors, Excellent AM4 motherboards were 20% off at newegg (only $60 shipped for ASRock's excellent AB350m Pro4) and Ryzen 5 1600 was $169.99, but let's go ahead and do everything we can to pretend intel is still relevant in price/performance.......

I picked up my i5-7600k & Z270 for $250 over the summer...
 
Would be nice to see a review where the CPU gaming performance is compared when multitasking with a non-game related process such as doing a video encode whilst gaming or streaming + gaming.

Agreed Steve, WTF?! Also please downclock the Intel CPU to 2.5ghz, shut down half its cores and use 4GB in the test system for Intel. Make no mention of that in your test results! Stop being so damn Intel bias.

On a different note;

I went back to see all the games techspot reviewed graphically in 2016, just two tests done for 2017 and none had a CPU test. The FX-4 and Intel i3 (second or third generation) showed 45+ FPS in pretty much every game. The intel 2500k at stock showed 60+ FPS in pretty much every game. For 1080p 60hz gaming, if you have a 6600k or better (or similar IPC in your CPU) you have more then enough horsepower for modern gaming.

I checked out the Steam hardware survey, just 6.5% of users game at a resolution above 1080p. 60% game at 900, 1050, or 1080p.
 
Last edited:
Oh god... I had to read through walls of text, that I believe were only taking the article and getting to the same conclusion... why do you feel the need to say "yes I believe it's correct"???

The day has come, I never thought Intel was going to win the BFB category, the i5-8400 is packed with mighty awesomeness for budget builders, I'm liking this (And also a head to head comparison with the "Obama" of the CPUs, the Ryzen line -we all remember how he was received when he took over right?).
 
So just exactly when did anyone ever say to themselves 'I want to get the best frame rate per dollar' and ignore actual gameplay. In the real world no one spends over $300 on a GPU to play at 720p so while all scientificy and stuff this alt-reality testing is of little value. Nothing in this comparison has a takeaway that reflects real world value. Any of the current AMD Ryzen or Intel CPUs cost matched with a GPU will provide hours of smooth responsive gameplay after a few game settings or switch adjustments.
Youtube videos show players the settings used to achieve good results and recording of gameplay which has real world takeaway.
 
Would be nice to see a review where the CPU gaming performance is compared when multitasking with a non-game related process such as doing a video encode whilst gaming or streaming + gaming.
The number of people who do that though is tiny. Streamers still make up <1% of gamers (and even then use stuff like ShadowPlay or external "game-capture" HDMI boxes that significantly reduces / eliminates CPU load). No-one's ever realistically going to start compressing video during a highly competitive 144Hz FPS multi-player gaming session then complain about artificially low frame rates. At worst, they'd set the video to "Idle Priority" and the game to "High Priority" (same FPS in games but longer encode times), at best they'd just use common sense and encode whatever video they had (the amount of which is also regularly over-exaggerated for the average person who isn't a Youtube / Twitch "celebrity") whilst eating dinner / overnight / at work or school, etc, during the other 20hrs per day when they're not gaming and the PC would otherwise be sitting idle / off.

Comment isn't aimed at anyone personally, but "I just bought a 16C/32T Threadripper, and let's say I believe the average person now wants to encode 8x videos, do 3x virus scans, Excel spreadsheets whilst gaming" isn't a particularly realistic "real-world" test, and is one of those things that seems to be about filling up unused cores of a recently purchased premium CPU with exaggerated demands to hide the effect that many games don't scale well beyond a point by making lower end CPU's perform artificially worse than how people who own them actually use them.
 
And you now have a dead end motherboard and a 4 Core cpu......

My 4 core CPU runs at 4.7ghz and screams through everything I toss at it since gaming is the most demanding thing I do on my gaming PC. If I ever feel the need I will ad a used 6700k or 7700k but regardless when the time comes for me to upgrade I would need to purchase a new mobo even if I did go with AMD since A) their is no benefit in gaming to go beyond an OC 1600 and B) looking at history future AM4 chips will simply be OC Ryzen chips (think piledrivers upgrade from bulldozer or worse the Phenom II x6 upgrade from the phenom II x4)
 
Last edited:
Lol, I did the x4 to x6 , lesson learned there

And I'm not anti-AMD as I've built multiple PCs with Athlon XP, Athlon X2, Athlon II X3 & X4, Phenom II x 2 and x4 . My old gaming PC was a phenom II x4 955 OC to 3.8ghz (it's now my wife's PC with a SSD to make it snappier). AMD came out with the Phenom II x6 and I was like WTF, I want better IPC not lower and two extra cores that only do something in synthetic benchmarks. Intel came out with the 2500k soon after it was a no brainer for me. The only allegiance I show is to my wallet.
 
Actually it does. It may not be a perfect 1:1 prediction for every single game, but it absolutely does highlight how much overhead a CPU can have on average.

Actual data though proves you wrong. The 7600k performs way better on 3+ year old games on 720p than the R5 1600x, therefore according to your argument it highlights that it has more CPU overhead. But does it? Nope, since in today's games it generally performs worse.
 
Your statements have very little substance behind them.

AMD CPU's, regardless of the threads/cores a game demands, have always lost to Intel in gaming benchmarks besides a few examples here and there.

That's completely irrelevant to my point. My point is you can't judge future performance based on current performance.

Still, you are wrong. The R5 1600x is a better gaming CPU than the 7600k and they launched at the same time.

I am willing to bet something like an i3-3240 will still match/beat the FX 6300 in gaming benchmarks, new and old games.

The fx 6300 performs similar to the i3 4130 even in single thread intensive games (like GTA V / arma etc.). Of course it wipes the floor with it in more multithreaded games like BF1 / Civ 6 / crysis 3 etcetera.
 
Back