Intel Core i7-8700K Review: The New Gaming King

I liked that they didn't make unsupported conclusions. They really didn't make conclusions (other than the value of high end CPU's has significantly improved). Techspot (you) offered no benchmarks at 1440p. Specifically, you offered no benchmarks for BF1 at 1440p. Are you disputing this?

How can you make conclusions about 1440p without benchmarks? It's reckless speculation for a site trying to give impartial reviews. If you wished to show a linear change in frames, prove it. Otherwise readers will not know when the bottleneck really happens (and where they should start investing more in GPU).

I liked your comment about the 1600. That comment and the end line from Anand were the two I gave to my wife when summarizing the new Intel lineup. [I've also liked how strongly your site has come out against prior high-end AMD chips].

I think you may be jumping to conclusions here (which I don't blame you considering how partisan these boards can be). My last build was a Core i5-3750k. Current build is a Ryzen 1700. I've ran both ATI/AMD and Nvidia cards throughout my building. Right now I'm running a Geforce 1070. My laptop is an Acer Helios 300 (Intel i7 + Geforce 1060). I run an AOC Agon 165hz gsync IPS monitor. I think I'm going to get the new i3 for a side build (I need to read see more benchmarks and compare it against entry level Ryzens). Pretty weird lineup for a fanboy?

Don't take my criticism too personal. You guys are okay at new releases. Just not compared to more comprehensive sites (Anand being the top one out there IMO). I also really dislike it when Techspot labels a something as a 'review' when it's just an aggregation of other site's reviews. Some of your stuff is excellent -- especially the retro stuff and everything aimed at esports (too many sites only focus on high end products when almost everyone is using mainstream stuff).

Why are you fixated on resolution? If a gamer has a high refresh rate panel and they're wanting to utilize it, they will adjust quality settings accordingly to try and max out the displays refresh rate. You don't need 144fps+ for 144Hz panels, 100-120 will be fine but these gamers are focusing more on frame rates than quality settings. If a Ryzen processor limits them to 100 fps but they really want 120 fps or more, that's worth noting.

So for Battlefield 1 we saw a 137fps minimum with the 8700K and a 111fps minimum with the R5 1600, as you pointed out that's at 1080p. At 1440p gamers with a high refresh rate panel will merely reduce the quality settings to high or even medium to maintain those high frame rates.

Finally I drew conclusions based on the testing I had done.
 
Why are you fixated on resolution? If a gamer has a high refresh rate panel and they're wanting to utilize it, they will adjust quality settings accordingly to try and max out the displays refresh rate. You don't need 144fps+ for 144Hz panels, 100-120 will be fine but these gamers are focusing more on frame rates than quality settings. If a Ryzen processor limits them to 100 fps but they really want 120 fps or more, that's worth noting.

So for Battlefield 1 we saw a 137fps minimum with the 8700K and a 111fps minimum with the R5 1600, as you pointed out that's at 1080p. At 1440p gamers with a high refresh rate panel will merely reduce the quality settings to high or even medium to maintain those high frame rates.

Finally I drew a conclusion based on the testing I had done.

I'm fixated on empirical results in order to confirm your conclusions. This is plain out shoddy journalism. I think you have a point when it comes to 1440p having more dependence on the CPU than 4k. The question is: By how much? How well does it scale? This is an empirical questions which you did not take the time to answer. You simply guessed.

Saying you can change the quality is a very poor excuse. This is adding more variables into the equation (first, it's evading the very issue you're trying to support -- second, different settings may play differently with different CPUs). But let's say if you really wanted to do that. Do you support it empirically within your benchmarks? No.

These are all hasty conclusions drawn on by insufficient data. At least Anand gave 4k benchmarks (making it possible for viewers to make very rough guesses on how they scale). I understand that you can't benchmark everything but you didn't seem to try to benchmark almost anything. This article is rushed and incomplete. Which wouldn't be so bad if you limited your conclusion to what you verified. Maybe made a suggestion at possible inferences (which, tbf, Techspot often does quite well). Not here. You just rushed into conclusions.

As for your comments, I'm not sure why you protected your analysis so strongly when you hadn't even addressed 1440p in the article? Now you are trying to change the issue by discussing settings -- another issue not addressed in the article.

Who says Ryzen will limit them to 100fps at 1440p? Not your article? Who says Intel will go over 100fps at 1440p? Not your article. Who says to what extent setting modifications will help? Not your article.
 
I'm fixated on empirical results in order to confirm your conclusions. This is plain out shoddy journalism. I think you have a point when it comes to 1440p having more dependence on the CPU than 4k. The question is: By how much? How well does it scale? This is an empirical questions which you did not take the time to answer. You simply guessed.

Saying you can change the quality is a very poor excuse. This is adding more variables into the equation (first, it's evading the very issue you're trying to support -- second, different settings may play differently with different CPUs). But let's say if you really wanted to do that. Do you support it empirically within your benchmarks? No.

These are all hasty conclusions drawn on by insufficient data. At least Anand gave 4k benchmarks (making it possible for viewers to make very rough guesses on how they scale). I understand that you can't benchmark everything but you didn't seem to try to benchmark almost anything. This article is rushed and incomplete. Which wouldn't be so bad if you limited your conclusion to what you verified. Maybe made a suggestion at possible inferences (which, tbf, Techspot often does quite well). Not here. You just rushed into conclusions.

As for your comments, I'm not sure why you protected your analysis so strongly when you hadn't even addressed 1440p in the article? Now you are trying to change the issue by discussing settings -- another issue not addressed in the article.

Who says Ryzen will limit them to 100fps at 1440p? Not your article? Who says Intel will go over 100fps at 1440p? Not your article. Who says to what extent setting modifications will help? Not your article.

I've done the testing buddy. But anyway I'm out.
 
Thanks for the great review as per usual. But to me the i5 8400 and i5 8600K review is going to be far more interesting, especially when compared to the Ryzen 1600 and 1700.
 
Thanks for the great review as per usual. But to me the i5 8400 and i5 8600K review is going to be far more interesting, especially when compared to the Ryzen 1600 and 1700.

Thanks mate, that data is incoming, I'll probably cover the Core i3-8100 and 8350K in detail, in the next day or two. Then the 8400 and then the 8600K. It's going to be a busy few days ;)
 
Nice gaming chip but look at the power draw! Who would have thought AMD would be the green option :) Looking forward to the i5 tests though my heart still wants a threadripper (though a lottery win may be required first) .
 
@kapital98, did you even think to go back to the original GTX 1080TI review (https://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/), since that was not only the GPU they used for this bit of testing, but when it was tested it was paired with the i7-7700K OC'd to 4.9GHz? Let's do a quick compare
  • Battlefield 1: in this particular article, the i7-8700K had identical performance compared to the i7-7700K. So by & large, it should be expected to perform similarly at the other resolutions.
  • Based on the 7700K's performance, you should expect an 8700K OC'd to 4.9GHz to top off at 137FPS average/117FPS minimum @ 1440p, & 73FPS average/66FPS minimum @ 4K.
  • Since the performance is based more & more on the GPU as the resolution goes up, the margin between the 7700K/8700K & the Ryzen CPUs is going to narrow at 1440p, & almost certainly disappear by the time you get to 4K.
Again, demanding that CPUs be "tested" in situations where they're not the limiting factor is a waste of time: a waste of your time to even argue about it, a waste of the reviewer's time to have to respond to it, & a waste of our time to have to read it/respond to it.
 
Great chip, you have to thank AMD for this. If it wasn't for them disrupting the market with Ryzen Intel would've of taken a lot longer to release 6 cores into the mainstream.
 
Let me summarize it for you:
- 8700K is the new out of the box performance king, indeed, but if you already have 7700K or 6700K for that matter you can simply raise frequencies to 8700K ones and you get same performance, so nothing special.
- For computing, you have to be blind to buy 8700K over R7 1700. R7 1700 is as fast or even faster in computing tasks and it has quite a headroom to overclock till 4Ghz, whereas 8700K is already pushed to over 4.5Ghz in stock form. The big difference is in price of CPU (300$ for AMD and 360$ for Intel) and also the platform costs are lower with AMD.
In the end, lets not forget that everything we see from Intel now is because AMD forced them to do so, with Ryzen lineup. I think AMD is worth the credit and the money, since now their products are not inferior from a performance/consumption standpoint compared to competition anymore.
And dont forget, the test bed used 3200MHz DDR4. Ryzen is shown to scale CPU performance in multi core (and to a lesser extent single core) with memory speeds going up to at least 4000MHz due to its infinity fabric. So on top of OC, faster memory will squeeze an extra 10% out of ryzen.

the 8700k seems kinda pointless IMO. Offers too little for too high a price, and all the fun OCing is already done for you. And now that ryzen seems to be able to keep up with intel in the memory bandwidth scene, intel doesnt even have that advantage anymore.
 
And dont forget, the test bed used 3200MHz DDR4. Ryzen is shown to scale CPU performance in multi core (and to a lesser extent single core) with memory speeds going up to at least 4000MHz due to its infinity fabric. So on top of OC, faster memory will squeeze an extra 10% out of ryzen.

the 8700k seems kinda pointless IMO. Offers too little for too high a price, and all the fun OCing is already done for you. And now that ryzen seems to be able to keep up with intel in the memory bandwidth scene, intel doesnt even have that advantage anymore.

Please show me a Ryzen CPU that can handle DDR4-4000. Most of the lower end chips have issues with 3200.
 
Missing from this review is a benchmark comparing the i7-8700K with the 6-core Xeon E5645 running in my venerable Dell Precision T3500, first shipped in 2011. Let's see how far Intel has really come in the last six years of hex processor void. And, yes, there are faster LGA1366 6-core Xeons. I do not happen to have one, but they are cheap now, maybe $25. The i7-8700K would probably beat the Xeon E5645 hands down on lowest power consumption and least amount of heat that needs to be dissipated. But for raw performance difference? Meh! Not worth cutting my electricty bill, and I have heat in my office in the winter. Not that much heat. The six cores are running cool right now, between 79 and 93 degrees Fahrenheit. It's always worth saving a lot of bucks by skipping Intel CPU generations.
 
Please show me a Ryzen CPU that can handle DDR4-4000. Most of the lower end chips have issues with 3200.
While I appreciate your responses at this point it feels like you're fighting an unwinnable fight with those unwilling to understand the point of the article (not for you but for the detractors):

"I still feel that the majority of gamers will be better served by the R5 1600, but before you take my full word we have to check out the Coffee Lake Core i5 range first (soon!). The 8700K makes the most sense for those going after extreme frame rates with the latest and greatest GPUs, and not those playing CS:GO on a GTX 1060."

8700k for best rate R5 1600 for best value (and upgrade path).
 
8700k for best rate R5 1600 for best value (and upgrade path).

There is no gaming upgrade path past an OC 1600, they all provide the same performance. TPU reviewed both the i3 8350k and i5-8400 and show them to be superior to anything Ryzen has for gaming. So today a ryzen makes sense since the coffee lake cpus seem to be a paper launch but the coffe lake CPUs offer better gaming performance and a better upgrade path as proven by every professional review web site not fan boy hyperbole. Facts hurt but they are the truth.

perfrel_1920_1080.png


For gaming, things are different. Here, the i5-8400 breezes past all AMD Ryzens thanks to its high per-thread performance and the boost clock of 4.0 GHz. I find it surprising that there is very little difference between the i5-8400, i5-8600K, and i7-8700K in gaming, even at the highly CPU-limited scenario of 720p. This suggests that today's games see limited gains from more than four cores. It is good news for gamers on a budget because a Core i5-8400 will be completely sufficient to not bottleneck even the fastest graphics cards.
https://www.techpowerup.com/reviews/Intel/Core_i5_8400/19.html
 
Can't say I agree with the power consumption being ok, let alone 'impressive'. No one found the power consumption on the FX CPUs impressive, just because it could reach high clocks out of the box. Compared to the 7700k, this CPU is not impressive at all. The only thing that makes up for it, is the two additional cores. But if this CPU was an AMD CPU, everyone would be complaining about how much more power hungry it is to the similarly performing 7700k, and to get the 7700k instead because those two cores are not needed and the power consumption is not worth it. But since this is Intel, they get a free pass on this and are even praised for it. We're talking 50-ish W difference, and AMD CPUs (and GPUs for that matter) get slammed constantly for consuming that same amount of power in excess, even when performing similarly. And yes, people are generally silent regarding how efficient Ryzen is.
 
@kapital98, did you even think to go back to the original GTX 1080TI review (https://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/), since that was not only the GPU they used for this bit of testing, but when it was tested it was paired with the i7-7700K OC'd to 4.9GHz? Let's do a quick compare
  • Battlefield 1: in this particular article, the i7-8700K had identical performance compared to the i7-7700K. So by & large, it should be expected to perform similarly at the other resolutions.
  • Based on the 7700K's performance, you should expect an 8700K OC'd to 4.9GHz to top off at 137FPS average/117FPS minimum @ 1440p, & 73FPS average/66FPS minimum @ 4K.
  • Since the performance is based more & more on the GPU as the resolution goes up, the margin between the 7700K/8700K & the Ryzen CPUs is going to narrow at 1440p, & almost certainly disappear by the time you get to 4K.
Again, demanding that CPUs be "tested" in situations where they're not the limiting factor is a waste of time: a waste of your time to even argue about it, a waste of the reviewer's time to have to respond to it, & a waste of our time to have to read it/respond to it.

You're making assumptions when the purpose of doing benchmarks is to create an objective measurement of the data. Then base conclusions on that data.

This is why other sites build databases of uniform benchmarks if they want to make grand conclusions (which requires a 'control' in the form of uniform parts and nearly uniform drivers). Other sites do this quite well (Techreport being great). Techspot doesn't. What would you think if Techspot randomly changed the RAM clock and timings, the SSD, etc when comparing the 7th and 8th generation core's? Yeah, it may be the same -- but it may be different. That's why you benchmark.

This is also a fundamental issue of statistical analysis. As soon as you start creating problems of autocorrelation, heteroscedasticity, and multicollinearity your results are worthless (unless you then control, as best as possible, for these problems).

Saying 'trust me' is not a way to analyze tech products.
 
Interesting. A very different conclusion to the majority of reviewers out there who seem to be applauding the i5 8400 as the new value for money king, coming in cheaper than an R5 1600 but beating it in most tests, quite heavily in many. It seems that you pay about 10% more to have the best which is traditionally a very small cost, unless you really only encode on the stuff I think that’s worth paying. Personally, I would rather buy this new Intel stuff than Ryzen.
 
Saying 'trust me' is not a way to analyze tech products.
You wouldn't be happy with anything Steve had to say at this point. He most definitely is not obligated in reporting anything. If you have a request then by all means put in the request and let it go.
 
After looking at all of the scores.....I'd rather buy a threadripper and vega. I need to give amd another shot. I'll be waiting for zen+/their refresh line for their processors and gpus.
 
The article title does say 'gaming', but does no one want to do HDR and/or UHD blu-ray on PC? Trying to read up, Intel 7th and 8th gen (i7-8700K) can do this using the onboard video. I checked the specs on some Gigabyte GTX 1070, 1080 and 1080ti since who is going to use onboard video and they support HDR/HDCP 2.2 as well as the HDMI 2.0a and DisplayPort 1.4 interface/cable both of which is required. Pioneer already makes a UHD blu-ray internal player and the HDR capable monitors are starting to come out. I don't know about gaming and HDR though, can anyone enlighten me?
AMD cpu has no onboard video, don't know if their cards like Vega support HDR (guess Google will have to be my friend here). AMD cpu seems to work better with AMD card.
 
Thank you for the good review. Some improvements for sure and I had originally planned to upgrade to this platform. That is until I paired a 1080ti with my oc'd 4790k . Here in canada, upgrading to the new platform would cost the same as the 1080ti.From my experience thus far, I would recommend those who have a similar processor spend the money on the video card instead.
 
Just picked up a Ryzen 1600 with B350 motherboard for a total of $185 at MicroCenter (open box on both). At that price, I don't care what bone Intel has finally decided to toss to people after all this time, it's too little, too late.
 
Last edited:
Just picked up a Ryzen 1600 with B350 motherboard for a total of $185 at MicroCenter (open box on both). At that price, I don't care what bone Intel has finally decided to toss to people after all this time, it's too little, too late.

•Edit•Reply•Share ›
Key word - open box.
 
Yes, open box helped, but right now, a Ryzen 1600 with B350 board at MicroCenter after $10 rebate is only $205. Still a good deal.

Ryzen 1600 = $170
B350 = $ 35 (with $30 combo discount and $10 rebate) = $205 (+ tax)
(Brand new, not an open box price)
 
Last edited:
I'm fixated on empirical results in order to confirm your conclusions. This is plain out shoddy journalism. I think you have a point when it comes to 1440p having more dependence on the CPU than 4k. The question is: By how much? How well does it scale? This is an empirical questions which you did not take the time to answer. You simply guessed.

Saying you can change the quality is a very poor excuse. This is adding more variables into the equation (first, it's evading the very issue you're trying to support -- second, different settings may play differently with different CPUs). But let's say if you really wanted to do that. Do you support it empirically within your benchmarks? No.

These are all hasty conclusions drawn on by insufficient data. At least Anand gave 4k benchmarks (making it possible for viewers to make very rough guesses on how they scale). I understand that you can't benchmark everything but you didn't seem to try to benchmark almost anything. This article is rushed and incomplete. Which wouldn't be so bad if you limited your conclusion to what you verified. Maybe made a suggestion at possible inferences (which, tbf, Techspot often does quite well). Not here. You just rushed into conclusions.

As for your comments, I'm not sure why you protected your analysis so strongly when you hadn't even addressed 1440p in the article? Now you are trying to change the issue by discussing settings -- another issue not addressed in the article.

Who says Ryzen will limit them to 100fps at 1440p? Not your article? Who says Intel will go over 100fps at 1440p? Not your article. Who says to what extent setting modifications will help? Not your article.

I've done the testing buddy. But anyway I'm out.

I have been moving away from AnAndTech in the last few years. I find Techspot and Tom's more current and relevant to my experience.

After reading your exchanges, I think that kapital can come across as a bit obnoxious in the way they deliver the argument.

However, they reveal a flaw in the Coffee Lake 8700k. There isn't a market for it.

I understand that you use low resolution to test the CPUs to reduce GPU bottleneck scenarios. However, very few people buy $400 CPU just to game on 1080p. We are looking at just PUBG and CS:GO at 144hz.

In most gaming scenarios, enthusiasts generally eyeball the 1440p/2160p resolutions. In a practical case, most people won't see the real world benefits to 8700k's marginal IPC lead.

For streaming, users will benefit from more cores/threads. Same goes for content creation and productivity tasks.

Which brings us to a bit of a problem, in any practical sense, the 8700k only take the lead in low resolution, high fps scenarios, such as 1080p/144.

Not to mention that it consumes more power than the Ryzen 8c/16t.
 
You're making assumptions when the purpose of doing benchmarks is to create an objective measurement of the data. Then base conclusions on that data.

This is why other sites build databases of uniform benchmarks if they want to make grand conclusions (which requires a 'control' in the form of uniform parts and nearly uniform drivers). Other sites do this quite well (Techreport being great). Techspot doesn't. What would you think if Techspot randomly changed the RAM clock and timings, the SSD, etc when comparing the 7th and 8th generation core's? Yeah, it may be the same -- but it may be different. That's why you benchmark.

This is also a fundamental issue of statistical analysis. As soon as you start creating problems of autocorrelation, heteroscedasticity, and multicollinearity your results are worthless (unless you then control, as best as possible, for these problems).

Saying 'trust me' is not a way to analyze tech products.

I think it's funny that you're calling for consistency & claiming there are problems with different variables in the testing being applied to the different setups, when the site you claimed did it better (Techreport.com) does the same thing with their reviews (http://techreport.com/review/32642/intel-core-i7-8700k-cpu-reviewed/5):
  • Just like in Steve's review, they keep the platforms as close as possible (although, I did notice that Techreport apparently changed the timings on their RAM, as the timings for the Ryzen tests were slightly faster than the Intel ones, something that Techspot didn't do)
  • As far as making sure they had a large sample size, Steve didn't say how many times he ran the tests...but in prior testing, we've seen that Techspot reviewers run them multiple times & present the averages in their results (as opposed to just picking the best one). Techreport does the same thing, as they ran each test 3 times & used the averages
  • Just like in Steve's review, all game tests for the CPUs were run with a GTX 1080TI set at 1080p resolution. I would imagine that they would give the same reason that Steve & all other Techspot reviewers give for using those settings, should you ask them.
And it's not like we don't already have multiple sites that show that, once you hit 1440p & 4K resolutions, that the gap in performance between CPUs shrinks down quite a bit (& depending on the game may even vanish):
The concise conclusion, though, that you can make from this data is that the individual testing results may vary, but the general consensus is that benchmarking at 1440p & 4K resolutions is definitely GPU dependent.
 
Back