4th-Gen Core i7 vs. 8th-Gen Core i7: Is It Worth the Upgrade for PC Gamers?

Great article but, most techies worth their weight in gold already knew it wasn't really worth it to upgrade. However, I believe the real question should be, "Do we, as consumers have a choice in the matter?" Over 95 percent of the time, a cpu will outlast a motherboard by years. Right now, trying to buy a new gaming motherboard for a 4th gen I7 cpu will cost more than buying a 8th gen I7 cpu with a gaming motherboard. I believe that the consumers lack of choice in this matter is part of the reason that technology gains are so small every new generation of cpu's.

Nothing we can do about that mate, hopefully with competition from AMD things will change over the next few years. This article was created because it's been the most heavily requested test of the past few weeks.
 
It's an interesting article. I like "dizmo"'s post midway on page #2:

"A 300mm wafer is ~$400.

The yield per wafer is apparently a guarded number, AMD gets around 150 per 200mm wafer? So for sake of a number we'll say a 300mm wafer gets 225 (could be more). So ~120 CPUs (graphics is a separate die). $3.33 per CPU, for a price increase of $0.66. Even at a multiple of 10, it's a $6.60 increase. Or a nice coffee.

Very, very rough numbers, but still. I didn't want to look that much into it
tongue.png


People are over reacting. Silicone, as you've said, is a small part of the process. This doesn't increase the greater costs; shipping, distribution, marketing, packaging, etc."

Now that I think about it though. Waiting 60 more days to save $30.00 on a motherboard.....eh.....I could go either way right now. Maybe I'll flip a coin like two-face does to make my decision for me. :p
The increased cost of memory chips doesn't justify the prices of the final products either. Is it?
You shouldn't just consider the retail prices, but how the profit margins are affected and if these price hikes can be a nice excuse for companies to increase them. Also not all chips from a wafer end up working and not all that work end up being top quality.
AMD sells with only 35% profit margin, Intel lost 1-2% these last months and together with Nvidia sells at around 60% and those percentages are probably not taken from the final retail price. And the less the nanometers in the manufacturing process they use the more the costs. Companies do know it and they could pass the extra costs, and more than your ten times multiple in your example, in the final product price.

I am not an expert and you could probably be right here, but I don't believe that it will be just an extra $6.6. Companies DO look for excuses lately to push prices up. Either to recover finacially(AMD), or to constantly keep making their shareholders happy(Intel), or just get enough cash to push their products in as more markets as possibly(Nvidia).

And these companies, with the exception of Intel, do not make their chips in their own factories. So we should also consider the profit margins of the factories manufacturing those chips. There are probably plenty of companies involved, with their profit margins to consider.
 
Last edited:
I disagree. You’re comparing two different things. Therefore, doing things in a formulaic fashion isn’t the best way to go.

The question here being, is upgrading from a 4th gen Core i7 processor to the 8700K worth it? This question entails many things but ultimately you want to see if the cost of buying a new CPU, motherboard and memory is worth the investment.

If you play at low resolutions with medium type quality settings with an extreme GPU like the GeForce GTX 1080 Ti and want 200fps+ then yes, the upgrade is beneficial. Overclocking the 8700K will also improve performance further here but we have our answer all the same.

If you play at 1080p using ultra quality settings on anything less than a GTX 1080 Ti, a GTX 1070 was used in my example, then ‘no’ the upgrade isn’t worth it. You can overclock the 8700K to 7 GHz on LN2 if you want, it still won’t yield any extra performance with a mid-range graphics card.

720p testing isn’t particularly useful here as we’re not interested in how much faster the 8700K is with all speed limits removed using an extreme GPU. Rather we’re interested in finding out if someone with a 4-year-old Core i7 has anything to gain right now by upgrading to an 8700K, under realistic gaming conditions and the answer is no. The 8700K will no doubt end up being a better gaming CPU in years to come, but you’re not upgrading from the 4770K to the 8700K for it to be a good investment in a few years’ time, you’d just upgrade in the future once proven.

So again comparing two ‘new’ CPUs on two new platforms is entirely different to what we were doing here.

It’s like buying a new car vs second hand.

When buying a second hand car you want to ask things like “how many k’s are on the clock”, that’s useful information for determining the cars condition and ultimately value.

Walking into a dealership and asking “how many k’s are on the clock” probably won’t lead to any useful information ;)

It depends. For the majority of people out there, even a Core 2 Quad or a quad Phenom II will be enough, if paired with an SSD, plenty of RAM and a graphics card that offers all kinds of modern acceleration for graphics, like a GT 1030 or an RX 550 for example.

But for people who try to have the absolute highest performance, even at higher settings and/or higher resolutions, the 8700K is the only option. It's not the only option if we start looking at the costs, but to be honest, someone who gone for a NEW 4770K and owns a 1080 Ti, doesn't probably care so much about the costs, except if he builds a hi end platform and keeps it for 5-10 years. But for most of those looking at absolute performance the difference in performance, especially if you consider the overclocking headroom of the 8700K, could easily justify the price. If it wasn't, AMD should be apologizing every week, for not having the time to build enough Ryzen processors to cover the demand. But before Coffee Lake, the 7700K was Intel's best selling CPU for a reason.

In the 8400 vs 1600 article the impression that was given was different. Look at the 720p scores, 8400 matters greatly. I understand what you say, about having two different scenarios, but they are not that different. People can sell their 4770K and in the end pay less compare to someone who just go out and pay the full price for a new system. We can also assume that the majority of buyers of 8400 and 1600 CPUs will not be looking at a GTX 1080 or greater GPU, so even after 3-5 years they will be using a 1080 level GPU at best and at 1440p resolutions at best. Still that review was trying to promote the best case scenario for the 8400, focusing on that "future proof 720p" while using a hi end GPU in the beginning of that article. The conclusion in that article is balanced and detailed, but that first 720p chart, where "the Core i5-8400 is 40fps faster" is what most people will see in a hurry.

The only REAL difference in this review, compared to the 8400 vs 1600, is that it addresses a different audience. In that review it was addressing an audience that was not willing to pay for the top models, like a 1700 or a 8700. In this case the article addresses an audience that at least once gone for the highest CPU model.

Anyway, I guess we still agree that we disagree.

Thanks for the review. While I have objections and probably I will continue having in the future, I do consider TechSpot's reviews extremely valuable.
 
Last edited:
As a side note, you can of course overclock the 8700K for greater performance when not GPU bound, but as we mentioned, gamers are almost always GPU bound with high-end Core i7 CPUs. We hope that addresses questions about why we haven't complicated things by also overclocking the 8700K.

^^

That doesnt answer anything, tbh.

When you do a A vs B product , you do it equally !

You overclock A but keep B on stock? that totally invalidates your conclusion !
 
That doesnt answer anything, tbh.

When you do a A vs B product , you do it equally !

You overclock A but keep B on stock? that totally invalidates your conclusion !

While it would be nice to see overclocking results for both just for completeness' sake, there's no real reason to expect they would change the conclusions. The explanation given is a perfectly reasonable answer to why the 8700K was not overclocked and if you read the article properly, you should be able to come to the same conclusion even without it. This doesn't mean there aren't cases where overclocking the 8700K would widen the gap, but it is a reasonable assumption that these are exceptions to the rule more than anything else.
 
The increased cost of memory chips doesn't justify the prices of the final products either. Is it?
You shouldn't just consider the retail prices, but how the profit margins are affected and if these price hikes can be a nice excuse for companies to increase them. Also not all chips from a wafer end up working and not all that work end up being top quality.
AMD sells with only 35% profit margin, Intel lost 1-2% these last months and together with Nvidia sells at around 60% and those percentages are probably not taken from the final retail price. And the less the nanometers in the manufacturing process they use the more the costs. Companies do know it and they could pass the extra costs, and more than your ten times multiple in your example, in the final product price.

I am not an expert and you could probably be right here, but I don't believe that it will be just an extra $6.6. Companies DO look for excuses lately to push prices up. Either to recover finacially(AMD), or to constantly keep making their shareholders happy(Intel), or just get enough cash to push their products in as more markets as possibly(Nvidia).

And these companies, with the exception of Intel, do not make their chips in their own factories. So we should also consider the profit margins of the factories manufacturing those chips. There are probably plenty of companies involved, with their profit margins to consider.

The ship to buy DDR4 memory at a reasonable price has already sailed I think. So, if I purchased a new Coffee Lake system now to save a few dollars in the future I would be paying ridiculous prices for my DDR4 memory chips versus the forecasted ludicrous prices for memory chips in the near future.
I didn't realize how much memory prices have risen in the past 2 years. I still have a Patriot Viper 2x8GB DDR4-3000 memory set which I bought 2+ years ago that I could recycle from my SkyLake system when I upgrade to Coffee Lake.

And I don't need features like Dual-Lan's. So, I'm waiting at the risk of spending an extra $6.60. I may take this time to snatch up a Kaby-Lake dual-core just so I have 2 PCs. I get nervous when I'm down to just 1. :)
 
Last edited:
Haswell is from 2013/14, its not very old.
it is a 5 going on 6 year old CPU. When sandy bridge came out, the horrendous pentium D was only 6 years old.

it is amazing how well CPUs last these days, or sad how little software takes advantage of new hardware, depending on your view. My wallet likes it, but my upgrading spirit does not.

Props to techspot for doing this kind of review. So many tech sites dont look at older hardware enough. And the mantra is the same, upgrade your GPU, keep your CPU until it dies.
 
As a former hw freak( I Always looked for value for money (a bit) So I have a 3930k@4,25 24/7 with 3 gtx 570 and yes I still play @1080P. The gtx's died on me an I started to look around replace all of my stuff and changing my watercooling setup or just buy a of the shellf watercooled video card. Conclusion was remarkebly simple.
To really get back to the top of the pile I wouldt have to spent about €2000 or about €1200 to be were I'm at now with my trusty setup and a GTX1080. Since I use my pc only for gaming and easy office like tasks I really see no need at this point to replace my system. I do however use a lot less power now replacing the 3x570 :)
 
"As a side note, you can of course overclock the 8700K for greater performance when not GPU bound, but as we mentioned, gamers are almost always GPU bound with high-end Core i7 CPUs."

This theory is flawed. There are games that are CPU bound no matter the GPU/CPU. I wish more reviewers would benchmark WoW as well. Sure it's old, but many people still play it. It's biggest issue is that it's heavily dependent on single core performance. I bet bliz would make a benchmark if enough outlets asked for one.
 
Thanks for your article, I'm running a 3770K @ 4.5GHz paired with an R9 Fury and am still running the latest games well. I suspected the 8700K wouldn't make any meaningful upgrade for my gaming with my current GPU and your GTX 1070 tests confirm that, there is virtually no performance gain upgrading from previous gen i7s to the 8700K unless you are running a 1080 Ti.

Since I wish to stick with FreeSync due to my gaming monitor, I'm limited to AMD GPUs for now in terms of upgrades. Do you think my 3770K @ 4.5GHz would suffice for a Vega 64? Based on your testing I'm thinking it will hold up relatively well, though there are tests showing Vega 64 being close to 1080 Ti in certain DX12 games so that's of a slight concern for me in terms of my CPU possibly bottlenecking in games.
 
It sad but there was time every two or three years CPU would double in performances not any more.

Even old days going from P2 to P3 was like three times the performance gain.
 
I did make the jump from a 4790K 4.7ghz with a 1080 Ti and 1440P 144hz monitor to a 8700K @ 5GHZ, and I don't regret it any bit. I do regret though jumping on the 8700K as only several months later the 9900K came out and that would have been the best option.

I also like the jump to DDR4 and m.2 support since I have quite a few NVME M.2 drives.

Don't forget, the new MB design and RGB features are worth 10 FPS.
 
"As a side note, you can of course overclock the 8700K for greater performance when not GPU bound, but as we mentioned, gamers are almost always GPU bound with high-end Core i7 CPUs."

This theory is flawed. There are games that are CPU bound no matter the GPU/CPU. I wish more reviewers would benchmark WoW as well. Sure it's old, but many people still play it. It's biggest issue is that it's heavily dependent on single core performance. I bet bliz would make a benchmark if enough outlets asked for one.

WoW Player here, 8700K jump did net more FPS over Haswell with same 1080Ti and 2k res
 
Back