4th-Gen Core i7 vs. 8th-Gen Core i7: Is It Worth the Upgrade for PC Gamers?


1. PUBG is notoriously poorly optimized, so not really a great benchmarking game.
2. You shouldn't base any decisions on benchmarking just one game, unless you _only_ play that one game.
3. You should take a critical look on your results and ask yourself "do these make sense". You're getting higher FPS with a higher CPU load and higher resolution, which means there's something wrong with your benchmark (could be related to point #1).
4. If the games have internal frame caps, then the choice of CPU matters even less.
5. In CS:GO you're basically CPU limited even with a GTX 1060 GB @1080p and everything maxed out. Your suggested benchmark would not reveal anything new, since all of these CPUs should be enough to push the average FPS to 300+.

1. Agree. That's the point. We are all taking tests on games with 2-5 yo engines, which are not optimized for 8700k, even for 7700k. Also taking tests in GPU-dependent games. You can take a glance for comparison of 1080TI vs 1080TI SLI in Project Cars @1080p Medium. 340 fps vs 354 fps. So does it mean SLI is not that good? Nope, that is just shitty comparison. Because @2160p Ultra it's 124 fps vs 195 fps. And Battlefield 1 can't even SLI, showing less fps then single GPU.
2. My friends were asking for this test, because this is the only game we play for last 5 month. Top200 EU, tryharding, you know...
3. A bit strange, yes. But I assume that i5-4670 bottlenecking 1080 TI(or bottlenecking himself somehow) @1080p, but not @1440p. During 1080p tests CPU loaded for 60-71% and GPU about 70%, during 1440p tests CPU @92%, GPU @99%
4. And again that is the point. We also can pick 2 dozens of different processors and make some tests for games @800*600, all low. And then "Hey, look. i5-650 can also do 300 fps". Why the hell we are comparing different generation of new processors, built for 1440p, 2160p gaming on 1080p?
5. Yes and no. Because perfect conditions for CS:GO are "< 1080p resolution, minimal graphics, minimum cores, maximum frequency per core" and suddenly i5-8600k losing to i3-8350k. Rivatuner showing that i3-8350k loading 1st and 4th core for 89%, and 2nd and 3rd for 15-20%. Fair enough? Makes sense?
 
Any chance we can stop showing Ashes of the Singularity results? It's not a game and no one runs that piece of software. It's more like a demo to show off very specific features.
 
Worth me moving to a new setup from my o/c'd i5 2500k yet? I guess wating for the new motherboards next year is the trick

Yes. I went from the same i5 to the 7700 and the gains on a 1070 were quite noticeable. A 1080 or higher would show even better gains. Of course, if you're upgrading, go for coffee lake, Intel gave me the shaft by releasing a new platform so soon after kaby lake without backwards compatibility. Still, the 7700 is a beast of gamer CPU so I'm still happy.
 
@Steve:

Thanks for the great benchmarks Steve! I was curious to see how little games actually needed from the CPU and tested a 2600k @ stock on an h61 motherboard with 8gb ddr3 at 1333mhz. No game dipped below 60fps.

This is the reason so many viewers are confused with testing methodology since everyone has some specific use case, but in reality, for gaming, any 8thread or higher Ring Bus CPU (Sandy Bridge forward,) is more than enough.

People are just trying to find an excuse to upgrade when it's unnecessary. Sli/Xfire are dead, and if you have a 1080ti your gaming rig is maxxed out. I realized most people feel the need to continuously dump more money in their pcs for no reason.

And if productivity and gaming are a concern Xeon E5 1650 v2 chips (6c/12t ring bus ivy bridge CPUs with Soldered ihs, and unlocked multiplier) can be had for $130 making the 8700k a very poor value. New, but cheap X79 motherboards can be had for $130+/- and quad channel ddr3 @1600mhz offers the same bandwidth as ddr4 3200mhz in dual channel. On top of that 16gb server ecc memory can be had for $30 vs $150+ for ddr4.

Adding all the factors of the current market out there the only reasonable upgrades are to used Xeons as even the X79 supports up to 12 core CPUs for even further upgrades in the future. Ryzen isn't bad either, but DDR4 simply isn't worth the price right now.

So don't worry about people not understanding your methodology. Just ignore them. They clearly don't understand how to benchmark and instead you should suggest they buy their own equipment and do their own testing to compare results (because if a viewer hasn't spent 40+ hours benchmarking a couple rigs before, they won't understand anyway.)
 
Any chance we can stop showing Ashes of the Singularity results? It's not a game and no one runs that piece of software. It's more like a demo to show off very specific features.

Probably not. It's one of the handful of games (if not the only one) that not only use more than 4 threads, but will show noticeable results with CPUs equipped with more than 8 threads. And it seems like it doesn't really favor AMD or Intel CPUs above the other. So it's kind of useful as a, "what would it be like if other games took more advantage of CPU cores/threads?" situation.
 
One article was comparing two new CPUs with the aim of trying to work out which one will serve you best. This article is trying to decided it upgrading to a new CPU from an old one is worth the investment. Surely you can tell the difference.
I already said that they are different, exactly as you describe. But as I also said, I still don't like the distinction.

Why overclock the 4770K(even for just emulating the 4790K) and keep the 8700K at defaults, while testing at reasonable resolutions? Overclock the 8700K also at 5.2GHz, run tests at 720p and probably come to a different conclusion, for example, that "8700K is significant faster, your 4770K is slow in gaming and it will get worst fast, it's time for you to upgrade. Upgrade today!".

What I am saying is that even in cases where the reviews look to try to answer a different question, they should always have some constants. In my opinion, we can't have a review focusing on 720p and calling a 130 vs 110 fps difference significant, and then the next week, have another review focusing instead on 1080p and calling a 130 vs 110 fps difference insignificant. And that while giving an extra advantage at the slower model by not overclocking the faster one. In one review, the difference is significant, gaming development will not change, go for the 8400. In the next, the same difference is insignificant, ignore that in the most resent game, even the overclocked 4770K is getting beaten badly. It's over 100fps anyway.

If I was seeing 720p numbers and an overclocked 8700K and also comments about how 130 vs 110 is a noticeable difference or how 4770K is falling behind in the most modern title, I would have no reason to comment anything that looks like an abjection. But others, 4770K owners probably, would.
 
I am going to keep this Web Page so when my Son asked for a New Computer I can tell him that His Intel Core i7-4770K with 24 Gig of Memory and GTX 1070 Card will run fast enough that does not make sense to look at something new. Also has 480 Gig SSD Boot Drive and 2 TB Data Drive.
 

Hmm. It seems that the answer I was going to post is so long that it's classified as spam/inappropriate. Here's the gist of what I wanted to say:

I think you're not really understanding the scope of the article. The point was not to test the absolute speed differences of the CPUs, but whether the new CPUs are worth upgrading to or not. Both the chosen games and the methodology fit the scope pretty well. If anything, we could argue that pretty much no one stuck with an old Intel CPU will own a 1080Ti. Instead, the tests should be run at 1080p using a GTX 980 Ti or something similar, in which case the end result would have been the same, just more pronounced. The benchmark results would also have been very boring. Same thing with 1440p and 4K results, since they're even more heavily GPU limited. So, the test methodology is a bit of a compromise, but I wouldn't say it's a bad one.

I get your point about CS:GO, but you're talking about a very specific use case and let's be honest, for most people it doesn't really matter which of the tested CPUs they'd use, since input lag is not one of their biggest issues. Pro gamers in pro games? Maybe. 99.99% of playerbase? Not really. This is why I disagree with your suggestion that CS:GO would somehow be a fairer benchmark. Within the scope of the article and considering the typical use case, CS:GO results aren't really that interesting.

P.S. About the PUBG results, if it was the i5 bottlenecking the GPU, the CPU should be at 100%. To me those numbers first and foremost indicate poor code optimization. It would be interesting to test other variables like memory speed at 1080p in an effort to pin the cause down, but at the moment I wouldn't draw any conclusions from those results other than "PUBG doesn't behave as it should". Kudos for being Top200, btw. Nothing wrong with tryharding.
 
Great article, this reinforces what I mentioned in the Coffee lake review and the decision I made to pair a 1080ti with the 4790k(4.7ghz) instead of upgrading to a new platform.
 
Hmm in other words as I have been thinking for a long time, 10% - 20% drop in some games and productivity.
Other than that some people that are on haswell lga 1150 shouldn't see the need to abandon ship.
Unless of course it is wear and tear or the system is having problems as I see off and on.
 
I'm running a 4790k (not overclocked)with a 1080Ti and everything set to Ultra at 1440 and get great frame rates in PUBG, R6:Siege, and Wolfenstein.
 
Glad to see haswell still kicking butt. Keep these reviews coming, far more interesting then reading latest & greatest over priced releases. On quick question, if you are going to OC one CPU for results why not just OC them all (obviously time but a few would be nice)? Plenty of stock CPU results to look at on the internet.
 
Hmm in other words as I have been thinking for a long time, 10% - 20% drop in some games and productivity.
Other than that some people that are on haswell lga 1150 shouldn't see the need to abandon ship.
Unless of course it is wear and tear or the system is having problems as I see off and on.

I don't see a lot of people here acknowledging the option of selling your old parts and applying that money towards an upgrade. I just sold my Haswell based parts for a little over 50% of what I paid. That will go towards upgrading to the Coffee Lake platform (next year when the H series chipsets come out).

So, I'll be getting an 8th gen system for almost the exact same money (if not cheaper, actually, definitely cheaper) than what I paid for a 4th gen cpu+motherboard combo 2 years ago. Thanks to AMD's Ryzen success I'll (be getting) more bang for my buck from Intel than what I got 2 years ago. And the parts will be brand new.

In other words, even if there aren't "Oh My God" performance differences, it still seems like it's worth doing. All these tests are based on using a super powerful discrete GPU also. Which I don't really care about much anymore. So, I go from HD 4600 iGPU to HD 630 iGPU.

I do understand the article is asking a hypothetical question and answering it for a specific group of interested parties. But as I say, if those same parties sold their Haswell parts and went to Coffee Lake for cheaper than what they invested in their Haswell systems, why not? But for me personally, I'm happy with my decision.

Although, I'm stuck using a backup dual-core SkyLake for the next few months. I think I'll be okay.
 
(next year when the H series chipsets come out).

You might want to upgrade now. You wouldn't have the option of the 8 core Coffee lake, but if things gets worst with DRAM and starts getting ugly even with other computer chips, like CPUs and GPUs, or even the cost of the chipsets on the motherboards, you could be getting today a top Z370 motherboard and a 6 core Coffee lake at the price you will pay tomorrow for a much cheaper H motherboard, the same probably Coffee Lake, but less memory, or the same amount of memory, but a cheaper Coffee Lake model.
https://linustechtips.com/main/topic/859620-price-of-silicon-wafer-rising-sharply/
 
Really? You pick up 8700k and 1080ti and making comparison at 1080p? Even I, with my i5-4670 and 1080ti playing @4k most of the games. Sometimes QHD. But never 1080p
The problem here is that people who are in the upper 20% of the income bracket have no understanding that not everybody lives according to the same means they do. Often, these kinds of comments come from very well to do households, and more often than not are from somebody with well to do parents. Even when it's actually a self sustaining individual, the problem is they have no understanding that the MAJORITY of the population does not live by the same financial guidelines that they do.

Even now, most cannot AFFORD just a 4k display, much less the hardware to drive one. 1080p displays are fairly cheap these days and this is what you will see 75-80% of gamers using, even if those running higher resolution systems are more vocal and therefore SEEM to be a larger portion of the user base. As with anything in life, folks who can't afford the same nice things that they see the in crowd with are even less likely to throw their voices into the mix but since they can't hide their specs from the quiet information gathering done by steam and other sources, we have a much better idea of what is actually trending in this regard.

To assume that everybody else is running a 1440p or 4k configuration just because you are simply goes to show further the irrefutable fact that those WITH in this country are blinded by their self entitlement and affluenza to the fact that most of the rest of the world does not, and can not, live as they do.

Further, it seems that some of you do not seem to understand that the only reason they use the very highest end graphics cards like the 1080ti in these tests is so that the results will actually show only the affect shown by the use of different CPUs since they will not be GPU limited. It is NOT because everybody is running a 1080ti.
 
Last edited:
It would have been nice to see the 4670k and 4690k in those benchmarks, especially since you mentioned them in the article itself. To go along with my earlier statement regarding the majority not using high resolution displays is the fact that there are a LOT more people running the more affordable i5's than there were, or are, running the much more costly i7s. Including comparisons for those cpus would, to me, seem a no brainer, especially since the current gen i5s were included.
 
I already said that they are different, exactly as you describe. But as I also said, I still don't like the distinction.

Why overclock the 4770K(even for just emulating the 4790K) and keep the 8700K at defaults, while testing at reasonable resolutions? Overclock the 8700K also at 5.2GHz, run tests at 720p and probably come to a different conclusion, for example, that "8700K is significant faster, your 4770K is slow in gaming and it will get worst fast, it's time for you to upgrade. Upgrade today!".

What I am saying is that even in cases where the reviews look to try to answer a different question, they should always have some constants. In my opinion, we can't have a review focusing on 720p and calling a 130 vs 110 fps difference significant, and then the next week, have another review focusing instead on 1080p and calling a 130 vs 110 fps difference insignificant. And that while giving an extra advantage at the slower model by not overclocking the faster one. In one review, the difference is significant, gaming development will not change, go for the 8400. In the next, the same difference is insignificant, ignore that in the most resent game, even the overclocked 4770K is getting beaten badly. It's over 100fps anyway.

If I was seeing 720p numbers and an overclocked 8700K and also comments about how 130 vs 110 is a noticeable difference or how 4770K is falling behind in the most modern title, I would have no reason to comment anything that looks like an abjection. But others, 4770K owners probably, would.

I disagree. You’re comparing two different things. Therefore, doing things in a formulaic fashion isn’t the best way to go.

The question here being, is upgrading from a 4th gen Core i7 processor to the 8700K worth it? This question entails many things but ultimately you want to see if the cost of buying a new CPU, motherboard and memory is worth the investment.

If you play at low resolutions with medium type quality settings with an extreme GPU like the GeForce GTX 1080 Ti and want 200fps+ then yes, the upgrade is beneficial. Overclocking the 8700K will also improve performance further here but we have our answer all the same.

If you play at 1080p using ultra quality settings on anything less than a GTX 1080 Ti, a GTX 1070 was used in my example, then ‘no’ the upgrade isn’t worth it. You can overclock the 8700K to 7 GHz on LN2 if you want, it still won’t yield any extra performance with a mid-range graphics card.

720p testing isn’t particularly useful here as we’re not interested in how much faster the 8700K is with all speed limits removed using an extreme GPU. Rather we’re interested in finding out if someone with a 4-year-old Core i7 has anything to gain right now by upgrading to an 8700K, under realistic gaming conditions and the answer is no. The 8700K will no doubt end up being a better gaming CPU in years to come, but you’re not upgrading from the 4770K to the 8700K for it to be a good investment in a few years’ time, you’d just upgrade in the future once proven.

So again comparing two ‘new’ CPUs on two new platforms is entirely different to what we were doing here.

It’s like buying a new car vs second hand.

When buying a second hand car you want to ask things like “how many k’s are on the clock”, that’s useful information for determining the cars condition and ultimately value.

Walking into a dealership and asking “how many k’s are on the clock” probably won’t lead to any useful information ;)
 
Really? You pick up 8700k and 1080ti and making comparison at 1080p? Even I, with my i5-4670 and 1080ti playing @4k most of the games. Sometimes QHD. But never 1080p

Why do I keep seeing comments like this, on a tech site of all places? I've run out of willpower to explain this so can I please tag someone else to tackle these?

Agreed...sick of people whining about not including higher res/4K benchmarking...

Though I have the ability to run my games at 2K comfortably, I still game at 1080p. Looks very nice (yeah, I've seen 4K - but not interested), plays smooth, and less stress to my graphics card.

Edit: Wish the stock 4790K was also tested and listed there for direct comparison.
 
Agreed...sick of people whining about not including higher res/4K benchmarking...

Though I have the ability to run my games at 2K comfortably, I still game at 1080p. Looks very nice (yeah, I've seen 4K - but not interested), plays smooth, and less stress to my graphics card.

Edit: Wish the stock 4790K was also tested and listed there for direct comparison.

Yep ;) You can expect a stock 4790K to sit directly between the stock and overclocked 4770K.
 
I don't see a lot of people here acknowledging the option of selling your old parts and applying that money towards an upgrade. I just sold my Haswell based parts for a little over 50% of what I paid. That will go towards upgrading to the Coffee Lake platform (next year when the H series chipsets come out).

So, I'll be getting an 8th gen system for almost the exact same money (if not cheaper, actually, definitely cheaper) than what I paid for a 4th gen cpu+motherboard combo 2 years ago. Thanks to AMD's Ryzen success I'll (be getting) more bang for my buck from Intel than what I got 2 years ago. And the parts will be brand new.

In other words, even if there aren't "Oh My God" performance differences, it still seems like it's worth doing. All these tests are based on using a super powerful discrete GPU also. Which I don't really care about much anymore. So, I go from HD 4600 iGPU to HD 630 iGPU.

I do understand the article is asking a hypothetical question and answering it for a specific group of interested parties. But as I say, if those same parties sold their Haswell parts and went to Coffee Lake for cheaper than what they invested in their Haswell systems, why not? But for me personally, I'm happy with my decision.

Although, I'm stuck using a backup dual-core SkyLake for the next few months. I think I'll be okay.

That maybe true but when you don't have enough money to purchase your main parts it would be a good idea to sell off good used parts.
Like I keep telling people over the years elsewhere on different forums and now here thats the best way.
Also I agree its not a big deal to have the newest grandest thing, now if the **** breaks and I need a part or computer.
Then I'll consider stepping up to better things, especially if the warrenty is gone.
I've been staring at intel this year before amd released ryzen and they are looking good.
Amd really improved but its not enough to stop me from making a new intel purchase.
They still lack in gaming areas and desktop productivity intel keeps trumping them when they make a new release.
https://www.amazon.com/dp/B012M8M7TY/?tag=httpwwwtechsp-20
If I find this used between christmas holidays and the last week of janurary next year on sale for 160 bucks.
I am buying it, the motherboard I have now leaves me with very little options.
It'll take me till march to move on to skylake but its really worth it.
 
You might want to upgrade now. You wouldn't have the option of the 8 core Coffee lake, but if things gets worst with DRAM and starts getting ugly even with other computer chips, like CPUs and GPUs, or even the cost of the chipsets on the motherboards, you could be getting today a top Z370 motherboard and a 6 core Coffee lake at the price you will pay tomorrow for a much cheaper H motherboard, the same probably Coffee Lake, but less memory, or the same amount of memory, but a cheaper Coffee Lake model.
https://linustechtips.com/main/topic/859620-price-of-silicon-wafer-rising-sharply/

It's an interesting article. I like "dizmo"'s post midway on page #2:

"A 300mm wafer is ~$400.

The yield per wafer is apparently a guarded number, AMD gets around 150 per 200mm wafer? So for sake of a number we'll say a 300mm wafer gets 225 (could be more). So ~120 CPUs (graphics is a separate die). $3.33 per CPU, for a price increase of $0.66. Even at a multiple of 10, it's a $6.60 increase. Or a nice coffee.

Very, very rough numbers, but still. I didn't want to look that much into it
tongue.png


People are over reacting. Silicone, as you've said, is a small part of the process. This doesn't increase the greater costs; shipping, distribution, marketing, packaging, etc."

Now that I think about it though. Waiting 60 more days to save $30.00 on a motherboard.....eh.....I could go either way right now. Maybe I'll flip a coin like two-face does to make my decision for me. :p
 
Great article but, most techies worth their weight in gold already knew it wasn't really worth it to upgrade. However, I believe the real question should be, "Do we, as consumers have a choice in the matter?" Over 95 percent of the time, a cpu will outlast a motherboard by years. Right now, trying to buy a new gaming motherboard for a 4th gen I7 cpu will cost more than buying a 8th gen I7 cpu with a gaming motherboard. I believe that the consumers lack of choice in this matter is part of the reason that technology gains are so small every new generation of cpu's.
 
Back