2nd-Gen Core i7 vs. 8th-Gen Core i7: RIP Sandy Bridge?

Back in 2013 my core 2 quad (Q8200) failed to cope with the release of battlefield 4 making it at times unplayable. At that time I upgraded to an i7 3770. Looking at the above none of the games tested are unplayable with a sandy bridge i7 never mind an ivy bridge so there really is no need to upgrade yet. When I upgrade (probably in the new year) it will be to the most cores I can afford as I'll have to justify it (man maths) on productivity grounds.
 
The title, "RIP Sandy Bridge," went way too far considering your conclusion. Actually, even the conclusion itself went too far. Frames never dropped below 40, and rarely below 60, which is the only thing that matters to the average gamer.

The title was a bit overkill so we changed that. The conclusion however is on point in my opinion and hit the nail on the head. The 2600K overclocked to at least 4.5 GHz is fine with a GTX 1070, anything faster and you're wasting money without a CPU upgrade.
 
The title, "RIP Sandy Bridge," went way too far considering your conclusion. Actually, even the conclusion itself went too far. Frames never dropped below 40, and rarely below 60, which is the only thing that matters to the average gamer.

The title was a bit overkill so we changed that. The conclusion however is on point in my opinion and hit the nail on the head. The 2600K overclocked to at least 4.5 GHz is fine with a GTX 1070, anything faster and you're wasting money without a CPU upgrade.

Thanks for the response. However, once you have a CPU that doesn't bottleneck the 1070, wouldn't you say the 1070 is a bottleneck and upgrade it, and so on? [Edit: The conclusion actually makes sense for those who haven't bought the hypothetical 1070 yet.]
 
Last edited:
Thanks for the response. However, once you have a CPU that doesn't bottleneck the 1070, wouldn't you say the 1070 is a bottleneck and upgrade it, and so on? [Edit: The conclusion actually makes sense for those who haven't bought the hypothetical 1070 yet.]

The point is if you have a GTX 1070 you don't need to do anything about the CPU right now till you upgrade to a faster GPU, that's what we meant when we said this...

"What does this mean for you if you're a Core i7-2600K owner? Currently, probably not a lot. If you're running a graphics card that's equal to or slower than the GTX 1070, you don't really need to do anything for now assuming you've overclocked that sucker to at least 4.5GHz."

You only need to upgrade the CPU if you have or plan to acquire a GTX 1080 or an even faster GPU. A scenario that will be more likely next year.
 
Great article Seve. Helped me a lot. I've been wanting to change my rig (i5 3570/8Gb/GTX970) for a while. Now I'm certain that it is time to sell it before it is too old and buy a new one.
 
I don’t see anything particularly surprising here. Anyone running old gear clearly understands that buying a much newer CPU will make a positive improvement.

Until I get my Ryzen upgrade I still have an AMD 8350 paired with a GTX 1080. I can play every game in my 450+ Steam library at max quality settings with no trouble at solid frame rates. Upgrading my CPU will just be icing on the cake.

Clearly I’m not fully utilizing my GPU, but is it restricted by CPU enough to really bother me? Absolutely not! Like I said, I haven’t found a single game that I own where I can’t completely max out in quality.

I’m looking forward to the next Ryzen chips coming out next year. THAT’s when I will upgrade. I’ve just been patiently waiting for the right upgrade.

In actuality we had our second baby this year and it pushed my full upgrade out the window... haha!
 
A few notes that I find in the test;

First and foremost awesome review. I find these reviews of past tech against the latest tech far more valuable then the typically latest and greatest review.

Eurogamer swears by upgrading Sandy bridge platforms to 2133 ram from 1600 with their tests. I find it interesting you guys show minimal benefit in most games. BTW the few other sites ive seen that have done similar tests often show minimal benefit as well.

The 2600k is still a beast IMO as it provides over 70 fps in practically every test. Hell of an investment if you picked one up around launch and are still holding on to it.

OC Intel K chips give you almost 20% fps increase in many games. Should give second thoughts to anyone getting a non OC intel board and/or chip.

That said it would be cool to see OC results for the 7600k and 7700k. Yes I know its a lot of work but kind of cool to see it.

That said part 2, the 7700k is damn impressive in many of the tests against the 8600k.

That is all
 
In my situation it seems I am gpu bound and have a decent fps, so reading this should just ease my mind and have me stick with my 2600k for a while. But, what I have not seen in sb vs coffe lake comparisons, how is latency on a P67 system vs a z370? If the P67 output is several frames behind a z370, this might make a way bigger impact than a 10-30% lower fps, would't you agree? Any opinions on this?
 
Human perception question: when does frame rate become 'annoying' (not just 'noticeable'). I have read that it is more an issue of 'inconsistency' (change in frame rate) than the actual frame rate. So apparently 30fps is fine as long as it is smooth - but >60fps can be inadequate if there are significant fluctuations - even if the minimum is still above 60fps. (see second post "long answer" - https://steamcommunity.com/discussions/forum/11/530645446316451261/ )

The minimum frame rate in the article is an attempt to capture this idea, but does it? Is there a better 'annoying' test?
 
I just slapped a water cooler on my 3770k (boosted the clock a tad also) because I had a itch to upgrade everything... Still not enough boost between that and an 8 series to warrant it in my opinion.
3770k & 1070 still make a terrific pair for 1440p gaming.
 
I don't agree with the conclusions in this article. Who uses a 1080 Ti to game at 1080p!?! I have a 2500k at 4.4ghz and a 1080 Ti with a 55 inch LG oled. I've benchmarked Shadow of War at 4k and the FPS is IDENTICAL to the results I've seen on gaming websites using the latest intel chips. Every triple A game is GPU limited at 4k and probably will be for the next few years. GPUs are slow as ****, including my 1080 Ti.

Please explain to me how a CPU/motherboard/ram upgrade compares to buying a 55 inch B7A? The OLED is outrageously beautiful and every game is GPU limited at 4k anyway. HDR 4k gaming with Destiny 2 and Shadow of War are the prettiest PC gaming there is.

PLEASE ADD 4k BENCHMARKS! PLEASE! And Shadow of War... it's amazing.
 
Last edited:
If someone is playing.... benchmarks, should upgrade yesterday.
If someone is playing games, 2600K is fine and will probably remain a good solution, as long as those 8 threads are enough.Games will probably start asking for 6 threads in the near future and we will have to see how those 8(4+4) threads of 2600K will perform. If games start asking for more than 6 threads, then the 2600K will start looking as a huge bottleneck.
 
Sandy Bridge is arguably the best generation for Intel. Next is Haswell. Just look around at refurbished PCs and laptops and most of them are still going and selling well for average homeowners who dont want to spend a lot of cash.
 
2600k puts on a damn good show when heavily overclocked but clearly showing it's age in the games that lean a little more on the CPU.

I'm more surprised you didn't include the 2500k. While I appreciate that this was comparing the high end Sandy Bridge, everyone bought the 2500k.
 
Back