Intel Core i7-6700K Review: Skylake arrives with the latest 'tock'

Steve

Posts: 3,044   +3,153
Staff member

Last month we finally got our first look at Intel's latest 14nm technology for desktop computers when we reviewed the Core i7-5775C. Whereas Broadwell was a 'tick' in Intel's "tick-tock" manufacturing and design model, offering a few minor tweaks, Skylake is a 'tock' which means the redesign should provide greater CPU and GPU performance with reduced power consumption.

A number of significant changes will be introduced with Skylake, chief among them being the new LGA1151 socket, which isn't backward compatible with anything previously available, instead requiring a new motherboard that supports an Intel 100 Series chipset, also known as Sunrise Point.

The memory controller found on Skylake processors has been upgraded to support DDR4 memory, though it will be limited to dual-channel operation. Skylake processors will support DDR3 memory as well, so it will be up to the motherboard manufacturers to decide which standard they're going to use.

On hand today for our tests we have the Intel Core i7-6700K, a quad-core processor operating at a base clock of 4.0GHz.

Read the complete review.

 
Great name for the last page of the article. Outside of a better chipset, there is zero reason to buy this thing. Wasn't skylake supposed to have some big improvements cpu wise?

Guess it'll be another 5 years before the 3570k gets upgraded, unless amd pulls a rabbit out of their hat with zen.
 
Now, who's spending 500euros of their hard earned money to upgrade their 4 year old 2500K, that was cheaper than Skylake when it was new? It's gonna be hard year for PC manufacturers...
 
I got an i2700K for 275 euros when it went EOL.
Great overclocker and still very competitive.

I'm waiting for the killer 6-core or 8-core, but it looks like I'll be hanging on to this cpu a lot longer.
 
Great name for the last page of the article. Outside of a better chipset, there is zero reason to buy this thing. Wasn't skylake supposed to have some big improvements cpu wise?

Guess it'll be another 5 years before the 3570k gets upgraded, unless amd pulls a rabbit out of their hat with zen.

I just received an updated BIOS from Asrock that has improved Core i7-6700K performance ever so slightly. Now I am seeing worst case it matches the 4790K but is often a few percent faster, so not significant changes but it looks a little better.

The Asrock Z97 motherboard I used was also quite aggressive with the 4790K Turbo function which has slightly inflated the results.

Looking at other sites it seems the Core i7-6700K is 0 – 5% faster than the 4790K.
 
At least I now know that my Z77/Core i7 3770 platform is still viable next to Skylake and the Z100 series chipset. I had a feeling that it'd be like this for those of us running Ivy Bridge and Haswell setups but anyone with an older platform should make the jump.
 
This is becoming just like the mobile smartphone cycle.
More power, more performance, HIGHER price. You get it, and find out it is all "fluff" for the
average user suckered into paying for a side update that really does not benefit the consumer.
Most consumers will never come close to maxing out the performance level of these new
chips, memory, computers, smartphones, considering what it costs to upgrade.
 
My $0.02 - Looks like Intel pulled an AMD. I've been sitting on an i7-3820 for some time. Really nothing Intel has come up with since then has inspired me to want to build a new machine. I would get more bang for the buck by simply upgrading my GTX580.

That said, it seems like Intel has really done nothing innovative in the past few years except go to 14nm. They are clearly milking every cent they can out of the current base design. As I see it, this is creating an opening for AMD - not that I am holding my breath for AMD to release an Intel killer. But having worked for a former photography giant that became complacent and in that complacency evaporated to a shadow of its former self, I have to say to Intel, wake up guys. Being a leader is great, being complacent is just stupid.
 
At least I now know that my Z77/Core i7 3770 platform is still viable next to Skylake and the Z100 series chipset. I had a feeling that it'd be like this for those of us running Ivy Bridge and Haswell setups but anyone with an older platform should make the jump.
yeah its sad to me in most of the test it stack up as well as a AMD 8350.......im keeping what I have for a few more years....
 
My $0.02 - Looks like Intel pulled an AMD. I've been sitting on an i7-3820 for some time. Really nothing Intel has come up with since then has inspired me to want to build a new machine. I would get more bang for the buck by simply upgrading my GTX580.

That said, it seems like Intel has really done nothing innovative in the past few years except go to 14nm. They are clearly milking every cent they can out of the current base design. As I see it, this is creating an opening for AMD - not that I am holding my breath for AMD to release an Intel killer. But having worked for a former photography giant that became complacent and in that complacency evaporated to a shadow of its former self, I have to say to Intel, wake up guys. Being a leader is great, being complacent is just stupid.


you do know we are talking about the only company to be sued for selling old hardware as new right??? yeah intel has been over rated for years......its sad im still rocking a AMD 555 unlocked to a quad core and 3.8ghz (really a locked 955)......but I can still get 50 to 60 frames a second in GTA5 with this CPU on a 9 series motherboard with a 770 4gb......
 
Still Rocking my SandyBridge i7-2700K and I see no reason to upgrade anytime soon. And by anytime soon I mean the next 5 years.

No worries, your CPU won't last that long. I have an 2600k and I'm sure I can't motivate a replacement before it breaks either. I need at least double the performance of my old CPU before I can consider it.
 
This is sad. 2 gens later and still no reason to upgrade over my haswell let alone other people sandy and ivy bridge cpus. I cant wait to see what amd unveils next. this is a joke
 
My $0.02 - Looks like Intel pulled an AMD. I've been sitting on an i7-3820 for some time. Really nothing Intel has come up with since then has inspired me to want to build a new machine. I would get more bang for the buck by simply upgrading my GTX580.

That said, it seems like Intel has really done nothing innovative in the past few years except go to 14nm. They are clearly milking every cent they can out of the current base design. As I see it, this is creating an opening for AMD - not that I am holding my breath for AMD to release an Intel killer. But having worked for a former photography giant that became complacent and in that complacency evaporated to a shadow of its former self, I have to say to Intel, wake up guys. Being a leader is great, being complacent is just stupid.


you do know we are talking about the only company to be sued for selling old hardware as new right??? yeah intel has been over rated for years......its sad im still rocking a AMD 555 unlocked to a quad core and 3.8ghz (really a locked 955)......but I can still get 50 to 60 frames a second in GTA5 with this CPU on a 9 series motherboard with a 770 4gb......

Intel overrated if you call having the best performing cpu for the last 10 years.

A phenom II is a very old cpu now and will bottleneck a 770 regardless. Look the results of a modern system with a 770 gpu then compare to your own. I'm glad you are happy with it but don't be blind to that fact that amd has no answer for intel currently when it comes to cpu's.

This is sad. 2 gens later and still no reason to upgrade over my haswell let alone other people sandy and ivy bridge cpus. I cant wait to see what amd unveils next. this is a joke

There is no reason to upgrade from haswell. I don't understand why people with the last generation product making statements like that if you upgrade every generation you are doing this wrong.

Anyone on SB,IVY or anything older nehalem, core2 quad this would be a good time to upgrade your platform.
 
The biggest scam of all was when Intel split their releases into consumer and enthusiast with their "E" versions... and then released "E" version AFTER the consumer ones!!

Honestly, if you are paying top dollar for an "E" CPU, you should be getting the latest chipset as well, not last year's (or 2 years ago)...

For instance - those of us sporting X99 (Haswell-E) are now a chipset behind - but there is nothing to upgrade to... Skylake-E should have come out first !!
 
When will they release consumer level chips without on-die graphics?

Sadly, never. Intel's vision of "consumer" is not the one using dedicated GPU... those are enthusiasts.
Consumer is office worker who plays minesweeper and solitaire when he has spare time from extensive excelling and powerpointing. And iGPU is quite enough for those.
 
And I am so sad about the iGPU performance... it's like complete flop compared to broadwell - WHAT WAS INTEL THINKING?

Mr. Steven, could you do now [Updated] version of this article with your new BIOS and actually the newest testing software, like PC Mark 8, not the 4 years old PCMark 7?
 
The 5775c is by far the most appealing processor to me. It's a beast in certain bandwidth-hungry benchies with the huge L4 cache, the Iris GPU is nuts, it's unlocked, it doesn't require a mobo upgrade...

Skylake is... underwhelming, to put it mildly. Give us Xeon E5-2687W v3 levels of performance for mainstream prices and we'll call that progress. Not the 5-10% we've been spoon-fed every generation for 10 years now...
 
Last edited:
Love it how they claim there is an overclocking industry. One they formed by artificially locking features then jacking the prices up on gear. What value have they actually added to the higher priced chips?

But yes, lack of competition. It's good to be king.
 
I'm actually looking forward to i3 desktop offerings in Skylake. With Ivy Brdge, they released a Core i3, with the high end graphics package. This was the i3-3225, and it provided Intel's best at the time "4400" graphics.

Since several generations have past since that particular graphics engine, I pretty much don't care if the CPU itself isn't much faster, but the graphics certainly would be much better.
 
I do have a few questions though. Why does everyone here think Moore's Law was handed down to them personally, on a set of stone tablets?

Why does everyone here they think they know how Intel should proceed, or what problems will arise, regard to the infinitesimally small transistor pathways being dealt with today?

The whole lot of you couldn't go into somebody's garage, and even come up with a working Pentium 2.

And FWIW, the reason AMD is likely falling behind, is because they simply don't have the excess capital to piss away on new fabs and research.
 
Back