Intel leaks desktop Kaby Lake processor and chipset details

Scorpus

Posts: 2,162   +239
Staff member

Intel wasn't expected to officially reveal their line-up of Kaby Lake desktop processors until early next year, as the company announced at IDF several months ago. However, they have essentially leaked details of their own unannounced CPUs through a product change notification (PCN) document, which was intended for manufacturing partners but published publicly.

In the PCN document, Intel lists ten desktop-class Kaby Lake Core processors and one Xeon product, along with their base clock speeds and product codes. As expected, Intel is using a similar naming scheme to Skylake: K-designators for unlocked 95W parts, T-designators for low-power 35W parts, and no suffix for regular 65W parts.

We can also expect to see a similar core configuration here, with Core i7 products receiving four cores and eight threads, while Core i5 CPUs get four cores and four threads. Cheaper Core i3 parts will be dual-core with four threads.

The main change moving from Skylake to Kaby Lake appears to be a slight increase in base clock speeds, somewhere in the 100 to 300 MHz range depending on model. Intel's refined 14nm+ manufacturing process allows these clock speed gains at no cost to power consumption, and users can also look forward to minor Speed Shift improvements as well.

A separate document has revealed Intel's upcoming chipset names, and again they aren't a massive surprise.

At the top end we're looking at the Z270 chipset, while we can also expect H270, B250, Q270 and Q250. Additionally, Intel revealed the C422 and X299 chipsets, which could be next-generation products for server and enthusiast processors respectively.

Permalink to story.

 
Speed gains in relation to what ? My 4790 from two years ago runs at 3.6 to 4 GHz. The 7700, the highest KL not meant for overclockers, has the same base speed.

Let's face it, Intel is spoon-feeding us because they can, with no real competition in sight. Zen may or may not change that (probably not), but the truth is your five year old i7 2600, paired with a nice card, can still run every modern game without a sweat.

New chipsets every couple of months as well... If you're into desktop activities, keeping up with newly launched technologies has become the main upgrade incentive these days, not the boost you get from processing power.
 
but the truth is your five year old i7 2600, paired with a nice card, can still run every modern game without a sweat.

Agreed. I'm still on a i5 2500k and have never seen the need to upgrade for the last 5 years. At this pace, I won't be upgrading for another 4-5 years...if ever at the pace that Intel is going at. Hopefully Zen would be amazing enough to put some serious pressure on Intel to give us some drastic performance gains.
 
Here's hoping that Zen at least makes Intel's CPUs cheaper. I don't need them to win, just to be competitive enough to make intel adjust the pricing.
 
"Leak".

The frequency with which Intel "accidently" leaks details leads me to suspect they aren't really leaks. Either that or there are some very incompetent sales/PR people.

And agreed, my 5yo Sandy Bridge i7 MBP is still sufficient for anything I throw at it. I was considering upgrading to the new MBP until I saw the atrocious HW decisions. And they wonder why computing sales are dropping?
 
"but the truth is your five year old i7 2600, paired with a nice card, can still run every modern game without a sweat."
Indeed. The gap shrinks even more for non-K chips:-

Maximum Non-K Turbo Speeds over 6 generations:-

i5-2500 - 3.4 / 3.5 / 3.6 / 3.7 -> 3.8 / 3.9 / 4.0 / 4.1 (+4-bin limited OC)
i5-3570 - 3.6 / 3.7 / 3.8 / 3.8 -> 4.0 / 4.1 / 4.2 / 4.2 (+4-bin limited OC)
i5-4690 - 3.7 / 3.8 / 3.9 / 3.9 - (From Haswell, +4-bin non-K limited OC feature is nerfed)
i5-5675R - 3.5 / 3.5 / 3.6 / 3.6
i5-6600 - 3.6 / 3.7 / 3.8 / 3.9
i5-7600 - 3.5 / 3.6 / 3.7 / 3.8) ???

So a new i5-6600 may be +15% faster than an i3-3570, but the latter is clocked 11% higher = barely 5% gain after 3 generations (Ivy -> HW -> BW -> Skylake). Likewise, the current "mid" non-K clocks are 300MHz slower than the highest ones (i5-6500 vs i5-6600) relative to only 100-200MHz of the old ones (3550/3470 vs 3570, 4590 vs 4690, etc). And the new Kaby Lake i5-7600 looks to be 300MHz slower than the i5-7600K (whereas i5-6600K vs 6600 / 4690K vs 4690 / 3570K vs 3570 / 2500K vs 2500, have all been at parity). The only non-K chips not consistently crippled are the i3's and even then it's only the slowest ones (4170 / 6100) with half sensible pricing.

Literally the only reason to upgrade has been hardware failure / platform features (DDR4, M2, etc). Makes you miss the good old 486 era when "x86 CPU" = pin compatible IBM, Cyrix, VIA, SGS Thomson, UMC, Texas Instruments, etc, in addition to Intel vs AMD.
 
Last edited:
Speed gains in relation to what ? My 4790 from two years ago runs at 3.6 to 4 GHz. The 7700, the highest KL not meant for overclockers, has the same base speed.

Let's face it, Intel is spoon-feeding us because they can, with no real competition in sight. Zen may or may not change that (probably not), but the truth is your five year old i7 2600, paired with a nice card, can still run every modern game without a sweat.

New chipsets every couple of months as well... If you're into desktop activities, keeping up with newly launched technologies has become the main upgrade incentive these days, not the boost you get from processing power.
Yup. Still on a ivy 3570k. literally 0 reason to invest in a new system, outside of getting NVME support. I really want to use a samsung 960 pro m.2.

The way I figure, if AMD releases a slam dunk in the form of a $350 8 core monster, I''ll upgrade to that and keep it as long as I can get parts. Probably buy a spare motherboard on sale just in case. If not, ill just keep my ivy for a few more years. There just isnt much in the way of improvements anymore.
 
Agreed. I'm still on a i5 2500k and have never seen the need to upgrade for the last 5 years. At this pace, I won't be upgrading for another 4-5 years..
You think thats bad?
I still roll with an i7 930 @ 4.0GHz 24/7.
For games, its still right there, only complaint is 4.2GHz is the highest I can get it to go safely.
 
Agreed. I'm still on a i5 2500k and have never seen the need to upgrade for the last 5 years. At this pace, I won't be upgrading for another 4-5 years..
You think thats bad?
I still roll with an i7 930 @ 4.0GHz 24/7.
For games, its still right there, only complaint is 4.2GHz is the highest I can get it to go safely.
I know of some friends that are STILL on core 2 quads. Close friend just upgraded to a i7 rig because he needed more then 8 gigs of ram. The CPU itself was fine.

It is scary how much things have stagnated. OTOH, makes gaming PCs that much cheaper.
 
Speed gains in relation to what ? My 4790 from two years ago runs at 3.6 to 4 GHz. The 7700, the highest KL not meant for overclockers, has the same base speed.

Let's face it, Intel is spoon-feeding us because they can, with no real competition in sight. Zen may or may not change that (probably not), but the truth is your five year old i7 2600, paired with a nice card, can still run every modern game without a sweat.

New chipsets every couple of months as well... If you're into desktop activities, keeping up with newly launched technologies has become the main upgrade incentive these days, not the boost you get from processing power.

While you're right that even 7yo processors can still hang with most modern applications, it is not because Intel is "spoon feeding" us. It is because we were pushing up against the limits of silicon computing. Both Intel and AMD have stated that they expect to run into a hard limit around 5-7nm - any smaller than that, and you end up with electron tunneling occurring too frequently for any amount of reliable error correction. They also both anticipate 10nm being the only interim step between 14nm and 5-7nm.

The 90s saw massively inflated transistor counts, and 00s saw us begin to achieve the maximum operating frequencies that silicon architectures are capable of (with practical cooling). The late 10s and early 20s are going to see decreases in power consumption. By the time Intel and AMD hit 5-7nm, there will be almost no difference in TDP between desktop, laptop, and mobile phone processors with comparable transistor counts. Yes, you can in theory get better performance, but it is not economical to do so - you'd never recover the R&D costs. Instead, they are both hoping that someone or some university cracks the Quantum computing nut, because that is ultimately the next jump for computing.
 
True, but that's not necessarily the end of the road. Improvements can be made in many other directions, and Intel, being at the forefront of microchip design, has definitely tested some of them. Waiting for someone else to achieve a breakthrough for you won't win you any favors with your investors.
 
I'm using a 6950x.

This desktop will last at least 4 or 5 years.

Not sure what PC gaming will require then, but I'll wait till then to buy another desktop.

I can't even imagine how powerful computers will be 5 years from now.
 
You think thats bad?
I still roll with an i7 930 @ 4.0GHz 24/7.
For games, its still right there, only complaint is 4.2GHz is the highest I can get it to go safely.

Lol, the 930 is a solid chip too!

I know of some friends that are STILL on core 2 quads. Close friend just upgraded to a i7 rig because he needed more then 8 gigs of ram. The CPU itself was fine.

It is scary how much things have stagnated. OTOH, makes gaming PCs that much cheaper.

Holy damn, I wouldn't be able to use a c2q for gaming reasons. I came from a Q9450 to the 2500k and got massive massive fps gains in gaming. C2q bottlenecks the gpu so much, especially online multiplayer games like Battlefield.
 
Where got power improvement?

Intel started well (1st gen)130w- (2nd gen) 95w- (third) 77w but (fourth) 87w- (sixth) 95w? what´s next? lol

Intel has years on 14nm and in´st getting best, AMD will use 14nm for first time, SMT for first time, first time Octal Core SoC and itselt limiting to 95w then AMD will have greater performance per Watt than Intel (using all threads) and about 10-15% lower performance per clock on Monothreaded scenarios, then... what you will buy?
 
then AMD will have greater performance per Watt than Intel
Some dreams may come true but until they do, they are just that "Dreams".

I would be so bold as to say; Intel has not been stagnant in development, even though their market has been stagnant. There is no telling what they are holding as an ace in the bag, if AMD does some how match their current offering.
 
I would be so bold as to say; Intel has not been stagnant in development, even though their market has been stagnant. There is no telling what they are holding as an ace in the bag, if AMD does some how match their current offering.

Yeah, Intel will bribe manufacturers like they did last time AMD had better product.

Intel has not developed new desktop CPU architecture since 2000 (Atom's and other low voltage offerings are different thing) so they simply don't have any "aces" other than money related. That includes also price cuts and such.
 
Intel has not developed new desktop CPU architecture since 2000 (Atom's and other low voltage offerings are different thing) so they simply don't have any "aces" other than money related. That includes also price cuts and such.
The 3D transistor or 'tri-gate' technology that released with Ivy Bridge was a huge improvement on efficiency and chip design.
 
The 3D transistor or 'tri-gate' technology
That's a fairy tail in HardReset's reality.

I laugh my *** off every time I hear someone say, "Intel hasn't developed anything since 2000" as a sales pitch for AMD. If that was the case AMD would be the top dog!
 
The 3D transistor or 'tri-gate' technology that released with Ivy Bridge was a huge improvement on efficiency and chip design.

That's manufacturing improvement, not architectural.

That's a fairy tail in HardReset's reality.

I laugh my *** off every time I hear someone say, "Intel hasn't developed anything since 2000" as a sales pitch for AMD. If that was the case AMD would be the top dog!

Read again.
 
Tri-gate transistor technology is all about manufacturing process, nothing about CPU architecture.
Manufacturing the chip deals with making it, not creating it. It takes places once they know how to create the new tri-gate technology. The 3D transistor tech had to be developed, tested and implemented into the chip design after the technology was created/finalized. It took them years and it was a huge breakthrough for chip design/architecture. Now that the tech is ready, it can now be manufactered.
The new Coyote motor in the Mustang was created/developed/designed for several years before it was put in the 2011 Mustang GT, and now that motor is manufactured.
I don't think you know what the definition of manufacture means, it has nothing to do with design, only creation of that product.
 
Last edited:
Back