Slides reveal Intel's entire 10th-gen series: Up to 5.3 GHz and 10 cores

I can't believe I just read someone say. Intel is throwing efficiency out the window. After all these years of supporting AMD and their core counts. Intel doesn't currently have anything even remotely close to the crap Bulldozer was. And talking about core count, last I looked AMD still has a higher core count in order to compete.

The 3950X has DOUBLE the core count and uses LESS power then the 9900K. If you consider power efficiency, core count, performance, and platform (double the PCIe bandwidth, native gen4 lanes from the CPU, USB 3.2, ect) as factors then yeah I'd say Intel is essentially at bulldozer levels. If this is what Intel has going up against AMD's 4000 series, it will certainly be bulldozer levels when that comes out. What will it be at that point? Performance isn't the only metric by which to measure a processor, Intel is behind in several dimensions.

FYI the 3950X beats the 9900K in single threaded IPC tests like CINEBENCH that aren't gaming related. You should know that being a frequent reader of techspot. Intel's only advantage is a tiny sliver in gaming, everywhere else it looses and often times by a wide margin.
 
Nope. Intel's chips are power hogs and not competitive on very heavily multi-threaded apps, but those are not what the vast majority of users need or use.

I think its not that no one needs or uses multitasking / multi-threading performance, because that would be incorrect, but rather, those people who do use their CPU for multitasking, no longer have an option with Intel unless you like to pay more to get less.

If people only need CPUs for high refresh gaming, and AMD's processors are only any good at multitasking, which no one needs or uses, then why is AMD rapidly taking marketshare, and may be ahead of Intel's heals in terms of current desktop CPU sales? (Amazon top 11 best selling CPUs right now is all AMD)

Is everyone buying AMD to get less of what they want and need to use their CPU for? Doesn't make any sense ... I think the premise for your statement is flawed.
 
I think its not that no one needs or uses multitasking / multi-threading performance, because that would be incorrect, but rather, those people who do use their CPU for multitasking, no longer have an option with Intel unless you like to pay more to get less.

If people only need CPUs for high refresh gaming, and AMD's processors are only any good at multitasking, which no one needs or uses, then why is AMD rapidly taking marketshare, and may be ahead of Intel's heals in terms of current desktop CPU sales? (Amazon top 11 best selling CPUs right now is all AMD)

Is everyone buying AMD to get less of what they want and need to use their CPU for? Doesn't make any sense ... I think the premise for your statement is flawed.
I was thinking in a more typical user scenario, not an enthusiast, but you make a good point. Enthusiasts buy individual processors to build their custom system around and clearly have a preference for the core and thread count from AMD based on AMZ's sales list. They're looking for the best for their money and that's clearly AMD now.

I guess I'm used to average users (I work in IT support) who buy a PC and frankly can't tell the difference between a cheap 2C4T and a high clocked 6C12T because they do nothing to challenge their PC. Some can barely tell the performance difference between an HDD and an SSD. ?
 
I was thinking in a more typical user scenario, not an enthusiast, but you make a good point. Enthusiasts buy individual processors to build their custom system around and clearly have a preference for the core and thread count from AMD based on AMZ's sales list. They're looking for the best for their money and that's clearly AMD now.

I guess I'm used to average users (I work in IT support) who buy a PC and frankly can't tell the difference between a cheap 2C4T and a high clocked 6C12T because they do nothing to challenge their PC. Some can barely tell the performance difference between an HDD and an SSD. ?
Working in IT myself, I can confirm this, and I feel your pain.
 
I was thinking in a more typical user scenario, not an enthusiast, but you make a good point. Enthusiasts buy individual processors to build their custom system around and clearly have a preference for the core and thread count from AMD based on AMZ's sales list. They're looking for the best for their money and that's clearly AMD now.

I guess I'm used to average users (I work in IT support) who buy a PC and frankly can't tell the difference between a cheap 2C4T and a high clocked 6C12T because they do nothing to challenge their PC. Some can barely tell the performance difference between an HDD and an SSD. ?

I also work in IT, and yes most people don't know their hardware at all. But at the same time I get constant complaints that the laptops are slow -- no matter how fast a laptop I buy them, always too slow, or then, too big. No satisfying people, lol. I suppose I spoil my people with pretty good hardware, we have almost nothing older than gen5 Intel, and everyone gets dual monitors. :)

But also I am a "megatasker", and do video and photo editing for the company, work with massive spreadsheets, and so I'm slightly biased toward needing the headroom, as that is my experience; and at home I do 3D modelling and animation rendering.

I assume also most people reading here, and having this type of interest are enthusiasts.

My point simply was AMD is killing it in sales so people must be thinking they need more than a few percent increment in low res gaming or single thread tasks.

And I do agree that Intel is not quite at Bulldozer level with its 14nm, Pentium4 level?, yes maybe. :)
 
Last edited:
Unlike AMD chips, these chips all, even the highest-end ones, have an integrated GPU. And they're all monolithic; the new multi-chip module tech Intel talks a lot about isn't ready yet. Plus, they're on 14nm; Intel hasn't yet quite overcome its 10nm problems, even if it's shipping some 10nm parts in volume. (Of course, it's mainly because they're monolithic that the integrated GPU is a potential issue, reducing yields.)

With all these disadvantages - and at least one of them, the integrated GPU, is fully under Intel's control - that they are still expected to be competitive with AMD with this new series of chips is... impressive, and somewhat surprising.

Of course, every year that passes before Intel gets its act together is an opportunity for AMD to make some hay, so it should enjoy it while it can. (And, in any case, the sad Bulldozer era was what AMD did to itself; Intel won't be able to knock AMD that far back, it should stay in the game, but at the modest level of the earlier Ryzen generations, even when Intel gets its game back.)
 
I also work in IT, and yes most people don't know their hardware at all. But at the same time I get constant complaints that the laptops are slow -- no matter how fast a laptop I buy them, always too slow, or then, too big. No satisfying people, lol. I suppose I spoil my people with pretty good hardware, we have almost nothing older than gen5 Intel, and everyone gets dual monitors. :)

But also I am a "megatasker", and do video and photo editing for the company, work with massive spreadsheets, and so I'm slightly biased toward needing the headroom, as that is my experience; and at home I do 3D modelling and animation rendering.

I assume also most people reading here, and having this type of interest are enthusiasts.

My point simply was AMD is killing it in sales so people must be thinking they need more than a few percent increment in low res gaming or single thread tasks.

And I do agree that Intel is not quite at Bulldozer level with its 14nm, Pentium4 level?, yes maybe. :)

Intel still outsells AMD in OEM systems so it's not all rosy for AMD.

It makes sense that PC builders would lean towards AMD. A majority of sales are in the mid and low end of the market and that's where Intel's slight IPC lead is reduced to nothing. Even with a 2080 Ti, the 9600K beats the 3600 by a few percentage points. People in this price range aren't pairing these CPUs with a 2080 TI though and dropping the GPU to something logical for these lower end processors reduces the lead to nothing. Meanwhile the 3600 beats the 9600K in everything else including lightly threaded productivity applications. In heavily threaded applications the 3600 gets anywhere from a 30% lead to double the performance of the 9600K.


If you look at the power consumption section, the 3600 achieved about a 35% quicker render time then the 9600K while consuming roughly the same power, meaning the 3600 achieves much better performance per watt as well.

Add in the fact that the 3600 includes a cooler out of the box, just works with no overclocking required, has a superior platform, and future upgrade-ability I don't see a single reason why people would buy a 9600K over a 3600.

Remember Techspot's revisit of the 7700K 3 years later and how it lost roughly 20% off the top of it's gaming performance? I expect the same to happen for Intel's 9600K and a few of their other 9000 series processors as well. If the above reasons were not enough to put it at bulldozer level, them taking a nosedive 3 years from now in the only metric they do match the Ryzen CPUs in, gaming performance, certainly will. 6 threads is barely enough now, let alone in the future.
 
What's is solid in another gen without PCIe 4.0?

You can buy an AMD board with WiFi-6, 2.5G Ethernet and PCIe 4.0 for a few month now.

M.2 drives for PCIe 4.0 already take very good advantage of the faster bus lane. And rumors have, so will new nVidia cards, due to be announced soon. Who will be buying Intel then?
LoL the gpu's really are not going to use anything near the bandwidth pci-3.0 x 8 offers already so do you really think it matters so much ?. The only thing the factories are doing is claiming that they are compatible with pci-e 4.0 standard. I ran many games and products on my system and I never found a gain in performance not if I stick the card into a x8 slot or a x16 slot hell even the old mobo with pci-e 2.0 slots does run any game exact the same and does not show any performance losses at all (only tested a x16 slot). the only difference which made a dent in performance is the somewhat lower clockspeed of the cpu. but its just a minor difference.
However when you use pro gpus to improve calculations you might see a small dip in performance.
But most never ever use these cards in a normal gaming pc, especially because these are priced at a total different level.
I see minor improvements when going to pci-e 4.0 bit on the other hand these are the only new products which used the same controller. So maybe we see a bigger difference in the future. However when you do not have a super heavy workload on these drives. You never will see a increase unless you use a synthetic fantasy program showing fantastic ***** numbers you never ever gonna reach ever in normal usage.
Fact is my brand new pci-e 4.0 ssd is at exact same speed as the pci-e 3.0 sibbling and can not even load the simplest programs faster than any other previous product.
 
What are you on about? Latest rumour circulating is that it'll hit 5.5 GHz on air!! All core OC!! Just wait and see!! ;)
Sure, it'll hit those speeds for a few seconds but after that it'll have to throttle to the freakin' basement to get the temps under control again. The 14nm process that Intel is still using is way past old and it shows in how freakin' hot these things run when you put them under any kind of real load.

I have an 8700K with an AIO and when I really start to push things temps can raise to 75c which isn't exactly what I would consider cool. And just think, that 8700K is a six-core CPU, add four more to make a ten-core processor and holy hell man... I don't even want to think how much it's going to take to cool that sucker when you really ramp up the loads on it.
 
What are you on about? Latest rumour circulating is that it'll hit 5.5 GHz on air!! All core OC!! Just wait and see!! ;)
lol really you clearly have alot of fantasy, these cpu are not even on the market and you asume that they clock so high.
The tech is running on its limits those high clocks can't be used all day long unless your that extreme lucky guy. In my country its impossible to get good clocking cpu they all are binned through the whole chain, so to get it you need to have deep pockets. I saw one guy offering his 5.3 Ghz I7 and he refused a 1700 euro bid.
Those are ***** prices compared to the 350 euro for the normal version.
Anyway he sold it I saw this morning so apparently people buy these for such insane prices
 
Nope. Intel's chips are power hogs and not competitive on very heavily multithreaded apps, but those are not what the vast majority of users need or use. Intel is competitive/superior at gaming and lightly threaded tasks though the margin is small, below what the vast majority of users will notice.
Actually, looking across their whole new line offerings, the i3 sequence looks to be the best "bargain" (**) of the bunch. With quite decent IGP, good clock speeds, and moderate TDP, they look like a HTPC or home office workhorse. But yes, I"m sure they'll suck at "Crysys", or whatever meme game the kidz are using as a yardstick these days. But I've checked the temps on all(?) my current Intel systems, and in what I call "normal use", with aftermarket air coolers, they average about 30 to 35 C.

With that said, they can't be miserable, "power hogs". Although, I will say I bought a twin fan, uber clocked, GTX-1050 ti "FTW", which I actually haven't gotten hot enough to actually start those fans. Who knows, they may be broken, but the card hasn't gone up in smoke yet, so who cares?

Bulldozer was not competitive at practically everything and users should notice the difference. Though I wonder if you put a user in front of a similarly configged FX and Sandy Bridge to Haswell i5, doing typical office tasks would they actually notice?
That was a rhetorical question, wasn't it?

(**) Yes, this does boldly -assume you're open minded enough to call any Intel offering, "a bargain".
 
Last edited:
Actually, looking across their whole new line offerings, the i3 sequence looks to be the best "bargain" (**) of the bunch. With quite decent IGP, good clock speeds, and moderate TDP, they look like a HTPC or home office workhorse. But yes, I"m sure they'll suck at "Crysys", or whatever meme game the kidz are using as a yardstick these days. But I've checked the temps on all(?) my current Intel systems, and in what I call "normal use", with aftermarket air coolers, they average about 30 to 35 C.

With that said, they can't be miserable, "power hogs". Although, I will say I bought a twin fan, uber clocked, GTX-1050 ti "FTW", which I actually haven't gotten hot enough to actually start those fans. Who knows, they may be broken, but the card hasn't gone up in smoke yet, so who cares?

That was a rhetorical question, wasn't it?

(**) Yes, this does boldly -assume you're open minded enough to call any Intel offering, "a bargain".
Intels are powerhogs everyone seems to forget intel and amd use a different way of giving tdp. Its like comparing apples with a orange.
I noticed this several times however intel has/had a more effective power control but when I crank up the load I see that the intel machine should use lesser power as it claims to be a 95 watt tdp.... but it does use more especially if you crank up the speed by overclocking.
I see people claiming all kinda nonsense but if you have your machine connected to a powermeter you clearly can see the higher power draw. As I said I tried to oc my i7 to 4.5 Ghz and had to clock it done because A) became hot as hell 99 C with quad fan watercooling set. B) burned insane high power to get above the 4.2 Ghz normal clock.
I bought in a 2 year period a few hundred same cpu from different shops and non of them ever came at 4.7 Ghz. While hundreds of people claim that they had clocked theirs to 5.2 or 5.3 Ghz, but when I said I buy it and want proof of it. The excuses started that they had another motherboard or different ram or just has sold their machine.
Non of them ever showed it running at the claimed speed ever. In most cases I was looking at pictures downloaded from internet and often I had seen them before.
Anyway nobody ever showed me a intel cpu highly overclocked and not being a power hungry monster those which where overclocked reached often 4.5 Ghz and burned alot of power. Compared to normal usage ofcourse its no problem but acting like they use low power is plain nonsense.
 
Sure, it'll hit those speeds for a few seconds but after that it'll have to throttle to the freakin' basement to get the temps under control again. The 14nm process that Intel is still using is way past old and it shows in how freakin' hot these things run when you put them under any kind of real load.
...

lol really you clearly have alot of fantasy, these cpu are not even on the market and you asume that they clock so high.
...


Ooops, I guess I forgot to use the sarcasm font. ;) You guys must not have seen many of my other posts. ? I was just repeating someone else's rhetoric.

I totally agree with you guys, anyone believing or hoping these will even work at all with anything other than the very best air cooler or better, even at stock frequencies, is clearly not thinking straight.

We may even see a situation similar to Zen2 where just because one core can hit X frequency for a second or two, doesn't mean you'll be able to OC all cores to that frequency. I have a feeling that Intel will be taking that one from AMDs book, to really try to squeeze a few more hz out of 14nm - hence why we might see 5.3 if this leak is accurate (I have seen some conflicting into on the clock speeds already).

My prediction is that for all core OC on 10900k, clocks will be no higher than what you can get with 9900k(s), but you'll have two extra cores to try to contend with the massive heat output. 360AIO would be the minimum, closed loop recommended. Intel should never had made the 10 core, let AMD have these few rounds for heavy workloads, and focus on 6 or 8 core gaming chips.

With a 95w 9900K able to pull >200w at the socket in some loads, a 125w, higher clocked, 10 core 10900k will likely be able to pull close to 300w, under specific AVX loads. Yikes. Ok for gaming, but don't expect to to use all those cores without hitting thermal limits unless you have serious cooling capabilities.
 
Last edited:
Hahaha next time make sure to add some sarcasm or we jump on you again ;)
Most people here are techjunks but I often am on other fora where people constant say things which makes my hair go instant straight up ;)
Claiming things to be fact but actually are big fables or fantasy.
Most of those people just are brand junks and have no clue what they are talking about :).
 
Actually, looking across their whole new line offerings, the i3 sequence looks to be the best "bargain" (**) of the bunch. With quite decent IGP, good clock speeds, and moderate TDP, they look like a HTPC or home office workhorse. But yes, I"m sure they'll suck at "Crysys", or whatever meme game the kidz are using as a yardstick these days. But I've checked the temps on all(?) my current Intel systems, and in what I call "normal use", with aftermarket air coolers, they average about 30 to 35 C.

With that said, they can't be miserable, "power hogs".

Heh, I'm using a Core i5-8400 for most of my home use for the very reason that it's power efficient, so there's me not taking everything into account. 6 cores at 3.8 GHz all core Turbo is well within Intel's efficiency range as it uses 63W at 56C with a Hyper 212 when at 100% CPU use in Handbrake.

It's when you run 6 and 8 core Intel CPUs over 4.5 GHz all core that they become power hogs. AMD keeps it below 4.5 GHz all core and outdoes Intel on IPC with Zen 2 to maintain core parity, and then exceeds Intel with core count.
 
My prediction is that for all core OC on 10900k, clocks will be no higher than what you can get with 9900k(s), but you'll have two extra cores to try to contend with the massive heat output. 360AIO would be the minimum, closed loop recommended. Intel should never had made the 10 core, let AMD have these few rounds for heavy workloads, and focus on 6 or 8 core gaming chips.

That is completely sensible and exactly what Intel should do, but unfortunately from the Intel boardroom's perspective, not marketable. The company heads need some halo product they can crow about and the 10c20t chip shows "progress" and will maintain the small lead in certain gaming circumstances.

Now you might think that we could just buy the 8c16t or 6c12t and OC that to get the gaming edge, *but* the best silicon is now reserved for the 10C chip. So the 8C and 6C will be getting lower quality stuff and therefore may not OC as high as the 9900K or 8700K.

This is apparently already in happening: after the 9900KS came out, the best silicon is reserved for those chips so when you buy a 9900K now, you do not get the same OC potential. Basically the 9900KS is Intel doing Silicon Lottery's job and leaving the junk 9900K chips for the regular consumer.
 
This is apparently already in happening: after the 9900KS came out, the best silicon is reserved for those chips so when you buy a 9900K now, you do not get the same OC potential. Basically the 9900KS is Intel doing Silicon Lottery's job and leaving the junk 9900K chips for the regular consumer.
So in other words, unless you're willing to pay the big bucks for their ultra-premium silicon, Intel doesn't care about you. Got it. Meanwhile, you have AMD where they're firing on all cylinders and giving value and performance across the board.
 
Intels are powerhogs everyone seems to forget intel and amd use a different way of giving tdp. Its like comparing apples with a orange.
I noticed this several times however intel has/had a more effective power control but when I crank up the load I see that the intel machine should use lesser power as it claims to be a 95 watt tdp.... but it does use more especially if you crank up the speed by overclocking.
I see people claiming all kinda nonsense but if you have your machine connected to a powermeter you clearly can see the higher power draw. As I said I tried to oc my i7 to 4.5 Ghz and had to clock it done because A) became hot as hell 99 C with quad fan watercooling set. B) burned insane high power to get above the 4.2 Ghz normal clock.
I bought in a 2 year period a few hundred same cpu from different shops and non of them ever came at 4.7 Ghz. While hundreds of people claim that they had clocked theirs to 5.2 or 5.3 Ghz, but when I said I buy it and want proof of it. The excuses started that they had another motherboard or different ram or just has sold their machine.
Non of them ever showed it running at the claimed speed ever. In most cases I was looking at pictures downloaded from internet and often I had seen them before.
Anyway nobody ever showed me a intel cpu highly overclocked and not being a power hungry monster those which where overclocked reached often 4.5 Ghz and burned alot of power. Compared to normal usage ofcourse its no problem but acting like they use low power is plain nonsense.
If you're going to quote me, please respond to what I said,which was a reference to the new i3 lineup. This is my 14th year here, and I'm not a gamer. So, I'm acutely aware of all the "facts" your putting out. They simply don't pertain to my usage. nor a great many other's.

But try to keep in mind, as a workaday machine the new i3 CPUs will do just fine. I have no doubt they'll run Photoshop just fine, and at sensible power usage and temperatures.

If you want to compare a stock air cooled i3 or i5 at stock speed to some overclocked monstrosity, be my guest, but please don't try to justify it me.

As for your comparison in general, "apples and oranges", doesn't even begin to encompass the magnitude of your folly.
 
Last edited:
....[ ]....It's when you run 6 and 8 core Intel CPUs over 4.5 GHz all core that they become power hogs. AMD keeps it below 4.5 GHz all core and outdoes Intel on IPC with Zen 2 to maintain core parity, and then exceeds Intel with core count.....[ ]..
You know, when Intel released the Core 2 Duo E-6300, they almost put AMD out of business. In fact, AMD moved to smaller quarters, and at time time, AMD couldn't even afford to have the landscaping done at their offices. So they "punted" in a manner of speaking, and put out a bunch of crap in the meantime. A wild guess is that their "dry spell", lasted almost a decade.

And now they're back, kudos to them!

I built my last system based on an i5-6600K and Gigabyte Z170 board. Got it on closeout for $240.00, both pieces. I'm quite happy with it. It has, (my last copy), of Win 7 Pro, with updates completely shut down.

As far as Intel vs. AMD superiority goes all of human history tells us, empires rise and fall, which is mirrored in corporate power structures.

As far as any new machine goes, I might still use an Intel CPU, a 10 nm Intel CPU, that is. :rolleyes:

Maybe I'll have to wait until after 14 nm generation 69, and get over my repulsion to Windows 10 in the meantime. I hope I live that long.
 
...
Now you might think that we could just buy the 8c16t or 6c12t and OC that to get the gaming edge, *but* the best silicon is now reserved for the 10C chip. So the 8C and 6C will be getting lower quality stuff and therefore may not OC as high as the 9900K or 8700K.

This is apparently already in happening: after the 9900KS came out, the best silicon is reserved for those chips so when you buy a 9900K now, you do not get the same OC potential. Basically the 9900KS is Intel doing Silicon Lottery's job and leaving the junk 9900K chips for the regular consumer.

Hmmm ... yes indeed.

Two things on that point ... Unfortunately, AMD already does this and Intel will just be copying them, for whatever reason ... Why the hell can we not have an 8 core Ryzen that single core boosts to 4.7 and has a base of like 4.0, all core boost 4.25? There is Ryzen silicon good enough to do this without creating a fire, but it all went to the 3950x and maybe TR.

Second point is we already saw that with 9900K and 9900KS ... pre-9900KS, almost all 9900k could all core OC to 5.0 easily .... 5.1 with great cooling. Post 9900KS, a lot less 9900K's actually can can.

In fact YouTube guy "Bitwit" - has a video complaining his older 9900k was a better bin than his shiny new 9900ks and could reach all core 5.0 with lower voltages and 5.1 stable, while his 9900KS couldn't really do 5.1 100% stable.

And by stable I don't mean a screenshot of CPUz -- I mean 20 minutes minimum of wPrime - all threads - an hour would be better. It bugs me when people see a screen shot of some clockspeed of CPUz and then claim it overclocks to whatever it said, on whatever cooling they had. Big difference between "stable" and "bootable" or "gameable"
 
Last edited:
Back