Intel 12th-gen Core CPUs are official: Performance preview, Alder Lake models and specs

Well the 3080 was the first flagship that showed a major difference between 2.0x16 and 3.0x16 (5% difference), and the last 2.0 platform was sandybridge in 2011. So I think you'll be fine.
If a 6600XT can be gimped with 8X PCIe3, then a card more than twice as powerful can be gimped by 16X PCIe3 and I would hope that in 3 years time or so that we are getting double the performance of a lowly 6600XT!

If you have PCIe3 now I do think you might run into some problems with that in a couple of GPU generations time.
 
OK, just so we'er cleat, "advertising" and "integrity", are antonyms, NOT, synonyms.(And most times it doesn't even matter who's doing the talking).
I can be a cynical dude but I must not be captain-level cranky yet because I do interpret most things I hear based on who's doing the talking.

I'm certainly with you as far as general principles of caveat emptor, be suspicious, the ad isn't meant to fully inform you, etc etc.

Ads are for the benefit of the seller but even so the most effective ones are usually emphasizing the real selling points. That's especially true in B2B advertising and relationships. If Intel has a good supply of a good performer at an attractive price, they've got a good story to work with. I'm surprised they'd want to play games by making a chart showing performance gains they know are about to evaporate before most customers can even buy the chip, especially when the primary audience for that chart is savvy tech reviewers who are absolutely going to do their own testing anyway. It just feels like the cheap stunt of a hack, and should have been beneath the dignity of a long term market leader with a lot of real assets.
 
OK Sausagemeat, here's my praise and thanks for Intel getting back in the game and hopefully turning price competition mode back on. Genuinely happy to see it although I'm not planning on throwing out my 5950x just yet. Even if they are just another source of supply of a desirable part I'm grateful just for that too.

Also, here's my hat tip to the tin foil hat crowd who noted the Windows 11 AMD performance bug could coincide nicely with Intel Alder Lake reviews. I thought that was a stretch, but sure enough here's Intel publishing graphs taking advantage of it and apparently not even acknowledging the short-lived nature of the issue or that it is now fixed. Even if it's an intentional gambit I still don't think it will matter -- any good reviewer will do their own investigation not based on the bug and the target market for this chip will see those reviews -- but it does make me think less of Intel's integrity to publish these graphs knowing full well they are not reflective of actual silicon performance differential. And if I were an AMD lawyer I might want to send a letter to Microsoft & Intel asking them to confirm the AMD bug was not created on purpose for this stunt and to update the materials.
That was you talking to the tin foil hat crowd when you suggested a conspiracy of your own?

Um....
 
The best news from this is the PRICE. I'm glad to see some pressure on AMD which will eventually circle back onto Intel when AMD comes out with their next CPU.

Overclocking the higher-end Intel CPUs - I doubt there's much to be gained. An overclocked, fully loaded CPU, running all out for 24/7 is certainly to involve some high temperatures and loud fans fighting to keep the heat in check. I doubt the new CPU is thermally a whole lot different from my 10-850K.
 
If you don't mind waiting till 2022 Q1 the new AMD processor is going to be faster. I'd also take these benchmarks with a grain of salt as they were done comparing it to a nerfed AMD cpu running 15% slower.

"Intel shows showing the 12900K being anywhere from slightly slower than the Ryzen 9 5950X, to being 30 percent faster. However, Intel does admit that these benchmarks were captured on Windows 11 before the performance patch for AMD CPUs was available, so the results aren’t as meaningful as they would have been had they tested the 5950X in its best performing mode, such as using Windows 10 or waiting for the patch to be available. "

https://www.guru3d.com/news-story/a...n-4-later-that-year-with-pcie-5-and-ddr5.html
He was on the same CPU for 7 years.
Nuff said.
 
Not buying Intel ever again, not supporting the Corp that brought us tech stagnation for 10 years, the company that would gladly still sell mainstream customers monocore and dual core CPUs if they could get away with it and the company that put in my Haswell CPU the absolute worse TIM they could find on the market just so it won't OC well as a planned obsolescence feature, so the end user had had to delid.

No matter what they do, I think Intel and their shady business practices and mafia like manipulation of their partners (so they only buy Intel CPUs for their laptops and prebuilts) must be appropriately punished by the end customer so they will learn a lesson.
Stupid Intel. They should have done what AMD did. Sell nothing for 5 years until they were ready...

... but then you'd blame AMD for jacking their prices in the meantime.

Intel struggled hard with 10nm. You people need to stop getting your info from bad journalism and comments.
 
Are the Gracemont 'efficient' cores there because the power draw is so high overall and they help boost performance per watt, or because intel is going cheap on us by using overstock Atom parts, or are they a small step in single-core performance to string consumers out while remaining competitive enough? I guess I don't understand the benefits of making the architecture more complex when it's not exactly needed for desktop CPUs. Gracemont seems like a laptop cpu core
 
RIP Ryzen 5000 lol. I’ve had my 4790K for 7 years and I genuinely thought my next CPU would be a Ryzen part with the way Intel were going. But by the looks of this I will almost certainly be getting a 12600K.

I look forward to all the AMD fans praising Intel for forcing AMD to lower their prices as they undoubtedly will after this launch.
Are you aware of the fact that the patch wasnt installed for the ryzens? :D
 
Sucks for anyone living in CA. They probably wont be sold there since that law passed not allowing alienware rigs to sell because they eat up too much energy.
 
Seems odd to lead a complaint about exaggerated benchmarks with a completely unfounded opinion about the exact same thing. Especially when this was noted in the analysis:

"...at no point during Intel’s presentation did they compare 12th-gen CPUs to AMD’s competitors in productivity workloads, and made no claims about being the world’s fastest chip for productivity, like they did for gaming. This suggests that Intel are unlikely to beat AMD in productivity."

Well, it's really not odd when both contradictory arguments are in favor of the same brand...

Aside from the leaked benchmarks that are out that show 12600K/12700K crushing 5600X and 5800X, it is not hard to make an educated guess that Alder lake will be faster even without any single benchmark

Intel does not need big boost over 11900K to surpass 5800X in productivity..... We know 12700K has 8 big cores (And we can safely assume these are more powerful than 8 cores on 11900K)..... Now add 4 E cores which will boost multi-core score even more..... And we can easily conclude that 12700K will be faster..

Even if P cores has barely any performance boost over rocket lake, the 4 extra cores in 12700K alone should put it above 5800X.... 12700K will faster even if you go by the worst estimations

I don't care about intel or AMD slides, and you don't need slides to know which one faster. It is not hard to figure out that 12700K will be faster than 5800X overall in multi-threaded and productivity.....
 
Last edited:
Are the Gracemont 'efficient' cores there because the power draw is so high overall and they help boost performance per watt, or because intel is going cheap on us by using overstock Atom parts, or are they a small step in single-core performance to string consumers out while remaining competitive enough? I guess I don't understand the benefits of making the architecture more complex when it's not exactly needed for desktop CPUs. Gracemont seems like a laptop cpu core

These Gracemont cores supposedly have equivalent IPC to Skylake cores. The ones that were Intel's flagship for 2016-2021. That's earlier this year.

I've been casually following Atom-class core performance and the most recent ones were similar in IPC to Core 2 Duos so whatever Intel's done to this current version has clearly resulted in big gains. IMO calling them .LITTLE is only relevant in comparison to the big cores as, if we can believe Intel, in terms of performance these are merely downclocked versions of their previous flagship architecture.

IMO well worth inclusion in the CPU for a lot more performance while keeping power use relatively low, even in parallel with the current big cores. I wonder how the whole package will do in terms of efficiency with a bit of an undervolt and slight underclock.
 
... I guess I don't understand the benefits of making the architecture more complex when it's not exactly needed for desktop CPUs. Gracemont seems like a laptop cpu core
The same for me, S|A also asks the question: why need big.little on desktop? One guess is that they want to have 16 cores on desktop to match AMD, but not possible to place 'em on a single die(even 10 nm), so ...make it 8+8, with 8 useless E cores. Why not make it simple 10-12 P cores and no need for Thread Director, win 11, and other bs? I'm afraid that on win10(or other OS without TD software support) actual performance could degrade if E-cores are enabled in BIOS. Disable them and you have normal 8 core cpu...
 
The same for me, S|A also asks the question: why need big.little on desktop? One guess is that they want to have 16 cores on desktop to match AMD, but not possible to place 'em on a single die(even 10 nm), so ...make it 8+8, with 8 useless E cores. Why not make it simple 10-12 P cores and no need for Thread Director, win 11, and other bs? I'm afraid that on win10(or other OS without TD software support) actual performance could degrade if E-cores are enabled in BIOS. Disable them and you have normal 8 core cpu...
From my point of view why have 16 cores? Most users dont need that and would probably benefit more from having a CPU which can use its efficiency cores to deliver more power efficiency. At least thats the idea.

It looks to me like the 12900K isnt trying to directly compete with a 5950X, its quite a bit cheaper and Intel have made no mentions about it beating the 5950X in productivity. It looks like they are going after the 5900X, 5800X and the 5600X with these products. With the 12600K beating the 5600X on price and performance and the 12700K and 12900K beating the 5800X & 5900X in performance for a slight premium.

Of course this architecture could massively benefit laptop users when it comes to power consumption and they cant release 2 different architectures, one for laptops and one for desktops and the laptops drive the market so I guess thats why we have e cores on the desktop.
 
My Youtube streaming PC is a Core i7 5960x with 32GB DDR4 and a 3090FTW3.
It runs everything: Cyberpunk, Far Cry 6, etc in max settings.
I definitely want to upgrade to a 12th, 13th or 14th generation - although I want an all new DDR5 motherboard, but considering my 5960X is doing its job so well, my thought is that games really aren't very demanding right now. An 8-core CPU - even a Ryzen mobile 8 core - seems to be able to run games fine. The GPU seems to be the limiting factor.

RAM, not so much a limiting factor. Once you have 16GB it seems most games don't demand more.
Depends on how much you value high-refresh gaming like 144fps or higher. Then you start running into CPU limitations
 
If a 6600XT can be gimped with 8X PCIe3, then a card more than twice as powerful can be gimped by 16X PCIe3 and I would hope that in 3 years time or so that we are getting double the performance of a lowly 6600XT!

If you have PCIe3 now I do think you might run into some problems with that in a couple of GPU generations time.
The 6600xt only gets gimped in situations where VRAM is insufficient. Apples to oranges comparison.

Unless you buy a card with insufficient VRAM, its not going to be an issue.

In 2011 the 580 was the fastest GPU on the planet. It took fromt he geforce 580 to the 3080 for 2.0 to finally be an issue. You REALLY think that we're gonna see the same explosive growth in the next 3 years? That is beyond optimistic.
 
From my point of view why have 16 cores? Most users dont need that and would probably benefit more from having a CPU which can use its efficiency cores to deliver more power efficiency. At least thats the idea.
....

Efficiency cores make no sense on desktop (and very little sense on laptop): you get only small benefit in power, like 5-10W when idle or light load. Even if you run your pc 24/7 I doubt you notice it. And this is at the cost of high scheduling complexity, designing a proper scheduler for big.little is not an easy task: samsung still has trouble with it, qualcomm did a better job.
 
Pricing is good specially for i5 and i7

AMD fanboys now suddenly care about gaming only ??? Even though most of these CPU limited gaming test (using best GPU and running it a low resolution) don't have any actual benefit for gamers in real word.

What about 3d rendering, photoshop, compression.... ?? Nobody cares about that anymore ?? i5 and i7 will easily crush 5600X and 5800X in these
Two problems here:

- Actual retail prices are a good bit higher than Intel‘s official prices

Here‘s Newegg and they are cheaper than Microcenter:

12900K / KF - $650 / no price given
12700K / KF - $450 / 430
12600K / Kf - $ 320 / $300

- Those who cared about MT performance are already on AM4, so switching to Alder Lake does not make sense for them.

Either way, we‘ll have to wait for third party reviews to see how good Alder Lake’s MT performance actually is, particularly when factoring power consumption in.
 
The 6600xt only gets gimped in situations where VRAM is insufficient. Apples to oranges comparison.

Unless you buy a card with insufficient VRAM, its not going to be an issue.

In 2011 the 580 was the fastest GPU on the planet. It took fromt he geforce 580 to the 3080 for 2.0 to finally be an issue. You REALLY think that we're gonna see the same explosive growth in the next 3 years? That is beyond optimistic.
The truth is we don’t know. We all said PCIe4 was irrelevant and then AMD rolled out an 8X card. You wouldn’t be too happy about that if you say purchased an i5 10400 over a Ryzen 3600 for example. There may be more 8X cards in the future.

I definitely think PCIe4 is something buyers should look for over PCIe3 today. I’m not as sure about PCIe5 but it always makes sense to have the latest spec just in case.
 
The truth is we don’t know. We all said PCIe4 was irrelevant and then AMD rolled out an 8X card. You wouldn’t be too happy about that if you say purchased an i5 10400 over a Ryzen 3600 for example. There may be more 8X cards in the future.

I definitely think PCIe4 is something buyers should look for over PCIe3 today. I’m not as sure about PCIe5 but it always makes sense to have the latest spec just in case.
I think the first x8 card they rolled out was the RX 460
 
I can be a cynical dude but I must not be captain-level cranky yet because I do interpret most things I hear based on who's doing the talking.
Well, I've been involved with so many of these AMD vs Intel threads, I've been keeping my opinion on the two products to myself. (And god only knows how much opinion, antagonism, accusation, hero worship, and self assured self righteousness is involved).

I also watch broadcast TV and other ad sponsored media sources.. right now, there's a trio of products being pushed the hardest, proprietary drugs, online gaming, and personal injury law firms. (In no particular order), literally a constant barrage of that bullsh!t. I have a humble degree in photography, and I know something about sociology, and psychology.

I always refer back to these Wiki articles on propaganda, in order to keep them fresh in my mind, while interpreting them against the other factors and inputs I'm working with.




See that way for instance, I can ascertain that certain racial objectives, sexual orientation,chronic neurosis, heterosexual tensions (male / female), and human greed,aren't merely being pandered to, but rather being reinforced as normal, and healthy,

My last post didn't even scratch the surface of my cynicism.

In fact, I may treat myself to a copy of the latest psychiatric diagnostic manual, (DSM-5) for my birthday. :rolleyes:

Hey, how about those new Intel processors, pretty snazzy, huh?
 
Last edited:
Intel is putting up a fight again. Pat Gelsinger saying and doing smarter things. It's exciting times.
Given that the 12th gen Alder Lake was conceived and developed LONG before Pat Gelsinger became CEO...I'm not sure how much of a difference he's made in his short tenure.
 
I'm so confused. We have 8 performance cores and 8 efficiency cores and the power consumption tops out at 240w?? Assuming it does lose to 5950/5900 in multithreading, this definitely is not a win.
 
Intel is back at the top end but it has neglected the bread and butter. The 11th Gen did not have any i3s and it looks like the 12th Gen is heading the same way. So those budget users will be stuck on 10th Gen processors for the foreseeable future.

The transition to DDR5 will be no different from past experience. The initial DDR5 modules will be low speed and high latency modules until manufacturers fine tune their designs and manufacturing. Remember, DDR4 started at 1600MHz and slowly progressed to 2933MHz for the 11th Gen CPUs. I would probably wait a year or two before faster DDR5 modules make the upgrade worthwhile.

AMD made hay while the sun shone and those Ryzen 5000 CPUs were priced so high that AMD's reputation as a budget CPU was smashed. Intel is now pricing the 12th Gen to ensure that AMD next gen processor prices don't go through the roof again. Competition is great for the consumers.

Now, all Intel needs to do is to update their Core-i3, Pentium Gold and Celeron and continue to keep AMD out of these budget lines!
 
Given that the 12th gen Alder Lake was conceived and developed LONG before Pat Gelsinger became CEO...I'm not sure how much of a difference he's made in his short tenure.
It's not Alder Lake. It's the thinking out loud in public, the opening of the Intel Kremlin by speaking frankly about the company's directions, ambitions and failures. The messaging was muddled before but makes sense now. And the pricing tells us they're not going to take it lying down.
 
Back