Leak shows overclocking Intel's Core i9-12900K Alder Lake CPU to 5.2 GHz turns it into...

I'll just copy-paste this from somewhere else, because it makes sense:

Nice FAIL intel.
The more I see about this Alder Lake CPU, the less I am impressed.

Horrible efficiency, barely faster, 1 year later, very expensive (CPU+MB+DDR5) and needs help from Win11 and DDR5. Those are a lot of minuses... I don't like it one bit.

Also, does no one see this?
11900k scores higher than 5950X in CPU-Z and other synthetic benchmarks, but gets beaten in gaming benchmarks.

So the actual lead of Alder Lake in gaming benchmarks will be much smaller than it looks in these synthetics. Not to mention if you take away Win11 and DDR5, I expect it will not win then anymore.

And then comes Zen 3D. Yeah, Alder Lake is a FAIL for me, if these leaks are true.

Even if I was an intel fanboi I would skip this 1st gen and wait for Raptor Lake which will have matured tech and process, better DDR5 and PCIe 5 prices and options and support, matured Win11 drivers, etc.
Everything looks better for Raptor Lake, but not only because it's the next step, but because of external factors too. This Alder Lake is like Zen1, but even less impressive somehow. Meh...

The only thing I like about Alder Lake is that it pushes AMD to do better and lower prices (hopefully).
 
I know it might something unpopular to say with some (And irrationally popular with some other of you) but it makes sense than this BIG-little architecture move is just a desperate attempt at stay competitive with AMD before they're completely left in the dust so it makes sense that while these chips might be competitive in raw performance most PC users probably would be better off sticking to 5-10% less performance and 50% less power draw or worst with just Ryzen 3, let alone Ryzen 3+ or Ryzen 4 in the near future.
I feel there are merits using a big/little core config, especially when have observed how it benefits ARM SOCs. While it is true that power consumption is less of a concern on desktop, but I feel there is no need to waste power for light loads.
Having said that, I do agree the approach to go with this big/little config is a way for Intel to try and catch up in terms of core count with competition, not just with AMD. It is also a way to hide the high power consumption of the p-cores, especially in work loads that are not that heavy. If Alder Lake is pulling this high a power consumption, I can imagine how much more power hungry will Raptor Lake be by doubling the efficient cores.
 
Wait a year and buy zen4 or the new intel then - this must be a stop gap cpu

Nah, this is brand new tech. Tech that needs to mature, meaning Raptor Lake is going to be insane (DDR5 matures, Windows 11 matures and Intel uses a better fab for Raptor Lake too)

A stop-gap solution would be Zen 3 refresh with 3d cache.

Raptor Lake vs Zen 4 is going to be fun. I put my money on Intel, we will see tho.
 
Nah, this is brand new tech. Tech that needs to mature, meaning Raptor Lake is going to be insane (DDR5 matures, Windows 11 matures and Intel uses a better fab for Raptor Lake too)

A stop-gap solution would be Zen 3 refresh with 3d cache.

Raptor Lake vs Zen 4 is going to be fun. I put my money on Intel, we will see tho.
like Ryzen. Took 3 gens to be worth buying.
 
It's funny how people act like AMD is miles ahead when it comes to watt usage.

When you overclock a 5900X, with PBO2 and unlimited power, it will easily draw 200+ watts at 4.7 all core if the Motherboard VRMs are capable.

Even 5800X can hit 200 watts and alot of people consider this a hot CPU, unless you run them at stock in a powerlimited state.

However, you will never see this kind of watt usage outside of synthetic testing / burn-in, same goes for Intel chips. My 9900K has been running 5.2 GHz since 2018 and can hit ~250 watts, however in gaming, with clockspeeds locked at 5.2. it's nowhere near this, more like half.

High-end GPU's can peak at 600+ watts today in comparison. Both 6900XT and 3090 can do this, so who cares if a CPU peak at 150,200,250 or even 300 watts. Most people that overclock and turn off limiters have good cooling anyway.

If Alder Lake performance is on point, it will be a succes. If single thread perf really improved 20-25% then there's a good chance of seeing great real world performance.

Raptor Lake will further improve on this + 25% IPC + Better Node (Intel 4) + DDR5 has improved in latency/timings and hybrid CPU issues will be sorted out (Windows 11 scheduler and software in general).

I am not interrested in Alder Lake but I am looking forward to see real world and especially gaming performance. It will give a hint of what Raptor Lake will bring.

However I will probably get an Alder Lake laptop, since we upgrade our laptops yearly at work anyway. Hybrid CPU and Windows 11 should bring a nice bump in battery life I expect. Most of my work is done with remote sessions so I might be able to use the efficiency cores all day long.
 
Last edited:
As though anyone will be running 100% CPU load at 5.2GHz continuously, Yes this is bad but it will never occur in real life gaming or other apps. Also don't OC, seriously why bother. We need real world loads at stock clocks to see how good or bad the new architecture performs.

Any way, no matter how good Alder Lake is, I won't touch it, I'll wait for Raptor Lake vs Zen 4 late next year.

Apparently you don't know about Distributed Computing. I've been running DC projects for over 20 years and they will max out every core at 100% utilization 24/7. The faster you can clock your cores, the more work you get done.
 
Ryzen 1000 and 2000 series were very average CPUs. Poor latency, poor motherboards. They are slower at gaming than my now 7 year old 4790K. 3000 series was half decent, at least competitive with Intel and the 5000 series edged Intel out and is the first series of AMD processors worth buying over its Intel competition since the Phenom. But it’s only been 12 months and Intel already look to be coming back. Quite impressive.

This article is quite amusing, an overclocked chip using lots of power? Well I never..

Power consumption is important but it’s not that important. Most of us would buy the faster part even if it uses a lot more power. These are Desktop CPUs, they plug into a wall so you aren’t limited for power. And if you’re using 300w at max load, that means you need more than 3 hours of max load to use the same amount of energy that it takes to boil a kettle. So if it concerns you, drink less tea before you switch your CPU.
 
Last edited:
I just wanted to point out that despite the fact that 1000, 2000 ryzen were poor in principle, they single handedly forced Intel to up their core counts, if AMD hadnt launched 200$ 6cores, intel would have still be giving us 350$ 4core CPUs

AMD made productivity & heavy multi tasking accessible to a large chunk of the market, this is without mentioning threadripper. AMD 1000 series was a 47% single gen IPC lift - absolutely unheard of from Intel.

Its too early to claim intel is winning with Alderlake, Zen3 with 3D cache is launching Q1-22 and is compatible with AM4.

The investment needed to benefit from intel’s Alderlake platform where you will need a brand new mobo as well will make any gains much more expensive.( if they indeed finally caught up )

Gove credit where it is due, AMD terraformed the CPU market despite being a fraction of Intel’s size.

Dont dismiss it.

Ryzen 1000 and 2000 series were very average CPUs. Poor latency, poor motherboards. They are slower at gaming than my now 7 year old 4790K. 3000 series was half decent, at least competitive with Intel and the 5000 series edged Intel out and is the first series of AMD processors worth buying over its Intel competition since the Phenom. But it’s only been 12 months and Intel already look to be coming back. Quite impressive.

This article is quite amusing, an overclocked chip using lots of power? Well I never..

Power consumption is important but it’s not that important. Most of us would buy the faster part even if it uses a lot more power. These are Desktop CPUs, they plug into a wall so you aren’t limited for power. And if you’re using 300w at max load, that means you need more than 3 hours of max load to use the same amount of energy that it takes to boil a kettle. So if it concerns you, drink less tea before you switch your CPU.
 
I just wanted to point out that despite the fact that 1000, 2000 ryzen were poor in principle, they single handedly forced Intel to up their core counts, if AMD hadnt launched 200$ 6cores, intel would have still be giving us 350$ 4core CPUs
That’s a fallacy. Intel had 6 core mainstream i7s out just a few months after Ryzen launched. These products take 5+ years to develop. To claim AMD forced Intels hand is simply incorrect. Maybe they forced Intel to launch a few months ahead of time. But they were going to give us 6 cores regardless of Ryzen or not.

You want to thank AMD for Intels chips and that’s a bit bizarre. You could argue that Intels dominance finally forced AMD to pull their finger out to make Ryzen. You ought to thank Intel for that lol.
 
Ryzen 1000 and 2000 series were very average CPUs. Poor latency, poor motherboards. They are slower at gaming than my now 7 year old 4790K. 3000 series was half decent, at least competitive with Intel and the 5000 series edged Intel out and is the first series of AMD processors worth buying over its Intel competition since the Phenom. But it’s only been 12 months and Intel already look to be coming back. Quite impressive.

This article is quite amusing, an overclocked chip using lots of power? Well I never..

Power consumption is important but it’s not that important. Most of us would buy the faster part even if it uses a lot more power. These are Desktop CPUs, they plug into a wall so you aren’t limited for power. And if you’re using 300w at max load, that means you need more than 3 hours of max load to use the same amount of energy that it takes to boil a kettle. So if it concerns you, drink less tea before you switch your CPU.

Desktop CPUs are not only based on gaming performances as you seems to imply in the first part of your reply. Ryzen 1gen and 2gen were better " Desktop CPU " than the Intel counterpart.

When you say "Most of us would buy the faster part even if it uses a lot more power." , you are speaking for yourself, not for "most of us". Maybe you are speaking for a part of gamers, not sure how many.
 
Desktop CPUs are not only based on gaming performances as you seems to imply in the first part of your reply. Ryzen 1gen and 2gen were better " Desktop CPU " than the Intel counterpart.

When you say "Most of us would buy the faster part even if it uses a lot more power." , you are speaking for yourself, not for "most of us". Maybe you are speaking for a part of gamers, not sure how many.
Gaming performance is everything in this industry. That’s why Ryzen 5000 cost so much more than the 3000 series, because it was the fastest at gaming. The 3000 series was already faster at multithreaded but it couldn’t command a premium. When news emerged of Ryzen 1000 poor gaming performance AMDs stock value took a hit. People get enthusiastic about spending their own money on a gaming PC, not so much when it’s a compute machine.

And you’re right I don’t speak for others. But considering more people have purchased a 3090 than the entire Radeon 6000 series I would say that users clearly put performance above power consumption.

Just watch, if Alderlake beats Ryzen at gaming it will cost more and it will outsell it. Won’t even matter if it’s drawing 300w.
 
Gaming performance is everything in this industry. That’s why Ryzen 5000 cost so much more than the 3000 series, because it was the fastest at gaming. The 3000 series was already faster at multithreaded but it couldn’t command a premium. When news emerged of Ryzen 1000 poor gaming performance AMDs stock value took a hit. People get enthusiastic about spending their own money on a gaming PC, not so much when it’s a compute machine.

And you’re right I don’t speak for others. But considering more people have purchased a 3090 than the entire Radeon 6000 series I would say that users clearly put performance above power consumption.

Just watch, if Alderlake beats Ryzen at gaming it will cost more and it will outsell it. Won’t even matter if it’s drawing 300w.

I see and understand your point when you say that " gaming performance is everything in this industry " . Still, amd increased their market share steadly over time, starting from Ryzen 1. There are reasons for that. Cost/performances, upgradability, multi tasking for example. Amd was able to offer a product different from intel ones, something that intel didnt have. There is honor in that.

threadripper was another successful product, which is obviously not a gaming product, but let's leave it aside.

Speaking for me, I have bought a 2600x after 1 month from launch, gaming wise it was not the top, but it was a great cpu overall for a new pc. I'm not regretting that choice and honestly, I dont mind to sacrifice a % of fps, nor I'm after the fastest fps video card / highest hz monitor / fancy lights inside case. For me a Pc is not a gaming machine, games are part of it... but they are not all the pc. And just to let you know, I play videogames on daily base.

You say the pc industry is trained by videogames. To me it seems like a "trend". For example today they like to put fancy lights inside case, tomorrow who knows. I also have an RGB ready mainboard, but do I care of it? ... not really. Today Mainboards sell like that.

Between yesterday and today I have assambled my father pc. An hard upgrade into an athlon 3000g + b450 aorus. I made a two channel of air flow with two frontal fan and one behind. The case was an old one, I had to remove the frontal slots for cd-rom to make space for the frontal fan. I did a good cable management to optimize the flow of air. Finally I have posted a short video of it in a chat with some friends.. and friends of friends. One of them replyed with a video of his pc: Liquid cooling, full of lights everywhere, expensive video card etc. . . I thought it looked like a disco dance from 80 years, but surely it is trendy.
 
Technically the performance difference between Ryzen 1000 and Ryzen 5000 was a bigger jump in a shorter period of time.
Good to know. I like learning stuff. (y) (Y)

But I meant the jump from the last 486, the DX2 which actually came out after the first Pentium 60 but even with a 40 MHz lead was completely dominated by the Pentium. It even bested the X5 133.

The 1000 and 5000 series was 3 1\2 years apart.
 
Last edited:
Good to know. I like learning stuff. (y) (Y)

But I meant the jump from the last 486, the DX2 which actually came out after the first Pentium 60 but even with a 40 MHz lead was completely dominated by the Pentium.

The 1000 and 5000 series was 3 1\2 years apart.
Oh, sorry. I am old enough to remember the 486 DX2, I remember my dad bringing home an IBM Aptiva with one inside it and I played games on it. But I was about 10 years old and I wasn’t terribly into benchmarks so I wouldn’t know!

It does seem to me that we get smaller jumps in performance these days. I remember back then you’d buy a new video card and it would be often more than double the speed of the old one. And everything was beige.
 
Good to know. I like learning stuff. (y) (Y)

But I meant the jump from the last 486, the DX2 which actually came out after the first Pentium 60 but even with a 40 MHz lead was completely dominated by the Pentium.

The 1000 and 5000 series was 3 1\2 years apart.
as you like learnig, the last468 was a dx4 120MHz
 
That’s a fallacy. Intel had 6 core mainstream i7s out just a few months after Ryzen launched. These products take 5+ years to develop. To claim AMD forced Intels hand is simply incorrect. Maybe they forced Intel to launch a few months ahead of time. But they were going to give us 6 cores regardless of Ryzen or not.

You want to thank AMD for Intels chips and that’s a bit bizarre. You could argue that Intels dominance finally forced AMD to pull their finger out to make Ryzen. You ought to thank Intel for that lol.
I respect your response and agree in principle, however, it supports my argument, intel always had 6, 8, 10 and 12 cores, they just didnt stack it the way AMD offered it, in this way, AMD forced intel to give us 6,8 and 10 cores in the mainstream segment. Earlier that was reserved only for HPC on x99 and x299 platforms for an arm and a leg I might add.

this is good for the industry and at the end of the day the consumer wins.

Personally, after all the shinanigans that intel pulled trying to push AMD out, I am done supporting a company that puts the industry and consumers last.

The fact remains, AMD pushed the industry in a way that saved consumers at least a decade of possible further stagnation.
 
I respect your response and agree in principle, however, it supports my argument, intel always had 6, 8, 10 and 12 cores, they just didnt stack it the way AMD offered it, in this way, AMD forced intel to give us 6,8 and 10 cores in the mainstream segment. Earlier that was reserved only for HPC on x99 and x299 platforms for an arm and a leg I might add.

this is good for the industry and at the end of the day the consumer wins.

Personally, after all the shinanigans that intel pulled trying to push AMD out, I am done supporting a company that puts the industry and consumers last.

The fact remains, AMD pushed the industry in a way that saved consumers at least a decade of possible further stagnation.
It’s also a fact that prices of mainstream socket CPUs have nearly tripled since 2017. In 2017 the 7700K was the most expensive chip in Intels mainstream socket and cost $300. Now if you look at Ryzen 5000’s launch last year, the cheapest part in the mainstream socket was $300 - the 5600X and the 5950X was $750.

Im not trying to say that competition is bad, it’s good. But AMDs intentions are just as dishonourable as Intels. These companies are just as bad each other, neither care about you, they just want your money. AMD in particular have massively ramped up pricing the moment they got a performance edge over Intel and Intel cut its prices in the mid to low end to recoup the money. It’s like they just swapped places.
 
Back