Upcoming Intel CPUs could push maximum temperature limits to 105 Celsius

Daniel Sims

Posts: 1,876   +49
Staff
Rumor mill: Chipzilla's Lunar Lake and Arrow Lake CPU lineups are expected to begin shipping this fall, with Panther Lake following next year. Recent leaks have teased their labeling, core configurations, wattages, and other details, while new information suggests that Intel is quite confident in the heat tolerances of these chips.

Leaker "Jaykihn" has made multiple posts this week sharing internal information on Intel's upcoming Lunar Lake, Arrow Lake, and Panther Lake processors. One post claims that Arrow Lake and Panther Lake will raise the TJMax of the company's CPUs to 105 degrees Celsius.

If the information is accurate, it represents a notable increase from Raptor Lake's 100-degree TJMax. Meanwhile, Lunar Lake is expected to remain at 100 Celsius.

The number indicates the highest temperature a processor can reach before the system begins throttling to prevent permanent damage, which can limit performance. A higher TJMax indicates that a CPU can withstand more heat and potentially higher clock rates.

Jaykihn also shared a chart containing numerous details on the Arrow Lake-S lineup. The series' top-end Core Ultra 9 variant is expected to feature eight performance cores, 16 efficiency cores, and four extreme efficiency cores. The two listed Core Ultra 7 models include only 12 efficiency cores and the lower-end version has no extreme efficiency cores.

At the bottom of the stack, the seven Core Ultra 5 processors shown each feature six performance cores. Among those, the two K models and all upper-tier variants combine these with eight efficiency cores, but only the top-ranked Core Ultra 5 units include all four extreme efficiency cores. Three lower-end non-K processors bring the efficiency core count down to four.

The K models feature a TDP of 125W, while the non-K processors are 65W and the T variants sit at 35W. The chart doesn't contain exact model names, making it difficult to line the core counts up with the six Arrow Lake-S SKUs that leaked earlier this month.

A final chart compares the Arrow Lake-S power rail electrical DC specs against Intel's 14th-generation Raptor Lake Refresh-S. Across the board, voltages increase from around 1.5V to roughly 1.7V.

The latest information indicates that Intel aims to introduce Lunar Lake in September with Arrow Lake-S coming the following month. Jaykihn previously posted the core configurations for some Panther Lake CPUs, which are expected to succeed Lunar Lake and Arrow Lake in 2025.

Permalink to story:

 
Wow. Looks like any higher end PCs we build for customers will use AMD only CPUs. On the low end, Intel N100 is keeping low budget customers happy.
 
Raising the temperature limits to 105 celsius and adding in features like Fast Throttle tech as well, basically foretells that Arrow Lake will be another hot processor..
 
If I need a new space heater, I'll consider one of these CPUs. /s
Heating is a function of total power consumption, not temperature. It's rather odd you'd be so concerned about the heat of a 125w CPU, but not a 300, 400, or even 600 watt graphics card.

I'm at a loss for words. Massive face palm.
A lot of drama queens in this thread. It's one unconfirmed rumor of a 5 degree change. Quite possibly false, and, even if true, I strongly suspect the earth won't suddenly stop spinning on its axis.
 
Good point, the 13900K and 14900K are also 125W processors and everyone runs those at that wattage with no problems whatsoever, right?
If you believe oblique references to Intel's stability issues refute the point, you'll want to reread my post. I was decrying the hypocrisy of those who crow over their 1200 watt power supplies and 600 watt graphics cards, then claim they don't want a 125w "space heater" in their homes.

In any case, your post is doubly off base, as Intel has other CPUs running stably at 125w, AMD has CPUs up to 250w, and NVidia's new Blackwell chips may consume 1000+ watts.
 
If you believe oblique references to Intel's stability issues refute the point, you'll want to reread my post. I was decrying the hypocrisy of those who crow over their 1200 watt power supplies and 600 watt graphics cards, then claim they don't want a 125w "space heater" in their homes.

You kinda missed the logic.

The 300-400W GPUs you mentioned are in the same range of power use as the "125W" Intel CPUs I mentioned. Which means it's quite reasonable to think this new "125W" Intel CPU will consume a similar high 300-400W power out of the box. It's a logical deduction based on recent Intel history.

And I didn't see the person mention at all what GPU they use or that they're unconcerned about high GPU power use. Instead you made up a strawman of the seeming inconsistency of power choice between GPUs and CPUs.
 
You kinda missed the logic.

The 300-400W GPUs you mentioned are in the same range of power use as the "125W" Intel CPUs...
Oops! I didn't simply mention "300-400w" GPUs. I mentioned 600w GPUs and CPUs using 250 watts -- all of which run stably. And, with the right choice of memory and graphics card, a system with a 125w CPU can use less than 250 watts. Add in a high-powered graphics card, and some overclocked memory to go with it, and that same system can easily top 1000 watts.

There was no logic in your post. You implied a 125w CPU must be unstable, using a classic 'guilt by association' fallacy.
 
Oops! I didn't simply mention "300-400w" GPUs. I mentioned 600w GPUs and CPUs using 250 watts -- all of which run stably.

LOL "But actually!" 600W is irrelevant as you brought it up to accuse someone of defending it who never mentioned it. Is that strawman or moving the goalposts? Both?

And, with the right choice of memory and graphics card, a system with a 125w CPU can use less than 250 watts. Add in a high-powered graphics card, and some overclocked memory to go with it, and that same system can easily top 1000 watts.

There was no logic in your post. You implied a 125w CPU must be unstable, using a classic 'guilt by association' fallacy.

LOL wrong.

I implied that a couple of CPUs listed at 125W TDP are unstable.

Thanks to "125W" being irrelevant as they can use 300-400W at full CPU usage in most motherboards which is leading to instability because of that.

This is a documented fact.

If they were actually run at 125W they would be stable. But that's not what happened.

So if Intel announces a new 125W CPU in the same tier, it is a very easy assumption that it will also run way over it's listed TDP and we know how that went the last 2 times. I assume Intel is competent enough to learn from their mistakes and fix it but that's still to be revealed by the future, isn't it?

And do you honestly think they'll run their flagship CPU at only 125W PL2? Zero chance.

Also: my original post is a jab, bitta having fun. No need to take everything seriously and round the wagons. Having some fun at the expense of billion dollar corpos is a thing people do.
 
Last edited:
LOL wrong. I implied that a couple of CPUs listed at 125W TDP are unstable.
Come now. You're more honest than this. You implied much more than that.

... they can use 300-400W at full CPU usage in most motherboards which is leading to instability because of that. This is a documented fact ... If they were actually run at 125W they would be stable.
Except this "documented fact" is pure supposition. And one that evidence leads us to believe otherwise. From THG:

"...this problem has also been prevalent in the data center... The discovery [of server chip problems] confirms that Intel's stability problems with Raptor Lake Refresh are more complicated than ever. The server-based motherboards used by these 13900K and 14900K servers are focused entirely on stability and running the chips within specifications, with no way to overclock these chips. The fact that Intel's 13th and 14th Gen chips are still crashing suggests that the chips themselves have problems..."

These server boards don't use the "extreme power profile" that causes the 14900KS to consume 300+ watts -- yet the chips are still crashing.

No need to take everything seriously and round the wagons. Having some fun at the expense of billion dollar corpos is a thing people do.
I've repeatedly stated I use AMD in my home machines for many years now. I'm a big fan of truth and accuracy, however. And as a student of history and economics, I recognize our happiness and standard of living depends on those "billion dollar corporations", and the perils that come from the relentless hackneyed, ignorant attacks on them. Places like Burundi, Slovenia, North Korea, and my former USSR didn't have billion-dollar firms to poke fun at. All the worse for them.
 
Last edited:
Come now. You're more honest than this. You implied much more than that.

I was referring to the fact that those 2 top end processors are never run at 125W, which is the problem. And if Intel makes any new "125W" top end processors, they will also not run at 125W and potentially also have the same problems. Likely they won't because Intel is aware of a problem and will fix it but that's for the future to reveal.

Except this "documented fact" is pure supposition. And one that evidence leads us to believe otherwise. From THG:

"...this problem has also been prevalent in the data center... The discovery [of server chip problems] confirms that Intel's stability problems with Raptor Lake Refresh are more complicated than ever. The server-based motherboards used by these 13900K and 14900K servers are focused entirely on stability and running the chips within specifications, with no way to overclock these chips. The fact that Intel's 13th and 14th Gen chips are still crashing suggests that the chips themselves have problems..."

These server boards don't use the "extreme power profile" that causes the 14900KS to consume 300+ watts -- yet the chips are still crashing.

I don't know how (or why when Xeon and Epyc exist) these are used for servers with whatever is different with those motherboards but Intel's recommended mitigations for desktop are to limit the power to 253W which suggests power is a major part of the problem.

I'm not 'rounding the wagons' ; I've repeatedly stated I use AMD in my home machines for many years now. I'm a big fan of truth and accuracy, however. And as a student of history and economics, I recognize the value to our happiness and standard of living of those "billion dollar corporations", and the perils that come from the relentless hackneyed, ignorant attacks on them. Places like Burundi, Slovenia, North Korea, and my former USSR didn't have billion-dollar firms to poke fun at. All the worse for them.

When corps make mistakes they will need to manage criticism and joking at their expense. If their future products are good then all is forgiven, Bulldozer to Zen being a good example. Or 1997 Apple to 2010 Apple. Both got a lot of deserved criticism and joking at their expense and both were good enough to plan properly and recover with products that didn't suck.
 
I was referring to the fact that those 2 top end processors are never run at 125W, which is the problem. And if Intel makes any new "125W" top end processors, they will also not run at 125W and potentially also have the same problems. Likely they won't because Intel is aware of a problem and will fix it but that's for the future to reveal.
Fair enough. I concede.
 
Upping the TJ Max is not the only concern here. Doing so also will put more stress on motherboard components and cause video cards and SSDs to run hotter when the system is under heavy load.
 
Intel's chip designs are quickly losing vialbility and their is a ceiling they have basically already hit.
 
Heating is a function of total power consumption, not temperature. It's rather odd you'd be so concerned about the heat of a 125w CPU, but not a 300, 400, or even 600 watt graphics card.


A lot of drama queens in this thread. It's one unconfirmed rumor of a 5 degree change. Quite possibly false, and, even if true, I strongly suspect the earth won't suddenly stop spinning on its axis.

indeed...
apple did it with M2 chips and everything is ok ...
 
Different manufacturing process, CPU architecture optimized for low frequencies, much lower overall power consumption ...
Oops! Apple's M3 uses the same node as these upcoming CPUs. Nor do transistors care about "overall power consumption", but only the localized temperature they themselves experience. Cerebras makes a CPU that doesn't simply consume 100 watts or even 1,000, but a staggering 23,000 watts. It works just fine ... because the temperature at any point on that (extremely large) chip is similarly constrained.
 
Back