Intel prepping 350-watt Turbo Mode for high-end Raptor Lake CPUs

nanoguy

Posts: 1,355   +27
Staff member
In brief: Next-gen GPUs are expected to reach new heights in terms of power consumption, but it looks like upcoming Intel CPUs won't be too far behind. Team Blue is reportedly preparing a new overclocking profile for upcoming enthusiast-grade Raptor Lake CPUs that will enable more performance at the cost of increased power consumption and heat output.

Intel's Raptor Lake lineup is expected to debut sometime in October, with the first models shipping later this fall. Leaked internal documents have confirmed the new processor designs won't stray too far from their Alder Lake counterparts, so they're likely refined versions of existing 12th-gen parts that are manufactured on an Intel 7 process node.

All hints, rumors, and leaks so far point out that Raptor Lake performance gains will be achieved through higher clocks and thermal envelopes. Recently, a popular Chinese tech content creator was able to get his hands on qualification samples of the upcoming Intel Core i9-13900KF, Core i7-13700K, and the Core i5-13600K, and his findings confirm that given adequate cooling and power delivery, Raptor Lake processors can reach impressive clock speeds.

The upcoming flagship Intel CPU was able to reach a whopping 6.1 GHz using liquid cooling, and power users will no doubt appreciate the overclocking headroom. That said, this comes at the obvious cost of reduced energy efficiency, which may or may not matter for you depending on your use case and energy prices in your area.

According to Hungarian publication ProHardver (via VideoCardz), Team Blue is currently testing a new factory overclocking profile for its upcoming 13th-gen Core i9 series processors. This will be possible thanks to a power limit of 350 watts, which is significantly higher than the default PL2 power limit of 253 watts.

When we looked at Alder Lake CPUs, we found that higher-end models such as the Core i9-12900KS were not only difficult to keep cool but also didn't see much benefit from higher clocks, so the torch will now be passed to Raptor Lake for extracting more performance out of what is now a more mature architecture.

Also read: Intel 12th-Gen Core Alder Lake Architectural Benchmark

ProHardver writes the 350-watt mode on the Core i9-13900K qualification sample can add up to 15 percent better performance, but you won't be able to use it unless you pair it with a high-end 700 series motherboard and one of the best AIO water-cooling kits out there. Enthusiasts might find the added cost a bit harder to swallow, especially since Intel has confirmed that price inflation will soon apply to its consumer product offerings.

Intel isn't alone in driving power consumption up to squeeze more performance out of its silicon. AMD's Ryzen 7000 series processors will likely offer better performance-per-watt, but they too are expected to be more power-hungry at the higher end. The AM5 socket will accommodate processors with TDPs of up to 170 watts, and actual power consumption in heavy workloads may even go north of 230 watts.

Furthermore, AMD has hinted that Zen 4 CPUs will be a boon for overclocking enthusiasts, with special features designed to squeeze the most out of DDR5 memory. It makes perfect sense for Intel to prepare overclocking profiles for enthusiasts that are considering an upgrade to the Raptor Lake platform.

That said, Intel should prioritize platform stability and software support over raw performance. As we've seen with Alder Lake, E-cores aren't exactly useful for gaming, as they cause DRM issues and can even reduce frame rates. More recently, Alder Lake driver issues caused lower performance when using Chromium-based browsers like Microsoft Edge and Google Chrome, further highlighting the importance of proper software optimization.

Permalink to story.

 
Desperate times calls for desperate measures I guess. I not just ragging on Intel for stuff like this but AMD as well. Both companies are so desperate to be the fastest they are throwing extreme power targets at their CPU's to make them run a few Mhz faster in hopes they can be the fastest.

Fast is great but not needing a Nuke reactor to power your system also would be nice. lol
 
The 3090 ti can go up to 450, allegedly. Considering a high-end game is going to probably push both processor and video card into those upper reaches, and the next-gen cards are supposed to, at the entry level, match the 3090s, how exactly are people supposed to power this without using appliance hookups or blowing a fuse somewhere?
 
The 3090 ti can go up to 450, allegedly. Considering a high-end game is going to probably push both processor and video card into those upper reaches, and the next-gen cards are supposed to, at the entry level, match the 3090s, how exactly are people supposed to power this without using appliance hookups or blowing a fuse somewhere?

Time to run yourself a 220v circuit, or we'll see the re-emergence of multi-PSU systems.
 
Somewhere at the Intel HQ:
- Consumers are noticing that our CPUs are not that advanced as AMDs.
- Lets create a story, how our 13th gen CPUs using 10nm lithography are more advanced than AMDs using 5nm.
- But our CPUs need to be faster also.
- That's not a problem, we will just increase power consumption by 500%, because why not?

It's kinda lame, to chase "Fastest CPU crown", by ignoring global warming problems.
 
Last edited:
Somewhere at the Intel HQ:
- Consumers are noticing that our CPUs are not that advanced as AMDs.
- Lets create a story, how our 13th gen CPUs using 10nm lithography are more advanced than AMDs using 5nm.
- But our CPUs need to be faster also.
- That's not a problem, we will just increase power consumption by 500%, because why not?

It's kinda lame, to chase "Fastest CPU crown", by ignoring global warming problems.
Absolutely! "Mo MHz" is so 1990s. :rolleyes:
 
Looking forward to reviews showing performance at full blast and next to it the by now completely meaningless PL1 TDP.

So thanks to many reviewers, this is actually a pretty smart move by Intel.
 
The 3090 ti can go up to 450, allegedly. Considering a high-end game is going to probably push both processor and video card into those upper reaches, and the next-gen cards are supposed to, at the entry level, match the 3090s, how exactly are people supposed to power this without using appliance hookups or blowing a fuse somewhere?
The claim of RTX 4000 being 600+ watts has already been debunked, everyone just assumes that TDP can go up forever.

The 3090ti cooler BARELY works. And it gets loud if you dare OC. Simple physics of cooling GPUs will limit TDPs.

And I seem to remember this same "TDP is gonna melt electric boxes" talk when the 10th gen came out. And the 11th. And the 12th.
 
Not sure now much power you can draw from that dual 8Pin EPS? seen somewhere about 300-330W per 8 pin
What VRM colling you will need for this?
If the 12 gen bent the socket and PCB, will 13 gen melt it?
Is watercolling enough for this beast ?

Uefi cpu option for CPU boost?
1. normal
2. turbo
3. volcano
4. supernova
5. intel 13 gen
 
The claim of RTX 4000 being 600+ watts has already been debunked, everyone just assumes that TDP can go up forever.

The 3090ti cooler BARELY works. And it gets loud if you dare OC. Simple physics of cooling GPUs will limit TDPs.

And I seem to remember this same "TDP is gonna melt electric boxes" talk when the 10th gen came out. And the 11th. And the 12th.

The 600W+ was for the 4090 Ti, never the 4090. 4090 is already 420W, so a 600W+ 4090 Ti OC from third party is very probable. One can see a 13900KS + 4090 Ti will easily require a 1500W PSU, and 4090Ti will no doubt have > 1kW power spikes, so good luck with that.
 
We should not dismiss high power CPUs or GPUs, just the fact that such high performance can be reached in a chip is amazing. Hope they manage that level of performance in the future with lower power, but currently imagine the amazing thermal management that must happen to take the watts from the bilions of transistors to the cold plate. 350W at just over 1V is over 300A, all in a thin slice of copper motherboard, just amazing.

That said for the next gen, if you want to sped money on a 100W CPU max, maybe that should be an AMD. Same at 150 and maybe even 250W. Have the budget for 350W per chip, then maybe Intel is the best and only option.
 
Advertised TDP vs real TDP is not the same for both vendors. I cant imagine the real TDP for an Intel 350W in specs papers.
Even AMD put on the specs just the CPU die TDP, leaving out I/O die.
For an 65W there is a 18W more and for 105W there the same 18W more.
From my test and all reviews I read it's about the same.

Now put a 4090 and this baby inside the same case :joy:
There is like 1000W transformed into heat ~= 3400BTU

My 5600x and a 2060 at idle can raise room temperature with 3 degrees Celsius in 4 hours in the summer. like office tasks and Interent reading.

At this rate we will have to install watter cooling with the radiator outside the house, just like datacenters.
 
Power requirements are getting silly. When does the trend of optimization start since Moores law, and at this rate ohms law, has ended?
Begging your highness' pardon, but "Moore's law" was bullsh!t to begin with. OTOH,Ohm's law does't end in the conventional sense, it just goes up in a puff of smoke if incorrectly applied. :rolleyes: 🤣
 
Begging your highness' pardon, but "Moore's law" was bullsh!t to begin with. OTOH,Ohm's law does't end in the conventional sense, it just goes up in a puff of smoke if incorrectly applied. :rolleyes: 🤣
the incense burners in my palace are powered with magic blue smoke
 
Back