Intel Core i9-14900KS leak reveals 6.2 GHz boost clock and over 400W of power consumption

DragonSlayer101

Posts: 372   +2
Staff
The big picture: Intel is expected to launch its flagship Core i9-14900KS Limited Edition CPU later this year, and a recent leak appears to reveal some of its key technical specifications. The forthcoming chip is believed to be a slightly enhanced version of the Core i9-14900K, which was introduced last year as part of the Raptor Lake Refresh lineup.

The Core i9-14900KS specifications were disclosed when it appeared on the OCCT database. According to the listing, the 14900KS will boast 24 cores, comprising eight P-cores and 16 E-cores. It will also support 32 threads, making it nearly identical to the existing 14900K in terms of core and thread count.

The new chip is reported to feature a 3.2 GHz base clock, mirroring that of the 14900K. Other notable specifications of the 14900KS include 36 MB of L3 cache and 32 MB of L2 cache, which align with the specifications of the 14900K.

The listing, however, also reveals a few upgrades from the standard 14900K, such as a 6.2 GHz boost speed, which is 200 MHz higher than that of the existing variant. The upcoming chip will also feature a 150W base TDP, representing a 25W increase from the 125W TDP of the 14900K. However, the peak power consumption of the 14900KS could reach up to a substantial 410W, with an average power rating of 330W.

Due to the high power consumption, the 14900KS is expected to run fairly hot, reaching temperatures of up to 101 degrees Celsius. These high temperatures suggest that users will need to invest in premium cooling solutions to ensure they can fully leverage the capabilities of their flagship CPU without experiencing thermal throttling.

As of now, there's no official word on when Intel will release the new Raptor Lake Refresh CPU, but it's likely to be sooner rather than later. Online speculations suggest that it could be priced around the $800 mark, similar to the 13900KS, but we can't confirm this until it's officially announced by Intel. Either way, the 14900K is expected to be a powerhouse, and we eagerly await its official debut.

Permalink to story.

 
Absolutely ridiculous. Struggling to match the 7800x3d in games, slurping up 400w in the process, and still suffering from stuttering and low 1% thanks to e cores.

Cooling this will be an utter nightmare.
Before you know it we're going to see people pulling the radiator out of their car to cool these things. That, or phase change cooling will make a comeback.
 
Before you know it we're going to see people pulling the radiator out of their car to cool these things. That, or phase change cooling will make a comeback.
*echos of early 2000s water cooling*

Honestly the issue isnt the radiators now. The biggest issue is just getting the heat out of the CPU. My rad on my 9700k never got particularly warm, whereas it was noticeably warm on my older ivy bridge era build.
 
*echos of early 2000s water cooling*

Honestly the issue isnt the radiators now. The biggest issue is just getting the heat out of the CPU. My rad on my 9700k never got particularly warm, whereas it was noticeably warm on my older ivy bridge era build.
The issue now, as I see it, is the surface area on a waterblock between the CPU can't effectively cool it for sustained periods of time at ambient temperatures while running at max power.

One reason I'm trying what I'm calling the "steam deck challenge" is I'm tired of absurd power and cooling requirements. I see 360mm radiators as "recommened" requirements for what are supposed to be consumer CPUs.
 
The issue now, as I see it, is the surface area on a waterblock between the CPU can't effectively cool it for sustained periods of time at ambient temperatures while running at max power.

One reason I'm trying what I'm calling the "steam deck challenge" is I'm tired of absurd power and cooling requirements. I see 360mm radiators as "recommened" requirements for what are supposed to be consumer CPUs.
Right, but the reason it cant do that is because of the tight thermal leakage of modern silicon VS old school silicon. See also, removing heat from the FX series VS a modern zen chip. Direct die cooling can help, but even then thermal density of modern silicon is just insane. AMD's dead die area was a major help in transferring heat to the IHS. Intel will need to start doing something similar.
 
Right, but the reason it cant do that is because of the tight thermal leakage of modern silicon VS old school silicon. See also, removing heat from the FX series VS a modern zen chip. Direct die cooling can help, but even then thermal density of modern silicon is just insane. AMD's dead die area was a major help in transferring heat to the IHS. Intel will need to start doing something similar.
Well, from a hobby perspective, I'd love to see phasers and grease monkey level of engineering comeback. I always wanted an FX60, I was actually looking at used ones on ebay awhile back. Be fun to put together a 939 system with SLi
 
Intel has fallen really far in recent years. It wasn't long ago they were touting power efficiency, but that phrase is no longer in their vocabulary. All they care about now is pulling ahead of AMD by a measly 1-2% in gaming just so they can use it as a marketing gimmick even if it takes a kilowatt of power to achieve it.

Wow, 6.2 GHz boost clock, that sounds amazing. How many seconds can it stay at that level though? Rumor has it Intel will release a new chip next year with 7 GHz boost clock and 800 watt power consumption.

I just made that up, but you probably believed it for a moment.
 
Well, from a hobby perspective, I'd love to see phasers and grease monkey level of engineering comeback. I always wanted an FX60, I was actually looking at used ones on ebay awhile back. Be fun to put together a 939 system with SLi
While I would also like a classic, the FX I was referring to was the 8000 series from the early 2010s. A FX-9590, at its insane 270+ watt power draw, could be easily cooled by a 240mm rad or a twin fan 120mm rad.

Meanwhile, the modern 320 watt 13900k can boil an egg even with a 360m rad.
 
While I would also like a classic, the FX I was referring to was the 8000 series from the early 2010s. A FX-9590, at its insane 270+ watt power draw, could be easily cooled by a 240mm rad or a twin fan 120mm rad.

Meanwhile, the modern 320 watt 13900k can boil an egg even with a 360m rad.
I must be getting old, when you said FX I didnt even consider bulldozer and that's famous for how bad it was. I still like to think back about the Pentium 4 days as the last nuclear reactor we had, but bulldozer was certainly a crazy one. That was the first and only time I bought an Intel CPU, I ended up getting a 3770k that fried at 5Ghz. But at that point, the 1800x was out and that's what I replaced it with. But that 3770k ran at 5ghz for a few months in a Zalman 9700 before it quietly shut-off and just never powered on again....
 
Nvidia uses their halo product to grab attention and prop up their image as the performance king.

Intel uses their halo product to have people crack jokes on how much power it uses, how difficult it is to cool it and how the performance increase is small relative to power consumption, price etc.
 
400w+a good RTX -oh boy
ill recommend TS to base their future marks on the power costs - first
if the power consumption for so called RTX isnd good, why do I need the frame rate?
after all, I dont know any game with good/catchy gameplay that require any of that nonsense
 
400w+a good RTX -oh boy
ill recommend TS to base their future marks on the power costs - first
if the power consumption for so called RTX isnd good, why do I need the frame rate?
after all, I dont know any game with good/catchy gameplay that require any of that nonsense
If you are worried about power cost, you cannot afford a $400 CPU or $1600 GPU. Period.
I cannot recommend a single Intel CPU for the last 5 gens...
Their I5s represent viable budget platforms, especially as AMD has stubbornly refused to release proper $150 replacements for the 3600/5600.
 
If you are worried about power cost, you cannot afford a $400 CPU or $1600 GPU. Period.
Their I5s represent viable budget platforms, especially as AMD has stubbornly refused to release proper $150 replacements for the 3600/5600.
At 180$ the 8500G should be very good value. perf/$ is similar to the 14100.
 
Gelsinger really is a clown. How pathetically desperate is he to boast Intel is faster than AMD no matter the cost. The only reason Alder/Raptor Lake is so power hungry is they had to try and win at any cost and pushed the cpu's well past their peak efficiency. AMD had no choice to return serve lest the media crucify the
 
Back