Retailer lists Intel Core i9-14900KS with 6.2GHz stock speed, a consumer CPU record

midian182

Posts: 9,745   +121
Staff member
What just happened? Intel's Core i9-14900K Raptor Lake Refresh chip is an absolute speed demon of a CPU, but Team Blue has an even faster processor lined up – the Core i9-14900KS, which can hit 6.2 GHz out of the box. It appears that the unannounced chip has just appeared in some prebuilt systems on an Israeli retail site, suggesting an official reveal by Intel is imminent.

The Core i9-14900KS was always expected, given that Intel also released the previous-generation Core i9-13900KS. Israel-based retailer PC-Online seems to have confirmed this by listing several pre-built PCs featuring the binned variant of the Core i9-14900K, which, with a 6.2 GHz stock clock speed, is thought to be the fastest consumer processor ever.

PC-Online lists the Core i9-14900KS as having the 200 MHz advantage over the Core i9-14900K, the same speed bump that the Core i9-13900KS has over the Core i9-13900K. Expect the new 14th-gen chip to have the same 24-core/32-thread configuration as its standard version, while the 68MB cache (32MB L2, 36MB L3) also appears unchanged.

The Core i9-14900KS is also expected to have a 150W PBP like the Core i9-13900KS and a maximum turbo of 253W, the same as the Core i9-14900K, though that will probably peak around 300W - 320W.

Considering an extreme overclocker has already pushed the Core i9-14900K to a record-breaking 9 GHz, don't be surprised to see the Core i9-14900KS capturing some new records.

An unusual element of the listings is that all the systems come with DDR4-3200 RAM, whereas one would expect DDR5 to be paired with the Core i9-14900KS. The likely explanation is that these are business-focused machines and not gaming PCs, which also explains the lack of a dedicated graphics card and office-style cases. The other option is these are fake or a mistake on the retailer's part.

With Intel launching the Core i9-14900K at the same $599 price as its Core i9-13900K predecessor, one would expect the Core i9-14900KS to cost the same $699 as the Core i9-13900KS. We might hear an official announcement from Intel about the Core i9-14900KS either on December 14 at its AI Everywhere show where the Meteor Lake / Emerald Rapids chips will be unveiled, or at CES 2024 in January, the same event where Nvidia looks likely to show off the long-rumored RTX 4000-series Super cards.

Permalink to story.

 
A perfect CPU would be Apple M3 Ultra that would support Nvidia external GPU, too bad it won't be happening. Intel CPUs are just heaters these days, I will never buy Intel CPU again until they improve the thermal and power consumption.
 
6.2 Ghz is only for a single core, and I suspect owners of the chip are unlikely to see it maintain that clockspeed. That single core will likely run into thermal limit almost instantly.
The whole thing is just a marketing gimmick. Intel desperately needs a win against AMD _somewhere_ so why not restart the pointless GHz war? Back when Intel was leading GHz didn't matter, but now it suddenly does. Funny how that works.
 
My oven is more energy efficient and makes less heat...
The don't run your intel cpu unlimited running cinebench. Come on, it's not that hard. As TPU showed, power limited it's the most efficient cpu on the planet
 
The don't run your intel cpu unlimited running cinebench. Come on, it's not that hard. As TPU showed, power limited it's the most efficient cpu on the planet
You can power limit and undervolt AMD CPUs too. For example, when the 7800x3D released TPU added PBO + Undervolt to their results. And I'm not talking about just CB, but about normal usage like gaming.

And I've seen the TPU article. What's the point of buying a 14900k if you are just going to limit it to something that performs worse than a 14700k and still doesn't beat AMD in power efficiency?
 
You can power limit and undervolt AMD CPUs too. For example, when the 7800x3D released TPU added PBO + Undervolt to their results. And I'm not talking about just CB, but about normal usage like gaming.

And I've seen the TPU article. What's the point of buying a 14900k if you are just going to limit it to something that performs worse than a 14700k and still doesn't beat AMD in power efficiency?
Of course whenever power draw is the discussion, the argument changes to gaming and the 3d. So typical.

The 7800x 3d is a fine cpu for gaming out of the box. That's it. Nothing more. It's not a high end cpu and when get into tuning it's not even the top gaming cpu.

When it comes to Mt workloads the 14900k is incredibly efficient so I'm not sure what your argument is here. It is literally sharing first place in efficieny with the 7950x / 7950x 3d in mt workloads, while being faster and more efficient in st workloads. For most users the intel chips are more efficient than AMD chips cause most people don't loop cinebench, and even when they do efficiency is similar

The 14900k is for people who either want the fastest in everything (st, Mt, gaming) or people that can tune and make it incredibly efficient while also being the fastest gaming cpu. In 4k gaming it doesn't draw more than 70-80w on the heaviest of games so...
 
Of course whenever power draw is the discussion, the argument changes to gaming and the 3d. So typical.

The 7800x 3d is a fine cpu for gaming out of the box. That's it. Nothing more. It's not a high end cpu and when get into tuning it's not even the top gaming cpu.

When it comes to Mt workloads the 14900k is incredibly efficient so I'm not sure what your argument is here. It is literally sharing first place in efficieny with the 7950x / 7950x 3d in mt workloads, while being faster and more efficient in st workloads. For most users the intel chips are more efficient than AMD chips cause most people don't loop cinebench, and even when they do efficiency is similar

The 14900k is for people who either want the fastest in everything (st, Mt, gaming) or people that can tune and make it incredibly efficient while also being the fastest gaming cpu. In 4k gaming it doesn't draw more than 70-80w on the heaviest of games so...
"Of course whenever power draw is the discussion, the argument changes to gaming and the 3d. So typical."

Sorry dude, but aren't you the one who said: "The don't run your intel cpu unlimited running cinebench." ???

I can't talk about CB, I can't talk about games, do you want to talk about youtube videos and netflix?

You are desperately trying to find some excuses for the huge power draw difference.

"When it comes to Mt workloads the 14900k is incredibly efficient so I'm not sure what your argument is here." - no it isn't. your own mentioned articled clearly shows this.

I'm assuming that you are talking about the final MT efficiency score where the 95W (and lower) scores are at the top, but those aren't realistic numbers, they're purely academic because of the performance loss. Nobody is running that config, not even "people that can tune".

Imagine what that chart would look like with a 7950x3D or 7800x3D that had power limits in place and was also undervolted.

In general the vast majority will keep the CPU at stock. The 200W power limit is much more realistic to keep temps down a notch and the 125W is just for people who can't cool their CPU at all (bought/reused a bad cooler for it) -> nobody buys a 14900k to make it run like the 14700k or worse.

Another thing you said that makes zero sense in this context: "In 4k gaming it doesn't draw more than 70-80w" - well duuh, the games are GPU bottlenecked.

It's like saying that my oven doesn't draw any power if I turn it off. According to TPU (since you like their results), at stock, the 7950x3d draws on average less than 80W in games and the 7800x3d less than 50W. (1080p not 4K, so much more CPU load) And in time games will 100% use more of your CPU even at 4K (we already have some heavy games this year).
 
Last edited:
"Of course whenever power draw is the discussion, the argument changes to gaming and the 3d. So typical."

Sorry dude, but aren't you the one who said: "The don't run your intel cpu unlimited running cinebench." ???

I can't talk about CB, I can't talk about games, do you want to talk about youtube videos and netflix?

You are desperately trying to find some excuses for the huge power draw difference.

"When it comes to Mt workloads the 14900k is incredibly efficient so I'm not sure what your argument is here." - no it isn't. your own mentioned articled clearly shows this.

I'm assuming that you are talking about the final MT efficiency score where the 95W (and lower) scores are at the top, but those aren't realistic numbers, they're purely academic because of the performance loss. Nobody is running that config, not even "people that can tune".

Imagine what that chart would look like with a 7950x3D or 7800x3D that had power limits in place and was also undervolted.

In general the vast majority will keep the CPU at stock. The 200W power limit is much more realistic to keep temps down a notch and the 125W is just for people who can't cool their CPU at all (bought a bad cooler for it) -> nobody buys a 14900k to make it run like the 14700k or worse.

Another thing you said that makes zero sense in this context: "In 4k gaming it doesn't draw more than 70-80w" - well duuh, the games are GPU bottlenecked. It's like saying that my oven doesn't draw any power if I turn it off.
Just Dont Feed The Trolls. Strawman knows that the 14th gen cant hope to compete with 7000 chips in efficiency, and will redirect any argument to some corner where he can pull a technicality.
 
"Of course whenever power draw is the discussion, the argument changes to gaming and the 3d. So typical."

Sorry dude, but aren't you the one who said: "The don't run your intel cpu unlimited running cinebench." ???

I can't talk about CB, I can't talk about games, do you want to talk about youtube videos and netflix?

You are desperately trying to find some excuses for the huge power draw difference.

"When it comes to Mt workloads the 14900k is incredibly efficient so I'm not sure what your argument is here." - no it isn't. your own mentioned articled clearly shows this.

I'm assuming that you are talking about the final MT efficiency score where the 95W (and lower) scores are at the top, but those aren't realistic numbers, they're purely academic because of the performance loss. Nobody is running that config, not even "people that can tune".

Imagine what that chart would look like with a 7950x3D or 7800x3D that had power limits in place and was also undervolted.

In general the vast majority will keep the CPU at stock. The 200W power limit is much more realistic to keep temps down a notch and the 125W is just for people who can't cool their CPU at all (bought a bad cooler for it) -> nobody buys a 14900k to make it run like the 14700k or worse.

Another thing you said that makes zero sense in this context: "In 4k gaming it doesn't draw more than 70-80w" - well duuh, the games are GPU bottlenecked. It's like saying that my oven doesn't draw any power if I turn it off.
I never said you can't talk about cinebench. I'm saying if you are running cinebench at 4000 watts of course it's not going to be efficient lol. Why would you ever do that if you want efficiency? My 14900k needs 380 watts for a score of 43k, and 220 watts for a score of 41k. So... Duh?

If the vast majority run their CPUs out of the box then the most efficient cpus, by far, are intel t and non k lineup. Not a single amd cpu will be on the top 20 efficiency list. Kid you not. So wtf are we talking about here? The point is, the 14900k at the same power limit as the 7950x or the 7950x 3d offers similar performance, therefore it's exactly as efficient as those 2.
 
Just Dont Feed The Trolls. Strawman knows that the 14th gen cant hope to compete with 7000 chips in efficiency, and will redirect any argument to some corner where he can pull a technicality.
According to tpup, the 14900k at the same power limit is as efficient as the 7950x 3d. What the heck are you talking about? Lol
 
I never said you can't talk about cinebench. I'm saying if you are running cinebench at 4000 watts of course it's not going to be efficient lol. Why would you ever do that if you want efficiency? My 14900k needs 380 watts for a score of 43k, and 220 watts for a score of 41k. So... Duh?

If the vast majority run their CPUs out of the box then the most efficient cpus, by far, are intel t and non k lineup. Not a single amd cpu will be on the top 20 efficiency list. Kid you not. So wtf are we talking about here? The point is, the 14900k at the same power limit as the 7950x or the 7950x 3d offers similar performance, therefore it's exactly as efficient as those 2.
The numbers don't lie. Every single outlet that reviewed it has numbers that dispute your claims. Every single one of them.
 
Yeah, I knew you only looked at that and ignored the rest. Let me translate that with the context of the entire article used to draw a conclusion: you are making your CPU noticeably slower to get close to a stock AMD in terms of perf/W.

You are going from 38.7k stock to 31.2k points in CB when limiting to 125W (using CB since that's what the chart you linked has). That's an almost 20% drop dude. Below the 14700k and well below AMD. (35k for the 7950x3D)

When using perf/W you can get weird results if you don't also look at the absolute numbers too. Who the hell wants such a massive drop in performance just to get close to AMD CPUs?

Since I can't find proper benchmarks for the 7950x3d in terms of power scaling, here's one for the 7950x:
 
Yeah, I knew you only looked at that and ignored the rest. Let me translate that with the context of the entire article used to draw a conclusion: you are making your CPU noticeably slower to get close to a stock AMD in terms of perf/W.

You are going from 38.7k stock to 31.2k points in CB when limiting to 125W (using CB since that's what the chart you linked has). That's an almost 20% drop dude. Below the 14700k and well below AMD. (35k for the 7950x3D)

When using perf/W you can get weird results if you don't also look at the absolute numbers too. Who the hell wants such a massive drop in performance just to get close to AMD CPUs?

Since I can't find proper benchmarks for the 7950x3d in terms of power scaling, here's one for the 7950x:
I'm looking at the 200w limits. The 14900k is more efficient than the 7950x at 200w while offering very similar performance.

The anandtech link you posted is obviously wrong, since the power draw on and is significantly higher. It's not drawing the wattage that the graph is saying it is.
 
Last edited:
I'm looking at the 200w limits. The 14900k is more efficient than the 7950x at 200w while offering very similar performance.

The anandtech link you posted is obviously wrong, since the power draw on and is significantly higher. It's not drawing the wattage that the graph is saying it is.

Your whole argument about "most people" had me crying...

Most people don't fiddle with BIOS, or OC, or even attempt any of that. But what MOST PEOPLE can do is click PBO... who have no use for the oldschool method (ie: archaic) of manually OC their system.

I get it you are an elite OC & ultra enthusiast and a know-it-all like the rest of us, but in 2023 even advanced people like me, just flip the PBO switch... because who can be bothered by all the Intel tedium...?



Secondly, nobody here cares about top multitasking and benchmarking and maintaining extreme high throughput while MT loads...? Nobody games with that type of workload and nobody is building a PC is trying to build a workstation.... Bcz there is ThreadRipper is for that.





 
Your whole argument about "most people" had me crying...

Most people don't fiddle with BIOS, or OC, or even attempt any of that. But what MOST PEOPLE can do is click PBO... who have no use for the oldschool method (ie: archaic) of manually OC their system.

I get it you are an elite OC & ultra enthusiast and a know-it-all like the rest of us, but in 2023 even advanced people like me, just flip the PBO switch... because who can be bothered by all the Intel tedium...?



Secondly, nobody here cares about top multitasking and benchmarking and maintaining extreme high throughput while MT loads...? Nobody games with that type of workload and nobody is building a PC is trying to build a workstation.... Bcz there is ThreadRipper is for that.
It doesn't matter what most people do. I fully acknowledge that out of the box the 14900 is stupidly setup, to the point of it being almost unusable. But that doesn't say much about the cpu in technology terms. Even if for the sake of argument I grant that the 7950x / 7950x 3d is more efficient in mt workloads, the 14900k takes 2nd place. So the 2nd most efficient cpu, how can you call that not efficient. It blows my mind.

Also you don't need to fiddle with the bios. The first you try to get into the bios (to set your xmp for example) it will ask you, what cooler you have and will set the limits accordingly. We'll if you put that to "unlimited wattage" you can't then complain that the cpu isn't efficient. I mean, duh.
 
Back