Early Core i9-13900K sample shows massive improvement in minimum framerates, boosts to...

Tudor Cibean

Posts: 182   +11
Staff
In a nutshell: Intel's Core i9-13900K will reportedly feature higher clock speeds than its predecessor and will have twice the number of E-cores, increasing multi-threaded performance tremendously. However, it will still use the Intel 7 node, meaning that all these advantages will make it even more power-hungry and difficult to cool than current Alder Lake chips.

Hardware reviewer ExtremePlayer has posted some benchmarks of an Core i9-13900K qualification sample they managed to obtain. This CPU should offer near-identical performance to shipping products as it features final clock speeds, unlike the engineering samples we've seen before.

This late sample features a 5.5 GHz all P-core turbo clock and boosts up to a whopping 5.8 GHz when only two cores are loaded. In both instances, that's 300 MHz faster than the pre-binned i9-12900KS.

Raptor Lake will rely on the same Intel 7 manufacturing process as Alder Lake, so the higher clock speeds come at the cost of higher power consumption and thus more heat. During testing, ExtremePlayer saw the package power consumption jump over 400W, with a 360mm AIO unable to keep the CPU from throttling.

The Core i9-13900K does shine in synthetic tests, seeing a 10 percent performance boost in single-threaded benchmarks compared to a current-gen i9-12900K. This improvement seems to come mainly from the increased clock speeds, as the IPC performance difference was within the margin of error.

Meanwhile, multi-threaded performance is up a whopping 35 percent on average, although this can be attributed to the eight extra E-cores. These small cores also boost up to 4.3 GHz, 400 MHz more than the ones in the 12900K.

Gaming performance was also tested using 32GB of DDR5-6400 memory and an RTX 3090 Ti. At 1080p, the Raptor Lake processor was shown to provide about a 28 percent improvement in minimum frame rates, while the average FPS were just about 4 percent faster.

Intel might announce the Raptor Lake lineup at its Innovation event set for September 27, with retail availability expected in October.

Permalink to story.

 
Well, for the tiny % of people who will buy 13900K powered rigs, Power & cooling requirements are of little to no concern.

They will throw whatever $$$ it takes at it, to contain the beast.
 
I like performance.

I don’t like the thought of actually needing a 1kW PSU to power this and something like the purported RTX4090. Even the multiGPU rigs of yesteryear did not get this insane.

This must stop.
Well, If you don't like the increase in power consumption, you have other options.
You can always buy an RTX 4070 and an 13700 or AMD equivalent. Problem solved.
 
Couple of things to be mindful of here. Firstly, Intel uses the terms engineering sample and qualification sample to mean the same thing:


Secondly, use of the same process node doesn’t automatically mean significantly higher power consumption at higher clock speeds, in complex chips.

Take the Radeon RX 5700 XT and RX 6800, for example: the latter has twice as many transistors as the former, and base and boost clocks a few hundred MHz higher, but only consumes 25W more power. There’s little difference between the fundamental architectures in the two chips, but a major difference in terms of efficiency.

So there is scope for Raptor Lake to have been tweaked in that respect. It is, of course, an Intel chip, so they’re kind ‘unique’ in that respect, though…
 
Intel has said underlying changes to RL were aimed at lowering power usage. I would not assume it uses more power, or at least not much more especially in the more rational 13700 and lower products. 13900K looks like a desperate Nvidia 4090 like attempt to ensure bragging rights over AMD at any cost.
 
Couple of things to be mindful of here. Firstly, Intel uses the terms engineering sample and qualification sample to mean the same thing:


Secondly, use of the same process node doesn’t automatically mean significantly higher power consumption at higher clock speeds, in complex chips.

Take the Radeon RX 5700 XT and RX 6800, for example: the latter has twice as many transistors as the former, and base and boost clocks a few hundred MHz higher, but only consumes 25W more power. There’s little difference between the fundamental architectures in the two chips, but a major difference in terms of efficiency.

So there is scope for Raptor Lake to have been tweaked in that respect. It is, of course, an Intel chip, so they’re kind ‘unique’ in that respect, though…
We already knew that the P cores won't change much compared to the current generation, with the biggest change being the increased L2 cache. IPC increase should be very low.

The extra cache and clock speed should help with gaming. (which is why we keep hearing that 3D CPUs from AMD will come this year)
 
The UK is having a heatwave right now, no way would I want to buy a CPU that puts out this amount of heat and there's no way I'm investing into a Aircon unit to only be used for a few weeks of the year.
 
Seriously I wish reviewers would just say ‘yea we ain’t testing with anything with more cooling potential than an nh-d15, because that’s not reasonable. Also blender is gonna run for half an hour before we run the multithreaded part of our test suite, because for people who need multithreaded performance, a lot of them are gonna need multithreaded performance ALL DAY’.
 
Seriously I wish reviewers would just say ‘yea we ain’t testing with anything with more cooling potential than an nh-d15, because that’s not reasonable. Also blender is gonna run for half an hour before we run the multithreaded part of our test suite, because for people who need multithreaded performance, a lot of them are gonna need multithreaded performance ALL DAY’.

Preety soon that NH-D15 will be used for chipset cooling, if we see +400W CPU and GPU I'm sure the motherboard will need some colling too. Water cooling will use car/truck size radiator and fan.

 
The UK is having a heatwave right now, no way would I want to buy a CPU that puts out this amount of heat and there's no way I'm investing into a Aircon unit to only be used for a few weeks of the year.
And the Energy cost in the UK as well would mean this costs a fortune - I worked out just turning my Xbox one into power sleep mode saves me 1-2£ per month (and all I have to do is wait 30sec to turn on). Glad I undervolted my GPU for the extra power save as well.
 
And the Energy cost in the UK as well would mean this costs a fortune - I worked out just turning my Xbox one into power sleep mode saves me 1-2£ per month (and all I have to do is wait 30sec to turn on). Glad I undervolted my GPU for the extra power save as well.
Microsoft quote 0.5W for regulatory standby and 11W for instant on, for the Xbox One S:


So that’s a 10.5W difference. Assuming 24 hours and 30 days, that’s saving 7.56 kWh of energy, per month. With the average UK electricity cost currently at 18.6 pence per kWh, that energy difference amounts to £1.41 — but you can save even more, an extra 7p per month, by doing just one thing.

Switch off the console, at the wall socket, when you’re not using it :) Seriously though, if you’re looking to save as many pennies as you can, properly switch off everything at the wall.
 
Microsoft quote 0.5W for regulatory standby and 11W for instant on, for the Xbox One S:


So that’s a 10.5W difference. Assuming 24 hours and 30 days, that’s saving 7.56 kWh of energy, per month. With the average UK electricity cost currently at 18.6 pence per kWh, that energy difference amounts to £1.41 — but you can save even more, an extra 7p per month, by doing just one thing.

Switch off the console, at the wall socket, when you’re not using it :) Seriously though, if you’re looking to save as many pennies as you can, properly switch off everything at the wall.
True but its how far will you go I suppose saving also the new cap is a fair bit higher than 18.6 mine is 26p. Spending money on timers to make sure you remember vs one setting.
 
And the Energy cost in the UK as well would mean this costs a fortune - I worked out just turning my Xbox one into power sleep mode saves me 1-2£ per month (and all I have to do is wait 30sec to turn on). Glad I undervolted my GPU for the extra power save as well.
I've undervolting my 3060ti so the most it pulls at full wattage is around 160-180w depending on the load, if I'm playing Single player games I will just cap at 60fps which lowers power usage even more. My Ryzen 5 3600 I used CTR 2.0 to auto kick in and lower the max cpu volts to run at max boost clocks when running over 20% load.

UK energy costs are a nightmare at the moment, even if someone managed to secure a lower cost per unit their standing charge (daily connection charge) would be 40p or something crazy. During 2020 I managed to secure a connection of 6p for Electric and Gas.
 
It will only stop when people stop buying..
People will stop buying when there’s no reason to buy it, or the cost puts it out of reach of normal people.

If all game developers adopted Rimworld or non-RT Minecraft style graphics, nobody would need much more than a good APU for gaming except for bragging rights. But we know that’s not gonna happen.

Bottomline is, AAA developers will keep pushing the graphics to higher and higher levels because frankly, gameplay peaked a while ago, so that’s pretty much all they have to offer in an attempt to sell you the next sequel and all the included micro transactions, and that is why people will keep buying more and more demanding hardware so they can play the latest shooter in all its 4K/240fps/RT glory because they need that dopamine hit.

People with no AC coming from something like a 9700K/2080 and going to a 13700K/4080 are going to be in for a rude awakening if they plop that down in a bedroom. I have a 12900K/3090 simrig and it heats up my living room to the point where I don’t want to use it on a hot summer day because AC isn’t a thing in my country. Fun times ahead for power users.
 
Intel's high power consumption is because they have been back porting since 11th gen. Give them time. It's not like Intel wants it this way. They want their process lead back.

GPU's scale far better than CPU's with higher power consumption, so I imagine the performance we get from next gen will determine how okay people will be with the rise going forward. Perf is king, so if the ratio is nice, cards and PSU's will sell.
 
Perhaps the headline on this article should be

"Intel releases space heater that doubles as CPU" 🤣
 
Screenshot voltages: 1.146V vs 1.341V - That's an exponential increase for a CPU like this. The top Intel CPUs can't be effectively overclocked without extreme cooling. Once you get to 1.3V temperatures start getting crazy high under load!
 
Preety soon that NH-D15 will be used for chipset cooling, if we see +400W CPU and GPU I'm sure the motherboard will need some colling too. Water cooling will use car/truck size radiator and fan.


Actually the opposite is true for the chipset. Most modern CPUs are more SoC then just simple processors reducing much of the load on the chipset. Where there's going to be a massive increase in the need for cooling will be all the power components feeding the CPU. Simple passive heatpipe cooling isn't going to cut it any more for those parts.
 
Back