Intel Core i9-13900K Review: Hot and Hungry

GREAT article! That was a lot of work!

100% agree, the high-end Intel CPU's are power hogs and HOT! Personally, I discovered the weakness of these CPU's with my 10850 (10 P-cores) when I tried to overclock it. No matter what I tried over a period of months, it ran too hot for my tastes. In the end, I put everything back to stock to control the temps - I even had to force the Asus BIOS to use "Intel recommended settings" because even the Asus defaults produced too much heat (they add too much voltage out of the box).
 
There's a bit of an error in that the conclusion of this text...

Here the 13900K was just 9% faster than the 12900K and 7900X, making it 13% slower than the 7950X, for an overall average result.

Does not match the graph. It should be the other way around, the 13900K is faster than the 7950X at least according to the graph. (Ed. note: thanks corrected!)
 
Yeah but most of them seem to agree, except you and techpowerup. And we are not talking margin of error here, the difference is huge. The 13900k @ your 125w limit is slower than a 12900k @ 125w in CBR23. That just doesn't make sense
So first things first, just taking the stock R23 Cinebench MT figures:

Computerbase - 39551
Guru3D - 37624
Club386 - 38126
igorsLAB - 37599

That's a mean score of 38225 and a sample standard deviation of 917. We got 38003 - that's not a huge difference at all. It's within 0.24 standard deviations of the others' mean. TechPowerUp's figure of 32248 does seem unusually low, though.

Second thing, about the fact that the 13900K performed worse at 125W than the 12900K - well, they're different chips, and given the fact that the PL2 of the 13900K is 12W higher than the 12900K might suggest that the former doesn't perform as well at lower power levels than its predecessor. Or it may well be a motherboard-related issue.
 
Fantastic review. Shame the same can't be said for the Intel part. How adding more of those "efficient" e cores led to this is quite a mystery.

Intel (and the whole industry frankly) needs to stop going balls to the wall on out of box clocks. Same for AMD, it's 65 watt results were far more impressive then 170 watt was, but Intel has seriously lost the plot of it thinks a CPU pulling this much juice is the right way forward. What the hell are they going to do next year?

Thank goodness for the GOAT, the 5800x3d.

edit: seriously that power scaling is hot garbage. What happened to intels big.little design saving power BS? When AMD does near twice the performance per watt without “e” cores you know you screwed up.
If the "E" cores were not there, the performance per watt difference would be even higher, especially for the MT benchmarks.
 
So if I'm upgrading my CPU and I want "the best" for gaming or productivity, I'll have to change my 360mm AIO to a custom liquid, a new PSU, turn my air conditioner to 18 degrees everyday with a hope that my "best CPU" will run below 90 degree and add "a few" percentage to my electric bill monthly. That's amazing, thank you Intel!!!

Funny to see that thanks to Intel, now the 7000 CPU series don't look too bad.
 
I feel sad for Intel and for the people wanting to purchase this CPU. The money that will have to be spent to run and cool these things will be laughable for what you get. Intel's i9-13900K uses 493 watts compared to AMD's RX 5800X3D which uses 192 watts, a difference of 301watts. And for what, you get 12 more frames at 1440p. I can get the 5800X3D for $390 compared to the $630 for the 13900K. I think I will pass on this one Intel.
 
How is 7950X not throttling when it works @5.3 and not 5.7GHz?! Seems like it is throttling...Maybe Intel should just take a case out of AMD's bag and ditch the TDP and states monitoring as we know it now, and change the names of those to make reviewers like them more. It's working well for AMD so far. TDP is meaningless on new AMD, max CPU speed is also meaningless, hell, everything is meaningless on AMD since every reviewer uses standards set by Intel to judge both AMD and Intel although AMD is almost completely avoiding truthful sensor reporting (they are not lying, they are just using the "better" data for show). 170W TDP for 7950? Yea, sure. And it's not throttling because max temp AMD would consider throttling is 95C...Yet, the part is marketed as 5.7GHz max boost. I am yet to see those speeds outside of overclocking it. But that's fine because it's benevolent AMD, not Intel. In reality, the 7950 is not: meeting the target boost of 5.7GHz, is not using MAX 170W tdp (I know Intel was the first to do this) etc. is not reporting that it's not using max speed (throttling and pretending everything is fine, until lawsuit hits them, like 12 something months ago), not reporting limits, not reporting incorrect memory "training"...

Of course, we all look at power performance ratio and decide what brand new top of the line CPU we will buy, as always. ( I have a feeling this portal invented that measure to push reviews towards AMD side no matter what. As soon as it backfires in Intel's direction they will stop using it. :p That's my personal belief, led to by this portal during the past two or so years... AMD simply could not make a worse CPU than Intel, even when it's obviously worse to everyone else )

I know I sound like an Intel fanboi but the truth is - I don't care. My mantra in regards to CPUs always was and still is that my budget for mobo, ram and cpu is <450~500 eur. Need it to be faster in gaming and real world apps and right now some b550 ddr4 3600X combo would be my purchase (or 5600) IF I couldn't get 10600K or 11400K for cheaper.

edit: more detailed and realistic power review for 7950 and 12900K
 
Last edited:
Fantastic review. Shame the same can't be said for the Intel part. How adding more of those "efficient" e cores led to this is quite a mystery.

Intel (and the whole industry frankly) needs to stop going balls to the wall on out of box clocks. Same for AMD, it's 65 watt results were far more impressive then 170 watt was, but Intel has seriously lost the plot of it thinks a CPU pulling this much juice is the right way forward. What the hell are they going to do next year?

Thank goodness for the GOAT, the 5800x3d.

edit: seriously that power scaling is hot garbage. What happened to intels big.little design saving power BS? When AMD does near twice the performance per watt without “e” cores you know you screwed up.

Just wait for the AMD Ryzen 9 7950X3D, Ryzen 9 7900X3D, and Ryzen 7 7800X3D to be released. :)
 
So first things first, just taking the stock R23 Cinebench MT figures:

Computerbase - 39551
Guru3D - 37624
Club386 - 38126
igorsLAB - 37599

That's a mean score of 38225 and a sample standard deviation of 917. We got 38003 - that's not a huge difference at all. It's within 0.24 standard deviations of the others' mean. TechPowerUp's figure of 32248 does seem unusually low, though.

Second thing, about the fact that the 13900K performed worse at 125W than the 12900K - well, they're different chips, and given the fact that the PL2 of the 13900K is 12W higher than the 12900K might suggest that the former doesn't perform as well at lower power levels than its predecessor. Or it may well be a motherboard-related issue.
Great, can you explain this here too? The following text in red is from the review

Under an all-core workload, the 13900K pushed total system usage to 493 watts, and remember that's with the GPU doing nothing, this is primarily CPU load.


So it would be fair to say, the actual CPU consumed around 400 watts on blender? At stock!! FOUR HUNDRED? I don't know how that makes sense, but okay
 
What a great and detailed review, thanks and congrats on the excellent work!

I own a a Ryzen 3600, a MSI B450 Tomahawk Max, and a good air cooler (be quiet! Dark Rock 4), along with the Nvidia 3060ti.
I am satisfied with my gaming performance, tbh, but it seems my best choice if I want an upgrade would be to just buy the 5800X3D, correct?

Cheers
 
Such an amd tinted review. Yes, it’s hot and power-hungry, which is usual case for halo products. Plus your wattage metrics look at peak and not sustained consumption (around 250ish looking at all other reviews). Even then the conclusion and comparison to 7700x makes no sense when it absolutely lack 13600k in context, which beats that 7700x in all scenarios. Nice benchmarks, nice graphs, but very little attention to context in the writing. Should’ve published review when 13600k was tested by you as well.
 
Great, can you explain this here too? The following text in red is from the review


Under an all-core workload, the 13900K pushed total system usage to 493 watts, and remember that's with the GPU doing nothing, this is primarily CPU load.


So it would be fair to say, the actual CPU consumed around 400watts on blender? At stock!! FOUR HUNDRED? I don't know how that makes sense, but okay
The entire system is using 493 watts. The GPU isn't doing any 3D rendering, of course, but like every other component in the machine, it's still active -- even just displaying Windows, it'll consume around 20 to 25W. Now add in the motherboard, fans, cooling system, RAM, SSD, PSU efficiency losses, etc. It's also worth noting that with the CPU drawing shed loads of current, the motherboard's power consumption isn't going to be just a few watts.

The important thing to take away from the Blender data is that with the 13900K in the setup, the machine's consuming 122 more watts than when it had the 12900K inside it.

Again, this could be something that's unique to that particular motherboard. Further testing, which Steve will invariably do, will give a better insight as to just what's going on with the new CPUs.
 
To say we're disappointed with the Core i9-13900K would be an understatement.

Nothing like showing a little bias here, right? The 13900K bested or equalled the 7950 in most tests and yet you give the 7950 a score of 95 and the 13900 a score of 75. The 7950 cost more and performs worse.
 
Thx for the great review and the smart conclusions Steve.
On paper 13900K looks good, in reality not so much. Win in some, loose in some, but there are too big discrepancies in what Intel claims and what it delivers. Best examples are power consumption, and prices.
The biggest retailers sell 13900K between 569$ (special offer at Microcenter), 659$ Newegg, to 889$ (Amazon)? A big increase from 589$ which Intel lied media on their paper launch.
https://www.microcenter.com/site/content/intel-gen13.aspx?linkId=100000157113057
https://www.amazon.com/stores/page/...4d33-803f-ae33749adfc0&linkId=100000157113056
For this prices of 13900K, Ryzen 7950X is an easy winner.
Also I am pleasantly amazed how AMD Ryzen 5800X3D is fighting with both 13900K and 7xxx in gaming.
So why Intel chose to deceive customers with fake prices and claimes?
Do their paper claims make more sense for investors and they needed to tick all boxes for investors?
Lower prices on paper than 7950X? Checked.
"Better" gaming performance than 7950X? Checked.
Power, temperatures and efficciency? Does not matter for investors and with "friendly" advice for reviewers to not talk about them or better to pretend that do not exist.
My conclusion is that 13900K was made to please investors on paper, in detriment of customers.
Customers who intend to buy 13900K may discover that they will have to pay almost the same or more while being locked on a dead and inferior platform, than 7950X on a better, future proof, AM5 platform.
As a funny joke, Intel launched 13900K at the best time for them, when the weather is colder so 13900k could run for 2 min before throttling. During winters 13900K may be quite useful as a fancy oven too. With 13900K Intel will have to explore new market places for sales like Alaska, Antarctica and Greenland.
Oh and Intel 13900K is sufferring of Premaure TempJack. Hero for max 2 min? Even the best AIO coolers cannot cure 13900K of Premature TempJack
And if I quote Steve and call a spade a spade, Intel managed to make 13900K a sad "joke" instead of "best (gaming or not) procesor".
At the end 13900K may be the best gaming proccessor today, but only with huge caveats, and after 3-4 months Zen43D release to not matter anymore.
 
Last edited:
If the "E" cores were not there, the performance per watt difference would be even higher, especially for the MT benchmarks.
You say that, but every time intel adds more "E" cores the power usage of their chips flys further off of the handle.

Meanwhile, the lower end alder lake chips with only P cores do very well in power limited scenarios, such as the 45 watt T lineup.

Rocket lake P cores are near identical to alder lake
Nothing like showing a little bias here, right? The 13900K bested or equalled the 7950 in most tests and yet you give the 7950 a score of 95 and the 13900 a score of 75. The 7950 cost more and performs worse.
The 7950 drew 170 watt, this monster drew over 350. That's mega GPU level. And it was barely any faster then the 12900k.

somehow saying that is bad is "bias". Will the internet's stupidity show no bounds?
How is 7950X not throttling when it works @5.3 and not 5.7GHz?! Seems like it is throttling...Maybe Intel should just take a case out of AMD's bag and ditch the TDP and states monitoring as we know it now, and change the names of those to make reviewers like them more. It's working well for AMD so far. TDP is meaningless on new AMD, max CPU speed is also meaningless, hell, everything is meaningless on AMD since every reviewer uses standards set by Intel to judge both AMD and Intel although AMD is almost completely avoiding truthful sensor reporting (they are not lying, they are just using the "better" data for show). 170W TDP for 7950? Yea, sure. And it's not throttling because max temp AMD would consider throttling is 95C...Yet, the part is marketed as 5.7GHz max boost. I am yet to see those speeds outside of overclocking it. But that's fine because it's benevolent AMD, not Intel. In reality, the 7950 is not: meeting the target boost of 5.7GHz, is not using MAX 170W tdp (I know Intel was the first to do this) etc. is not reporting that it's not using max speed (throttling and pretending everything is fine, until lawsuit hits them, like 12 something months ago), not reporting limits, not reporting incorrect memory "training"...

Of course, we all look at power performance ratio and decide what brand new top of the line CPU we will buy, as always. ( I have a feeling this portal invented that measure to push reviews towards AMD side no matter what. As soon as it backfires in Intel's direction they will stop using it. :p That's my personal belief, led to by this portal during the past two or so years... AMD simply could not make a worse CPU than Intel, even when it's obviously worse to everyone else )

I know I sound like an Intel fanboi but the truth is - I don't care. My mantra in regards to CPUs always was and still is that my budget for mobo, ram and cpu is <450~500 eur. Need it to be faster in gaming and real world apps and right now some b550 ddr4 3600X combo would be my purchase (or 5600) IF I couldn't get 10600K or 11400K for cheaper.

edit: more detailed and realistic power review for 7950 and 12900K
Because the 7950 was power limited, not temp limited? Can you even read?
 
Proof that Intel's upcoming locked cpu's are going to be a hit. DeBauer locked the 13900K at 90W and it still mopped the floor with AMD's new lineup in regards to gaming.

 
@Theinsanegamer :

I agree with a lot of what you wrote. But, :)

Because the 7950 was power limited, not temp limited? Can you even read?
I can, and that's statement is wrong. Power limited 7950X stops at 170W TDP right? ;) Nope, It's happy to go to 300W. 7 series Ryzen are all temp limited to 95C where, according to AMD, the CPU logistics will start dropping frequency, as the temp goes over 95C to up to whatever they claim it can go, to supernova I guess because AMD is that perfect. Jokes aside, that's what AMD says. That's why if you cool your 7xxx ryzen well, and keep it under 95C (preferably under 80C) you should get max boost, but you don't. Not with every CPU (7600,7700, 7900 et c. all behave a bit different from one another in regards to boost)

somehow saying that is bad is "bias". Will the internet's stupidity show no bounds?
It's biased because I can, right now, link you two other bigger portals and reviewers that will crush that theory and show Tech Spot wrong. I linked one up there already. It shows how lazy power testing here is done, by Tech spot, or Hardware unboxed. Then they take those results and use them as reference. Review gets high praise from Intel haters.
Now, if Gordon, that has been an IT journalist from almost when Steve was born, if he is wrong, and Steve is right, I will stop listening to Gordon. They can't be both right. And Gordon just used a power meter and measured power draw from the wall and guess what, 7950X uses as much power, sometime more, than 12900K does. And it's not 170W, it's almost twice that, depnding on the workload (CB, Hanbrake will push it, Photoshop wont, and then Steve takes 50W from photoshop test and 300W from CB test and says: this is a 250W cpu (because A+B/2 = whatever I aim for it to be)

AMD statistics that show in HWinfo or wherever are a pure misinformation. Deliberate misinformation that AMD wants people that review hardware to believe...To conclude, 7950X uses 300W no problem, stock. In the same application Intel 12900K uses ~290W. In that respect, 13900K is using only ~30% power and delivers what it delivers (somewhere great results, somewhere barley faster than 12900K)

Intel AD and RL are a waste of sand. P cores, E cores, that's the last of them, I am fairly sure they will drop that circus and continue "normally" with 14 series.

The 7950 drew 170 watt, this monster drew over 350. That's mega GPU level. And it was barely any faster then the 12900k.
Accorfing to Steve. Acoding to Gamers Nexus and Gordon's PCWorld, that is simply untrue. All of it.

edit: same conclusion on GamerNexus. Every reputable HW reviewer, except Steve, has different results. To keep it simple - 7950X uses more power than Steve's test show.
 
Last edited:
Techpowerup has a more positive view towards Intel 13900K. It's a bit faster than AMD 7700X but I don't think it's worth the extra heat and power consumption. Not to mention, throttling will reduce the performance that's been shown in these reviews. Techpowerup recorded 117C temperature max in their tests with power limits removed and 90s in gaming. It's 🔥 🔥 🥵
 
Back