Intel Core i9-13900K Review: Hot and Hungry

Techpowerup has a more positive view towards Intel 13900K. It's a bit faster than AMD 7700X but I don't think it's worth the extra heat and power consumption. Not to mention, throttling will reduce the performance that's been shown in these reviews. Techpowerup recorded 117C temperature max in their tests with power limits removed and 90s in gaming. It's 🔥 🔥 🥵
Who the h3ll removes the power limits on this chip when gaming?
 
Mostly intel fanboys and now they just got slapped in by face by their lord and saviour lol.

17 second thermal throddle with a 360mm Rad is crazy.
So, I guess based on power draw, a Core i5-12600K outperforms a 7950X in gaming by 2 to 1. The Core i5 uses 95W whereas the 7950 uses 195.

Power draw is but one factor in what makes any CPU/GPU etc good or bad. The fact is the 13900 outperformed the 7950 most of the time in benchmarking even if it did thermal throttle. And when it comes to cooling, they aren't that much different. Average temps for the 13900 are about 83C compared to the 7950 @87.7. Almost 5 degrees difference, kind of like 95C for 7950 and 100C for 13900 at peak levels. Hmm..not such a clear cut win here is there? If the 13900 is going to be hard to cool, what makes anyone think the 7950 won't be equally as hard?

In overall compute performance the CPUs are literally tied though the 13900 shows a slight ~5% advantage in gaming (depending on the benchmark mix). And the Intel CPU comes in slightly cheaper at $660 versus $700. In other words, I think the choice here is a coin-toss. You can build a cheaper Intel system if you're willing to go with DDR4 but overall if you're building a new system the costs should be similar, if not minimally cheaper for an Intel build.

Interestingly, this article omitted any data on the Core i5-13600K or the Core i7-13700K which appears to best the 7950 in gaming (not productivity) and uses less power on average. And does so at less cost.


 
Proof that Intel's upcoming locked cpu's are going to be a hit. DeBauer locked the 13900K at 90W and it still mopped the floor with AMD's new lineup in regards to gaming.

jonah-jameson-mouth.gif
 
Thank you techspot for the work & reviews....

Seems like the moral of the story is, that the Ryzen 7700x is the chip to get if you are a Gamer and don't care about productivity.

Buying anything Intel right now makes zero sense, because it's all EOL as a new socket is coming in just 8 months.
No, I do not think that is the moral of the story. Look at the i5 and i7 benchmark numbers. They are on average 10% faster (at 1080P) and in the case of the Core i5, 20% cheaper. At 1440 the 7700 and 13600 are neck and neck, but the 13600 is still cheaper.

This is base on Tom's Hardware review.

https://www.tomshardware.com/news/intel-core-i9-13900k-core-i5-13600k-cpu-review
 
Accorfing to Steve. Acoding to Gamers Nexus and Gordon's PCWorld, that is simply untrue. All of it.

edit: same conclusion on GamerNexus. Every reputable HW reviewer, except Steve, has different results. To keep it simple - 7950X uses more power than Steve's test show.
Interesting and concerning. I wonder if they could arrange an equipment swap and see if each could replicate the other's results.
 
This isn't a per per watt chart but it is an energy consumption one from the Blender data - it's how many kJ of energy was required to do the test:View attachment 88605

Thank you for adding this. Performance per watt is an important metric. The review's Blender charts for the 7950 vs 13900K show that the former completed the task in 88% of the time of the latter, while drawing 72% of the energy (full system). 0.88 x 0.72 = 0.63 which is consistent with your added graph.

Meaning: the Intel system takes at least 50% more energy to do that job. Worse is the CB23 normalized score runs during which Intel took 70% more power.

So at least 50% more heat to be removed by room cooling (itself far from 100% efficient). And more system and room cooling noise.

I'm AMD-only for big-picture reasons, but these results are damning for Intel's latest tech.
 
A snippet from Gordon's review on power usage (he measures from the wall full PC P. usage, same GPU, RAM size...I left a video a bit obave)

9WjPxbI.png


In default state (no going into advanced settings in UEFI.. ) in AMD territory, I'd say it performed poorly but it did not draw some obscene amount of power (compared to 7970s draw, we are otherwise in an obscene power usage territory already)... 500W is overclocking via unlocking P-states. You can't take that as a reference power usage. In most games it uses less power than 7950X and delivers more performance.
 
In most games it uses less power than 7950X and delivers more performance.
Then by all means, keep buying Intel.
Nobody has to convince you or tell you what to do with your money.

Even if the Intel cpu was really that much better, which is not, I will still refuse to buy anything from them, since the last thing I need is Intel back on top with no competition.
I remember what they did to me as a customer when they keep me in 4 core hell for a decade and how they laughed on my face when I said that I needed more cores and their answer was "buy a US$1K cpu (Extreme Edition) " or how they have lied over and over with false manipulated demos and "reviews"

Or forcing a motherboard replacement every other gen, because rea$on$.

Lastly, my moral compass wont let me give money to a company that has and perhaps, still behaves how Intel has done, resorting and found guilty of illegal actions.
 
How is 7950X not throttling when it works @5.3 and not 5.7GHz?! Seems like it is throttling...Maybe Intel should just take a case out of AMD's bag and ditch the TDP and states monitoring as we know it now, and change the names of those to make reviewers like them more. It's working well for AMD so far. TDP is meaningless on new AMD, max CPU speed is also meaningless, hell, everything is meaningless on AMD since every reviewer uses standards set by Intel to judge both AMD and Intel although AMD is almost completely avoiding truthful sensor reporting (they are not lying, they are just using the "better" data for show). 170W TDP for 7950? Yea, sure. And it's not throttling because max temp AMD would consider throttling is 95C...Yet, the part is marketed as 5.7GHz max boost. I am yet to see those speeds outside of overclocking it. But that's fine because it's benevolent AMD, not Intel. In reality, the 7950 is not: meeting the target boost of 5.7GHz, is not using MAX 170W tdp (I know Intel was the first to do this) etc. is not reporting that it's not using max speed (throttling and pretending everything is fine, until lawsuit hits them, like 12 something months ago), not reporting limits, not reporting incorrect memory "training"...

Of course, we all look at power performance ratio and decide what brand new top of the line CPU we will buy, as always. ( I have a feeling this portal invented that measure to push reviews towards AMD side no matter what. As soon as it backfires in Intel's direction they will stop using it. :p That's my personal belief, led to by this portal during the past two or so years... AMD simply could not make a worse CPU than Intel, even when it's obviously worse to everyone else )

I know I sound like an Intel fanboi but the truth is - I don't care. My mantra in regards to CPUs always was and still is that my budget for mobo, ram and cpu is <450~500 eur. Need it to be faster in gaming and real world apps and right now some b550 ddr4 3600X combo would be my purchase (or 5600) IF I couldn't get 10600K or 11400K for cheaper.

edit: more detailed and realistic power review for 7950 and 12900K
"How is 7950X not throttling when it works @5.3 and not 5.7GHz?!" -> you are confusing single core boost clocks with all core boost clocks.

FYI the 5.3GHz all core clock speed for the 7950X is actually above what AMD said it can do (I think they mentioned 5.1) which means that with enough thermal headroom you gain performance.

How do you know when a CPU is thermal throttling? When its performance noticeably drops after heat saturation of the systems is achieved. This is what's happening with the 13900k. It can only sustain high clock speeds for a short while even with an expensive ultra higher end cooler when you have an intensive workload (games don't do this).
 
A snippet from Gordon's review on power usage (he measures from the wall full PC P. usage, same GPU, RAM size...I left a video a bit obave)

9WjPxbI.png


In default state (no going into advanced settings in UEFI.. ) in AMD territory, I'd say it performed poorly but it did not draw some obscene amount of power (compared to 7970s draw, we are otherwise in an obscene power usage territory already)... 500W is overclocking via unlocking P-states. You can't take that as a reference power usage. In most games it uses less power than 7950X and delivers more performance.
Define "obscene" because even with power limits it's still drawing a lot more power.

As for the games, from what I've seen it depends on the title. But in most games it draws more power (especially in those that can make use of multiple threads).
 
I'm waiting for regulatory bodies to step in and limit sales of PSUs over, let's say 500Watts, so both hardware and software creators actually started to work on optimising their products, rather than making the industry the mess It is.
This is maybe off topic, but I think that we are observing a possible end to x86/64 because they cant simply keep adding transistors.
They need to maybe, somehow, break free of the legacy cr@p and start clean.

I dont know how, to be honest, I am not Jim The G.O.A.T. Keller, but something like ARM needs to step up, for example.
 
Proof that Intel's upcoming locked cpu's are going to be a hit. DeBauer locked the 13900K at 90W and it still mopped the floor with AMD's new lineup in regards to gaming.

Far Cry, PUBG, etc, In those kind of titles it is to be expected that if the Intel CPU can still achieve it's boost clocks for 2 cores then it will be fine.

In general both CPUs dropped by similar amounts of power in most titles. In games that AMD wins it will remain on top (like with AC Valhalla) and vice-versa.
 
Anyone else notice the small variance between 1080p numbers and 1440p?

The 4090 is such a monster that these TOP of the line CPU’s are seen as a bottleneck.

Pretty cool

Would have preferred max settings and 4k benchmark results myself.
 
No, I do not think that is the moral of the story. Look at the i5 and i7 benchmark numbers. They are on average 10% faster (at 1080P) and in the case of the Core i5, 20% cheaper. At 1440 the 7700 and 13600 are neck and neck, but the 13600 is still cheaper.

This is base on Tom's Hardware review.

https://www.tomshardware.com/news/intel-core-i9-13900k-core-i5-13600k-cpu-review
Why would anyone buy a LGA1700 mobo and EOL platform, just to be close to the 7700x's performance.

What are you going to do with you brand new iNTEL rig next year when the new CPUs are out?

I bought at 7700x because I know I will be able to buy whatever (10 core, 12 core 14 core or 16 core) AMD comes out with over the next 6 years... and just drop it in.

While 13th gen owners get laughed at... for being drunk on kool-aid. (Didn't AM4 teach you 8700k owners anything....)
 
I'm waiting for regulatory bodies to step in and limit sales of PSUs over, let's say 500Watts, so both hardware and software creators actually started to work on optimising their products, rather than making the industry the mess It is.
How about letting consumers do that with their wallets, which based on comments here and elsewhere does not seem like an idle threat (not 500W specifically, but at "too much".)

Meanwhile, keep in mind we are talking about a relatively small niche of the overall computing device market. Overall most of the volume is in phones, tablets, laptops, cheap desktops, etc. most of which have gotten a great deal of power efficiency attention over the past decade.
 
The top AMD and Intel chips are not gaming CPUs anyone buying these solely for gaming is silly .
Imagine a 4090 + 13900k chip together - does that 850w PSU NVidia recommends apply

Intel already has some great budget gaming CPUs

remember the proviso - You have a 4090 you have a 4K monitor - so you will run games at 4K or 1440p where most top CPUs even out

Wait your AMD 3D chips next year and GPU price war as RDNA 3 is released.

For encoding/video , formula bashing etc AMD all the way - will run for days cooler and less power draw .

Apple will have a laugh comparing some video app - m3 vs 13900k for power draw
 
There was a chart with "Cinebench R23, Power Scaling" showing multicore score per wattage for the 13900K and the 7950X slaughtered that chip... Why was it taken down? I see no reference to the chart anymore....?
 
Why would anyone buy a LGA1700 mobo and EOL platform, just to be close to the 7700x's performance.

What are you going to do with you brand new iNTEL rig next year when the new CPUs are out?

I bought at 7700x because I know I will be able to buy whatever (10 core, 12 core 14 core or 16 core) AMD comes out with over the next 6 years... and just drop it in.

While 13th gen owners get laughed at... for being drunk on kool-aid. (Didn't AM4 teach you 8700k owners anything....)
That's not a strong argument for me. One could say the same about people who built AMD PCs last year using Zen 3. By the time I need to upgrade the mobo, there will likely be 2 or 3 new CPU versions out not to mention all the other tech advancements I'll want. And I'll believe 6 years when I see it. AMD says 2025 and maybe beyond, maybe.

I'm sure you'll like the 7700X, but I don't think that's a slam dunk decision.
 
This is maybe off topic, but I think that we are observing a possible end to x86/64 because they cant simply keep adding transistors.
They need to maybe, somehow, break free of the legacy cr@p and start clean.

I dont know how, to be honest, I am not Jim The G.O.A.T. Keller, but something like ARM needs to step up, for example.
That's where Apple is putting their money, and they are not wrong. There's still plenty of places for x86 but ARM is certainly the future, especially as we become even more mobile centric than we are already. Down the road you may not even need a high-end GPU if you're streaming games.
 
Zen 4 weith v-cache will definitely be worth waiting for after seeing this. We are promised virtually identical clock speeds this time around compared to regular models and the v-cache is far more refined and expect possibly even larger uplifts in gaming and some productivity software. 7900X3D here we come.
 
So, I guess based on power draw, a Core i5-12600K outperforms a 7950X in gaming by 2 to 1. The Core i5 uses 95W whereas the 7950 uses 195.

Power draw is but one factor in what makes any CPU/GPU etc good or bad. The fact is the 13900 outperformed the 7950 most of the time in benchmarking even if it did thermal throttle. And when it comes to cooling, they aren't that much different. Average temps for the 13900 are about 83C compared to the 7950 @87.7. Almost 5 degrees difference, kind of like 95C for 7950 and 100C for 13900 at peak levels. Hmm..not such a clear cut win here is there? If the 13900 is going to be hard to cool, what makes anyone think the 7950 won't be equally as hard?

In overall compute performance the CPUs are literally tied though the 13900 shows a slight ~5% advantage in gaming (depending on the benchmark mix). And the Intel CPU comes in slightly cheaper at $660 versus $700. In other words, I think the choice here is a coin-toss. You can build a cheaper Intel system if you're willing to go with DDR4 but overall if you're building a new system the costs should be similar, if not minimally cheaper for an Intel build.

Interestingly, this article omitted any data on the Core i5-13600K or the Core i7-13700K which appears to best the 7950 in gaming (not productivity) and uses less power on average. And does so at less cost.


Compare apples to apples. The 7950k and 13900k are productivity chips. Don't compare productivity chips to gaming chips like the i5s or ryzen 7700 or 7700s.

In terms of cooling, excess electricty usage/heat production is a bigger problem than overly thick IHS. Intel has a problem with excess power use and heat output, which is a harder problem to fix than AMD's problem of using an overly thick IHS. The later has more options in fixes to achieve lower temperatures with better cooling HSF, deliding, or playing around with undervolting. Ways to fix Intel can be along similar lines but it won't mitigate the problem as well.

For example, debaurer saw a 20'C temperature drop when he delidded the 7000 chip....you won't see that type of improvement if the problem is just too much power draw like with Intel chips. AMD's much lower overall power consumption also allows a wider range of coolers and won't force you into bigger liquid cooling.
 
Back