Retail Core i9-13900K CPU reviewed, provides impressive performance uplift with power...

Tudor Cibean

Posts: 182   +11
Staff
The big picture: Details about Intel's upcoming Raptor Lake series have slowly been leaking over the past few months. After today's review of a retail unit of the i9-13900K, the only piece missing from the puzzle is pricing, which will hopefully be competitive against Team Red's Zen 4 lineup.

Hardware leakers ECSM and OneRaichu recently posted an in-depth review of a retail unit of Intel's upcoming Core i9-13900K flagship CPU. They have removed the article since then (likely at the request of Intel), but the charts and results are luckily still circulating the web.

First up, a few notes about the test systems used. The reviewers compared the Raptor Lake processor with its predecessor, the i9-12900KF, and tested them with both DDR5-6000 CL30 and DDR4-3600 CL17 memory. A beefy 360mm AIO cooler kept temperatures (somewhat) under control, while an AMD Radeon RX 6900 XT was used for the gaming benchmarks.

ECSM tested the CPUs with their power limits disabled in the BIOS, which saw the i9-13900K draw up to 343W during AIDA64's FPU test (vs. the i9-12900KF's 236W). Users wanting to manually overclock the i9-13900K will likely require sub-ambient cooling solutions.

Apart from the core count and frequency increases we've talked about before, the reviewers noted the Raptor Lake CPU has a new ring bus design working at a higher frequency. The i9-13900K has more L2 and L3 cache, with bandwidth and latency also seeing improvements.

In Cinebench R23's single-threaded test, the i9-13900K's P-core (Raptor Cove) is 13 percent faster than its predecessor using the Golden Cove architecture, with the E-cores also seeing a similar uplift. Meanwhile, multithreaded performance was, on average, 42 percent higher on the 13th-gen chip. The new CPU provides decent gains in games as well, with CSGO seeing the biggest improvement, especially when it comes to minimum framerates.

Intel will officially announce the Raptor Lake lineup at its Innovation event next week, with retail availability starting in late October. They will compete with AMD's Zen 4 series launching on September 27.

Permalink to story.

 
""Users wanting to manually overclock the i9-13900K will likely require sub-ambient cooling solutions.""

Here we go .. phase change cooling for gamers and OC's :( this is not where we need to go Intel/AMD.
We'll see, but I think that's being a little dramatic.
 
That moment when I realize that the 13900K draws almost as much power as my Threadripper 3970X and has significantly lower multi core performance.

Not that "significantly" if we can judge now.
13900k (unlimited) scores >40k in CB23 which isn't miles away from TR3970x results.
 
13900KS + 4090 Ti, will require 1.5kW ATX 3.0 PSU with 100% spike protection. Oh well I guess Intel's hype about how they were going to lower RL energy use was all BS. Technically it's probably a bit more efficient per core at full tilt than 12900K 343/24 = 14.29W/core vs 236/16 = 14.75 W/core but that's a lame difference. Imagine cRaptor Lake in laptops. Phoenix Point will destroy them on battery life.
 
Funny, it seems everyone has the same takeaway from this. Too power hungry and too hot.

When you ride for years on a development environment where you do little because you are the "king" (eg. Intel, Nvidia...), it is normal that most developments come from the node:

- if you improve the node (even if it doesn't get smaller), then you put more transistors and draw more power = more performance

- with that you just have to make sure nothing gets burned

As most nodes (due to physics) aren't advancing as fast as before, there are no huge improvements in speed / power consumption. At least Apple and AMD are showing bigger improvements on that matter.

ATM I find ridiculous the energy consumption from Intel and Nvidia chips so that I just can game; I have a personal and social life and I don't live for gaming, so I can wait until something shows up that has a good performance per Watt (= don't deplete my wallet buying, consuming energy and cooling down the room)
 
Everyone is concerned about power draw. I understand where these comments are coming from but they're not true for real world use. Unless you're rendering video or doing anything else extremely intensive the CPU will never draw that much power during normal use and gaming.
 
Everyone is concerned about power draw. I understand where these comments are coming from but they're not true for real world use. Unless you're rendering video or doing anything else extremely intensive the CPU will never draw that much power during normal use and gaming.
Well, the time has come for Intel to also understand that karma caught them fast. A few years ago, before Zen era, Intel was screaming how efficient it is and stigmatizing AMD processors as being power hungry. Now Intel 13900K is a power hungry oven and Intel is tasting it s own medicine (this trend started to raise higher and higher with 11900K and 12900K). And it is terrible considering how expensive power electricity is nowadays.
 
When 13 th arrive we will change out 9 th-12 th gen fast. and still we can play on pentium 2 3 4 halved FPS. but pcie 4.0-5.0 beating pcie 1.0 agp 4x 8x
 
Everyone is concerned about power draw. I understand where these comments are coming from but they're not true for real world use. Unless you're rendering video or doing anything else extremely intensive the CPU will never draw that much power during normal use and gaming.
I always used PSUs that had less than recommended wattage even though I have a lot of of fans and drives connected to them.
Judging by my experience, I can have latest cpu and gpu and still pair them with a 800 watt good quality psu and it will work just fine.
 
Not that "significantly" if we can judge now.
13900k (unlimited) scores >40k in CB23 which isn't miles away from TR3970x results.

Here is my 3970X CB R23 results at stock power limits cooled with only a 280mm radiator. What 13900K leaked result shows a 50,000+ CB R23 score?
fyoBp8C.jpg
 
Back