Intel Core i9-9900KS Review: Limited Special Edition

Who Is It For?

Gamers and only gamers who want the fastest gaming CPU and are willing to pay a pointless premium for bragging rights.

Essentially you will perceive no difference in games to a standard 9900k. If you want about 95 percent of the same gaming performance for 70 percent of the price, as your graphs demonstrate, you buy the 9700k for $160 less.

Even in a highly threaded game like Battlefield there I find it pretty unlikely you'll visibly tell much difference between a knock down $190 2700x and this chip. ?‍♂️ Although it does streak away somewhat in Tomb Raider, an Intel stronghold.
 
Last edited:
It runs cooler than 3800x, so QQ more.

You are talking about a difference of more than 100W under load. It can't posibly run cooler (science and stuff). Are you confusing this with something else?

Heat and temperature are not the same thing. It is entirely reasonable for the 3800X to consume less power and still run hotter.

Power => heat, and heat/area (naively) => temperature.

The 3800X consumes less power but is cramming all of its transistors into a smaller area, so the temperature is higher. Go watch Buildzoid's video on 3700X boost behavior where he's cramming 40W into a few square millimeters of die space by running an intense single threaded load and getting temperatures of around 70C.

*EDIT* To add, power consumption is what determines how much your room will heat up (because it is equivalent to the amount of heat generated by each component), so yes, the 9900KS WILL heat your room up more than any other processor in this test.
 
Heat and temperature are not the same thing. It is entirely reasonable for the 3800X to consume less power and still run hotter.

Power => heat, and heat/area (naively) => temperature.

The 3800X consumes less power but is cramming all of its transistors into a smaller area, so the temperature is higher. Go watch Buildzoid's video on 3700X boost behavior where he's cramming 40W into a few square millimeters of die space by running an intense single threaded load and getting temperatures of around 70C.

*EDIT* To add, power consumption is what determines how much your room will heat up (because it is equivalent to the amount of heat generated by each component), so yes, the 9900KS WILL heat your room up more than any other processor in this test.
Power = Heat. It's how things work, it's physics. You can't ignore 100W. What ppl are describing is the difference between coolers and how well they dissipate the heat.
The die area is also not a good argument since it's actually Intel that has a monolithic design not AMD. And please don't bring TDP into discussion, we all know that it's irrelevant.

edit: fixed embarrassing typos :p
 
Last edited:
"For those with a more modest graphics card and a desire to do things outside of gaming, AMD’s 3rd-gen Ryzen is the better choice."

Yea, it's definitely not that cut and dry.

I'm curious to see a head to head using more popular pieces of software people actually use. Like how about an Intel Quick sync versus Ryzen comparison in Photoshop, or Intel and AMD both using CUDA in Blender. Or Ryzens insane core count against Intel's fewer cores at the same price point.

Something like that would interest me more than Cinebench and WinRAR scores.

We keep benching actual gameplay to test CPU's and GPU's, and only synthetic productivity apps for CPU's for some reason.
 
"For those with a more modest graphics card and a desire to do things outside of gaming, AMD’s 3rd-gen Ryzen is the better choice."

Yea, it's definitely not that cut and dry.

I'm curious to see a head to head using more popular pieces of software people actually use. Like how about an Intel Quick sync versus Ryzen comparison in Photoshop, or Intel and AMD both using CUDA in Blender. Or Ryzens insane core count against Intel's fewer cores at the same price point.

Something like that would interest me more than Cinebench and WinRAR scores.

We keep benching actual gameplay to test CPU's and GPU's, and only synthetic productivity apps for CPU's for some reason.

Probably because more people will use WinRAR or 7-Zip than Photoshop? The subset of computer users that have to uncompress files is most likely larger than the subset of users that are producing professional-grade photo manipulations.
 
Power = Heat. It's how things work, it's phisics.

Yes, that's what I said.

You can't ignore 100W. What ppl are describing is the difference between coolers and how well they disippate the heat.

That's decidedly not what they're talking about.

The die area is also not a good argument since it's actually Intel that has a monolithic design not AMD. And please don't bring TDP into discussion, we all know that it's irrelevent.

Whether the design is monolithic or not isn't particularly relevant here. I should technically have said transistor density rather than die area, but the two are essentially interchangeable considering the comparable transistor counts and the knowledge that they're running two different processes (with TSMC's 7FF being quite a bit more dense than Intel's 14++.)
 
I would love to see Techspot do a review where they take 4-5 of their staff, and setup 4-5 rigs, all pushing the same exact 144Hz 1080p/1440p monitor, don't care what model it is as long as they are identical.
Then take an AMD Ryzen 3600, 3700X, 3900X, Intel 9700K and 9900KS to build the rigs with, attempting to keep all other components as similar as possible; obviously all pushing the same GPU. Each staff member plays each rig for about 15 minutes, then switch. I would like to see, when getting about 100-150FPS, give or take, and the monitor locked in at 144Hz, will a user notice a difference or will there be a noticeably smoother experience in a game like Tomb Raider or The Division with the Intels? If your LCD is locked a 144Hz and you have one rig running 125 FPS and another running 140FPS, it could make a difference.
I get how anything over 60FPS is very smooth, and the various factors/differences to consider, but you know what I mean.
Would love to see this and hear honest opinions from real world users, not online biased yuppies spouting the same brand loyal nonsense over and over, as if to convince themselves.
 
Last edited:
Fully unleashed without TDP restrictions the 9900K was already a power hungry processor, but the 9900KS sucks down even more power for a minor performance gain.

Wait. KS-model should hit "all-core-5 GHz" stock, I.e. out-of-the-box with power limit set to its own TDP (127W). We've learned how K-model performs with power limit set to its TDP (95W). What about stock KS?

In 9900K re-review Steve said:
Looking ahead into future reviews, we plan to stick to show the typical out of the box experience.

Why there's no such data in the KS-model article? Does it mean, that unlimited power on Intel-based products is typical out of the box? I'm not an Intel-product owner now.
 
Last edited:
Yes, that's what I said.



That's decidedly not what they're talking about.



Whether the design is monolithic or not isn't particularly relevant here. I should technically have said transistor density rather than die area, but the two are essentially interchangeable considering the comparable transistor counts and the knowledge that they're running two different processes (with TSMC's 7FF being quite a bit more dense than Intel's 14++.)
Density is irrelevant in this case. I seriously have no idea why are pushing this argument, especially when CPUs limit the max operating temps anyway. And sure, density went up, but power usage went down. It's why you see the 100W difference.

In this particular case, for AMD the boxed cooler is enough and you can't put the same cooler on this CPU and expect anything less than a broken mess because of thermal throttling.

FYI die area is extremely important for heat dissipation.
 
I would love to see Techspot do a review where they take 4-5 of their staff, and setup 4-5 rigs, all pushing the same exact 144Hz 1080p/1440p monitor, don't care what model it is as long as they are identical.
Then take an AMD Ryzen 3600, 3700X, 3900X, Intel 9700K and 9900KS to build the rigs with, attempting to keep all other components as similar as possible; obviously all pushing the same GPU. Each staff member plays each rig for about 15 minutes, then switch. I would like to see, when getting about 100-150FPS, give or take, and the monitor locked in at 144Hz, will a user notice a difference or will there be a noticeably smoother experience in a game like Tomb Raider or The Division with the Intels? If your LCD is locked a 144Hz and you have one rig running 125 FPS and another running 140FPS, it could make a difference.
I get how anything over 60FPS is very smooth, and the various factors/differences to consider, but you know what I mean.
Would love to see this and hear honest opinions from real world users, not online biased yuppies spouting the same brand loyal nonsense over and over, as if to convince themselves.

Tom's hardware tried to do that in a past but failed to do a double blind study or even properly conceal the machines. It's a lot more effort then you think to ensure results are unbiased.

Unless they are gathering a rather large sample size across a large cross section of games with a good selection of different test subject, I don't really think just TechSpot's staff would be enough people. I'd also say you are more or less getting a narrow perspective.
 
I would love to see Techspot do a review where they take 4-5 of their staff, and setup 4-5 rigs, all pushing the same exact 144Hz 1080p/1440p monitor, don't care what model it is as long as they are identical.
Then take an AMD Ryzen 3600, 3700X, 3900X, Intel 9700K and 9900KS to build the rigs with, attempting to keep all other components as similar as possible; obviously all pushing the same GPU. Each staff member plays each rig for about 15 minutes, then switch. I would like to see, when getting about 100-150FPS, give or take, and the monitor locked in at 144Hz, will a user notice a difference or will there be a noticeably smoother experience in a game like Tomb Raider or The Division with the Intels? If your LCD is locked a 144Hz and you have one rig running 125 FPS and another running 140FPS, it could make a difference.
I get how anything over 60FPS is very smooth, and the various factors/differences to consider, but you know what I mean.
Would love to see this and hear honest opinions from real world users, not online biased yuppies spouting the same brand loyal nonsense over and over, as if to convince themselves.
I like your idea, but would suggest a different approach:

Build several systems for the same price, using different mainboards (e.g. base vs premium), different GPU (e.g. cheaper CPU + higher end GPU and vice versa), CPU and memory.

Cooling (case and CPU), PSU... should all be included in the price calculation.

The end result would be to see which system gives the best bang for the buck in specific price ranges for productivity and gaming.

The cool thing would also be to see what trade offs there are and that this is a true real user case and not Yuppies with unlimited funds.
 
the TDP of the

Core i9-9900KS has a TDP rating of 127 watts which is more and as Intel doesn't report TDP the amd as AMD (who or more honest) the gap it even bigger. GamersNexus clock it at 200 watts at 5 ghz

More honest? Like what kind of oxymoron is that? You are either honest or you are not.

3900x is using up to 220w whilst its TDP is rated at 105w.
Same with 2700x, up to 180w
Same with 1800x and 1700x, up to 200w.

Update your knowledge and don't follow the herd.
 
Back