Ryzen 9 3950X vs. Core i9-9900KS Gaming, Feat. Tuned DDR4 Memory Performance

Ryzen is a great bargain, but for gaming, its below average and to be perfectly honest, not that impressive when you compare them core for core, thread for thread, like comparing the 3700X/3800X to a 9900K.

I have no idea why you think comparing core for core, thread for thread is more important. You are very definitely the outlier here. Most people on this forum compare price for price: 3900X against 9900K.

Even in non gaming benchmarks the 9900K bests the 3700K/3800K in a few tests, about matches them in all, and SMOKES THIER COOKIES in gaming benchmarks.

No. The 9900K doesn't "about match them in all". It is 20% slower across most applications than a 3800X, which is $85 cheaper.

If your a gamer, its worth the extra $60.
Techspot had trouble admitting it, but for raw gaming performance they came through and gave the 9900K its due.
The 9700K is not far behind.
A 6-8% average is massive.
Yes in some games, that means only a 5PS difference, in others its 18FPS.

Again, you are speaking in absolutes. To you an 8% FPS difference is massive. The biggest variance in this test was 15 FPS, and I would bet a $1000 that you couldn't reliably tell the difference in a blind test between the Intel 177 FPS vs the AMD 162FPS in Metro Exodus, mainly because I'm betting you haven't upgraded to a 240Hz monitor yet. I still don't think you could tell the difference even on a 240Hz monitor.
 
I have no idea why you think comparing core for core, thread for thread is more important. You are very definitely the outlier here. Most people on this forum compare price for price: 3900X against 9900K.

The 3900X is $529. (newegg)
The 9900K is $480. (newegg)

The 9900K is $50 cheaper and a faster gamer.
Don't want to hear that crap about the cooler.

No. The 9900K doesn't "about match them in all.

If you compare the 9900K to a 3700X/3800X, all of the benchmark comparisons are close, with the 9900K being faster in a few. If you compare then in games, the 8/16 9900K wipes the floor with the 8/16 3700X/3800X.


I still don't think you could tell the difference even on a 240Hz monitor.
I'm speaking as to what is the better gaming CPU now and for the forseeable future, and its not the AMD chip.
15-25 FPS could be a difference when gaming at 144Hz or so, like I do.
A gamer wants every frame he can get, whether its 7FPS faster or 19FPS faster.
Ryzen is slower and costs more when it comes to gaming and it doesn't overclock.
The 9700K is faster then the 3900X (before its overclocked) in gaming and costs almost $200 less.
For gaming, which is by far and away more important then any other test for 50-70% of consumers, Intel is still the best at most price points. However for a bargain build I would still recommend the 3600, that little 6/12 rocks for the coin.
 
Last edited:
Tip to the OP next time add Neverwinter, Star Trek Online it's free game sure you might not like it but that's all I play along with Asphalt 9: Legends on Windows 10.. Otherwise your review is excellent very detail information. To me whatever is cheap that runs good I'll buy. My i3 laptop now with SSD surely has spinned up for me. Save a lot of bettery as well. My desktops runs Windows 10 Home or Windows 10 Pro. I got some Dual Core, Quad Core systems here and 2 APU's. DDR2 and DDR3. Haven't mess around with DDR4 too much only on HP Elite Series 800.
 
I stopped reading your post after this.
It's very obvious I'm fully aware that those tests are neutral, fair and comparative, it doesn't change the fact that its taking a new GPU against an overpriced GPU, a 16 core against a 10 core and a more tweaked AMD setup against a lightly tweaked Intel setup.

Why do the number of cores matter though? Are you comparing GPU's based on their core count as well or are you just doing that for the CPU? I'm sorry but it doesnt make sense.

The number of cores or any other specification is irrelevant. The price and the performance are all that matters. Officialy the 3950x and the 9900ks have very similar pricing and they are the top mainstream CPU's from each company, soit makes sense to compare them. It's not AMD's fault its giving you double the number of cores for a similar price. Complain at Intel
 
The 9700K will whoop its a$$ in games too.
That was a 8/16 vs 8/16 comparison to show that, even while the 9900K costs a little more, it makes up for it by whooping the 3700X and 3800X in games although they have the same core/thread count.
And the person who called me a fanboy is obviously a clown, no one cares what people like that have to say.
I never said Ryzen wasn't a good buy, or very impressive.
It's just average when it comes to gaming, and it takes AMD's absolute best, more expensive chip to hang with Intel's 4th or 5th best gaming CPU when talking gaming.
Sorry this bothers you, its the truth, so tough sh!t.
The whimpering is annoying, deal with it.


The 3900X is a $500-600 CPU that loses to a $350 CPU in gaming.
Who cares about everything else? Nobody.
The market share tipped a little bit, but people I speak to, which are mostly gamers, are quite happy with their Intel's and don't plan on jumping ship.
Ryzen is great, but not that great and worse/below average in gaming performance.
The $350 9700K is matching/beating AMD's best effort in that costs 3-4 times more.
This post is pure gold. The Ryzen is just average? Cause it loses to what, the 2 best gaming CPU's? So the 3rd fastest gaming CPU is "average"? Also, why is the cost relevant when it comes to gaming performance? Do you think that CPU's are priced according to their gaming performance? Cause there are some Xeon's out there that cost 10k and get whooped by the 80$ R5 1600 in gaming.

NOBODY buys a 12/24 or a 16/32 just to play games. It's just stupid. The R5 3600 is the go to for gaming and general usage PC's. The higher end chips are for very specific people doing very specific tasks.
 
This post is pure gold. The Ryzen is just average?
Below average in gaming.
Compare the 8700K 6/12 against the 3600 6/12.

The number of cores or any other specification is irrelevant.
So is this comment.
Comparing an 8/16 vs an 8/16 that are in similar price ranges is a great comparison for a gamer, as the new consoles are 8/16's and this is the most important test to look at in that aspect.
Anyone who tries to say different is just trying to look the other way because they are too caught up in the hype. The 9900K makes up for its cost over the 3700X/3800X in gaming because it blows their doors off, and matches up well across the board in every benchmarks as well.

The price and the performance are all that matters.
This comment that applies to nothing.
Well all get that the AMD value ideology here with getting a bunch of cores for less.
They are still slower in games.

Officialy the 3950x and the 9900ks have very similar pricing and they are the top mainstream CPU's from each company
The 9900KS is not the top of mainstream for Intel, however the 9900KS is a faster gamer then the 3950X and costs less. I've said this about 4 times already but whats a fifth? Yes It's not AMD's fault Intel doesn't have equal core to core comparisons.
The 9700K is as fast or faster in games and cost significantly less, so having a bizzilion cores is COMPLETELY AND UTTERLY USELESS FOR GAMES NOW AND THE FORESEEABLE FUTURE.
Ryzen is great overall value, but if your just concerned about gaming, which is ALL I care about and care to compare, its slower.
 
There's more performance that can be squeezed out of that 3950X. Using a 16GB kit (2x8GB) actually lowers the performance by between 5-10% versus using a 32GB (2x16GB or 4x8GB) kit since you're preventing the processor from being able to alternate memory ranks which allows for even better performance. You also want to try and get the infinity fabric up to 1866Mhz while running 3733Mhz memory, or 1900Mhz with 3800Mhz memory if you have a particularly good chip (most 3950X's should be able to handle 1900Mhz fabric clocks). That will also yield another low single digit boost in performance. Combined this would bring up the Ryzen 3950X performance another 10% or so in total.

The question is really whether it's worth the money you need to spend on high quality RAM for that last 10% of performance when in real world situations you're more likely to be GPU limited. Maybe not for a use case like games but if you're doing some heavy rendering work a 10% uplift can be a lot of saved time.

If budget is an issue I actually think 32GB (2x16GB dual rank sticks) of Micron Rev E DDR4 is the best value. Virtually all of these can hit 3600Mhz C16 and the better kits (silicon lottery of course) will do 3733 and 3800 at C16. The subtimings won't be the absolute best but they're cheap (about $130 or so on a regular basis), give you 32GB of RAM, and really close the gap with intel's best even in gaming. Now to be entirely fair, if you just care about gaming then just going ahead and overclocking the crap out of a 9900KS is probably still your best bet. Both of these are super overkill for just gaming lol
 
Here is what I said verbatim: "There should be a major disclaimer on every article, and/or a mix of resolutions / settings in the results ..."

There was a thread over at Tom's where some poor guy saved up a lot of money and bought a new CPU instead of a GPU, because he looked at the reviews and saw that he could get a 25% increase in FPS at 1080. He gamed at 1080 so it seemed like he was getting good advice from a tech review article. If you listen to AMStech or half the noobs in this forum you see them parroting the same nonsense. Its ubiquitous.

This guy bought his new CPU and then asked on the forum why his FPS didn't change at all. He was upset, out of money and he got his knowledge from a review article that solely displayed only bottlenecked CPU graphs with no disclaimer that they do not represent real life gaming experiences.

Do you know why his FPS didn't change after he just spent $400 on a new CPU? Can you guess why? It was an Intel CPU and they are 20% faster in gaming ... aren't they? He wasn't gaming at 1440 or 4k ... so resolution had nothing to do with it at all. Tell me why.

Let me ask you this ... how would you solve the problem of people reading those charts and then believing the data applies to their gaming experience so you are not misleading your viewers? How would you do that in order to be responsible and fair to your readers if you were the author?

Or would you not care because the bias that such a situation creates aligns with your own?

Think about it ...

And to disclaim ... there's a reasonable chance that Zen3 and the cache changes will steal the "bottleneck gaming" crown from Intel ... I'll be lecturing all the AMD fanbois on this too if that happens ... and I imagine all the Intel fanbois will change their tune 180 degrees and chime in with me ... won't they ... ?
Because his cpu was not bottlenecked at 1080p, his gpu was....?
I fully agree that at least a disclaimer or more resolutions should be inserted.

I fully understand that in a cpu review you must see the differences between different cpus.

But I also would like to see real life scenarios reviews not scientifical worthless 1% real life application crap.

If I and the majority of users would have the funds to buy a 3950x or a 9900k and to pair it with a 2080ti and with 16gb or more of low latency ddd4 would we game on 1080p or even 1440p? The answer is 100% no. We would game in 4k resolution on an ...let's say ...Oled 120hz hdmi 2.1 panel with gsync now already enabled.

Sure in this case the review is pointless as it will be an even field for even more CPUs but this is the reality. And it should be explained in any review...as a disclaimer or whatever...
 
Or you could just READ a little bit more than 1 article and understand why the reviews are done that way... no one wants to read an article comparing a bunch of CPUs just to have the conclusion be “well, it doesn’t matter because the bottleneck on everything is the GPU anyways, so buy whatever you want”.

When real-world performance depends on something other than the part in question, one must make artificial circumstances to test that part.

After all, perhaps one day there will come a time when the GPU isn’t the bottleneck - and you might want to know which CPU games better :)
 
Or you could just READ a little bit more than 1 article and understand why the reviews are done that way... no one wants to read an article comparing a bunch of CPUs just to have the conclusion be “well, it doesn’t matter because the bottleneck on everything is the GPU anyways, so buy whatever you want”.

When real-world performance depends on something other than the part in question, one must make artificial circumstances to test that part.

After all, perhaps one day there will come a time when the GPU isn’t the bottleneck - and you might want to know which CPU games better :)
I understand the reason as I previously said. And I did not ask for such a conclusion. I asked for an explanation at the beginning or a disclaimer explaining what you just said.
It must be explained to the majority of users that in the real life scenarios for these configurations (I.e. cpu, gpu, a.s.o.) there is no difference at all as the bottleneck is on the gpu. However in order to differentiate the cpus and if in the future more powerful gpus will have no botlenecks with 4k ultra details 4x AA gaming than the bottleneck will return to cpus and there is where Intel will have a slight edge. It is a big difference from what is presented here. And it is a big IF for at least 2 generations of gpus, at least for what it really matters in gaming experience meaning 1% low frame rates.
 
There are tons of reviews on this site - and almost all of them have comments similar to yours... again, readers just need to READ! Don’t bother coming to a tech site if you can’t comprehend simple comparisons.
 
Because his cpu was not bottlenecked at 1080p, his gpu was....?
I fully agree that at least a disclaimer or more resolutions should be inserted.

I fully understand that in a cpu review you must see the differences between different cpus.

But I also would like to see real life scenarios reviews not scientifical worthless 1% real life application crap.

If I and the majority of users would have the funds to buy a 3950x or a 9900k and to pair it with a 2080ti and with 16gb or more of low latency ddd4 would we game on 1080p or even 1440p? The answer is 100% no. We would game in 4k resolution on an ...let's say ...Oled 120hz hdmi 2.1 panel with gsync now already enabled.

Sure in this case the review is pointless as it will be an even field for even more CPUs but this is the reality. And it should be explained in any review...as a disclaimer or whatever...
I have seen several benchmarks showing the 9900ks getting better 0.1 % lows at even 4k resolution while the games do become more gpu bound the higher you scale the resolution but it seems that you may benefit from better frame variance or 0.1% lows when using the 9900ks although those test were done without memory tweaks.
 
Back