Ryzen 9 3950X vs. Core i9-9900KS Gaming, Feat. Tuned DDR4 Memory Performance

Just to make sure, this is a 8c/16th vs a 16c/32th correct? If so, is AMD really struggling this bad still? Just asking

Games pretty much don't use more than 8 cores. AMD is doing great. Coming close enough in gaming that the difference hardly matters (especially at the resolutions people buying these CPUs actually run), and then solidly outperforming Intel when anything that isn't a game utilizes all of the cores AMD is offering but Intel is not.
 
I like Techspot, and I have alot of respect for the reviewer, great guy, but It's been awhile since I've seen a fair article on Techspot.
AMD's $400 GPU against Nvidia's old 2060 Super (which was a crappy GPU with an overpriced release), AMD's 16/32 3950X against Intel's 10/20 CPU, and now tweaked timing AMD chips against basically untweaked Intel setups.
Lets see a review of a $400 8/16 3800X vs a $450 8/16 9900K in only games, so we can see how, core for core, Intels old stuff still wipes the floor with AMD's new stuff in games and still bests it in some benchmarks.
I like Ryzen, it's stellar architecture with great latency for its speed and impressive multicore performance, but core for core, its really only marginally better, sometimes slower in application benchmarks, and much slower in gaming.

That "old" 2060 super as an MSRP of 400$ too, that's why they were compared. It was also requested by the audience.

What's wrong with comparing HEDT platforms? Ryzen always had more cores than Intel Core in that segment. Comparing flagship parts is wrong now?

Look up 3700x if you want a 8c/16t comparison between Zen 2 and 9th gen.
 
Last edited:
In my 25 years as a tech enthusiast, I've never seen memory latency benchmarks listed as "higher is better". Couldn't get past that and stopped reading.

Maybe something is changed and I am missing something? But higher latency has always been bad. Wtf?

25 years of being a tech enthusiast and you can't even realize that this is just a copy/paste error?
 
Would you be happier if we called it "memory overclocking" since OCing is pretty much brought up as a strength of Intel CPU all the time and consumers shouldn't have to do that either.

Selecting a single XMP profile is hardly comparable to the software tools and settings that were changed to tune these kits....
 
Selecting a single XMP profile is hardly comparable to the software tools and settings that were changed to tune these kits....

My point was that some posters - including you - mention overclockability as a strong point of Intel CPU.

That is a totally valid point but would you consider that something consumers should do vs. tweaking memory settings which is something consumers shouldn't (have to) do ?

 
Consumers shouldn't have to do this. Maybe AMD will get it right with Zen 3. Maybe.

They don't but customers that do will see gains on both AMD and Intel. FYI the same argument can be made for K model Intel CPUs. Customers shouldn't have to manually overclock their CPU, it should just work out of the box like both AMD's and NVIDIA's boost algorithms. I didn't see you complain about the Intel OC results in the 9900K review. Perhaps TechSpot should start including memory overclocking like they do core overclocking. After all, that would be fair.
 
The best ddr4 kit for Ryzen 2 would be 3600mhz dual rank (2x16GB kit), overclock the memory and mem controller to 3800mhz and it beat any 3600mhz single rank with tight timings any day. And who would buy a 3950X with only 16GB of Ram lol (cant run tight timings with 4x8GB kit).
I thought the same. I get that it is ammo for geeky arguments, but it is pointless as guidance to buyers who would rarely go 16GB (not just for the very affordable extra ram capacity ATM, but for perf too on a dual channel quad rank AFAIK).
 
My point was that some posters - including you - mention overclockability as a strong point of Intel CPU.

That is a totally valid point but would you consider that something consumers should do vs. tweaking memory settings which is something consumers shouldn't (have to) do ?

Consumers don't have to do it. The have XMP set for then in the computers they buy by OEM's.

No OEM is going to manually tune memory for every Ryzen system.

The two are not comparable. I'm talking about memory timings and you're talking about overclocking? I can't wait for your next irrelevant reply....
 
I like Techspot, and I have alot of respect for the reviewer, great guy, but It's been awhile since I've seen a fair article on Techspot.
AMD's $400 GPU against Nvidia's old 2060 Super (which was a crappy GPU with an overpriced release), AMD's 16/32 3950X against Intel's 10/20 CPU, and now tweaked timing AMD chips against basically untweaked Intel setups.
Lets see a review of a $400 8/16 3800X vs a $450 8/16 9900K in only games, so we can see how, core for core, Intels old stuff still wipes the floor with AMD's new stuff in games and still bests it in some benchmarks.
I like Ryzen, it's stellar architecture with great latency for its speed and impressive multicore performance, but core for core, its really only marginally better, sometimes slower in application benchmarks, and much slower in gaming.
Saying you like this publication and respect the reviewer, and then calling them cheating hacks is a perfect microcosm of your kind of thinking. To summarize, it's flawed, shallow and in the smallest of minorities.
I don't know if it was meant to be funny, but this is a comical reply. If it was meant to be a serious reply than you need to get out more often. MOST games have been tweaked and optimized for DECADES for intel chips. Plus, intel has been tweaking this very chip for almost a decade!!!! AMD's chip JUST RELEASED, and hasn't been tweaked hardly at all yet. And if you go back and look at the empirical history of these chips, you will see that they ALWAYS get better with time. So that's the first ridiculous thing. Next is how pathetically OVER-MATCHED the 9900ks is at almost EVERYTHING else. The 3600x, 3700x, 3800x, 3900x and of course the 3950x beat the ever-lovin dogcrap out of the 9900ks in almost every single productivity bench, and any other applications and apps outside of a few games. They are ALL better chips for the money in all-around computing, and are way more future proofed than the crappy old intel chip, you are so in love with.
Lastly, THERE IS MORE TO LIFE THAN GAMING!!!!!! THERE IS MORE TO COMPUTING THAN GAMING!!!!!! So while gaming is cool, and I enjoy it as much as the next guy, those of you who are infatuated with it, and ONLY game on a computer are wasting your life and that computer, in my opinion. There are thousands of BETTER things to do with a computer, and with your time than gaming. And for MOST reasonable people, gaming is ONLY one thing they do with their systems. So, considering that the AMD chips can run circles around your pathetic OLD intel chip in thousands of applications in hundreds of genre's, and is only a tiny bit slower in a handful of games that were optimized for Intel chips, I'd say anyone with a functioning brain would prefer the AMD processors. Here's another point, once you get over 90fps, you can NOT see any discernable difference on the screen of a game in action, with a difference of 15-25fps. And while I might sound like an AMD fanboy to you, worldwide the AMD processors are KILLING the antiquated OLD intel chips in sales, since their release in July. So the world seems to be agreeing more with me, and sending a clear message to the likes of you...
 
Actually this is a poor test. A 16 core CPU can of course perform gaming but that would not be the primary purpose of buying it. If you don't have the cash for 3rd gen Threadripper then the Ryzen 9 3950X would be perfect for mid level animation, photoshop work, etc.
 
4 percent at a CPU bound low resolution is nothing, you would barely notice it. At 1440p or above you definitely wouldn't notice it. It's probably 2-3 percent at most.

You would probably notice it at tasks outside of just gaming though, where the 16 cores obviously dismiss the 9900KS without trouble.

Really that's the key. You wouldn't buy the 3950X for gaming specifically, but if you DID game on it, you know that it's not so much slower than the best Intel gaming processor for it to matter or you to notice.
 
Saying you like this publication and respect the reviewer, and then calling them cheating hacks is a perfect microcosm of your kind of thinking.
I stopped reading your post after this.
It's very obvious I'm fully aware that those tests are neutral, fair and comparative, it doesn't change the fact that its taking a new GPU against an overpriced GPU, a 16 core against a 10 core and a more tweaked AMD setup against a lightly tweaked Intel setup.

Lastly, THERE IS MORE TO LIFE THAN GAMING!!!!!! THERE IS MORE TO COMPUTING THAN GAMING
I haven't PC gamed since about 2010 or 11, although I do dabble here and there. Still, PC gaming is a massive part of the market and 7/10 PC's bought or built by consumers are for gaming. Most people don't care how fast a file zips or copies, or how fast it takes to encode/convert video, not to mention any decent CPU can do all of those things rather well anyways.
 
Last edited:
This is to make sure that the bottleneck is the CPU, not the GPU...

But how many people don't know that this is an artificially induced situation that no one games within and that the graphs don't actually represent any type of real world gameplay?

I know how many ... every single Intel fanboi, as you hear them going on and on about "10% faster in games!!", and posting these charts as proof, influencing poor people who also don't know about how all these numbers are artificial, who then in turn spend more money than they have, saving on a $500 CPU upgrade instead of a GPU upgrade, thinking they will get a 20% FPS increase, then finding out that it didn't change their game experience at all, because they never had a CPU bottleneck to begin with.

Its really disingenuous and causes people to make poor buying choices. As a tech community we are supposed to be doing the opposite ... There should be a major disclaimer on every article, and/or a mix of resolutions / settings in the results. But that would be being neutral and responsible - not something tech journalists are known for apparently.


All that said, as I have been saying, tweaking RAM on Ryzen closes any advantage Intel thinks they had, even if the CPU is bottlenecked in an unrealistic scenario.
 
I like Techspot, and I have alot of respect for the reviewer, great guy, but It's been awhile since I've seen a fair article on Techspot.
AMD's $400 GPU against Nvidia's old 2060 Super (which was a crappy GPU with an overpriced release), AMD's 16/32 3950X against Intel's 10/20 CPU, and now tweaked timing AMD chips against basically untweaked Intel setups.
Lets see a review of a $400 8/16 3800X vs a $450 8/16 9900K in only games, so we can see how, core for core, Intels old stuff still wipes the floor with AMD's new stuff in games and still bests it in some benchmarks.
I like Ryzen, it's stellar architecture with great latency for its speed and impressive multicore performance, but core for core, its really only marginally better, sometimes slower in application benchmarks, and much slower in gaming.
None of this is making sense to you because you are making up prices. How are you complaining about the 5700XT being compared the 2060? I am looking on PCPP at 2060 and 2060 SUPER cards, same price point of $400, I'm confused. I'm not sure which Intel 10 core you are talking about in regards to the 3950X benchmarks, but I see Intel 10 core I9s going for $600-$700, 3950X retails for $750. The R7 3800X is $380 and the 9900K is $490, more than $100 price difference, not the made numbers you posted.

With your fabricated prices, I could easily see how you would think Techspot is unfair.
 
Consumers don't have to do it. The have XMP set for then in the computers they buy by OEM's.

No OEM is going to manually tune memory for every Ryzen system.

The two are not comparable. I'm talking about memory timings and you're talking about overclocking? I can't wait for your next irrelevant reply....
Being thick on purpose ?
My point was that you complain about noticeable performance gains by tweaking the memory as that is something consumers should not need to do, yet in other posts you point out that overclocking is a strength of Intel CPU, so I was asking if you feel this (overclocking) is something consumers should do.

Is it *that* hard to get this ?
 
I stopped reading your post after this.
It's very obvious I'm fully aware that those tests are neutral, fair and comparative, it doesn't change the fact that its taking a new GPU against an overpriced GPU, a 16 core against a 10 core and a more tweaked AMD setup against a lightly tweaked Intel setup.

Are you even talking about this article? Can you, or do you even read? From the article opening "In other words, this is a comparison of Intel and AMD's top desktop CPUs that do not belong to their HEDT series, with both processors tweaked for maximum memory performance. "

Top mainstream desktop parts from each team ... that's fair.

Both systems with same memory speed and timings ... sounds fair to me.

You cry about "16 core vs 8 core" (well you said 10, but I guess thats an example of your knowledge) 1) who's fault is it Intel will only give you 8 cores? -2) this is a gaming test ... are you now going to change your tune and express how more than 8 cores improves gaming results, or maybe you should just take that BS back and pretend you never said it, it would help with your image.

You cry about "more tweaked vs lightly tweaked", yet it states right there "both processors tweaked for maximum memory performance" Who's fault is it that Intel doesn't respond very well to memory tuning?

Why don't you go hide under a rock for a bit and stop embarrassing yourself with your BS?
 
But how many people don't know that this is an artificially induced situation that no one games within and that the graphs don't actually represent any type of real world gameplay?

I know how many ... every single Intel fanboi, as you hear them going on and on about "10% faster in games!!", and posting these charts as proof, influencing poor people who also don't know about how all these numbers are artificial, who then in turn spend more money than they have, saving on a $500 CPU upgrade instead of a GPU upgrade, thinking they will get a 20% FPS increase, then finding out that it didn't change their game experience at all, because they never had a CPU bottleneck to begin with.

Its really disingenuous and causes people to make poor buying choices. As a tech community we are supposed to be doing the opposite ... There should be a major disclaimer on every article, and/or a mix of resolutions / settings in the results. But that would be being neutral and responsible - not something tech journalists are known for apparently.


All that said, as I have been saying, tweaking RAM on Ryzen closes any advantage Intel thinks they had, even if the CPU is bottlenecked in an unrealistic scenario.
So you’d rather see a benchmark done at 4K or 1440p that shows they perform virtually identically? That’s a waste of everyone’s time...

When the GPU is the limiting factor, there is no point benchmarking the CPU! That’s why the benchmark uses a test system with a 2080ti - to try and eliminate the GPU as much as possible from the equation.

The real question you should be asking is why not benchmark at 720p!
 
So you’d rather see a benchmark done at 4K or 1440p that shows they perform virtually identically? That’s a waste of everyone’s time...

When the GPU is the limiting factor, there is no point benchmarking the CPU! That’s why the benchmark uses a test system with a 2080ti - to try and eliminate the GPU as much as possible from the equation.

The real question you should be asking is why not benchmark at 720p!
Here is what I said verbatim: "There should be a major disclaimer on every article, and/or a mix of resolutions / settings in the results ..."

There was a thread over at Tom's where some poor guy saved up a lot of money and bought a new CPU instead of a GPU, because he looked at the reviews and saw that he could get a 25% increase in FPS at 1080. He gamed at 1080 so it seemed like he was getting good advice from a tech review article. If you listen to AMStech or half the noobs in this forum you see them parroting the same nonsense. Its ubiquitous.

This guy bought his new CPU and then asked on the forum why his FPS didn't change at all. He was upset, out of money and he got his knowledge from a review article that solely displayed only bottlenecked CPU graphs with no disclaimer that they do not represent real life gaming experiences.

Do you know why his FPS didn't change after he just spent $400 on a new CPU? Can you guess why? It was an Intel CPU and they are 20% faster in gaming ... aren't they? He wasn't gaming at 1440 or 4k ... so resolution had nothing to do with it at all. Tell me why.

Let me ask you this ... how would you solve the problem of people reading those charts and then believing the data applies to their gaming experience so you are not misleading your viewers? How would you do that in order to be responsible and fair to your readers if you were the author?

Or would you not care because the bias that such a situation creates aligns with your own?

Think about it ...

And to disclaim ... there's a reasonable chance that Zen3 and the cache changes will steal the "bottleneck gaming" crown from Intel ... I'll be lecturing all the AMD fanbois on this too if that happens ... and I imagine all the Intel fanbois will change their tune 180 degrees and chime in with me ... won't they ... ?
 
Last edited:
In the memory latency graph (ns), it says that higher is better, shouldn't lower be better? It shows the latency to be higher on the AMD chip as expected, but that is not better, it is worse, no? Just pointing it out so you could edit that. I know, writing these things must be an easy mistake to make or easy to miss lol. But yeah, just for those who aren't in the know, when it comes to latency, you wanted it lower, not higher.
 
Here is what I said verbatim: "There should be a major disclaimer on every article, and/or a mix of resolutions / settings in the results ..."

There was a thread over at Tom's where some poor guy saved up a lot of money and bought a new CPU instead of a GPU, because he looked at the reviews and saw that he could get a 25% increase in FPS at 1080. He gamed at 1080 so it seemed like he was getting good advice from a tech review article. If you listen to AMStech or half the noobs in this forum you see them parroting the same nonsense. Its ubiquitous.

This guy bought his new CPU and then asked on the forum why his FPS didn't change at all. He was upset, out of money and he got his knowledge from a review article that solely displayed only bottlenecked CPU graphs with no disclaimer that they do not represent real life gaming experiences.

Do you know why his FPS didn't change after he just spent $400 on a new CPU? Can you guess why? It was an Intel CPU and they are 20% faster in gaming ... aren't they? He wasn't gaming at 1440 or 4k ... so resolution had nothing to do with it at all. Tell me why.

Let me ask you this ... how would you solve the problem of people reading those charts and then believing the data applies to their gaming experience so you are not misleading your viewers? How would you do that in order to be responsible and fair to your readers if you were the author?

Or would you not care because the bias that such a situation creates aligns with your own?

Think about it ...

And to disclaim ... there's a reasonable chance that Zen3 and the cache changes will steal the "bottleneck gaming" crown from Intel ... I'll be lecturing all the AMD fanbois on this too if that happens ... and I imagine all the Intel fanbois will change their tune 180 degrees and chime in with me ... won't they ... ?
If someone is stupid, they deserve the foolish purchase they made...
I haven’t seen anyone on this website posting about how they blame Techspot for a foolhardy purchase - perhaps the readers here are slightly smarter than the norm?
Hard to believe after seeing some of the replies in this thread I guess...
 
Lets see a review of a $400 8/16 3800X vs a $450 8/16 9900K in only games, so we can see how, core for core, Intels old stuff still wipes the floor with AMD's new stuff in games and still bests it in some benchmarks.

You've gotta be kidding, right? It's like saying "let's compare a $200,000 sports car vs a $600,000 sports car and see which one is faster". And then it turns out the more expensive one is faster. What have you proven? Nothing.

If you have a budget of $400 you wanna find the fastest chip for that money. Biggest bang for your buck. Or don't you understand the concept of money?
 
If someone is stupid, they deserve the foolish purchase they made...
I haven’t seen anyone on this website posting about how they blame Techspot for a foolhardy purchase - perhaps the readers here are slightly smarter than the norm?
Hard to believe after seeing some of the replies in this thread I guess...

I guess then, every Intel fanboi (or should I say "Intel enthusiast") that broadly claims that Intel is 10% better for gaming is stupid for not only believing that, but doubly so for spreading that misinfo then. Aren't they? Because it would take a "stupid" person to think that Intel would actually be faster at 1080p gaming, as all the reviews show, in real life ... just like that guy ... right?

The guy in question wasn't stupid. He read reviews across several publications that showed at 1080p the Intel CPU was 20% faster at gaming ... That's what the graph and the review indicated. It was 1080p ... you indicated that at 1080p is where Intel processor show themselves as better. That was the resolution of his monitor.

You never answered any of my questions, why? Why was his CPU not any faster at 1080p? It wasn't 1440p and it wasn't 4K.

Actually, I think you inadvertently answered this question: "Or would you not care because the bias that such a situation creates aligns with your own? " -- thanks for sharing a bit about yourself.
 
Last edited:
Let me ask you this ... how would you solve the problem of people reading those charts and then believing the data applies to their gaming experience so you are not misleading your viewers? How would you do that in order to be responsible and fair to your readers if you were the author?

Easy do your own research.

There's so much information on this all over the internet.
 
Back