Ryzen 9 3950X vs. Core i9-9900KS Gaming, Feat. Tuned DDR4 Memory Performance

amstech

IT Overlord
You cry about "16 core vs 8 core"
Overall I could care less, 8/16 vs 8/16 the 9900K rocks the 3700 and 3800X in games, and gaming performance is this is all your truly, and most people, care about (even though I just kind of dabble :D). The Intel's have other tweaks + overclocking headroom, which Ryzen does not.
No one cares about file copy and file zipping performance. NOBODY.
Your obviously bothered and emotional, its just silicon dude, again, I could care less.
Everything I've stated about those comparisons is spot on, its not a knock or praise on AMD or Intel, it's fair, comparative, sure, that don't make it equal.
Judging by your reply, your tone, your use of the word 'fanboi' and other nonsense, I would take a break.
I never said the reviewer did a bad job, or that his intentions were a certain way, I said certain comparisons aren't as genuinely equal sometimes, and recent history and other factors played a beneficial role in some of these results.
The authors mobo tweaks and results with the 3950X are pretty cool, its a nice bill as a $1000 CPU.
 
Last edited:

Squid Surprise

TS Evangelist
I guess then, every Intel fanboi (or should I say "Intel enthusiast" that broadly claims that Intel is 10% better for gaming is stupid for not only believing that, but doubly so for spreading that misinfo then. Aren't they?

The guy in question wasn't stupid. He read reviews across several publications that showed at 1080p the Intel CPU was 20% faster at gaming ... That's what the graph and the review indicated. It was 1080p ... you indicated that at 1080p is where Intel processor show themselves as better.

You never answered any of my questions, why? Why was his CPU not any faster at 1080p? It wasn't 1440p and it wasn't 4K.

Actually, I think you inadvertently answered this question: "Or would you not care because the bias that such a situation creates aligns with your own? "
Well, you didn’t give the specs of his PC, so I assumed you were just trolling... but sure, I’ll give it a shot :)

When all other hardware is the same, and assuming that no other part is acting as a bottleneck, then the Intel 9900 will give the best performance in gaming. That’s why every benchmark on pretty much every reliable site uses top of the line hardware on their test beds - to prevent anything other than the tested part limiting performance. Not to mention they all come up with the same conclusion - that the Intel CPUs are better for gaming (but JUST gaming, not application / workstation use).

I’m going to assume that this “not stupid” person has a setup that was limited by some other PC part - such as his GPU, SSD, RAM, etc...

As anyone with even a modicum of PC knowledge SHOULD know, your choice of CPU is far less relevant for gaming than your choice of GPU.

An older PC - with let’s say a Haswell CPU - will still perform better than a 9900 if the Haswell has a 2080ti and the 9900 has an AMD GPU (pick any model you like, even the best will lose to the Haswell/2080ti combo).

If you’re actually not trolling, please list his hardware specs and I’m sure I (or some other poster) can tell you where he went horribly wrong.

And I'm not an "Intel Fanboi" or any such thing. Until fairly recently, they were simply the only game in town. With AMD's Ryzen, then game has changed, and now the only reason to purchase an Intel CPU is if you use your PC solely for gaming. Not an opinion really - it's basically a fact - backed up by pretty much every tech site there is, including this one!
 
Last edited:
  • Like
Reactions: amstech

JimboJoneson

TS Booster
Well, you didn’t give the specs of his PC, so I assumed you were just trolling... but sure, I’ll give it a shot :)

...

As anyone with even a modicum of PC knowledge SHOULD know, your choice of CPU is far less relevant for gaming than your choice of GPU.
...
Now we're getting somewhere ... unless you have a 2080ti, (or like a 1st gen bulldozer processor and a high end card), you will never get a CPU bottleneck. And even if you do, unless you have a 240hz monitor - it won't matter.

He had like a 1070 level of card (I can't recall) and played at very high or ultra quality settings so there was zero chance of inducing a CPU bottleneck at 1080p whatsoever.


My point is that there's a lot of people who read these tech reviews BECAUSE they don't know a lot about these things. I think it behooves the tech community to not throw these people under the bus - they make a good portion of readers. They aren't "stupid", they just aren't enthusiasts ... yet. That's why they come to review sites -- to learn.

So back to my initial point ... review journalists should be posting a disclaimer that these CPU gaming reviews are an artificially induced situation and unless you have a 2080ti AND game at 1080p AND have a 240hz monitor, these results DO NOT apply but they are merely a rough reference showing a specific CPU load - and is not indicative of what you will get while gaming ... or just show some other settings or GPUs in the mix and let them figure it out.

Does it sound that bad for reviewers to just be honest and clear about what a user can expect in real life when it comes to CPU gaming reviews? That was my initial claim you responded to.

I know that many people who you might claim aren't "stupid" still fall for the rhetoric that "Intel is 10% faster in gaming" - and they completely believe that outright and go spreading that misinfo ... I could name a few right here on this thread ... for sure ... as I said this issue of spreading inaccurate information is ubiquitous, Intel "enthusiasts" seem to not have a clue about any of this, and at worst people are being misled to buy wrong parts. There's proof everywhere that this is true - even a lot here at Techspot where there apparently aren't many "stupid" people ...

But like I said, I'll be saying this to all the AMD fanbois if Zen3 takes away that little lead in this one artificial situation ... and I promise you all th eIntel "enthusiasts" will change their tune 180 to say exactly what I am saying now.
 
Last edited:

Squid Surprise

TS Evangelist
Now we're getting somewhere ... unless you have a 2080ti, (or like a 1st gen bulldozer processor and a high end card), you will never get a CPU bottleneck. And even if you do, unless you have a 240hz monitor - it won't matter.

He had like a 1070 level of card (I can't recall) and played at very high or ultra quality settings so there was zero chance of inducing a CPU bottleneck at 1080p whatsoever.


My point is that there's a lot of people who read these tech reviews BECAUSE they don't know a lot about these things. I think it behooves the tech community to not throw these people under the bus - they make a good portion of readers. They aren't "stupid", they just aren't enthusiasts ... yet. That's why they come to review sites -- to learn.

So back to my initial point ... review journalists should be posting a disclaimer that these CPU gaming reviews are an artificially induced situation and unless you have a 2080ti AND game at 1080p AND have a 240hz monitor, these results DO NOT apply but they are merely a rough reference showing a specific CPU load - and is not indicative of what you will get while gaming ... or just show some other settings or GPUs in the mix and let them figure it out.

Does it sound that bad for reviewers to just be honest and clear about what a user can expect in real life when it comes to CPU gaming reviews? That was my initial claim you responded to.

I know that many people who you might claim aren't "stupid" still fall for the rhetoric that "Intel is 10% faster in gaming" - and they completely believe that outright and go spreading that misinfo ... I could name a few right here on this thread ... for sure ... as I said this issue of spreading inaccurate information is ubiquitous, Intel "enthusiasts" seem to not have a clue about any of this, and at worst people are being misled to buy wrong parts. There's proof everywhere that this is true - even a lot here at Techspot where there apparently aren't many "stupid" people ...

But like I said, I'll be saying this to all the AMD fanbois if Zen3 takes away that little lead in this one artificial situation ... and I promise you all th eIntel "enthusiasts" will change their tune 180 to say exactly what I am saying now.
Well, I'm glad you think we've gotten somewhere.... I fail to see the need for a disclaimer - the Intel IS better for gaming - and every tech site agrees.

Most people understand that a PC is not simply the CPU, and purchase accordingly... If you're spending hundreds (or thousands) of dollars on a gaming rig, do your homework!

Again, no one is complaining on this site - or any other sites other than your "not stupid" guy on Tom's... I'd love a link to that thread so I can see for myself - I'm beginning to suspect, however, that it either never happened, or that you were the poster in question...
 

JimboJoneson

TS Booster
... the Intel IS better for gaming ...
Well I gave you a real life example where it wasn't at all -- and I could go around and poll 100 people on their gaming rigs and I think it would not be true in 100% of these cases if they upgraded to a high end Intel CPU. So in what real life situation is Intel better for gaming again? Oh wait I keep confusing fanboi fantasy and real life. Silly me.


You proved my point completely .. .Thanks!

I did agree with your one point you made ... people thinking that upgrading to a Intel CPU will actually increase their FPS as being "stupid" -- I do agree that people who think that way are indeed stupid. You got me on that one ... ;)
 
Last edited:

Squid Surprise

TS Evangelist
Well I gave you a real life example where it wasn't at all -- and I could go around and poll 100 people on their gaming rigs and I think it would not be true in 100% of these cases if they upgraded to a high end Intel CPU. So in what real life situation is Intel better for gaming again? Oh wait I keep confusing fanboi fantasy and real life. Silly me.


You proved my point completely .. .Thanks!
Very nice...one “real life” example which is invalidated by your refusal to provide a link to it... and your fictional poll of 100 people which will never happen... that’s wonderful evidence...

How about reviews from EVERY SINGLE Tech site stating what you are unwilling or unable to comprehend? That the Intel 9900 is the best gaming cpu on the planet!

Yes, it generally doesn’t matter what CPU you have (hence the recommendations from these same sites to buy AMD!), but the facts stay the same!

As for a “Real Life” example: there actually ARE people who bought the Nvidia 2080ti - google Nvidia sales figures if you don’t believe me - and maybe they want the best gaming pc they can buy?

Now stop trolling and find yourself a threadripper to purchase.
 

JimboJoneson

TS Booster
Very nice...one “real life” example which is invalidated by your refusal to provide a link to it... and your fictional poll of 100 people which will never happen... that’s wonderful evidence...

How about reviews from EVERY SINGLE Tech site stating what you are unwilling or unable to comprehend? That the Intel 9900 is the best gaming cpu on the planet!

Yes, it generally doesn’t matter what CPU you have (hence the recommendations from these same sites to buy AMD!), but the facts stay the same!

Now stop trolling and find yourself a threadripper to purchase.
Let's test your knowledge ... GTX 2060 ... $200 R5 3600 or $500 9900ks ... 1080p quality on highest. Which one has higher FPS ... :) Go on tell me, smart guy.
 

JimboJoneson

TS Booster
Not relevant... we’ve already ascertained the bottleneck! Do you even bother to read my posts?
Oh right the CPU has to have an artificial bottleneck or else Intel isn't faster at gaming, I haven't been saying that all along! Silly me! ... if only there was a way to be accurate about that info ... I guess its impossible! lol ...

No reviewer can do the impossible and be clear about actual real life performance ... sigh ... too bad we have to live with the misleading advice ... and there's nothing anyone can do to relay the actual truth. I guess its an impossible problem to solve then ...

PS remind me not to hire you to solve any real life problems. :)

PPS only about 1 in a few thousand gamers owns a 2080ti and games at 1080p ... so out of 100 that I polled the chances of any of them having a bottlenecked CPU is slim to none. Wouldn't you agree? I mean who purposely bottlenecks their CPU to game? You? AMSTech? Anyone? Bueller?
 
Last edited:

Squid Surprise

TS Evangelist
Oh right the CPU has to have an artificial bottleneck or else Intel isn't faster at gaming, I haven't been saying that all along! Silly me! ... if only there was a way to be accurate about that info ... I guess its impossible! lol ...

No reviewer can do the impossible and be clear about actual real life performance ... sigh ... too bad we have to live with the misleading advice ... and there's nothing no one can do to relay the actual truth.
It’s not an artificial bottleneck - I think you genuinely don’t understand this.... there are REAL GPUs out there that DO benefit from the slight improvement an Intel 9900 gives... a tech website exists to give you the information....if you don’t want it, or don’t understand it, then please go elsewhere!
 
  • Like
Reactions: Burty117

Burty117

TechSpot Chancellor
I think you genuinely don’t understand this...
I just (painfully) read through all this and yeah, he genuinely doesn't understand or can't comprehend what a bottleneck is. Below is for @JimboJoneson


Hope this helped!
 

JimboJoneson

TS Booster
It’s not an artificial bottleneck - I think you genuinely don’t understand this.... there are REAL GPUs out there that DO benefit from the slight improvement an Intel 9900 gives... a tech website exists to give you the information....if you don’t want it, or don’t understand it, then please go elsewhere!
Calm down. I only asked that reviewers add a disclaimer indicating that this "benefit" you speak of, applies ONLY with one GPU - the 2080ti, 1080p resolution, and where you display's hz are not exceeded. This is true - this is who no other configuration is used in these tests across all reviewers ... Do you dispute this fact? Is this not "the only" situation where this bottleneck occurs? If you don't think this please explain ...

Its artificial for 99.99% of all gaming rig scenarios ... How many people do you know that own a 2080ti and game at 1080p?

Is it unreasonable to ask that considering it explains a very common misunderstanding that some less tech savvy readers who might be at a tech site to learn more? I mean that was literally my comment that you responded to ...

If you think that is unreasonable ... thanks for lettings us know.
 
Last edited:

Burty117

TechSpot Chancellor
it explains a very common misunderstanding that some less tech savvy readers who might be at a tech site to learn more?
So you're saying "less tech savvy readers" cannot fathom that their 2060 is worse than the 2080Ti? How did that "less tech savvy" reader end up with a 2060? Must have read some reviews on GPU's, reviews that were probably running an overclocked 9900k to remove the CPU as a bottleneck...

Do you see where I'm going with this?
 

Squid Surprise

TS Evangelist
Calm down man. I only asked that reviewers add a disclaimer indicating that this benefit you speak of, applies ONLY with one GPU - the 2080ti, 1080p resolution, and where you display's hz are not exceeded. This is true ... Do you dispute this fact? Is this not the situation where this bottleneck occurs? If you don't think this please explain.

Is it unreasonable to ask that considering it explains a very common misunderstanding that some less tech savvy readers who might be at a tech site to learn more? I mean that was literally my comment that you responded to ...

If you think that is unreasonable ... thanks for lettings us know.
It doesn’t only apply to the 2080ti though.... it would also apply to titans, 1080ti, 2080 super.... and once again, there is no need for a “disclaimer” - just don’t be stupid and you’ll do fine...
 

JimboJoneson

TS Booster
I just (painfully) read through all this and yeah, he genuinely doesn't understand or can't comprehend what a bottleneck is. Below is for @JimboJoneson


Hope this helped!
What is it you think I don't understand? When a cpu is bottlenecked and when it isnt? What?
 

JimboJoneson

TS Booster
It doesn’t only apply to the 2080ti though.... it would also apply to titans, 1080ti, 2080 super.... and once again, there is no need for a “disclaimer” - just don’t be stupid and you’ll do fine...
It only applies on those cards (except the titan - which no one owns) with old CPUs that you can't buy new anymore ... my reference was about misleading people on forward thinking, including purchasing habits, if you don't recall.
 
Last edited:

JimboJoneson

TS Booster
Perhaps you simply don’t understand the purpose of a tech site....
Please let me know ...

I only asked that reviewers add a disclaimer.
Is it unreasonable to ask that considering it explains a very common misunderstanding that some less tech savvy readers who might be at a tech site to learn more?
 

JimboJoneson

TS Booster
So you're saying "less tech savvy readers" cannot fathom that their 2060 is worse than the 2080Ti?

Do you see where I'm going with this?
No that's not what I said anywhere (you obviously didn't read through sht) ... I indicated that a 2060 owner doesn't know that their CPU will never be a bottleneck and that that means their CPU makes no difference as long as its modern.

They would obviously think that the performance scales linearly from the 208ti 1080p low settings graphs, because there is never any indication otherwise, which brings me back to my original point about the need for transparency of information ... you two are sure fighting hard to promote the continued obfuscation of when a CPU is and is not bottlnecked. That's pretty weird.

Is it unreasonable for reviewers to disclaim this or provide tests across different GPUs and settings? Why is that such a problem if you don't have a bias?

This was the what I asked for, originally that squid responded to ... would this be a problem for you?, if reviewers did that?
 
Last edited:

Squid Surprise

TS Evangelist
Please let me know ...

I only asked that reviewers add a disclaimer.
Ahhh... I thought you didn't understand.... let me help then :)
"The purpose of an information centric website is to convey specific, helpful information to a specific user/audience so that the reader learns something new or understands a topic better. "
The specific user/audience in this case is clearly not you -as you refuse to actually learn...
Is it unreasonable to ask that considering it explains a very common misunderstanding that some less tech savvy readers who might be at a tech site to learn more?
Yes, it IS unreasonable... because it is NOT a "common misunderstanding". It's a misunderstanding that only seems to apply to YOU and your "not stupid" guy who I'm still fairly sure is ALSO YOU.
 

JimboJoneson

TS Booster
Ahhh... I thought you didn't understand.... let me help then :)
"The purpose of an information centric website is to convey specific, helpful information to a specific user/audience so that the reader learns something new or understands a topic better. "
The specific user/audience in this case is clearly not you -as you refuse to actually learn...
Well that's funny ... because my original post that you commented on was indicating my desire to help the learner learn more accurately about the charts and graphs and what they represent, because someone learning about and understanding the things a tech site has posted, just like this site does, made a bad purchasing decision as a result ... you said: "If someone is stupid, they deserve ... (blah blah)" ... so is a tech site for people to learn or be insulted as "stupid" if they use the info provided on said tech site to make decisions? Which one? ... or should I ask, which time were you just lying when you said that, this one or that one? Because you can't claim both the contradictions.

Yes, it IS unreasonable... because it is NOT a "common misunderstanding". It's a misunderstanding that only seems to apply to YOU and your "not stupid" guy who I'm still fairly sure is ALSO YOU.
This misunderstanding only pplies to me? So what you are saying is, my misunderstanding is that review graphs only represent a bottleneck situation, which 99.9 percent of all gamers don't game within, and I don't understand that this isn't reality?

Seriously? This is now what you are arguing? Are you feeling ok man?

Let's assume it was me ... do you think the purpose of a tech site is to mock people like me? To make fun of people who took information at face value, as presented, and made a decision based on that? Is that what you are here for?

You tell me ...
 
Last edited:

mailpup

TS Special Forces
That's enough personal comments and arguing. If you feel the need to continue your conversation, please do so through PM. Thank you.
 

Squid Surprise

TS Evangelist
One can think of this benchmark kind of like a "World's Strongest Man" competition. We want to find out who the strongest man is - so we give them a bunch of tests to see who can do them better...

Someone might ask, "When is a man ever actually going to be pulling an airplane?!?!?"

So we change the competition so that everyone has to pull a paperback book instead.... funny... now, even I have a decent chance of winning the competition!
 
  • Like
Reactions: Burty117

markrb

TS Rookie
Overall I could care less, 8/16 vs 8/16 the 9900K rocks the 3700 and 3800X in games, and gaming performance is this is all your truly, and most people, care about (even though I just kind of dabble :D). The Intel's have other tweaks + overclocking headroom, which Ryzen does not.
No one cares about file copy and file zipping performance. NOBODY.
Your obviously bothered and emotional, its just silicon dude, again, I could care less.
Everything I've stated about those comparisons is spot on, its not a knock or praise on AMD or Intel, it's fair, comparative, sure, that don't make it equal.
Judging by your reply, your tone, your use of the word 'fanboi' and other nonsense, I would take a break.
I never said the reviewer did a bad job, or that his intentions were a certain way, I said certain comparisons aren't as genuinely equal sometimes, and recent history and other factors played a beneficial role in some of these results.
The authors mobo tweaks and results with the 3950X are pretty cool, its a nice bill as a $1000 CPU.
You keep going to the price argument and applying it unfairly (which is what got you labelled as a Fanboi, IMO). You wanted the 3800X ($380 (Amazon)) to be compared against the 9900K ($465 (Amazon)), when it is priced similar to the 9700K ($400 (Amazon)). The 3900X is priced against the 9900K.

I have no idea where you have seen the 3950X priced at $1000. If I didn't know better I would think you were just making this stuff up :)

As per a previous TechSpot review the 3900X is 6% slower than the 9900K (1080p Gaming) and they are the same price. The 3900X is faster at EVERYTHING else. The difference in gaming performance narrows even further when you go to 1440p, which is what people with $400+ CPUs will be using.

More importantly, the 9900K is slower than the 3900X while gaming and streaming simultaneously. Again, you will probably respond with "NOBODY does that". While, in my experience about half of my friends are streaming/recording their games now. They are all using Ryzen 7s and 9s because they are faster for the video encoding needed while streaming.

Back to this article, most people will take the following away:

For an extra $200 on top of the price of a 9900KS you get
1) within 5FPS on most games at 1080p, less than 5FPS at 1440p or higher
2) 20-40% faster video encoding times for HEVC media center PCs / Adobe Premier encodes
3) 60% faster performance for software compilation - but NOBODY uses Linux and compiles their own software, right? ;)
4) almost 100% performance improvement for 3D rendering Autodesk Maya / 3ds Max / Blender
5) the ability to use a PCIe 4.0 motherboard and get an RTX 2080Ti running with 3 NVMe drives in RAID and still not saturate the PCI BUS. Try doing that on a Z390!

It is fine if you don't care about those things, but I consider my Linux Kernel compilation time to be just as important as my Fallen Order FPS performance.

You also said: "The Intel's have other tweaks + overclocking headroom, which Ryzen does not."

You don't seem to realise that you are saying this on a 9900KS article . . . the 9900KS already MCEs to 5GHz . . . most people are silicon max'ed at 5.1GHz . . . that's 100MHz of overclock potential!!!

Finally, you are flat out wrong when you say "NOBODY" cares about file compression performance. Just because YOU don't take large backups - for example - and compress them doesn't mean nobody does.
 
Last edited:
  • Like
Reactions: Dsirius

amstech

IT Overlord
You keep going to the price argument and applying it unfairly (which is what got you labelled as a Fanboi, IMO). You wanted the 3800X ($380 (Amazon)) to be compared against the 9900K ($465 (Amazon)), when it is priced similar to the 9700K ($400 (Amazon)). The 3900X is priced against the 9900K.
The 9700K will whoop its a$$ in games too.
That was a 8/16 vs 8/16 comparison to show that, even while the 9900K costs a little more, it makes up for it by whooping the 3700X and 3800X in games although they have the same core/thread count.
And the person who called me a fanboy is obviously a clown, no one cares what people like that have to say.
I never said Ryzen wasn't a good buy, or very impressive.
It's just average when it comes to gaming, and it takes AMD's absolute best, more expensive chip to hang with Intel's 4th or 5th best gaming CPU when talking gaming.
Sorry this bothers you, its the truth, so tough sh!t.
The whimpering is annoying, deal with it.

As per a previous TechSpot review the 3900X is 6% slower than the 9900K (1080p Gaming) and they are the same price. The 3900X is faster at EVERYTHING else. The difference in gaming performance narrows even further when you go to 1440p, which is what people with $400+ CPUs will be using.
The 3900X is a $500-600 CPU that loses to a $350 CPU in gaming.
Who cares about everything else? Nobody.
The market share tipped a little bit, but people I speak to, which are mostly gamers, are quite happy with their Intel's and don't plan on jumping ship.
Ryzen is great, but not that great and worse/below average in gaming performance.
The $350 9700K is matching/beating AMD's best effort in that costs 3-4 times more.


More importantly, the 9900K is slower than the 3900X while gaming and streaming simultaneously.
First I have heard of this.

You don't seem to realise that you are saying this on a 9900KS article . . . the 9900KS already MCEs to 5GHz . . . most people are silicon max'ed at 5.1GHz . . . that's 100MHz of overclock potential!!!
5.1GHz is not the limit, many folks are running 8700K, 9700K's and 9900K's @ 5.2/5.3GHz.
Finally, you are flat out wrong when you say "NOBODY" cares about file compression performance. .
Okay not everyone, just by far the majority.
Ryzen is a great bargain, but for gaming, its below average and to be perfectly honest, not that impressive when you compare them core for core, thread for thread, like comparing the 3700X/3800X to a 9900K.
Even in non gaming benchmarks the 9900K bests the 3700K/3800K in a few tests, about matches them in all, and SMOKES THIER COOKIES in gaming benchmarks.
If your a gamer, its worth the extra $60.
Techspot had trouble admitting it, but for raw gaming performance they came through and gave the 9900K its due.
The 9700K is not far behind.
A 6-8% average is massive.
Yes in some games, that means only a 5PS difference, in others its 18FPS.
 
Last edited: