AMD Ryzen 9 3950X Review: The New Performance King

Actually my point about gaming (high refresh rate mostly) was that even at 5GHz I'll bet AMD will not match Intel. The ring bus beats Infinity Fabric for latency and games also want low latency. AMD may need 5.5 GHz (good luck with that) as 5.1 to 5.2 is pretty common for 9700K and 9900K.

On this architecture if you added 15 percent better clocks to the AMD parts the gap for the majority of games would close quite a lot I suspect. Worse case scenarios are 20 percent deficit at 1080p, average is about 10 percent. The above tests show that IPC is very close.

For some heavily multithreaded games AMD have proven to be closer. This is one key component of the future for games. When new consoles arrive I think it will certainly end up having more multi core performance on PC utilised.

At the moment the consoles might use 7 threads but they are all pitifully slow. Upon moving those games to PC, any decent 4C/8T smashes it out of the park. That will rapidly cease. With such a large baseline (console) increase in CPU performance available to developers it'll definitely translate to greater multithreaded CPU demand in the next few years.

Then there is the next step for Zen 3, which looks like it should unify the L3 cache on a full 8 core chiplet instead of the current 4 core CCX design. There is definitely significant latency gains to be had even if there is only a limited scope for clock speed improvements.

To this end it seems apparent AMD are making bigger gains than Intel each generation. Intel might still be squeezing out a lead 12 months from now, however it seems to be shrinking with each iteration that AMD produce. Intel need 10nm desktop to work, and very soon.
 
Last edited:
At the moment the consoles might use 7 threads but they are all pitifully slow
Mostly because the CPU isn't really an 8 core chip - both the PS4 and XBox 1 use a processor with two 4 core units in the same die, so any threads requiring data from another unit's L2 cache will suffer a significant performance penalty. This is why, even now, the main game engine thread just runs on the one thread, which in turn dispatches concurrent jobs off onto other threads. The next generation of consoles will be far superior in this respect.
 
Ba hum bug. Come on AMD! 8-cores @ 5Ghz. Majority aren't looking for more cores. We are looking for more speed. Shoot, I'll be happy with 4-cores at 5Ghz. Still waiting...

AMD 8 cores or maybe just 4 cores @5ghz eh? I know just the CPU for you ... they called it "Bulldozer" -- the great thing is that it is both an 8 core and 4 a four core in one! ;-)
 
In a few years when the price on second hand chips reach 250 I will upgrade to this 16 core monster.
for now 16 threads will be enough.
 
That is not what I said. The only way your example would compare to what I said is if rape wasn't illegal yesterday.
It compares perfectly well - there is a thing called anti trust law in many countries and Intel was fined based on these laws - also in many countries.

It almost seems like you are insinuating that there is no rule of law in western countries...

To be more clear: No new laws were created based on Intel's behavior - they were fined based on existing laws. This can all be easily found.
 
Question for you guys: Did you implement the new microcode updates for Intel's CPUs?

The update is to deal with ZombieLoad version 2 and Jump Conditional Code bug (Kiss Intel bug, pronounced Kiss n' tell). That affects Intel CPUs from Haswell up to CoffeeLake. Some sources are showing it looses approx. another 5 to 7% based on the workload.
It should not affect desktop workloads (1-2% at most). For server workloads it's different though. I don't think the charts will change.
It doesn't matter anyway, Intel doesn't really have something that can compete properly with this CPU.
 
That is not what I said. The only way your example would compare to what I said is if rape wasn't illegal yesterday.
Paying off vendors' CEOs billion$ to not buy competitors products, then trying to interfere with the investigation, destroying evidence, etc. is not legal under anyone's definition and certainly didn't suddenly become illegal after that lawsuit. Its clear that the level of information you actually have on this topic and business law in general, is pretty thin.
 
Your mostly correct, but however I would like to mention a few things:
A) There is a huge dropoff with AMD CPU's in multiplayer games with 64 players, and it was in multiple titles. This is a huge deal for gamers. Someone posted some bench's from Battlefield, I'll see if I can dig them up.
B) While I mostly agree with you about most of those benchmarks being for bragging rights, those of us like myself who play at 1440p/144Hz will benefit from having another 10-20FPS, and if you compared 35 games, you would get the same results as Techspot just did with only more examples of each, so a few more games would run 10-20 FPS faster, a few 5-10FPS faster and a few with no difference.

One could also argue that Ryzen has been out enough and has mostly matured, and some of those Intel Chips like the 9700K can overclock to 5.2GHz - 5.4GHz, picking up another 3-8FPS in some games, while your pretty much maxed with Ryzen. It all comes down to price range, if I was building a budget gaming PC, the Ryzen 3600 is a stud. But if I was spending around $300, I'd get Intel and overclock it.
It will be years before gaming results truly change, and use more then 8 cores, as the new consoles (PS5/Xbox Scarlett) are rumored to have 8/16 core processors, that's what most developers will be shooting for. Nothing wrong with being future proof either but at the same time, when will folks who only care to game actually need more then 8 cores? Not now, that's for sure, and this is with newer titles.

A) I believe the Battlefield 64 player issue was chalked up to outdated Windows drivers. Either way, this wouldn't be a hardware or architectural issue without showing up clearly in much less obscure examples.
B) most of the reviews I've read have been much closer than that at anything over 1440. At some point it just isn't a good trade to give up significant multithread advantage for a few fps in select games. If all you do is game and you are planning to overclock I agree with you, why care if you are purchasing a mature platform. It'll probably be a while for pcie4 to add anything significant to gaming so it's not like you are future proofing going with AMD. There is the argument of constant mitigations though. Eventually those have to add up.
 
A) I believe the Battlefield 64 player issue was chalked up to outdated Windows drivers. Either way, this wouldn't be a hardware or architectural issue without showing up clearly in much less obscure examples.
B) most of the reviews I've read have been much closer than that at anything over 1440. At some point it just isn't a good trade to give up significant multithread advantage for a few fps in select games. If all you do is game and you are planning to overclock I agree with you, why care if you are purchasing a mature platform. It'll probably be a while for pcie4 to add anything significant to gaming so it's not like you are future proofing going with AMD. There is the argument of constant mitigations though. Eventually those have to add up.

Techspot got a 4% difference in average FPS between 1080p and 1440p on a 2080ti very high or ultra game settings between I think it was 3600 (may have been 3700x) and 9700k. 4% is within margin of error difference - anything less than a 2080ti and that number drops to parity. I don't know why this is hard for some people to understand. That post wasn't worth responding to IMO.

To 99.99% of real life gamers, they woudn't see any difference switching from a $200 R5 3600 to a $375 i7 9700k in FPS because only 0.001% of real gamers plays with a bottlenecked CPU in reality, nor do they need to ... it just wastes the capabilities of the GPU you spent good money on ... now talking that $175 and putting it into faster RAM (with Ryzen) and/or a better GPU will actually get you a higher FPS.
 
To 99.99% of real life gamers, they woudn't see any difference switching from a $200 R5 3600 to a $375 i7 9700k in FPS because only 0.001% of real gamers plays with a bottlenecked CPU in reality, nor do they need to ... it just wastes the capabilities of the GPU you spent good money on ... now talking that $175 and putting it into faster RAM (with Ryzen) and/or a better GPU will actually get you a higher FPS.
10-20 FPS is quite significant, 4-8FPS across many games in still an advantage. You can say its 1% or 4% or whatever, but for gaming, the Intel's still control the board and in some cases are 7-12% faster, in many cases they are no faster.
For folks like myself playing at 144Hz @ 1440P, getting 118 FPS from a 3600 and getting 136 FPS from a 9700K is pretty substantial.
Intel's $300 9700K is beating AMD's $700 CPU across the board in gaming, and it also has room to overclock.
Not saying AMD isn't a great buy and that in many cases, there will be little to no difference, but there is a difference. If building a highend gaming PC, bias aside, I would absolutely go Intel, because for gaming, they are better, period.
I'm also interested about comet lake, which is rumored to hit 5.2GHz in stock form, which means it may be capable of 5.4Ghz - 5.5GHz, which would add on another 5-10FPS across the board.
 
Last edited:
Does higher FPS translate into lower latency? Isn't that the reason why people are clamoring for more FPS? It's also looking like faster CPUs are becoming less of a factor than faster GPUs. I suppose it depends on the game. If I understand it correctly, faster GPUs gives better framerates while faster CPUs process more characters on the screen. I can definitely say that faster hard drives has helped out tremendously in open world games when moving into new sections. In the early days, games gain an definite uplift with faster CPUs, but today it doesn't seem to be necessarily so and augments the thinking in upgrade strategy. As someone who has to be budget conscious, I'm just looking for good gameplay at 720p. Can anyone explain this better than myself?
 
Last edited:
Does higher FPS translate into lower latency? Isn't that the reason why people are clamoring for more FPS? It's also looking like faster CPUs are becoming less of a factor than faster GPUs. I suppose it depends on the game. If I understand it correctly, faster GPUs gives better framerates while faster CPUs process more characters on the screen. I can definitely say that faster hard drives has helped out tremendously in open world games when moving into new sections. In the early days, games gain an definite uplift with faster CPUs, but today it doesn't seem to be necessarily so and augments the thinking in upgrade strategy. As someone who has to be budget conscious, I'm just looking for good gameplay at 720p. Can anyone explain this better than myself?

Anyone who's been into PC gaming for as along as I have scratch our heads at how little understanding there is as to what a CPU does and does not contribute to gaming.

The GPU is always the bottleneck in 99.99% of any real life gaming scenario. A bottlenecked GPU primarily removes the CPU from the equation altogether and it doesn't really matter what CPU you have as long as its not pentium 4 or bulldozer - any modern mid to high end CPU from either AMD or Intel will make almost no difference if the GPU is the bottleneck.

When is a CPU not bottlenecked? If you have a modern CPU (ryzen / 6thgen core+) with at least 6 threads, and your GPU is less than a 2080 series GPU you will never have a CPU bottleneck.

If you game at 4k with any video card including 2080ti, you will never have a CPU bottleneck.

If you game at 1440p with any card less than a RTX 2080 you will never have a CPU bottleneck.

If you game at 1080p with an card less than a RTX 2070 you will never have a CPU bottleneck.

If you like to crank all the visual settings to ultra, you'll almost have no CPU bottleneck with any card less than a 2080ti.

If you use raytracing features, you will never have a CPU bottleneck.

Now deduce whether you will have a CPU bottleneck or not ... if not, then the difference between any modern AMD and Intel CPU will be ZERO.

If you don't have a CPU bottleneck, your CPU makes almost no difference

If you have a monitor that has less than 120hz, even if you DO have a CPU bottleneck, your displays refresh will cap it all at 120 anyway. If your FPS is less than that, then you don't have a CPU bottleneck.

I find it hard to believe that people on this forum struggle so hard to understand this, when Techspot is actually pretty good about including a variety of graphs showing what performance is like with lesser cards or higher resolutions. Do you guys just discard all that to focus on the artificially induced bottlenecked CPU numbers that only represent about 0.001% of all gaming setups? (sorry, rhetorical question - I'm addressing this response to the public in general, not specifically to you Danny101)


At 4k even with a 2080ti, the $200 R7 2700x (last gen AMD) is equal to a $550 9900k ... anyone want to waste $350 on 4k gaming? Anyone? Anyone? Bueller? Bueller?

2018-11-25-image-5.png



As far as the, grasping at straws, "latency factor" goes, the difference betwen 150fps and 165 fps isn't going to affect your latency. besides, we now have latency reduction as a BUILT IN feature in all modern GPUs ...

Now, don't even get me going on how average FPS isn't even the deciding factor in quality user gaming experieince amd smooth gameplay is concerned. 120fps with 60fps 1% lows is inferior to 90fps with 75fps 1% lows.

Average FPS doesn't tell what the gameplay quality and smoothness is like at all. There's more to it than avg FPS to tell the story of how well you experience the game.

Steve Burke from Gamers Nexus now recommends against 9600k because while the average FPS numbers are good, the poor 1% lows in some games causes stutter, ruining gameplay. He notes that the r5 3600 doesn't have this issue and considers it a superior gaming processor to the 9600k.



Actually I DO know why there is this complete lack of clarity in regards to what a CPU does and does not contribute to gaming. When Ryzen first hit the scene it offered 100% more cores than Intel's flagship mainstream desktop part and whooped it at a whole lot of things. Intel's equivalent 8 core CPU was HEDT and cost $700 for the chip alone - almost double the AMD offering.

Before this point, every enthusiast was well aware that gaming performance was dictated by the GPU, not CPU, (which it still is) with the exception of inducing an artificial bottleneck that didn't represent real life ... this was well known, especially the "not real life" part.

Intel, needed a new marketing (propaganda) strategy in light of Ryzen, so they emphasized the importance of a CPU to gaming (to cast the widest net possible) and incentivized reviewers to focus all their CPU comparisons for Ryzen on artificially bottlenecked gaming, instead of CPU work.

You can still see the effect today with some reviewers taking workstation targeted CPUs, doing 75% of their testing on bottlenecked gaming, and not even really testing the CPU for the applications in which it was designed for. 12, 16 and 18 core CPUs are not designed for gaming - that is not their main purpose, yet the reviews will still almost entirely focus on just bottlenecked gaming that isn't even real world ...

Don't underestimate Intel's ability to make people think they way they want them to think. They fool their investors constantly with paid for headlines.

They didn't hire an army of tech writers for nothing ... (they actually did that)
 
Last edited:
Don't underestimate Intel's ability to make people think they way they want them to think. They fool their investors constantly with paid for headlines.
They didn't hire an army of tech writers for nothing ... (they actually did that)
You have a great post here, but your chart @ 4K has the GPU doing all the work, which does a great job showing that you only need a certain CPU to hit a certain mark IF you have a certain GPU.
The benchmarks showing Intel faster then AMD are the lower resolution benchmarks at 1080p and even some at 1440p.

This shows you which CPU's are running which games faster.
If you look at the list of games at 1080p and 1440p with the 9900K vs the 3900X, the 3900X gets beat by 10-20FPS in many games and 5-10 FPS in many games. AMD has improved some but as this article shows with the 3950X, Intel's are still noticeably faster, and without dips or stuttering. Not making a case against AMD, Ryzen's performance is right there, but Intel is still leading, when it comes to gaming.
 
Last edited:
You have a great post here, but your chart @ 4K has the GPU doing all the work, which does a great job showing that you only need a certain CPU to hit a certain mark IF you have a certain GPU.

Yes ... only IF you have a 2080ti or lower GPU .... lol.

"Whooosh!!!"

I really don't know if you're just trolling me at this point ... ?

2080ti and lower GPU ... that's every GPU ever made in all time since GPUS were invented.
 
Yes ... only IF you have a 2080ti or lower GPU .... lol.
No, it's lower across the board with various AMD CPU's pushing various GPU's at various resolutions. AMD's Ryzen is right there in most cases, but when it comes to gaming, Intel's faster, with more OC headroom.
This particular discussion is over until Comet Lake arrives, which are rumored to be capable with 5.4GHz-5.5GHz.
The race is on, and its great, AMD has a significant and truly powerful architecture with their Ryzen, and god bless the world for competition.
 
You've shown at very high resolutions the GPU does most the work.

Resolution has nothing to do with it ... its about whether the CPU is botlenecked or not.

When a cpu isn't bottlenecked is when the GPU does all the work ... its not just about the resolution at all -- a gtx1060 (the card most people in the world own) will never be able to bottleneck a modern gaming CPU at ANY playable resolution - even low ones ... NEVER.

You clearly didn't read my posts, (obviously you are more interested in trolling than learning the truth), where I clearly outlined when a cpu is bottlenecked and when it isn't.

If you had read that post without your fanboi goggles you would realize that in 99.99% of all real life gaming scenarios ... the GPU is doing all the work making the CPU difference to zero.


Anyway its clear your are just trolling me, and have no interest in actually knowing how CPU does and does not impact gaming ... Until you learn this you probably shouldn't speak on the subject in order to not appear so uninformed to people who know better.

I think this point has been well proven ... "Don't underestimate Intel's ability to make people think they way they want them to think. "
 
... the GPU is doing all the work making the CPU difference to zero.
If your gaming at 4K.
Your BF5 64 player chart @ 4K is completely useless here.
None of your posts or information counters anything I've said yet, so unless you got something we all haven't seen, lets just stop please, here is a tidbit from the 1080p benchmark on this very article by the author:
3950X is a whisker faster than the 3900X and that made it just a fraction slower than the 8700K and 9700K while it was 12% slower than the 9900K
At 1080p (in this game), the 3950X is 12% slower then the 9900K.
12 percent is SIGNIFICANTLY SLOWER.


a gtx1060 (the card most people in the world own) will never be able to bottleneck a modern gaming CPU at ANY playable resolution - even low ones ... NEVER.
This does not counter the results that in many cases with the same GPU, the Intel chips are faster then AMD's offerings and that same GPU performs better with a 9900K then it does with even a 3950X, sometimes 10-20FPS faster in some games, 5-10FPS faster in many, and next to zero difference in many as well. For a gaming build at certain price points, Intel is the chip to get with more overclock headroom if all the builder is concerned about is gaming. Most will not find the difference to be enough so both choices are going to be just fine.
 
Last edited:
<Gets explained to how review benchmarks are misleading because they don't typically represent any scenario that people actually play in ... then comes back claiming said benchmarks are proof of representing real life gameplay ... >

:double facepalm: buddy you're not doing yourself any favours here, trust me ... :)

You don't know when yo stop digging the hole do you? lol. I'm fairly new here, but I suspect your one of the special ones that everyone just ignores. And now me too. Have fun using that level of logic in real life after you grow up.
 
Back