AMD Ryzen 9 3900X and Ryzen 7 3700X Review: Kings of Productivity

Here you have the same game, with an i5 8400 and a gtx 1060. Compare the two videos and make your own conclusions.

Yes, because comparing gameplay from 2 different streamers is the best way to choose a gaming CPU...

Dude, it's getting beyond sad at this point. You are really digging deep with these foreign unknown YouTube videos.
 
Tfw 1/4 of the posts in this thread are about you damage-controlling for Intel.
 
Lol why do you guys allow LogiGaming to post on this site. The guy adds nothing to the topic is just straight intel shilling. Like some 40 year old man child living in his parents basement.

But but but but Gaming.....

Now back to the topic on hand.
I want to see the results of the 3800x going to be doing a new Ryzen build in about a month once all the dust settles.Fortunately for me I do more than play games on my Rig.

Almost forgot thank you Techspot for the review and the time put in.

You surely added a lot by getting personal wirh another user and letting everyone know that you are special by not only using a pc for games and want a 3800x. Thanks for letting us know xD

There is no respect on this website. Ppl cant have opinions, the usual suspects will attack other users with "shill" "retarded" "40y old on basement". To me this is sad. Feel free to disagree with someone but going low is unecessary.

With that being said and as a R5 2600 owner, yes, Intel is still ahead in games and drastically ahead in some cases. I know what kind of benchmarkers I like to follow, certain high refresh comp gaming scenarios, and Intel has some steady leads like on escape from tarkov at 1080p 75% res scale 200fps avg and 3900x 147fps, or Quake champions 3900x 160fps on Church of Azeroth and 9600k 210fps.

This is crucial for me, as high fps = better aim on these comp games. So it is a specific case wich I value and few reviewers analyze this aspect. .

For the casual player and almost everyone, AMD is more than enough, But please do not say Intel is worthless because that is not true. My next upgrade will be Intel, to pair with my recent alienware aw2518hf 240hz monitor wich I got for 260€ on Amazon.

And before someone says " meh gaming", yes I only use my PC for entertainment. Online gaming and watching movies. I have a life apart from being on my PC. Dont see anyrhing wrong with that.
 
Last edited:
You are forgetting a big thing on your flawed conclusions. Most of those pro gamers that are on 1080p, use lower resolution scales. From 50% to 75%, wich are the most common ones. And that´s the same as playing on lower resolutions. The effecticve res is still 1080p, but internally downscaled, so the GPU handles a lot less. Resolution scales are not on those charts.

But hey, you also said I´m not Portuguese and that I use google transaltor to speak Portuguese, trying to win an argument, so everything is possible :D
Yeah, no. an incredibly small number of might still use 4:3 sometimes, but nobody is using a lower resolution in games anymore beyond maybe on potato systems and cheap laptops. you can go and ask any cs:go pro about it, this debate has ended a long time ago.
Fyi, in cs:go, zen2 is doing really well so your entire argument falls flat on its face and I know of no other esport game where ppl used to play like this on a pro level beyond maybe Quake 10-15 years ago when it still had a small esports scene.
 
Last edited:
You surely added a lot by getting personal wirh another user and letting everyone know that you are special by not only using a pc for games and want a 3800x. Thanks for letting us know xD

There is no respect on this website. Ppl cant have opinions, the usual suspects will attack other users with "shill" "retarded" "40y old on basement". To me this is sad. Feel free to disagree with someone but going low is unecessary.

With that being said and as a R5 2600 owner, yes, Intel is still ahead in games and drastically ahead in some cases. I know what kind of benchmarkers I like to follow, certain high refresh comp gaming scenarios, and Intel has some steady leads like on escape from tarkov at 1080p 75% res scale 200fps avg and 3900x 147fps, or Quake champions 3900x 160fps on Church of Azeroth and 9600k 210fps.

This is crucial for me, as high fps = better aim on these comp games. So it is a specific case wich I value and few reviewers analyze this aspect. .

For the casual player and almost everyone, AMD is more than enough, But please do not say Intel is worthless because that is not true. My next upgrade will be Intel, to pair with my recent alienware aw2518hf 240hz monitor wich I got for 260€ on Amazon.

And before someone says " meh gaming", yes I only use my PC for entertainment. Online gaming and watching movies. I have a life apart from being on my PC. Dont see anyrhing wrong with that.

You keep mentioning Quake Champions and weird numbers that make no sense.
Here dude, here are your favourite 720p results for Quake (color me unimpressed that you lied again about the 3900X):

Quake.png


Please post more numbers so that people understand you more and your "problems" and how when you lower the resolution you have low FPS on your AMD system.
 
One thing to note is that these chips (as mentioned many times) somehow can't hit their advertised max clocks. I'd laugh my a** off if AMD pulls a "HD7970 GHZ Edition" from these chips with bios updates. R9 3900X @ 4.6 GHZ all cores, anyone?
 
One thing to note is that these chips (as mentioned many times) somehow can't hit their advertised max clocks. I'd laugh my a** off if AMD pulls a "HD7970 GHZ Edition" from these chips with bios updates. R9 3900X @ 4.6 GHZ all cores, anyone?
It seems that new BIOS versions are being released after launch (AGESA 1003a/b) and these improve some of the small boost issues some people are seeing. GN didn't have problems with boost clocks and the new bios improved the results by only 1-2% at most. There are variations between motherboards and chips. Other reviewers might see slightly bigger improvements, but I doubt it will be more than 2%.
 
Last edited:
Ah the "it should be faster because it uses less nm" argument.

The biggest advantage of 7nm is the power savings and they are nothing short of extraordinary.

Hate to break it to you, but reducing node size typically doesn't improve performance as much as node refinement given the same architecture. Just compare Sandy Bridge to Ivy Bridge or Haswell to Broadwell. For further evidence, compare KabyR to 10nm Cannonlake...
Funny really, because I have no problem breaking the news to you that you are incorrect. Die shrinks often bring performance improvements. Look at the CPU in this review. It performs faster than the previous incarnation at 12nm. Look at Vega 7. Performs better than Vega 64 after the die shrink and that didn’t even have that great efficiency improvements.

Although yes I would agree that die shrinks do generally just improve efficiency, however manufacturers often use these efficiency improvements to deliver more performance. Smaller dies can often result in greater yields too, meaning bigger profits. The more you know eh!
 
Funny really, because I have no problem breaking the news to you that you are incorrect. Die shrinks often bring performance improvements. Look at the CPU in this review. It performs faster than the previous incarnation at 12nm. Look at Vega 7. Performs better than Vega 64 after the die shrink and that didn’t even have that great efficiency improvements.

Given the same architecture is the important point here. Vega 7 and Ryzen 3 would have been better than their predecessors without the die shrink.
 
The fact that I haven't seen so many comments in a article about AMD on this site before plus so many claiming things like "I'm dissapointed" and " Intel is still king in gaming" just tells me how much AMD has improved that the Intel fans had to came out to defend their decision to be scammed. In just 2 years Intel has passed from 4 to 8 cores and why was that?
 
What's with Vulkan being so CPU intensive lately? World War Z and Rage 2 both seem to give AMD more trouble than most games.
The Zen 2 architecture doesn't seem to be struggling any worse/better than Intel is in Rage 2, according to these test results. Comparing the 3900X to the 9900K, there's no difference in average fps worth talking about, and the 1% min difference is less than 6% (Intel vs AMD) at 1080p. Different story with WWZ but it's a different game; Vulkan is nowhere near as easy to work with than OpenGL or Direct3D11, as there is far less hand-holding by the rules and structures of the API. WWZ could be the way it is, purely because the developers don't have enough experience with it yet.
 
You surely added a lot by getting personal wirh another user and letting everyone know that you are special by not only using a pc for games and want a 3800x. Thanks for letting us know xD

There is no respect on this website. Ppl cant have opinions, the usual suspects will attack other users with "shill" "retarded" "40y old on basement". To me this is sad. Feel free to disagree with someone but going low is unecessary.

With that being said and as a R5 2600 owner, yes, Intel is still ahead in games and drastically ahead in some cases. I know what kind of benchmarkers I like to follow, certain high refresh comp gaming scenarios, and Intel has some steady leads like on escape from tarkov at 1080p 75% res scale 200fps avg and 3900x 147fps, or Quake champions 3900x 160fps on Church of Azeroth and 9600k 210fps.

This is crucial for me, as high fps = better aim on these comp games. So it is a specific case wich I value and few reviewers analyze this aspect. .

For the casual player and almost everyone, AMD is more than enough, But please do not say Intel is worthless because that is not true. My next upgrade will be Intel, to pair with my recent alienware aw2518hf 240hz monitor wich I got for 260€ on Amazon.

And before someone says " meh gaming", yes I only use my PC for entertainment. Online gaming and watching movies. I have a life apart from being on my PC. Dont see anyrhing wrong with that.

I never said I didn't game.
It just isn't the only thing I do with my PC since I have a career in IT.

I just find the one track mind of gamers to be hilariously and even more so when you spam arguments over multiple pages over 10 fps difference that most people won't notice in most games.

And I get it all you and him do is play games on a PC and that is fine. But to belittle a brand new product that is introducing competition that we need in this space... is well Juvenile.

For the amount of crap he posted you would think he is a competitive player that actually makes money playing games which we all know is not true.
 
I truly do not understand why everyone is jumping the gun on judgement here regarding these CPU's. They just came out dang it! Give them a second to mature, for the bios to be updated to be more supportive of these chips, as well as drivers, and you will see marked improvements. So typical of the culture of today. Friggin burger king society, you way right away or forget it. AMD has always had immature drivers and bios regarding their Chips and Chipsets. I do not honestly believe for a second that you guys can be that moved over tests that were performed on immature software. Get a grip people!
 
... Give them a second to mature, for the bios to be updated to be more supportive of these chips, as well as drivers, and you will see marked improvements. So typical of the culture of today. Friggin burger king society, you way right away or forget it. ...

Does AMD let us get the chip for a down payment and pay them in full when the are mature? Didn't think so. So whats with this sad attempt at damage control?

Whatever happened to products getting full QA cycles, solid testing, and not offloading/crowd sourcing beta testing to the customers? Seriously, quality control is out the window, and there is just too much garbage to sort thru. The price is too high for people to suffer thru this much!
 
No benchmark can claim perfect accuracy, which is why margin of error exists. For CPU tests the current methodology has a test resolution of 3%. So if two different processors are within that 3% margin of error, neither can be called better then the other as the test is not accurate enough to say that the difference cannot be otherwise attributed to variance of other factors out of their control. The variance for the 3900X review has been a bit higher then normal due to a bios issue preventing max boost clocks. Hopefully we get another look as the amount of variance exceeds typical margin of error.

Multiple runs do not eliminate the margin of error. That 3% margin of error figure already considers that you are doing multiple runs. Not doing multiple runs would greatly inflate the margin of error, if not make the data entirely worthless.

I'm not saying the 3900X is within margin of error of the 9900K all the time. I was replying to your comment on 1440p performance where the 3900X is in fact very within margin of error on average. Heck at 1080p it's 4.7% away (according to TechPowerup).

https://www.techpowerup.com/review/amd-ryzen-9-3900x/22.html

I haven't claimed that benchmarks are 100% accurate. Please give me reference for your 3% claim.

Multiple publications/sites, many test runs and Ryzens are consistently slower than Cores in gaming. If it was due to "real" variance we'd be seeing those CPUs leapfrogging each other from site to site or test to test. The consistent trend however is that Ryzens are slower. Even if I grant you that anything below 3% difference doesn't count (which I do not until you provide some reference for that 3% claim), we have instances where Cores' advantage goes beyond that. So we have a situation where a Core CPU is going to be as good or better than equivalent Ryzen. Sounds like and advantage to me.

I haven't said that doing multiple runs eliminates margin of error but hey, nice strawman.

If the 9900K (and 9700K for that matter) is above that margin at some games and never drops below then in fact it has an advantage over the 3900X. Moreover please explain to me why is it that somehow, that variance you speak of, allways works against Ryzens ? The Core CPUs have been consistently shown on multiple sites, test setups, test samples etc. to be faster in gimng than Ryzen CPUs. That's a fact. Which then leads to a natural conclusion that they have an advantage in gaming. Deal with it.

A small sample size does not disprove the whole.

https://prosettings.net/overwatch-pro-settings-gear-list/
https://prosettings.net/best-fortnite-settings-list/
https://prosettings.net/pubg-pro-settings-gear-list/
https://prosettings.net/apex-legends-pro-settings-gear-list/
https://prosettings.net/rainbow-6-pro-settings-gear-list/

Even by your own admission, it is only "some" and only in a single game. So in fact, by me saying pros play at 1080p I was indeed correct and no argument of semantics will change that.

A small sample size ? CSGO is one of the biggest eSport titles around. It's consistently in the top 5 (top3 more so than not) of concurent players on Steam. Sees milions of players each month (https://blog.counter-strike.net/index.php/category/updates/) and yet that's a small sample, OK.

Nice cherry pick btw. I said some play at 800x600 resolution to point how low some players go. I also said many play at 1024x768 but you have conveniently left that part out. In fact, only around 6% of CSGO pros play at resolutions 1080p or higher. This is as close to a strawman without being a strawman one can get. Nice job.

Now, let's check out your candidates. Let's see what % of players game on resolutions below 1080p. I have excluded players who have no resolution stated in the respective tables. We have: Fortnite - over 22%, PUBG - around 40%, Apex Legends - around 29%, R6 - around 8%, Overwatch - around 7%. Please tell me again about how small the sample size of pro players playing under 1080p is, please.

No you were not correct and you saying otherwise doesn't change anything. It's not semantics, it's reality and facts, some of which you have even provided. When you say: Pros play at 1080p and anything below is not realistic, the only thing one has to do is provide one pro player that does not and your point falls apart. Unfortunately for you, it's not just one pro player, it's a lot of them. Get your facts straight and maybe learn the meaning of semantics.
 
I haven't claimed that benchmarks are 100% accurate. Please give me reference for your 3% claim.

Multiple publications/sites, many test runs and Ryzens are consistently slower than Cores in gaming. If it was due to "real" variance we'd be seeing those CPUs leapfrogging each other from site to site or test to test. The consistent trend however is that Ryzens are slower. Even if I grant you that anything below 3% difference doesn't count (which I do not until you provide some reference for that 3% claim), we have instances where Cores' advantage goes beyond that. So we have a situation where a Core CPU is going to be as good or better than equivalent Ryzen. Sounds like and advantage to me.

I haven't said that doing multiple runs eliminates margin of error but hey, nice strawman.

If the 9900K (and 9700K for that matter) is above that margin at some games and never drops below then in fact it has an advantage over the 3900X. Moreover please explain to me why is it that somehow, that variance you speak of, allways works against Ryzens ? The Core CPUs have been consistently shown on multiple sites, test setups, test samples etc. to be faster in gimng than Ryzen CPUs. That's a fact. Which then leads to a natural conclusion that they have an advantage in gaming. Deal with it.

https://www.gamersnexus.net/guides/...ability-standard-deviation-of-game-benchmarks

You do not seem to understand what margin of error is. It doesn't matter if the i9 9900K is consistently 2.4% ahead in benchmarks, the tests being run simply do not have the resolution to declare one better then the other. CPU benchmarks are not accurate within 3% and the only thing that would change this is improvement to the testing methodology itself. More data does not equal better as it is all limited by how the testing was conducted.

Here's another review from GN that shows the margin of error bars

https://www.gamersnexus.net/hwreviews/3474-new-cpu-testing-methodology-2019-ryzen-3000-prep

"They’re all about the same. The top half of results – that’d be the i5-8600K through the i7-8700K – are all within margin of error"

Given that the 9900K is actually a bit worse then the 8700K, that's enough said on the subject.


A small sample size ? CSGO is one of the biggest eSport titles around. It's consistently in the top 5 (top3 more so than not) of concurent players on Steam. Sees milions of players each month (https://blog.counter-strike.net/index.php/category/updates/) and yet that's a small sample, OK.

Oh please, could you possibly manufacture any more false outrage?

CS:GO is the only eSport where below 1080p is common and you tried to pass it off as if every pro player. This is just you creating a straw man (which you ironically accused me of) so that you don't have to answer why every other eSport has a standard resolution of 1080p as shown in the links I provided. FYI steam doesn't at all represent the whole PC scene, not at all.


Nice cherry pick btw. I said some play at 800x600 resolution to point how low some players go. I also said many play at 1024x768 but you have conveniently left that part out. In fact, only around 6% of CSGO pros play at resolutions 1080p or higher. This is as close to a strawman without being a strawman one can get. Nice job.

Now, let's check out your candidates. Let's see what % of players game on resolutions below 1080p. I have excluded players who have no resolution stated in the respective tables. We have: Fortnite - over 22%, PUBG - around 40%, Apex Legends - around 29%, R6 - around 8%, Overwatch - around 7%. Please tell me again about how small the sample size of pro players playing under 1080p is, please.

No you were not correct and you saying otherwise doesn't change anything. It's not semantics, it's reality and facts, some of which you have even provided. When you say: Pros play at 1080p and anything below is not realistic, the only thing one has to do is provide one pro player that does not and your point falls apart. Unfortunately for you, it's not just one pro player, it's a lot of them. Get your facts straight and maybe learn the meaning of semantics.

:joy:

You cherry pick one game
I provide links to every other eSport with a standard res of 1080p

Yep I'm the one cherry picking :joy:

Now, let's check out your candidates. Let's see what % of players game on resolutions below 1080p. I have excluded players who have no resolution stated in the respective tables. We have: Fortnite - over 22%, PUBG - around 40%, Apex Legends - around 29%, R6 - around 8%, Overwatch - around 7%. Please tell me again about how small the sample size of pro players playing under 1080p is, please.

Um, I appreciate the evidence that supports my point. If you take the average, that's 21.4% of players who play below 1080p. Yes, my point that 1080p is the standard for eSports player is more valid now then ever.

No you were not correct and you saying otherwise doesn't change anything. It's not semantics, it's reality and facts, some of which you have even provided. When you say: Pros play at 1080p and anything below is not realistic, the only thing one has to do is provide one pro player that does not and your point falls apart. Unfortunately for you, it's not just one pro player, it's a lot of them. Get your facts straight and maybe learn the meaning of semantics.

:joy::joy::joy::joy::joy::joy::joy::joy::joy:

Yep and my dog loves chocolate so I guess it's ok for all dogs to eat chocolate. You don't seem to realize how utterly illogical your statement is. By the way, in case you really don't know, don't feed your dog chocolate. My dog is a freak that can eat a whole pan of brownies and be A-OK but a single example or a few examples does not disprove the overwhelming amount of evidence that chocolate's effect on small animal's hearts. Your statement is exactly this, you are trying to disprove the vast majority with the minority.

You can disprove my statement with a single example? Where exactly did I say "all pros play at 1080p"? That's right, I didn't. At this point, when you read "pros play at 1080p" you assumed it meant "all pros". At this point you are arguing against what you assumed I said. The sentence "Pros play at 1080p" is pretty vague in and of itself. I didn't specify the quantity of "Pros" so the reader can assume any number. A normal person would assume "a majority" or "average" amount. Of course that would be the normal rationalization of a statement like that. This is the internet and you had to assume on the extreme side of the spectrum because you wanted to pick a fight. I'm not going to bother being explicit with my nouns on the internet. There will always be trolls who assume the irrational to try and prove their point. It's honestly more funny to watch them try and rationalize their extreme interpretation to a statement they made a bunch of assumptions on. Suffice it to say, if this were a conversation in real life you would have agreed with my statement long ago, most likely because people would think you need help if you are jumping to wild conclusions like that.
 
Last edited:
Gaming: Coming from a 4690k I thought all these new AMD cards would 6-8 core OC to the boost speeds without a problem. AMD did show us at computex that the intel provided higher fps. But the boast of best gaming CPU, nope. And the limit on the 15% IPC at 3800 ram bah.

So for my own ignorance on AMD this is disappointing.
 
Yay, I can finally side grade my 7700K to a 3700x and get about the same performance. At least I can stop worrying about the Intel problems with meltdown, spectre etc.
You are saying that going from 4 cores to 8 or 12 is just a small upgrade O_o ?

Literally it is not an upgrade but a SIDE grade. Game performance comes first. Extra cores are extra, just like 64 bits were extra when stuff went 64-bit and people didn't pay extra for 64-bit. Remember when the first core 2 duos came out, the were priced the in a the same price points of the old P4s (single cores) and you did NOT pay extra for 2nd cores. The extra cores do not command top dollars, and it never should.

The question is if it is really worth the time, money, and effort to side grade now, given how we know 1700x is now like $120 and the 2700x is now like $190 (after mobo discount) see:
https://www.microcenter.com/product...-am4-boxed-processor-with-wraith-prism-cooler

When and how much price drop is going to be applied to 3700x? And even then all this effort just to fortify against spectre/meltdown so you can side grade. It is not a pleasant thought. Moar cores does not relieve any of the pain either. Plus there are all the horror stories of fried Ryzens with bad bios-es etc.

So the price has got to lower to incentivize the transition.
 
Literally it is not an upgrade but a SIDE grade. Game performance comes first. Extra cores are extra, just like 64 bits were extra when stuff went 64-bit and people didn't pay extra for 64-bit. Remember when the first core 2 duos came out, the were priced the in a the same price points of the old P4s (single cores) and you did NOT pay extra for 2nd cores. The extra cores do not command top dollars, and it never should.

The question is if it is really worth the time, money, and effort to side grade now, given how we know 1700x is now like $120 and the 2700x is now like $190 (after mobo discount) see:
https://www.microcenter.com/product...-am4-boxed-processor-with-wraith-prism-cooler

When and how much price drop is going to be applied to 3700x? And even then all this effort just to fortify against spectre/meltdown so you can side grade. It is not a pleasant thought. Moar cores does not relieve any of the pain either. Plus there are all the horror stories of fried Ryzens with bad bios-es etc.

So the price has got to lower to incentivize the transition.

Please do not compare number of cores to 32 bit vs 64 bit. They are nothing alike. 64 bit was only made so you can exceed 4GB of addressable RAM. It was not intended to boost performance.

Extra cores DO command more money or would you care to explain the entire server market to me? Or how about the HEDT market? The only difference for those products is the additional cores and much higher prices.

When and how much price drop is going to be applied to 3700x? And even then all this effort just to fortify against spectre/meltdown so you can side grade. It is not a pleasant thought. Moar cores does not relieve any of the pain either. Plus there are all the horror stories of fried Ryzens with bad bios-es etc.

So the price has got to lower to incentivize the transition.

The price has got to lower? That must be why it's sold out everywhere since launch....

Please, if Intel and AMD didn't charge more for additional cores they would be out of business. It's a ridiculous assertion.

Plus there are all the horror stories of fried Ryzens with bad bios-es etc.

So the price has got to lower to incentivize the transition.

If you can submit a link with a news outlet covering this, I'd like to see it. Otherwise it's hearsay.
 
BTW we know how launch day availability goes. Stuff is never everywhere with unlimited availability. Intel overpriced crap is always sold out, but that does NOT mean they are not looking to gouge you.
 
Really?? It is still on the shelves at Microcenter even with mobo discounts see:
https://www.microcenter.com/product...-am4-boxed-processor-with-wraith-prism-cooler

Perhaps you should apply the following to yourself:

Providing proof is no problem for me, here:

driSepr.png


Hey and I'm not saying it's not in stock at your local Micro-center. It's just that for 99% of the country who don't have access to microcenter the CPU is completely sold out and even those who do have Microcenter nearby, many of them are in fact sold out.

Also, I don't see any proof of "fried ryzens" as you claimed. You did not address this at all in your comment. It's funny that you mention these unverified claims yet the numerous major security holes confirmed directly by Intel and security researchers don't phase you.

And I am not the only on that thinks AMD stuff is priced too high see:

https://www.techpowerup.com/review/amd-ryzen-7-3700x/23.html
" Could be cheaper
Still not as fast as Intel in gaming
No integrated graphics
"

You are really not understanding what the reviewer meant when he said "could be cheaper". Of course, you are only reading the small blurb at the end of the review and not the context in which it is provided.

"Priced at $330, the Ryzen 7 3700X is relatively affordable and cheaper than the $410 Intel Core i7-9700K. Unlike the Intel processor, AMD was kind enough to include a heatsink with their processor, so you can get your new rig set up immediately. The included heatsink is not some cheap low-quality heatsink, but a great cooler that can handle the processor's heat output with ease. Pricing of the processor itself has remained flat over the generations. The 3700X launches at the same $329 as the 2700X did. What has changed, though, is platform cost. AMD X570 chipset motherboards are significantly pricier than boards based on X470. Luckily, these processors offer backwards-compatibility with older platforms. Guess AMD no longer has to compete on price alone—the Ryzen 7 3700X is an excellent choice that's almost seeing eye to eye with Intel."

Mind you no one says you even have to buy X570 either as even a B450 motherboard can handle a 3900X, let alone a 3700X. The only category the 9700K wins in is gaming and that's by margin of error with a 2080 Ti at 1080p, every other category the 3700X either wins or wins big. If you have money to spend on high end cooling and an OC motherboard you should not be buying the 9700K in the first place, you should be getting the 9900K. For everyone else the 3700X is preferable.
 
According to some people in the comments section, my Ryzen is no good for games or something. The reality is that It's quite good for games, excellent even. And I play a wide variety of games. Don't recall having any stutter either.
 
Back