Ryzen 9 3900X vs. Core i9-9900K: 36 Game Benchmark

Couple things. While it's implied, there is no actual statement as to what cooling was used in either of the i9 tests. Stock cooling? Explain.

How many times was each benchmark ran? Was this an average of results? Was this using the same parts except what was stated? Was it an open test case? Were they performed at the same time?

Is 1080p considered a benchmark anymore? Why use a 2080ti for 1080p? Can we get cpu and gpu (gpu especially) usage percentage scores (high/low/average)?

Jesus, people act like it's tough to properly report findings. This just comes off as sloppy, I expected better, honestly.
 
Please do the test correctly, you are using 3200 speed memory for the AMD and give the INTEL 3600 SPPED. Give me a break use the Asus Strix motherboards they make boards for both CPU, use the same speed memory too. It already known that the Ryzen 3000 series CPU is memory sensitive the faster the memory the better it works, sure the 9900k has the better clock speed and all core boost. If you guys are so call tech geniuses then compare apple to apples and price to price please. Please stop acting like salesmen for Intel or AMD. I hate these type of articles that are not honest.

Reread the initial part of the comparison. They used 3200 speed memory for BOTH at stock and 3600 for BOTH when overclocked. And zen 2 isn't as memory dependent as previous iterations of zen.
 
Here are results from Techspots review, a mix of 1080p and 1440p results from the 8/16's.
There's a 15-20FPS difference, and sometimes more, in many games...this is just 7 examples and most importantly, this is Intel @ stock clocks. Add another 10-20FPS when its overclocked.

Still reposting this totally inaccurate claim which is refuted in this exact article you are replying to, eh? When overclocked, the delta between the 9900K and the 3900 actually decreased. Well within noise levels of course but your claim of a 10-20 FPS increase is flat wrong.

And your data is old, World War Z gives the same framerate on both after a patch but why actually keep up to date? Keep spouting old, inaccurate data.


Hitman 2
9900K = 89/119
3700X = 83/111

World War Z
9900K = 123/151
3700X = 111/135

Far Cry New Dawn
9900K = 96/123
3700X = 88/112

The Division
9900K = 108/172
3700X = 107/158

Shadows Of The Tomb Raider
9900K = 89/123
3700X = 72/102

Battlefield 5
9900K = 125/168
3700X = 107/155

Total War: Three Kingdoms
9900K = 107/128
3700X = 106/123
 
While I support AMD and their efforts, they are barely competing on 7nm when Intel is still on 12nm. When Intel moves to 10nm, the IPC gap will again widen with Intel maintaining it's lead. AMD needs to go back to the drawing board on IPC. On the bright side, AMD does have a good handle on the multi-processor performance. If game developers were able to better write for multi-processor usage, then it may be all mute and won't matter who you go with.

IPC refers to per clock. AMD is winning in IPC . Intel is winning in clock speed and nothing else. Clock speeds often drop with new nodes. AMD well further mature on this node leading to gains we saw first thi second gen Ryzen. Intel likely won't catch up until 2021 IMO.
 
These are mostly closed loop tests on single player scenarios with a lot of console ports going on too.

If only this test was made with multiplayer games on competitive settings (low/medium): Battlefield V Conquest Large, Black Ops 4, Apex, Quake Champions, Battalion, World War 3, CoD WW2, Pubg, Arma 3, etc. The story would be different because as we can see from other benchmarks like JayP Tek, Intel delivers 200fps+ on BF V conquest, for example, on every map, while his Ryzen chip couldn´t sustain steady 160fps. Same on Blackout and Quake Champions.
 
While I support AMD and their efforts, they are barely competing on 7nm when Intel is still on 12nm. When Intel moves to 10nm, the IPC gap will again widen with Intel maintaining it's lead. AMD needs to go back to the drawing board on IPC. On the bright side, AMD does have a good handle on the multi-processor performance. If game developers were able to better write for multi-processor usage, then it may be all mute and won't matter who you go with.

AMD needs to go back to the drawing board when concerning IPC? Uh, what? Do you even understand how complicated microprocessor technology is?

Owner of an Intel 8th gen Intel processor here. Your comment omits several details and is half-truth.

Games already do develop for multi-threaded processors and that's why the i5-9600K has been rendered obsolete. Processors with multi-threading handily outperform processors without.

Single core IPC is almost tied. Optimization prevents better results in gaming and AMD has completely rendered all Intel SKUs worthless. Not just my opinion, but the opinion of nearly every reviewer. AMD doesn't need to lead in IPC.

I think saying "9600k obsolete" is too harsh. That chip, specially when overclocked, still eats 90% of the games for breakfast with great performance.... with that being said, I would rather get a R5 3600 for obvious reasons, but let´s not exaggerate. Plus 9600k is out for some time now, Ryze 3000 just released. Someone that bought a 9600k 9 months ago, has no reasons to switch. Someone who is buying now, has reasons to buy R5 3600 instead, but that´s how tech evolves.
 
So much fanboying.

An extra 200-300 MHz will not help AMD catch Intel at the top end, Ryzen's latency (while improved) will take more than that to overcome and that's not going to happen anytime soon. So if you have a 2080Ti and game at 1440p or lower, and maybe even 4K with lower settings, you'll get more FPS from the 9900K (and 9700K?).

There's a 10-15% 9900K advantage in a few games as long as:

You spent $1200 for your GPU, and either:
You game at 1080p or lower
You game at 1440p at lower visual quality settings

That's useful to a few people out there but the vast majority of gamers are not spending that kind of money for potato settings. Instead they're using less expensive video cards and playing games at reasonable resolutions for their video cards, making them GPU-bound.

As Steve said, cutting down to even a $700 card makes the FPS differences within noise value for the vast majority of games and that's only at 1080p. Using 1440p? No difference between these CPUs. Choose whichever CPU you like.

The most played games on PC right now are mostly online competitive ones... e-spots titles as we call them. Played by both profressionals and non professionals that still like to climb the ranks and have a good result/performance.

We can see this by checking SteamCharts or Statista ranks, where the most played games right now are:

League of Legends, CS GO, Dota 2, Pub G, Apex Legends, Fortnite, Rainbow 6, DayZ, Arma 3, GTA V, WoW, Blackout, Escape from Tarkov, etc.

This drives most of the PC gaming community. If you check single player eye candy games like Assasins Creed, Tomb Raider, Metro or Far Cry, they struggle to have 1000 players daily.... and their peak number is 20 times less than any game of the ones I mentioned.

So in fact I do believe a lot of gamers use high-end cards (not necessarly a 2080ti ofc) to play 1080p with competitive sliders. This situation takes the load out of the GPU and the CPU is important to sustain high framerates.

This is where Intel still excels. In some games it isn´t any better, in other games it has as much as 20fps to 40fps advantages on the 1% lows, on a high framerate scenario. This is important, this sells hardware. This is where Intel still has space on the market.
 
Couple things. While it's implied, there is no actual statement as to what cooling was used in either of the i9 tests. Stock cooling? Explain.

How many times was each benchmark ran? Was this an average of results? Was this using the same parts except what was stated? Was it an open test case? Were they performed at the same time?

Is 1080p considered a benchmark anymore? Why use a 2080ti for 1080p? Can we get cpu and gpu (gpu especially) usage percentage scores (high/low/average)?

Jesus, people act like it's tough to properly report findings. This just comes off as sloppy, I expected better, honestly.

There is no stock cooling for the 9900k. You will *always* pay for aftermarket cooling, as none is included with the cpu or the mb.
 
IPC refers to per clock. AMD is winning in IPC . Intel is winning in clock speed and nothing else. Clock speeds often drop with new nodes. AMD well further mature on this node leading to gains we saw first thi second gen Ryzen. Intel likely won't catch up until 2021 IMO.
Do you realize that Intel doesn't really have to change a thing on design, but go to a smaller node and will likely smoke AMD in the next round. They will have the clock speed advantage on any comparable node size. Not taking away from what AMD has done, for even I only buy AMD. I'm recognizing the stiff competitionbthat is Intel.
 
I think saying "9600k obsolete" is too harsh. That chip, specially when overclocked, still eats 90% of the games for breakfast with great performance.... with that being said, I would rather get a R5 3600 for obvious reasons, but let´s not exaggerate. Plus 9600k is out for some time now, Ryzen 3000 just released. Someone that bought a 9600k 9 months ago, has no reasons to switch. Someone who is buying now, has reasons to buy R5 3600 instead, but that´s how tech evolves.

It isn't too harsh. Gamers Nexus, LTT, etc. have all stated Intel's entire product line has been "rendered obsolete". The far higher 'one percent lows' that the Ryzen 5 3600 deliver over the 9600K cannot be ignored; overclocking the 9600K often provides little to no performance gains over the R5 3600 in titles that utilize more threads. You'll actually be "thread limited" in certain titles, creating bottleneck.
 
Here are results from Techspots review, a mix of 1080p and 1440p results from the 8/16's.
There's a 15-20FPS difference, and sometimes more, in many games...this is just 7 examples and most importantly, this is Intel @ stock clocks. Add another 10-20FPS when its overclocked.

Hitman 2
9900K = 89/119
3700X = 83/111

World War Z
9900K = 123/151
3700X = 111/135

Far Cry New Dawn
9900K = 96/123
3700X = 88/112

The Division
9900K = 108/172
3700X = 107/158

Shadows Of The Tomb Raider
9900K = 89/123
3700X = 72/102

Battlefield 5
9900K = 125/168
3700X = 107/155

Total War: Three Kingdoms
9900K = 107/128
3700X = 106/123


Its from the 3700X review done a week ago.
This simple cut and paste to every post becomes more inaccurate each day.

World War Z performs exactly the same using both CPUs, based on more recent tests from this website.
Your 10-20 FPS more from overclocking claim is completely refuted by this very article.

Nobody's gonna take you seriously unless you update your claims.
 
Last edited:
Do you realize that Intel doesn't really have to change a thing on design, but go to a smaller node and will likely smoke AMD in the next round. They will have the clock speed advantage on any comparable node size. Not taking away from what AMD has done, for even I only buy AMD. I'm recognizing the stiff competitionbthat is Intel.

Those promises are 3 years old now and Intel still has nothing to show for them. I'll believe it when it happens (and it will!). But I'm still waiting, and probably will still be waiting a year from now. When AMD will have yet another batch of new CPUs to offer.
 
The most played games on PC right now are mostly online competitive ones... e-spots titles as we call them. Played by both profressionals and non professionals that still like to climb the ranks and have a good result/performance.

We can see this by checking SteamCharts or Statista ranks, where the most played games right now are:

League of Legends, CS GO, Dota 2, Pub G, Apex Legends, Fortnite, Rainbow 6, DayZ, Arma 3, GTA V, WoW, Blackout, Escape from Tarkov, etc.

This drives most of the PC gaming community. If you check single player eye candy games like Assasins Creed, Tomb Raider, Metro or Far Cry, they struggle to have 1000 players daily.... and their peak number is 20 times less than any game of the ones I mentioned.

So in fact I do believe a lot of gamers use high-end cards (not necessarly a 2080ti ofc) to play 1080p with competitive sliders. This situation takes the load out of the GPU and the CPU is important to sustain high framerates.

This is where Intel still excels. In some games it isn´t any better, in other games it has as much as 20fps to 40fps advantages on the 1% lows, on a high framerate scenario. This is important, this sells hardware. This is where Intel still has space on the market.

That all sounds reasonable, I'd love to have actual statistics about how many of those competitive/successful people play with high framerates at 1080p (or lower?) for the advantages in online play. I'd also love to know how much 144FPS matters vs. 100FPS (or whatever), with the latency of a typical ping. All unanswerable of course unless you work at Steam and have direct access to their data.
 
IPC refers to per clock. AMD is winning in IPC . Intel is winning in clock speed and nothing else. Clock speeds often drop with new nodes. AMD well further mature on this node leading to gains we saw first thi second gen Ryzen. Intel likely won't catch up until 2021 IMO.
Do you realize that Intel doesn't really have to change a thing on design, but go to a smaller node and will likely smoke AMD in the next round. They will have the clock speed advantage on any comparable node size. Not taking away from what AMD has done, for even I only buy AMD. I'm recognizing the stiff competitionbthat is Intel.
Are you sure intel can get very optimized at 1st try ? Because shrinking node more complicated than whay we thought even in this architecture it need 3rd gen to get what is coffelake get
 
For me its the efficiency coupled with the performance that is amazing, 5% on average that too because some very old game optimized for Intel skew the stats, with 142 watts for 3900x on full load vs 250 watts overclocked Intel 9900k is truly amazing, if AMD so wished they could have raised the TDP to 200 watts against intel and would have easily beaten them hands down.

Prime95 and AVX instructions (205W stock, 250W overclocked) : https://www.tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-11.html
(in the link if you see the graphs for power consumption, they mention they used industrial chiller for 5GHz OC as AIO were already pushed to limits at stock, pretty funny for a consumer-grade CPU review)
Not to mention the needed cost of a beefy AIO to keep Intel chip safe.
 
These are mostly closed loop tests on single player scenarios with a lot of console ports going on too.

If only this test was made with multiplayer games on competitive settings (low/medium): Battlefield V Conquest Large, Black Ops 4, Apex, Quake Champions, Battalion, World War 3, CoD WW2, Pubg, Arma 3, etc. The story would be different because as we can see from other benchmarks like JayP Tek, Intel delivers 200fps+ on BF V conquest, for example, on every map, while his Ryzen chip couldn´t sustain steady 160fps. Same on Blackout and Quake Champions.

Hmm. Wonder what vid that was, since the last thing on his channel, from yesterday, was a "this what I hoped for from Zen 2" and "both the 9900k and 3900x both will let the gpu drop into 80% usage sometimes". He did say you should use ryzen calc for your memory, but once you do, the CPUs are largely equivalent for high refresh gaming.
 
I think saying "9600k obsolete" is too harsh. That chip, specially when overclocked, still eats 90% of the games for breakfast with great performance.... with that being said, I would rather get a R5 3600 for obvious reasons, but let´s not exaggerate. Plus 9600k is out for some time now, Ryzen 3000 just released. Someone that bought a 9600k 9 months ago, has no reasons to switch. Someone who is buying now, has reasons to buy R5 3600 instead, but that´s how tech evolves.

It isn't too harsh. Gamers Nexus, LTT, etc. have all stated Intel's entire product line has been "rendered obsolete". The far higher 'one percent lows' that the Ryzen 5 3600 deliver over the 9600K cannot be ignored; overclocking the 9600K often provides little to no performance gains over the R5 3600 in titles that utilize more threads. You'll actually be "thread limited" in certain titles, creating bottleneck.
...they never said that in fact they stated the 9600k and 9700k make sense from a pure gaming perspective...

From GN 3600 review

. The i5-9600K outperforms the 3600 in most of our game benchmarks as games have been slow to adapt to CPUs with more than 8 threads, and the 5GHz+ overclocking potential of the 9600K makes it an even clearer winner for exclusively gaming, but the R5 3600 is the more versatile and potentially cheaper option at $200 MSRP.
 
Depending on the game, yes it could; quite a feat considering how much more expensive the Ryzen competition was at the time and when the 8100 wasn't outperforming, it was at least on par.

"Outperforming all Zen products in gaming" is a blanket statement implying this was the superior gaming CPU. Depending on the game? Sure, maybe you can find cases where this applies, but as a general statement it was not performing all Zen products in gaming. It wasn't even always outperforming all 4-core Zen products, although Ryzen 3 was pretty much always a bit slower - at least when running it stock.

You didn't have to pair it with overclocked RAM to get those results and I'm talking after the H and B boards were released but before the Zen+ launch.

Any benchmarks you could link? With H and B boards you were stuck with 2400MHz RAM and Paired with 2400MHz the i3-8100 lost to stock 1500X, 1600 and 1700 in all games Techspot / Hardware Unboxed tested with (just a handful were tested, though). It benefitted notably from 3200MHz RAM - with it the i3-8100 actually matched the stock i5-7600K in F1 2017 - but buying a Z-series board to run a budget CPU only made sense if the idea was to use it as a placeholder until one could buy a better CPU second hand.
 
Really looking for 1440p benchmarks. Top end purchasers realistically don’t play games at outdated 1080p anymore. With those incredible frame rates either processor is fine, but the Ryzen part destroys in all other non gaming areas. Therefore it is the best value for the money IMHO. I mean, who only games?!?!

You apparently skipped over the first paragraph of this particular article:

For more context on the capabilities of the latest Ryzen series, check out our day-one reviews of the 3900X, 3700X and the mainstream Ryzen 5 3600.
(emphasis added).

Those day-one reviews already included 1440p test results...
 
In the words of the great philosopher band Midnight Oil and WoW military leader "Dives"

a fact's a fact, handle it!

If your CPU is not ranked how you like on the gaming charts below, deal with it! Tossing a hissy fit on the internet (however comical) does not make it any faster. If you fail to understand the charts and why it's important to test at 720p and 1080p, that is a you problem and your lack of understanding. Get educated then come back and make a post rather then embarrassing yourself.

relative-performance-games-1280-720.png


b58dF9Du7rNkLGKtGpLZLg-650-80.png



4UyuvW7RPLEjhhjT2AWmp8-650-80.png
 
Facts are facts, Intel still has the upper hand not only in clock speed but also core-to-core latencies which seems to still be a problem with AMD. And not only that but you have to cross the Infinity Fabric to access DRAM and that's where latencies skyrocket because the new design has the I/O chiplet separate from the compute chiplet hence you need to go over the Infinity Fabric even more so than in the past adding even more latency when accessing DRAM.

Sure, AMD has done quite a bit to eliminate much of the performance hit of having to go over the Infinity Fabric by including more onboard cache and some other clever optimization tricks but in the end, if you need to go over the Infinity Fabric to access DRAM you incur a performance penalty and really that's what hurts AMD's Ryzen platform.

Like I said before, Intel still has the upper hand (and will continue for the foreseeable future) when it comes to raw clock speeds and for some of us, we need high clock speed. Besides, if you combine a 9900K with a good 280mm radiator and a liquid cooling loop you can very much tame those high temperatures with ease and achieve 4.8 to 5 GHz quite easily.
 
When even 7Zip performs better on Intel, AMD can't take the productivity crown. If you need raw performance you have no choice but Intel. I really hate saying that since I really did wish that AMD would have been able to hand Intel their heads on a silver platter but nope, Intel is still where the performance is.
Please stop smoking or update glasses. AMD wins in all productivity tests, period.
 
In the words of the great philosopher band Midnight Oil and WoW military leader "Dives"

a fact's a fact, handle it!

If your CPU is not ranked how you like on the gaming charts below, deal with it! Tossing a hissy fit on the internet (however comical) does not make it any faster. If you fail to understand the charts and why it's important to test at 720p and 1080p, that is a you problem and your lack of understanding. Get educated then come back and make a post rather then embarrassing yourself.

If you need 7% more frames when gaming at 720p using a $1200 graphics card, then by all means buy the 9900K.

For the rest of the people out there gaming with a measly <$1000 graphics card at 1440p or similar, feel free to buy an AMD or Intel CPU as there's no difference.
 
Nothing like the TRUTH simply stated! AMEN!

In the words of the great philosopher band Midnight Oil and WoW military leader "Dives"

a fact's a fact, handle it!

If your CPU is not ranked how you like on the gaming charts below, deal with it! Tossing a hissy fit on the internet (however comical) does not make it any faster. If you fail to understand the charts and why it's important to test at 720p and 1080p, that is a you problem and your lack of understanding. Get educated then come back and make a post rather then embarrassing yourself.

relative-performance-games-1280-720.png


b58dF9Du7rNkLGKtGpLZLg-650-80.png



4UyuvW7RPLEjhhjT2AWmp8-650-80.png
 
If you need 7% more frames when gaming at 720p using a $1200 graphics card, then by all means buy the 9900K.

For the rest of the people out there gaming with a measly <$1000 graphics card at 1440p or similar, feel free to buy an AMD or Intel CPU as there's no difference.

Like I said in my initial post, you have a bunch of people arguing about one $500 CPU they won't be buying against another $500 CPU they won't be buying.
 
Back