Ryzen 9 3900X vs. Core i9-9900K: 36 Game Benchmark

There is no credibility when your sample size is 1 against 1.
Give me some numbers when it is a minimal statistically significant sample size.
That would be 20 vs 20 systems.

Not all gamers run overclocked. They may have to do it for some games that are poorly written.
Overclocking is just seeing which system is more of a toaster.

You did not reference the wattage used by the Intel system versus the AMD system. I bet, that over the long run, you can heat your home with the former system.
 
There is no credibility when your sample size is 1 against 1.
Give me some numbers when it is a minimal statistically significant sample size.
That would be 20 vs 20 systems.

Not all gamers run overclocked. They may have to do it for some games that are poorly written.
Overclocking is just seeing which system is more of a toaster.

You did not reference the wattage used by the Intel system versus the AMD system. I bet, that over the long run, you can heat your home with the former system.
...why don't you just look at 20 different reviews rather then make juvenile comments like requiring 40 system test of identical hardware because you don't like the results...
 
I'd imagine that 2 4-core, non-SMT CPUs could handle the higher clocks and heat very well.

Probably couldn't, though. I don't know if the problem is wattage density, the quality of the silicon, both or something else, but based on what I've seen something like 5.5 GHz is totally unreachable with Zen 2 even with a single core and liquid nitrogen. Intel's i7-9700K on the other hand needs something better than water cooling to reach those speeds and be stable while doing anything serious.

As for the production crowd, I don't see any reason why a 8-core, 8-thread cpu couldn't still handle the production, publishing, and creator needs.

Many workloads scale very well with cores and threads, even if all of them don't, and in these cases a core/thread deficit cannot be balanced with any realistic clock speed increase.
 
Biggest factor to note- neither CPU is considered unplayable, even in the worst case scenario of starcraft II, or if the framerates are what you would consider too low intel is hardly any better.

I didnt think we'd hit a point that CPUs just didnt matter anymore before 2020.
 
Gaming benchmarks on 2560x1440 would be most interesting.

I couldn't care less about 1080p gaming because this is not what people buy a top CPU with a top GPU for.
have you not heard of 1080p 240hz gamers? pretty sure alot of people buy the 9900k 2080ti combo for that setup because it's basically required........
 
Six percent slower but 30% less expensive. Not really a contest IMO. Put the extra $150 towards a better GPU = clear win for team red. Not to mention sheer dominance in threaded apps.
 
Six percent slower but 30% less expensive. Not really a contest IMO. Put the extra $150 towards a better GPU = clear win for team red. Not to mention sheer dominance in threaded apps.
Not sure where you are looking but on Amazon and Newegg, as if today, the i9 is $15 cheaper
 
All I'm looking for is a 6-8 core CPU with the fastest speeds. Why AMD only releases the the higher speed CPUs at 12-cores but a 4 or 6- core at lthe lowest speeds makes no sense to me. Not everyone needs that many cores. Maybe they're saving that for the Athlon line I suppose. Gaming at 720p at 120hz on max detail is plenty enough for me. Maybe it's my eyesight but I've yet diserned much of a difference between 720 and 1080. 1080 is still great for static browsing, office programs, and pictures. Maybe I'll consider 2k for the static stuff but anything above that I'm just not going to pay the premium for with not that much of a return. The refresh rates of 120, 144, and 240 however seems to be worth shelling out for. I do have a 120hz tv but unfortunately it isn't real but a ploy that Samsung played in their marketing. I didn't know enough about the differences at that time.
 
Last edited:
Gaming benchmarks on 2560x1440 would be most interesting.

I couldn't care less about 1080p gaming because this is not what people buy a top CPU with a top GPU for.
Well, guess this test isn't based on what you care about... Any competitive gamer currently plays on 1080p more or less, for highest possible framerate on their 144-240hz monitors, and well - then rocking a 2080ti and an intel makes perfect sense.. Also when comparing CPU's it makes no sense going to a higher res, as we all know that at the second it is the GPU that bottlenecks then the gpus makes no difference.. and why would you then even bother making a comparison between them...?

Sure though, if you like to go 1440p or even 4K - and don't care much about max FPS, then you also likely don't care about which of the cpu's perform better as the GPU will be your limiting factor.
 
All I'm looking for is a 6-8 core CPU with the fastest speeds. Why AMD only releases the the higher speed CPUs at 12-cores but a 4 or 6- core at lthe lowest speeds makes no sense to me. Not everyone needs that many cores. Maybe they're saving that for the Athlon line I suppose. Gaming at 720p at 120hz on max detail is plenty enough for me. Maybe it's my eyesight but I've yet diserned much of a difference between 720 and 1080. 1080 is still great for static browsing, office programs, and pictures. Maybe I'll consider 2k for the static stuff but anything above that I'm just not going to pay the premium for with not that much of a return. The refresh rates of 120, 144, and 240 however seems to be worth shelling out for. I do have a 120hz tv but unfortunately it isn't real but a ploy that Samsung played in their marketing. I didn't know enough about the differences at that time.

The marketing hz on tv's are often based on output frequency, and not input - always check the input frequency. Tv's can often output 480 frames for instance, but only recieve 60.. so if your PC creates 480 images it will still only update to the TV 60 times a second - then the TV will make 420 fake images that is stacked in between the once it has to fake smooth motion - but these images will not update correctly if new objects enter the screen in between two frames of the 60..
 
I'll admit I don't understand your comparisons. Surely for gaming you'd do better using the i7 9700K? Most games don't use more than 4 cores fully and hyperthreading only makes the processors run hotter which then restricts overclocking. The 3900X barely allows any overclocking while the i9 processor only overclocks to 5GHz while the i7 9700K, without hyperthreading, goes to 5.4GHz.

The 9700K is better at gaming at stock speed than the 3900X but then also allows a 20% bump in performance through overclocking. It's also 25% cheaper than the 3900X. All of this only makes a difference if you have the very best GPU available otherwise you'll just be bottlenecked by the GPU. For most of us with a budget then the Intel i5 9600K still beats the 3900X in most games but also allows a decent overclock if required. These AMD processors are great though if you mainly play cinebench ;)

( Gaming benchmarks from https://www.techspot.com/review/1869-amd-ryzen-3900x-ryzen-3700x )
 
It's really the difference between a slightly lower gaming experience and great in everything else (AMD) vs the best gaming experience and not as great in everything else (Intel). While gaming is big, everything else is bigger. Out of the full breadth of gamers, most are on consoles. Long term it would seem that AMD has the advantage here.
 
Well, guess this test isn't based on what you care about... Any competitive gamer currently plays on 1080p more or less, for highest possible framerate on their 144-240hz monitors, and well - then rocking a 2080ti and an intel makes perfect sense.. Also when comparing CPU's it makes no sense going to a higher res, as we all know that at the second it is the GPU that bottlenecks then the gpus makes no difference.. and why would you then even bother making a comparison between them...?

Sure though, if you like to go 1440p or even 4K - and don't care much about max FPS, then you also likely don't care about which of the cpu's perform better as the GPU will be your limiting factor.

How many gamers are singletasking competitive high-refresh shooty shooter gamers? vs 1440p shooter gamers? vs 1080p shooter with midrange gpu gamers? vs multitasking 1080p rts, rpg, old game gamers? vs streaming multitasking 1080p shooty shooter gamers?
 
It's really the difference between a slightly lower gaming experience and great in everything else (AMD) vs the best gaming experience and not as great in everything else (Intel). While gaming is big, everything else is bigger. Out of the full breadth of gamers, most are on consoles. Long term it would seem that AMD has the advantage here.
Strictly speaking that should read "the best gaming experience and not as great in everything else if it can make use of more than 8 cores (Intel))". My addition in bold. There just aren't that many applications outside of video processing that can use that many cores. For anything else, the faster cores on an i7 9700K would be better. If you rarely use more than 6 cores then the i5 9600K would be both better and cheaper. If you are heavily into video production then these new processors do make sense but, in all honesty, how many of us are?
 
Strictly speaking that should read "the best gaming experience and not as great in everything else if it can make use of more than 8 cores (Intel))". My addition in bold. There just aren't that many applications outside of video processing that can use that many cores. For anything else, the faster cores on an i7 9700K would be better. If you rarely use more than 6 cores then the i5 9600K would be both better and cheaper. If you are heavily into video production then these new processors do make sense but, in all honesty, how many of us are?
True. Keep in mind though, software follows hardware. There will always be a lag between the two. AMD fell behind Intel in selling their hardware to developers and had to play second fiddle in processing software optimized for the Intel processors. It was Intel after all who invented the x86 platform, so it stands to reason they would lead. Just like ATI had fallen behind against Nvidia. In that case, Nvidia just did a better job in selling their hardware and backing it up with quality parts. It's admirable that AMD has achieved what they have being number 2. Don't really see this changing for the foreseeable future. Just more of the same. Intel= premium, AMD= budget. The competition is fun to watch and the arguments about who's better is unnecessary.
 
Last edited:
Gaming benchmarks on 2560x1440 would be most interesting.

I couldn't care less about 1080p gaming because this is not what people buy a top CPU with a top GPU for.
couldnt agree more . 2008 called you techspot they want their benchmarks back.

running 1080p benchmark with that kind of hardware would be like buying a F1 car to enter a go-kart tournament for kids
 
The article is a direct one-vs-one CPU comparison, using a variety of games in which to determine the respective performances, in workloads generated by games. Ideally, you'd want to be using the kind of resolutions that people game at, who are likely to be buying such a high end CPU, but you want to ensure that as few variables as possible affect the measured frame rates.

So, if you start going beyond 1080p resolution, even with a 2080 Ti, the relative CPU performance delta becomes reduced due to GPU dependency. Once you're at 1440p or higher, outright CPU power simply becomes less relevant compared to GPU strength.

You can see this in the recent Ryzen 3900X review:

https://www.techspot.com/review/1869-amd-ryzen-3900x-ryzen-3700x/

At 1080p, in Battlefield V, the CPU results are:

9900K: 125 fps 1% low | 168 fps av
3900X: 111 fps 1% low | 155 fps av

That puts the 9900K at 12% and 8% faster, respectively.

However, at 1440p, the results are:

9900K: 110 fps 1% low | 135 fps av
3900X: 104 fps 1% low | 130 fps av

The difference now becomes 7% and 4%. At 4K, the delta would be even smaller and the results would tell you little about how two CPUs compare, when processing instructions and data generated in games.
 
How many gamers are singletasking competitive high-refresh shooty shooter gamers? vs 1440p shooter gamers? vs 1080p shooter with midrange gpu gamers? vs multitasking 1080p rts, rpg, old game gamers? vs streaming multitasking 1080p shooty shooter gamers?
your qute: 'I couldn't care less about 1080p gaming because this is not what people buy a top CPU with a top GPU for.' - yes we are many who buy top CPU and GPU exactly to play in 1080p.. and thus the test is very much useful. The percentage is easily found through Steam stats if you care to get the digits, easy to look up (spoiler alert, 1080p stands very strong). That does not mean however that 1440p and 4k numbers are not important - but there are tons of bechmarks out there covering these as well, though they display less of what difference CPU does to the FPS since higher res means sooner GPU-bound.
 
your qute: 'I couldn't care less about 1080p gaming because this is not what people buy a top CPU with a top GPU for.' - yes we are many who buy top CPU and GPU exactly to play in 1080p.. and thus the test is very much useful. The percentage is easily found through Steam stats if you care to get the digits, easy to look up (spoiler alert, 1080p stands very strong). That does not mean however that 1440p and 4k numbers are not important - but there are tons of bechmarks out there covering these as well, though they display less of what difference CPU does to the FPS since higher res means sooner GPU-bound.

I am not the one that said: "I couldn't care less about 1080p gaming because this is not what people buy a top CPU with a top GPU for."
I'm betting the person that said that is probably right though. Lower GPU's than a 2080ti are already considered fast enough for 1440p. Your comment about 1080p standing strong is silly because only ~10% of steam users surveyed even have better than a GTX1070! One half of 1% have a 2080ti and around 8% of users play at 1440p or higher. I think that proves it right there. People that buy the most powerful cards to play at 1080p are a tiny tiny minority.
 
Gaming benchmarks on 2560x1440 would be most interesting.

I couldn't care less about 1080p gaming because this is not what people buy a top CPU with a top GPU for.
What if they want to play at 165fps? Still going to suggest 1440p?
 
What if they want to play at 165fps? Still going to suggest 1440p?
That does bring up a question. How much FPS is needed to sustain smooth and responsive game play? To be smooth enough to support a resolution upgrade and yet still maintain smooth and responsive game play. I'm perfectly happy at 720. But if I'm consistently pushing 165fps, then I might as well upgrade the resolution. That's like running a 4 Cyl. graded transmission with a V-8 Turbo engine. That's a lot a of unused power left on the table.
 
That does bring up a question. How much FPS is needed to sustain smooth and responsive game play? To be smooth enough to support a resolution upgrade and yet still maintain smooth and responsive game play. I'm perfectly happy at 720. But if I'm consistently pushing 165fps, then I might as well upgrade the resolution. That's like running a 4 Cyl. graded transmission with a V-8 Turbo engine. That's a lot a of unused power left on the table.
You don't buy a top tier CPU so you can raise your resolution. This isn't a GPU test.
 
You don't buy a top tier CPU so you can raise your resolution. This isn't a GPU test.

No, but I am sure that people who buy expensive large high-res monitors and several hundred dollar graphics cards are usually getting a nice high end cpu too. He is also spot on about excess frames possibly being redundant. Only .5% of recently surveyed steam users have a 2080ti, but 8% of users in the same survey play at 1440p. You think those 2080ti users are doing 1080p and lower end card users are doing 1440p?
I know some shooty shooter gamers are all about those ultra high fps numbers (betting it's more than they can perceive in game for a lot of them too). I'd wager a lot more all around gamers with a high end card will run at a higher resolution, which I think is evident in that steam survey.
 
No, but I am sure that people who buy expensive large high-res monitors and several hundred dollar graphics cards are usually getting a nice high end cpu too. He is also spot on about excess frames possibly being redundant. Only .5% of recently surveyed steam users have a 2080ti, but 8% of users in the same survey play at 1440p. You think those 2080ti users are doing 1080p and lower end card users are doing 1440p?
I know some shooty shooter gamers are all about those ultra high fps numbers (betting it's more than they can perceive in game for a lot of them too). I'd wager a lot more all around gamers with a high end card will run at a higher resolution, which I think is evident in that steam survey.
Again. This is a CPU test. Not a GPU test.

I never said any stupid ish about using a 2080Ti in 1080p and a lower spec card at 1440p.

Someone might not be able to tell in a 5min blind test if they like excess frames for the latency cutdown. It may take over a week on high excess frames for your body to adjust to be able to discern a difference in a few ms. But it is very much possible. Don't let a couple slowies writing articles tell you what can or cannot percieve.
 
Again. This is a CPU test. Not a GPU test.

I never said any stupid ish about using a 2080Ti in 1080p and a lower spec card at 1440p.

Someone might not be able to tell in a 5min blind test if they like excess frames for the latency cutdown. It may take over a week on high excess frames for your body to adjust to be able to discern a difference in a few ms. But it is very much possible. Don't let a couple slowies writing articles tell you what can or cannot percieve.

Right, it's a cpu test. From my perspective, I'll take a 6% average deficit in present gaming at 1080p on a 2080ti and go for the cpu that has 40% more power. I think over the long term I'll notice the 40% more than the ethereal gains that my body adjusts to in gaming.
 
Back