4GHz CPU Battle: Ryzen 3900X vs. 3700X vs. Core i9-9900K

If things work out for me this upcoming year, I'm planning on building a new rig to replace my aging i5 4690K build. I'm starting to think I'd consider AMD, even though Intel is still beating them at gaming which would be my main concern, and not too far behind in other applications.

However, where I live AMD Ryzen cpus are now severely overpriced, to the point that Intel cpus are currently giving much better bang for buck. Funny how a few years ago it was the opposite. Gotta love this competition.
 
Ew. This comment section... Fanboys at it again...

In any case, this is a great test. It really shows the advances AMD has made.
 
Why would I invest in a dead Intel platform for slightly better gaming performance while everything else is in Ryzen's favor?

Unless you're shroud you will not notice the frames a 9900k gets at 1440p 144hz.

For the cost of an 8700K I got an ENTIRE R5 3600 system. B450, gSkill 16GB 3200.

That's something Intel just can't compete with. Upgrading from a i7 3770k witch retailed for $350 in 2012 that's a complete upgrade and massive consumer competition.
 
Battlefield V DX11? Why not DX12? DX12 utilizes CPU and then GPU noticeably better, and yes I tested it by myself with a Turing card (not RTX though).
 
So the test was restricted to 4ghz across all cores with only 8 cores active. Scaling IPS Ryzen 3900x was on average around 10% faster than the Intel 9900. Moving this forward it would be safe to say that the Ryzen 3900x can actually run all core at 4.3ghz while the Intel 9900 can run at 5Ghz all core. The gap closes substantially when you add the conservative IPS uplift of 10%, in effect the Ryzen 3900x runs at an equivalent of 4.73Ghz all core, obviously, when the additional 4 cores come into real world play it levels up and often pushes past the Intel 9900. You have to look past the headline maximum clock speed of any Intel processor when comparing with AMD, because AMD may have a lower clock speed but has a much higher IPS speed.
 
All of these benchmarks for the Intel i9 9900K are @4G. How do you make it run at 4G???? I just built a system using the Intel i9 9900K and ASUS 390z-A Prime MB, and for the life of me, I cannot make it run at anything below 5GHz!!! at <32 degree temps!
Is it just me, or does there seem to be a tidal wave of bias towards AMD these days, that sort of takes hold of these writers faculties?
 
All of these benchmarks for the Intel i9 9900K are @4G. How do you make it run at 4G???? I just built a system using the Intel i9 9900K and ASUS 390z-A Prime MB, and for the life of me, I cannot make it run at anything below 5GHz!!! at <32 degree temps!
Is it just me, or does there seem to be a tidal wave of bias towards AMD these days, that sort of takes hold of these writers faculties?
This was an experiment to determine IPC when clock speeds are the same. There is no "bias", just simple facts based on replicable results. Downclocking is not that hard to achieve (you can do it too), the harder part is to optimise the voltages to be as low as possible and still be stable.

I hope I'm not misunderstanding, but it seems you are trying to convince us that your CPU runs at 5GHz@32 degrees C which is just not achievable with any normal cooling solution I know of. You might have a faulty temp sensor or god knows what else.

FYI AMD deserves to be praised for what they managed to achieve in just 3 short years with Intel having an R&D budget higher than what AMD is worth in total.
We would still be on 4 cores if AMD didn't force Intel's hand, with the 7700K being re-re-released every year with a small bump in clock speeds because Intel still doesn't have 10nm high performance CPUs.

For gaming the 9900k is still the better solution, nobody is going o take that way from you (at least until AMD releases Zen3 and Intel the 10900k). For someone like me, having 12 cores for under 500$ is godsend (I'm waiting for the prices to drop after the next gen CPUs get released).
 
Last edited:
It's not 3 years, they did not wake up with the complete design for the new architecture. By saying that, you are not doing AMD any favors, you are taking away a lot of years of hardship from them.

And the better IPC is on the CPU that can do >5GHz. Note that that doen't make the CPU better overall.
 
FYI AMD deserves to be praised for what they managed to achieve in just 3 short years with Intel having an R&D budget higher than what AMD is worth in total.
We would still be on 4 cores if AMD didn't force Intel's hand

I believe this is the crux of the matter (is this sentence correct?). It's remarkable the long way AMD has gone with a fraction of the resources Intel has. It's kind of revolutionary, all software being developed or will be developed by the industry is going to try to take advantage of all the compute power that can be achive by AMD's vision of compromising high speed clocks for cores. In my opinion, this is exacly why AMD should be supported, because they took things a step further and pushing all of us in that direction.
 
It's not 3 years, they did not wake up with the complete design for the new architecture. By saying that, you are not doing AMD any favors, you are taking away a lot of years of hardship from them.

And the better IPC is on the CPU that can do >5GHz. Note that that doen't make the CPU better overall.
single core performance = IPC x Clock speed
I can only count the years since they re-entered the market. yes development took longer.

Knowing the difference in IPC at the same clock speeds is very important because you can infer performance across the entire product stack without having to look up benchmarks.

Cheaper Intel CPUs have closer clock speeds to what AMD is offering. For example, knowing what Intel and AMD can achieve at 4GHz you can easily calculate what the 9400F is capable of at 4.1GHz boost clocks vs the 3600 at 4.2GHz (with very small real world differences).

I seriously don't understand why people are so confused about what was tested in this article and why.
 
IMO this is business. No praise, just good and bad/overpriced products.

Your should buy AMD chips if they do the job you need better than Intel. For most desktop PC uses, that is currently accurate. It wasn't the case a little over 3 years ago which is why almost everyone bought Intel at that point.

The fact that Intel has allowed AMD to catch up and pass them in most Desktop metrics is embarrassing but does it really matter when Intel has close to 100% of the OEM sales in it's pocket? OEM sales are huge, our org continues to buy Dell Optiplexes and Latitudes which are 100% Intel.

Laptops are a larger part of the market than desktops and Intel is king there. We'll see about Ryzen 4000, but there is no data either way so far.

I do find it remarkable that AMD has overtaken Intel on the desktop but I built 2 AMD desktops not because of that. I built them because they were the better option, because AMD designed the better Mobo & CPU system with Ryzen. I do hope AMD can continue their success in other circumstances (laptops, GPUs, *OEMs*) as competition always works in my favor.
 
I believe this is the crux of the matter (is this sentence correct?). It's remarkable the long way AMD has gone with a fraction of the resources Intel has. It's kind of revolutionary, all software being developed or will be developed by the industry is going to try to take advantage of all the compute power that can be achive by AMD's vision of compromising high speed clocks for cores. In my opinion, this is exacly why AMD should be supported, because they took things a step further and pushing all of us in that direction.
Fortunately we've moved beyond the point of compromising single core performance vs cores with Zen2.
 
Why so many lies? Are you an Intel employee? The charts themselves speak for that the IPC of the new Ryzen is significantly hanging from 9900k. The tone you wrote with the message very clearly shows that you are an Intel freak and even if the charts tell you the obvious, you explain that it is different: D

TechSpot, did you install the latest chipset drivers released a few days ago?

Point being: 'invest' in a GTX-2080 Ti and game (or more accurately 'benchmark') at 1080p to justify your 'investment' in Intel.
 
The compiler used for application development will favor Intel when the people developing are on Intel Processsors. I will say in Games the drivers have a huge impact as shown in the charts, but where the application is not using GPU or other direct writes it is a win for AMD. I use both processors in the development of software and I can say a few tweaks of code will make Intel or AMD fast or slow and that is how things are period. Remember if all running at 4 GHz then it is the application that controls the faint of numbers. The question is how fast is fast!
 
This was an experiment to determine IPC when clock speeds are the same. There is no "bias", just simple facts based on replicable results. Downclocking is not that hard to achieve (you can do it too), the harder part is to optimise the voltages to be as low as possible and still be stable.

I hope I'm not misunderstanding, but it seems you are trying to convince us that your CPU runs at 5GHz@32 degrees C which is just not achievable with any normal cooling solution I know of. You might have a faulty temp sensor or god knows what else.

FYI AMD deserves to be praised for what they managed to achieve in just 3 short years with Intel having an R&D budget higher than what AMD is worth in total.
We would still be on 4 cores if AMD didn't force Intel's hand, with the 7700K being re-re-released every year with a small bump in clock speeds because Intel still doesn't have 10nm high performance CPUs.

For gaming the 9900k is still the better solution, nobody is going o take that way from you (at least until AMD releases Zen3 and Intel the 10900k). For someone like me, having 12 cores for under 500$ is godsend (I'm waiting for the prices to drop after the next gen CPUs get released).
I'm not trying to convince you of anything- this is what ASUS "AI Suite3" is reporting. I found it hard to believe myself, but I used my own temp measurement and it was very close. Now I'm not saying that I'm playing video games on it or anything- just web browsing, Spotify playing in the background, Outlook, HDHomeRun TV program, and GOM player playing an MP4- still those numbers are hard to believe. I wish there is a way to insert an image from a file, because I have a feeling, no one will believe me.
 
I'm not trying to convince you of anything- this is what ASUS "AI Suite3" is reporting. I found it hard to believe myself, but I used my own temp measurement and it was very close. Now I'm not saying that I'm playing video games on it or anything- just web browsing, Spotify playing in the background, Outlook, HDHomeRun TV program, and GOM player playing an MP4- still those numbers are hard to believe. I wish there is a way to insert an image from a file, because I have a feeling, no one will believe me.
so... what you are saying is that it stays cool while idle. ok, it helps us a lot to understand why you are so confused, but I still don't understand why you are trying to confuse us too. :D
 
so... what you are saying is that it stays cool while idle. ok, it helps us a lot to understand why you are so confused, but I still don't understand why you are trying to confuse us too. :D
How is that IDLE?? How is 5+ GHz at below 32 degrees not impressive? I've been reading your posts above.... What are you an AMD employee or something??
I like AMD, I never use other than AMD Graphics cards, I'm just relaying what I'm seeing, If that confuses you, you got much bigger problems!!!
 
Last edited:
How is that IDLE?? How is 5+ GHz at below 32 degrees not impressive? I've been reading your posts above.... What are you an AMD employee or something??
I like AMD, I never use other than AMD Graphics cards, I'm just relaying what I'm seeing, If that confuses you, you got much bigger problems!!!
Because it isn't impressive. It's just 1 or 2 cores being used in very short workloads.

You can't convince anyone who knows a thing or two about computers that your CPU works at 32 degrees, it's impossible. You are either misreading the temps or you have a sensor issue. Most web browsing involves some form of GPU acceleration so does video playback.

Run any modern game on it and without a good cooler you are looking at some pretty "hot temps". The 9900k is know to run hot, especially when not capped at 95W by cheap motherboards.

FYI you CAN insert an image here.
FYI2 here are some real results that everybody in the world can replicate: https://www.techspot.com/review/1744-core-i9-9900k-round-two/
 
For gaming, Intel is still better and that probably wont change this year with Zen 3, but it might... hard to say. I think those getting the 10900k will be happy, even if it doesn't have some of the latest chipset features (rumors say no PCIE4). 10 cores... 8 for games, 2 for the OS... nice sweet spot for years to come.

That being said, Zen 3 might surprise and be the defacto gaming CPU, but I doubt it.
 
I wonder what is the reason for continued Intel dominance in gaming benchmarks.

Non-stop.

I mean... yeah, not by much, but when it comes to gaming, Intel STILL leads AMD in gaming benchmarks, even if it is just in single digit.

Any one sentence can explain this?

I'm sure no practical users care about the non-game benchmarks...


I'm an Intel user so far, but I will be getting a next-gen Ryzen Threadripper for my next upgrade. Not because of benchmarks, but I am tired of using Intel for the past few years. And Threadrippers have been closing the gaps very much nowadays.

Problem is...Threadrippers are more expensive.
 
I wonder what is the reason for continued Intel dominance in gaming benchmarks.

Non-stop.

I mean... yeah, not by much, but when it comes to gaming, Intel STILL leads AMD in gaming benchmarks, even if it is just in single digit.

Any one sentence can explain this?

I'm sure no practical users care about the non-game benchmarks...


I'm an Intel user so far, but I will be getting a next-gen Ryzen Threadripper for my next upgrade. Not because of benchmarks, but I am tired of using Intel for the past few years. And Threadrippers have been closing the gaps very much nowadays.

Problem is...Threadrippers are more expensive.
Because of Intel's core and cache memory low latency (scroll up and see AIDA's test results), and if I'm not wrong, this is because of the ring bus present in intel's cpu, which allows it to keep this numbers really low but at a the cost of less core count. If you sum that up with recent drivers and a faulty windows core scheduler (I can't recall the exact name for this), you get these kind of results.
 
Back