AMD Ryzen 7 5800X vs. Intel Core i7-11700K: 32 Game CPU Battle

Going for a last gen 10700k would save me €28 (€39 for the 10700) or about 5.5%. So in my market, the question is: why would I do that ? Getting last gen that‘s behind in everything should give me a lot lower cost.

last gen that‘s behind in everything

How did you come to that conclusion, when the 10700k is performing better than the 11700k in most games (and matching or surpassing the 5800x in a few cases)?
 
To be fair, as someone who owned the 5800X and now owns a 10850K the power draw is not an issue for the i9 while gaming, it's still higher than AMD's of course but I doubt you could actually see the difference on your electricity bill unless you game 18 hours a day, as to the cooling my stock Ryzen 5800X with PBO enabled was hotter than my i9 overclocked to 5.0Ghz on the same 360mm aio cooling
I‘m not worried about power draw in terms of the electricity bill, but I do dislike inefficiency.
Plus, higher power consumption means overall higher system cost as all components need to be able to handle it.

CPU heat dissipation is another thing as smaller dies can dissipate less heat, but that does not mean you need a beefier power supply or mainboard VRM.

Just compare Zen+ chiplets to Zen 2 and 3 chiplets. The latter consume a good bit less power but are also a lot smaller.
 
I‘m not worried about power draw in terms of the electricity bill, but I do dislike inefficiency.
Plus, higher power consumption means overall higher system cost as all components need to be able to handle it.

CPU heat dissipation is another thing as smaller dies can dissipate less heat, but that does not mean you need a beefier power supply or mainboard VRM.

Just compare Zen+ chiplets to Zen 2 and 3 chiplets. The latter consume a good bit less power but are also a lot smaller.

I usually don't care about inefficiency, when I buy parts for my PC I always spent extra on PSU, Mobo and cooling regardless of what CPU goes in there but of course not everyone does this so I understand why most people do care about these things. My wife runs a 3700X with PBO enabled and that runs really cool on a 360mm aio, so much cooler than the 5800X was 😅😅
 
You didn't understand. For benchmarks at 1080p, Radeon 6900XT is much faster than 3090. The benchmarker wants to remove the possibility of GPU bottleneck at that resolution. Why at 1080p? Because it is the best resolution to compare CPU's. This article is not about GPU's. Hopefully you understood now.
Thanks for the pontificating, but it is still besides the point.

Pairing an amd cpu and gpu together may (or may not) have certain advantages that are not realized if a Nvidia gpu or Intel cpu is used instead. Testing with Nvidia hardware will confirm or deny that possibility.

Second, the conclusion that the 5800x is better by x% on the 6900xt is a best case scenario, but most people don't have a 6900xt, and in fact, Nvidia is the more popular brand. So what is the performance differential between the two cpus on Nvidia hardware?

There are more layers to this than "what's the fastest 1080 gpu." Especially given the exorbitant prices of the 6900xt. Microcenter just quoted me $1800 for a red devil 6900xt the other day, and they're even more expensive on Amazon, eBay, stockx, etc. I'd argue that most people with the deep pockets to purchase a 6900xt aren't gaming at 1080p anyway. So if you want to have a complete discussion on 1080p, why not measure the performance difference on hardware that most folks who game at 1080p will actually be interested in purchasing?

Just because there's a x% difference in framerate between the two cpus on the 6900xt doesn't mean the same is true on a 3070 or even 6700xt. If there isn't any appreciable difference at all on those other GPUs then that should be pointed out. Rather than the unsupported conclusion that "because the 5800x is faster at 1080p on a 6900xt, the same is necessarily true on a 6700xt or 3060 or 3070."

The only way to know for sure is to test the other hardware. And we know these guys have the other hardware. That's why I left the comment in the first place.
 
Last edited:
To be fair, as someone who owned the 5800X and now owns a 10850K the power draw is not an issue for the i9 while gaming, it's still higher than AMD's of course but I doubt you could actually see the difference on your electricity bill unless you game 18 hours a day, as to the cooling my stock Ryzen 5800X with PBO enabled was hotter than my i9 overclocked to 5.0Ghz on the same 360mm aio cooling
You should have tried using Ryzen tuner, I used it and lowered my temps by nearly 20c at full load.
 
Thanks for the pontificating, but it is still besides the point.

Pairing an amd cpu and gpu together may (or may not) have certain advantages that are not realized if a Nvidia gpu or Intel cpu is used instead. Testing with Nvidia hardware will confirm or deny that possibility.

Second, the conclusion that the 5800x is better by x% on the 6900xt is a best case scenario, but most people don't have a 6900xt, and in fact, Nvidia is the more popular brand. So what is the performance differential between the two cpus on Nvidia hardware?

There are more layers to this than "what's the fastest 1080 gpu." Especially given the exorbitant prices of the 6900xt. Microcenter just quoted me $1800 for a red devil 6900xt the other day, and they're even more expensive on Amazon, eBay, stockx, etc. I'd argue that most people with the deep pockets to purchase a 6900xt aren't gaming at 1080p anyway. So if you want to have a complete discussion on 1080p, why not measure the performance difference on hardware that most folks who game at 1080p will actually be interested in purchasing?

Just because there's a x% difference in framerate between the two cpus on the 6900xt doesn't mean the same is true on a 3070 or even 6700xt. If there isn't any appreciable difference at all on those other GPUs then that should be pointed out. Rather than the unsupported conclusion that "because the 5800x is faster at 1080p on a 6900xt, the same is necessarily true on a 6700xt or 3060 or 3070."

The only way to know for sure is to test the other hardware. And we know these guys have the other hardware. That's why I left the comment in the first place.
You didn't understand "yet". The benchmarker is using the best GPU at 1080p (the fastest one at that resolution) to remove the possibility of GPU bottleneck. Why to remove this possibility of GPU bottleneck? So the GPU won't be altering the comparison between CPU's. It doesn't matter if it is Nvidia or AMD. It requires the fastest GPU at 1080p, and for now the 6900XT is the fastest GPU "at 1080p". Once the benchmarker made sure that the GPU is not distorting the results, he proceeded to benchmark the CPU's. OK, this is 101. Two links for you to check:
https://www.videocardbenchmark.net/high_end_gpus.html (benchmarks at 1080p)
https://www.gpucheck.com/gpu-benchmark-graphics-card-comparison-chart (at different resolutions)
 
5800X for gaming? Still no.
You can get a 10700 for $399 CAD and a Z590 for $220 in Canada for example. Or a B560 for $140ish. 5800X alone is $609.

More than 8 cores won't help you in games either, so there goes your upgrade path if it's a gaming machine. For gaming get a 5600X and leave the 5800X alone unless it's HEAVILY discounted and you just have to have 8 cores.

I feel sorry for those that still live in a third-world country, for those who don't want to click the link that is an AMD Ryzen 7 5800X Processor for $399.99. LOL

https://www.microcenter.com/product...800x-vermeer-38ghz-8-core-am4-boxed-processor
 
Last edited:
You didn't understand "yet". The benchmarker is using the best GPU at 1080p (the fastest one at that resolution) to remove the possibility of GPU bottleneck. Why to remove this possibility of GPU bottleneck? So the GPU won't be altering the comparison between CPU's. It doesn't matter if it is Nvidia or AMD. It requires the fastest GPU at 1080p, and for now the 6900XT is the fastest GPU "at 1080p". Once the benchmarker made sure that the GPU is not distorting the results, he proceeded to benchmark the CPU's. OK, this is 101. Two links for you to check:
https://www.videocardbenchmark.net/high_end_gpus.html (benchmarks at 1080p)
https://www.gpucheck.com/gpu-benchmark-graphics-card-comparison-chart (at different resolutions)
Agree to disagree and let’s move on ! Thank you !
 
So, for us normal people that can't get our hands on expensive C14-3800mhz, how would be extrapolate real-world performance from this? Do both CPUs scale with memory the same way?

I understand the methodology of 'removing bottlenecks' for a direct comparison, but how about a realistic one for the average end-user contemplating an upgrade? 3600 C16-16~ is hard enough to get at a decent price.
 
Last edited:
I‘m not worried about power draw in terms of the electricity bill, but I do dislike inefficiency.
Besides agreeing with that, I will add that the brand fanbois will use the power consumption and temp numbers when it fits their brand worshiping.

Example, the person that you replied to is the same type that would say those things DO matter if it was the competitor having those deficiencies.
 
To be fair, as someone who owned the 5800X and now owns a 10850K the power draw is not an issue for the i9 while gaming, it's still higher than AMD's of course but I doubt you could actually see the difference on your electricity bill unless you game 18 hours a day, as to the cooling my stock Ryzen 5800X with PBO enabled was hotter than my i9 overclocked to 5.0Ghz on the same 360mm aio cooling
As someone who used to own a Ryzen 7 5800X, the heat issue is really because there is too much power fed to the single chiplet/ CCX. And being a chiplet, the surface area is too small to cool effectively as compared to a monolithic chip like Comet Lake/ Rocket Lake.

If you are savvy enough to OC the Intel chip, messing around with the PBO settings is probably easier. Just by virtue of reducing the PPT to 115W instead of 140W, I see a reduction from 90 degs looping Cinebench R20, to 78 degs with an Arctic Liquid Freezer II 360 and with a room temp of slightly over 27 degs.
 
The 5800X is a great processor if you can get it on sale. I got mine for $410. Finding a 5900X was impossible at the time and 5600X were available, but for $350+.
It seems that the stock situation is getting better for the Ryzen 5xxx series with availability of all 4 SKUs this week at MSRP. Of course the 5900X being the best in value is quickly snapped up within an hour.
 
Besides agreeing with that, I will add that the brand fanbois will use the power consumption and temp numbers when it fits their brand worshiping.

Example, the person that you replied to is the same type that would say those things DO matter if it was the competitor having those deficiencies.
Oh, heard that for years when efficiency and power consumption seemed to matter a lot (CPU and GPU) vs now when it‘s often „who cares about power consumption“. Heck, laptop CPU are now being praised for scaling well with high power use (70+ W).

Either way, I disliked inefficiency even back then, which is why e.g. Polaris was out of the question for me in spite of its attractive price / perf.

I am not saying that high power consumption disqualifies a product if performance is good enough, but it does cost additional money when building a PC.
 
My next build will be with DDR5 RAM and whatever generation of Intel Core is available at the time. Probably 13 or 14th.

Intel for gaming and content creation.

AMD for benchmarkers and apps I don't use.
"Intel for gaming and content creation." Did you even read the article or all the others `? Conclusion is clear, in general terms across all (real life) performance metrics AMD is nr.1 - Intel nr.2.
But hey lets buy intel lol
 
Nice to see I made the right call to pick up a 5800X in December. I was a little worried that the 11700K would be faster. You do get thunderbolt and an iGPU on the Intel side but it is a lot hotter.

It’s definitely odd to use a 6900XT to test these CPUs, Radeon has about a 10% market share so the vast majority of people would be using different drivers to what’s being used here. But this is probably just a symptom of Steve’s well documented beef with Nvidia.

Also it’s savage that the 6900XT can’t deliver metro Exodus enchanced edition at 4K60. A 3070 appears to manage just fine from other sources, a 2070 super does it with DLSS 2.1. If only Techspot would test it properly. But no, we will probably get another boring office laptop review instead.
 
Nice to see I made the right call to pick up a 5800X in December. I was a little worried that the 11700K would be faster. You do get thunderbolt and an iGPU on the Intel side but it is a lot hotter.

It’s definitely odd to use a 6900XT to test these CPUs, Radeon has about a 10% market share so the vast majority of people would be using different drivers to what’s being used here. But this is probably just a symptom of Steve’s well documented beef with Nvidia.

Also it’s savage that the 6900XT can’t deliver metro Exodus enchanced edition at 4K60. A 3070 appears to manage just fine from other sources, a 2070 super does it with DLSS 2.1. If only Techspot would test it properly. But no, we will probably get another boring office laptop review instead.
Another guy not understanding. May Steve explain why he used Radeon 6900XT here? I explained this to another guy but he couldn't understand me yet.
 
Another guy not understanding. May Steve explain why he used Radeon 6900XT here? I explained this to another guy but he couldn't understand me yet.
I think it’s you that has failed to grasp what the problem is here. Games perform differently with different drivers, Steve himself showcased this not that long ago, showing the Nvidia driver to have a higher CPU overhead than the Radeon driver. This means that we would likely see bigger differences between these CPUs if you used an Nvidia card. And as like 90% of the market uses Nvidia it would have given better results to use Nvidia.

It’s very simple to understand.
 
Back