AMD Ryzen 7 5800X vs. Intel Core i7-11700K: 32 Game CPU Battle

AMD has removed that offset reporting not long after Ryzen 1000 came out due it causing problems for people who tried to measure temperatures......

On top of it I've used a 2700X and 5800X and my wife is currently using a 3700X with a Corsair Capellix 360mm aio running at 800rpm and her CPU temperature in demanding games with PBO enabled is 54C. Either I had a really bad 5800X or they are just hot in nature due to small die area : -).
Yes but there is many unknowns AMD does not really report.

54 degrees is not much. So low it's possible thermal sensor is very inaccurate at that temps. To put it another way: FX-series had max temperature around 62 degrees and temp sensor was very inaccurate under 40 degrees (that is: difference between max temps and inaccurate temp was 22 degrees). Now, Ryzens should handle 95 degrees with no problems. Same 22 degrees difference would mean temp sensor is accurate on 73 degrees.

Probably not but who actually cares 50 degree range? Something over 70 is what matters. 50 degree range is just for automatic fan curve adjustment and needs not to be accurate.
 
Yes but there is many unknowns AMD does not really report.

54 degrees is not much. So low it's possible thermal sensor is very inaccurate at that temps. To put it another way: FX-series had max temperature around 62 degrees and temp sensor was very inaccurate under 40 degrees (that is: difference between max temps and inaccurate temp was 22 degrees). Now, Ryzens should handle 95 degrees with no problems. Same 22 degrees difference would mean temp sensor is accurate on 73 degrees.

Probably not but who actually cares 50 degree range? Something over 70 is what matters. 50 degree range is just for automatic fan curve adjustment and needs not to be accurate.

Ryzen is not FX and what you saying makes 0 sense, AMD got much better at reporting temperatures and I believe 54C when I see it using Ryzen Master just like I believed the 85C it was reporting when I had the 5800X or 68C when I overclocked my 2700X. Ryzen CPU's have been out for 4 years now and the only time people were reporting issues with temperature sensors and temperature reporting was at the beginning when AMD used that +20° offset on X SKU's to ramp up the cooler to ensure the boost clock would be achieved but later on they changed it because it was causing too much trouble for people and it was giving these CPU a bad name for apparently running hot when in fact they were not.
 
Ryzen is not FX and what you saying makes 0 sense, AMD got much better at reporting temperatures and I believe 54C when I see it using Ryzen Master just like I believed the 85C it was reporting when I had the 5800X or 68C when I overclocked my 2700X. Ryzen CPU's have been out for 4 years now and the only time people were reporting issues with temperature sensors and temperature reporting was at the beginning when AMD used that +20° offset on X SKU's to ramp up the cooler to ensure the boost clock would be achieved but later on they changed it because it was causing too much trouble for people and it was giving these CPU a bad name for apparently running hot when in fact they were not.
It's much easier to make temperature sensor that is accurate on around 70-95 degree range than sensor that is accurate on around 50-95 degree range. That was exact reason why FX-CPU's didn't work accurately under 40 degrees. Another reason is that nobody cares about 50 degree range when limit is somewhere around 95 degrees.

Also AMD does still not specify exact (only about) TJMax for Ryzen 5000 CPU's. That alone tells that temperatures are not necessarily exactly right.

AMD GPU's report high temperatures because AMD had now more than one sensor, AMD does not really care.
 
AMD has removed that offset reporting not long after Ryzen 1000 came out due it causing problems for people who tried to measure temperatures......

On top of it I've used a 2700X and 5800X and my wife is currently using a 3700X with a Corsair Capellix 360mm aio running at 800rpm and her CPU temperature in demanding games with PBO enabled is 54C. Either I had a really bad 5800X or they are just hot in nature due to small die area : -).
The 5800X is a 105W TDP part correct? They run at a higher frequency than the 3700X. Yes they run warmer. The base clock is 200MHz difference.

Now, here is why heat doesn't necessarily mean higher wattage. Yes, you are correct that with the 7nm chiplet, when running at higher frequencies are going to be hotter. The transistors are more condensed. There are many more transistors per sq. mm than something like Intel 14nm die, which is what Intel 10th and 11th gen are. The heat sensors are going to be in areas where there is a higher density of transistors. But, the heat is in a much more compact space than Intel CPUs, or at least until Intel has a 10nm desktop CPU, as Intel 10nm has about the same density as TSMC 7nm.
However over the entire area of the SoC, there is less total heat on the Zen 2 and Zen 3 SoCs than on the Intel 10th and 11th gen SoCs when you're pushing these CPUs and running all the cores. There are no sensors that show TOTAL heat for the entire SoC, which would basically be an average over the entire die. You'd need millions of sensors to give you that kind of information.

Same reason as to why something like Zen 3 CPUs running at higher frequencies are going to show higher temps than Intel parts is also why the Zen 2 and Zen 3 parts show higher heat than Zen+. Zen+ I believe is on 12nm die, but it's been a while since I checked and I could be wrong. Once again the transistor are more condensed, and therefore so is the heat. But, Zen 3 5800X I believe uses less power than the 2700X when both are being pushed.
 
AMD cool and all but where tf am I going to get ahold of a 3800 C14 kit.

The weird RAM deal AMD still has going on is what keeps me from them. I can get a 5800x easy but not a fast C14 kit.
 
My next build will be with DDR5 RAM and whatever generation of Intel Core is available at the time. Probably 13 or 14th.

Intel for gaming and content creation.

AMD for benchmarkers and apps I don't use.

My next build will be AMD.

AMD for best performance.

Intel is not best.

 
So AMD is clearly faster in gaming, according to the article. So what's the point in what you say?

This article does. Not all articles do. I hope people gather info from more than one place. Also, for a 14nm+++++++++ Intel is loosing by a frame or five on top of million frames in 1080p. That's not telling me that AMD is superior by much, it tells me that AMD will soon have nothing to offer (well, it will have what to offer but not for gaming, unless they drop 5600x to $200 to undercut Intel again)
 
Also, for a 14nm+++++++++ Intel is loosing by a frame or five on top of million frames in 1080p. That's not telling me that AMD is superior by much, it tells me that AMD will soon have nothing to offer (well, it will have what to offer but not for gaming, unless they drop 5600x to $200 to undercut Intel again)
🤦‍♂️

Purpose of 14nm+++++++++ was to make CPU's that achieve high clock speeds so that CPU's are fast on crappy software that only uses one or two cores.

There is one problem though. 14nm+++++++++++++++ is very power hungry. Intel's 10nm process is not so power hungry but why do you think Intel makes Rocket lake on 14nm+++++++++++ instead 10nm? Because 14nm++++++++++++ gives much higher clock speeds.

Now, if Intel wants to match AMD's power consumption, they must at least use 10nm process with much lower clock speeds. But wait, then AMD is much faster on gaming :cool:
 
AMD cool and all but where tf am I going to get ahold of a 3800 C14 kit.

The weird RAM deal AMD still has going on is what keeps me from them. I can get a 5800x easy but not a fast C14 kit.

You dont need a CL14 3800Mhz ram, you can use anything you want, there is nothing weird with their ram support
 
This article does. Not all articles do. I hope people gather info from more than one place. Also, for a 14nm+++++++++ Intel is loosing by a frame or five on top of million frames in 1080p. That's not telling me that AMD is superior by much, it tells me that AMD will soon have nothing to offer (well, it will have what to offer but not for gaming, unless they drop 5600x to $200 to undercut Intel again)
First of all - Intel's 14nm+++++++ is somewhat comparable to TSMC's 7nm, proof: https://m.hexus.net/tech/news/cpu/145645-intel-14nm-amdtsmc-7nm-transistors-micro-compared/

So don't expect that Intel can really squeeze a lot, another fact confirming that is that they decided to rename their technological nodes to sound more impressive: https://www.kitguru.net/components/...naming-scheme-to-match-the-industry-standard/

Secondly, they kept using 14nm+++++ for a purpose of getting those oh so desired 5.0-5.3 GHz - good luck having such frequencies on Intel's 10nm. And Intel can't match AMD at same frequency right now - even in the article above (and other articles in web), 11700K can't match 5800X, while it's boost is +300MHz vs AMD's.
 
This article does. Not all articles do. I hope people gather info from more than one place. Also, for a 14nm+++++++++ Intel is loosing by a frame or five on top of million frames in 1080p. That's not telling me that AMD is superior by much, it tells me that AMD will soon have nothing to offer (well, it will have what to offer but not for gaming, unless they drop 5600x to $200 to undercut Intel again)
Most articles have only 4-6 games so it is hard to make a proper decision as a single game can swing the results by a lot. Just make an excel file and put all of the results there and see if there are differences between testing methodologies etc.

This is why I like the huge gaming benchmarks made by Steve. You get a much clearer image on performance.
 
Seriously, where exactly are you getting your tech info dude? You are either reading the wrong things (like userbenchmark) or you are intentionally saying misleading things.
What?.... I used 3200Mhz CL16 with my 2700X, my wife is using 3600Mhz CL16 no problem.

Have you ever used a Ryzen CPU?
Yall over here just acting like 1:1 infinity fabric ain't a thing. Yeah I can go find c16 kits easy. It's not what's optimal for ryzen though. Your 3600 c16 on a 5800x will not perform the same as a 3800 c14 kit. It's the sought after kit. It's the kit you should have your eyes on as a Ryzen buyer. 4000+ kits are pretty much out of the question.
 
Yall over here just acting like 1:1 infinity fabric ain't a thing. Yeah I can go find c16 kits easy. It's not what's optimal for ryzen though. Your 3600 c16 on a 5800x will not perform the same as a 3800 c14 kit. It's the sought after kit. It's the kit you should have your eyes on as a Ryzen buyer. 4000+ kits are pretty much out of the question.
It is a thing but 3800MHz won't give you much besides a very small boost. You are better off just going with better timings on your 3200/3600MHz RAM if you want to max things out without going bankrupt.



PS: Intel also benefits from such memory optimisations.

edit: unless you have a rare chip that can do 1:1 for 4000MHz RAM, don't buy such kits.
 
Last edited:
Yall over here just acting like 1:1 infinity fabric ain't a thing. Yeah I can go find c16 kits easy. It's not what's optimal for ryzen though. Your 3600 c16 on a 5800x will not perform the same as a 3800 c14 kit. It's the sought after kit. It's the kit you should have your eyes on as a Ryzen buyer. 4000+ kits are pretty much out of the question.
Nobody is all over the place, I had a Ryzen system, I still have one in my house that my wife is using, I built few PC's for my friends with Ryzen CPU so I know how they work, I don't know if you know but 3800Mhz CL14 would also be best for Intel but that doesn't mean that you can't use anything else, I would take 3600Mhz CL16 - 19 - 19 - 36 over 3800Mhz CL14 at half the price any day for any PC, Intel or AMD.

BTW. Infinity fabric always synchs with your ram up to a 3800Mhz frequency only if you go over that it goes into half the speed, same thing happens on 11th gen but there is after 3600Mhz
 
I would like to see a comparison to the nonK model, the 11900.
I have done a power restriction release for this nonK model.
Here's the result.
CinebenchR23
3DMark Timespy
CPU-Z
Passmark
It has a score that can compete with Ryzen 5800X on equal terms.
In Japan, the price of the 5800X and the nonK model 11900 are currently almost equal.

In might have a similar scores but why would anyone really buy it over the 5800X when AM4 is a better platform?
 
Cause it's cheaper? Reliability?

But it isn't cheaper, if the CPU's are the same price you still most likely going to get a better B550 board for the price of B560 board, Intel has to be cheaper to be considered and not just few dollars
 
I would like to see a comparison to the nonK model, the 11900.
I have done a power restriction release for this nonK model.
Here's the result.
CinebenchR23
3DMark Timespy
CPU-Z
Passmark
It has a score that can compete with Ryzen 5800X on equal terms.
In Japan, the price of the 5800X and the nonK model 11900 are currently almost equal.
I would buy it if it was a bit cheaper than the 5800x. It's all about the features you need and uses cases. Reliability is not an issue with any of these CPUs.
 
I would buy it if it was a bit cheaper than the 5800x. It's all about the features you need and uses cases. Reliability is not an issue with any of these CPUs.
You telling me Ryzen never had USB connectivity issues? I can go buy a 4400MHz kit and rely on a 5800x to use it?
But it isn't cheaper, if the CPU's are the same price you still most likely going to get a better B550 board for the price of B560 board, Intel has to be cheaper to be considered and not just few dollars
It is cheaper. The motherboards don't require fans to cool them. You can put in 4400MHz kits. This isn't like a few years ago when AMD wasn't even considerable as the cheaper alternative. AMD took the slightest little lead and decided to throw a $450 8 core at you and you ate it up.
 
You telling me Ryzen never had USB connectivity issues? I can go buy a 4400MHz kit and rely on a 5800x to use it?

It is cheaper. The motherboards don't require fans to cool them. You can put in 4400MHz kits. This isn't like a few years ago when AMD wasn't even considerable as the cheaper alternative. AMD took the slightest little lead and decided to throw a $450 8 core at you and you ate it up.
Yeah, it had USB issues for a short while until updates were released. Most mobos have no USB bug now. You are telling me Intel never had issues? You are going to nitpick here where all issues are pretty much known for both platforms?

Here, nitpick this:

As for the 8 core comment... you have the 11900k released by Intel, enough said.
 
Last edited:
Back