AMD Ryzen 7 7800X3D vs. Intel Core i9-14900K

Status
Not open for further replies.
For me it was 7800X3D on launch earlier this year, for gaming. It was also largely the knowledge that not only do I have a CPU that is class leading and incredibly power efficient, it is on a platform that will probably last at least two more CPU generations. I do expect Intel to respond with large cache models next time around, but they really need a newer process node more than anything.

AM5 had some launch teething issues not least the infamous smoking CPUs but it feels a much more stable environment in my experience than when I set up this system nine months ago.
 
If you are into a small factor build pc without risk of throttling the 7800X3D is the best cpu you can get. The fact that Intel rebranded the i9 13900K and didn't dethrone the 7800X3D in gaming they are getting double the beat down exposure wise. Not a smart move by Intel imo. While the i9 14900k is a great cpu for everything outside of gaming its a gaming cpu last. Meaning if are only gaming this is the cpu to avoid from a higher premium, higher ram cost, higher cooling cost, worse efficiency and lastly not ideal in a small form factor build pc.
Unfortunately the price of the 7800X3D went back up to $399 unless you live near a Microcenter where it's $349 and $499 for Combo of: AMD Ryzen 7 7800X3D, MSI B650-P Pro WiFi DDR5, G.Skill Flare X5 Series 32GB DDR5-6000 Kit, Computer Build Bundle

With AMD recently remphasizing support for am5 until 2025+ early adapters can only hope for a potential zen6 3d compatibility but I wouldn't hold my breath.
 
Last edited:
@Steven "we have the 4K data, which, in our opinion, is not very valuable for CPU testing" That's for High-end CPU but what about low-end one?
In one of your benchmark review for ryzen 5 1600 and i5 7xxx (I could not find it anymore to be honest). I remember at 720p the intel part was the faster one but at 1080p it was the Amd one. From what I have seen rising the resolution rise the CPU demand too (but of course instead of 4k test u can easily use Cinebench to check how powerful the CPU is).

BTW Ryzen 7 7800X3D is good for productivity too not the best but more gaming less productivity I think it is no brainer it is "The One CPU".
 
Step 1)If money is no object, you buy the 14900k.

step 2)If money is a problem and you just want to play games, then between the 2 the 7800x 3d is the obvious choice.

step 3)Problem is, there are other cpus that offer great gaming performance and are much cheaper than either of these 2, so unless you are playing at 1080p with a 4090 (in which case, money is obviously no object so go to step 1) you are better off with something cheaper (eg, 13600k).

step4) If you want an all around fast cpu then again - for the same money as the 7800x 3d there are much better options (13700k - 14700k - 7900x)

When you are gaming at 4k with a 4090 as you should, 85% of the system power draw is the GPU. The difference in power draw between cpus will be like 30 watts at max. It's not even worth thinking about it.
 
The 7800x 3d isn't the better CPU though. If money is no object and Intel did not exist I would still buy a 7950x 3d over the 7800x 3d. Cause what can you do, it's not a particularly good cpu. It's pathetically slow on everything but games.

Let me wait for 40 minutes every time I want to play some RPCS 3 cause the 8core chip is struggling to build the shaders. Yeah, makes sense.
 
Step 1)If money is no object, you buy the 14900k.

step 2)If money is a problem and you just want to play games, then between the 2 the 7800x 3d is the obvious choice.

step 3)Problem is, there are other cpus that offer great gaming performance and are much cheaper than either of these 2, so unless you are playing at 1080p with a 4090 (in which case, money is obviously no object so go to step 1) you are better off with something cheaper (eg, 13600k).

step4) If you want an all around fast cpu then again - for the same money as the 7800x 3d there are much better options (13700k - 14700k - 7900x)

When you are gaming at 4k with a 4090 as you should, 85% of the system power draw is the GPU. The difference in power draw between cpus will be like 30 watts at max. It's not even worth thinking about it.
Intel and their paid actors need to go away.
 
The 7800x 3d isn't the better CPU though. If money is no object and Intel did not exist I would still buy a 7950x 3d over the 7800x 3d. Cause what can you do, it's not a particularly good cpu. It's pathetically slow on everything but games.

Let me wait for 40 minutes every time I want to play some RPCS 3 cause the 8core chip is struggling to build the shaders. Yeah, makes sense.
Do you have any articles and or videos showing scaling shader compilation in RPCS 3? I am very interested in this. I've played more than a handful of games with shader compilation that takes a few seconds to 1 minute maximum on my 7800X3D.
 
7800x3d and i9-14900K users generally use 1440 resolution or even 4K resolution when playing games, fps difference may be negligible... then the choice is just a matter of price difference...
however, if what you want is to play games and do work that requires high productivity then the choice can be a challenge..
 
Do you have any articles and or videos showing scaling shader compilation in RPCS 3? I am very interested in this. I've played more than a handful of games with shader compilation that takes a few seconds to 1 minute maximum on my 7800X3D.
How many shaders are they compiling? For example in GT6 I've only reached 63k, and I've heard that other games like metal gear solid hit over 400k. Havent tested metal gear, but I assume thats going to take a LOT
 
I9 for gaming ? if you do this, you clearly have a problem, most of the cores are not even used... the only sensible choice for gaming, if you do only that, is an x3D period.
The only sensible choice for playing games is a cpu that costs 400€. Right, yeah.
 
7800x3d and i9-14900K users generally use 1440 resolution or even 4K resolution when playing games, fps difference may be negligible... then the choice is just a matter of price difference...
however, if what you want is to play games and do work that requires high productivity then the choice can be a challenge..
Actually, the resolution depends on the GPU one is using. You need to remember that with up scaling technology like DLSS and FSR, running anything below quality mode in 4K means it’s rendering at 1080p. If you are gaming at 1440p, quality mode on either up scaling tech means 1080p native. So we are running games at 1080p more often than we know with very GPU taxing games introduced in 2023.
 
Actually, the resolution depends on the GPU one is using. You need to remember that with up scaling technology like DLSS and FSR, running anything below quality mode in 4K means it’s rendering at 1080p. If you are gaming at 1440p, quality mode on either up scaling tech means 1080p native. So we are running games at 1080p more often than we know with very GPU taxing games introduced in 2023.
Imagine the year 2023 where you just paid $2k for 4090 to run games in 1080p because Nvidia wants you to use RT and DLSS
 
130 watt difference, and the x3d averages 5% faster, so the GPU is under higher load. The watt slurping method is not printing dividends for intel anymore.

The heat is really no joke. Co worker did buy a 14900k, against advice, and struggles constantly to keep the thing from throttling, even with a water cooler. And it heats up his room to a ridiculous degree. I cant imagine what he will do this summer. My own machine with a 6800xt gets the room uncomfortably warm in the summer, and its pulling a total system draw comparable to that 7800x3d build.
 
130 watt difference, and the x3d averages 5% faster, so the GPU is under higher load. The watt slurping method is not printing dividends for intel anymore.

The heat is really no joke. Co worker did buy a 14900k, against advice, and struggles constantly to keep the thing from throttling, even with a water cooler. And it heats up his room to a ridiculous degree. I cant imagine what he will do this summer. My own machine with a 6800xt gets the room uncomfortably warm in the summer, and its pulling a total system draw comparable to that 7800x3d build.
Lmao, what a bunch of BS. Sure, if you run Cinebench with 32 threads at max overclocks, the power consumption is ridiculous. However, when playing normal games that utilize a few cores, my 6.2Ghz P / 4.8Ghz E 14900K consumes about 65W. On idle it's about 12W, in single core games around 35W. In very heavy games the power consumption does jump a bit, but it's pretty unrealistic to be honest. SOTTR can load 16 cores to the max, and in that scenario it was 165W. Still a third of a stock 4090, so I don't really know if you can say it "heats up the room to a ridiculous degree".

What comes to cooling, tell him to check his cooler is mounted properly. 100C at stock is not normal.

5.7Ghz:
 
130 watt difference, and the x3d averages 5% faster, so the GPU is under higher load. The watt slurping method is not printing dividends for intel anymore.

The heat is really no joke. Co worker did buy a 14900k, against advice, and struggles constantly to keep the thing from throttling, even with a water cooler. And it heats up his room to a ridiculous degree. I cant imagine what he will do this summer. My own machine with a 6800xt gets the room uncomfortably warm in the summer, and its pulling a total system draw comparable to that 7800x3d build.
Cool story bro. Wish it was true though.

If your coworker is running cinebench on a loop and he doesn't want it to thermal throttle he can power limit it, right? Set it to 200w, it will be incedibly easy to cool and still the 2nd fastest cpu in MT. But what water cooler does he have? Im scoring 42k cbr23 on a u12a so, kinda hrad to believe your story
 
Lmao, what a bunch of BS. Sure, if you run Cinebench with 32 threads at max overclocks, the power consumption is ridiculous. However, when playing normal games that utilize a few cores, my 6.2Ghz P / 4.8Ghz E 14900K consumes about 65W. On idle it's about 12W, in single core games around 35W. In very heavy games the power consumption does jump a bit, but it's pretty unrealistic to be honest.
Cool story bro. Wish it was true though.

If your coworker is running cinebench on a loop and he doesn't want it to thermal throttle he can power limit it, right?
I shall once again point you both to this:
Once again, 12 game average, the 14900K is eating 130 watts more than the 7800X3D system while being (on average) 5% slower.
Set it to 200w
Why, would anyone, spend all that money on the 14900K to then limit it even further? Why would you just not buy the 7800X3D instead?
 
I shall once again point you both to this:
Once again, 12 game average, the 14900K is eating 130 watts more than the 7800X3D system while being (on average) 5% slower.

Why, would anyone, spend all that money on the 14900K to then limit it even further? Why would you just not buy the 7800X3D instead?
Guy is apparently not talking about games. You think his coworker is thermal throttling with an AIO playing games? Apparently he is talking about MT workloads. In which case even at 200w the 14900k is literally twice as fast as the 7800x 3d.

Regarding gaming, are you playing at 1080p with a 4090? At 4k 85% of the power draw of your system is your GPU. All cpus will consume very similar amounts of power. In fact I just tried a 12900k @ 4k cyberpunk PT + DLSS Q, with a 4090 it averages around 65w. Does the 3d consume less? Probably. But how much less, 10 watts? 15? Wow, that's really great...meanwhile the 4090 is hitting 480 watts, total power draw including the monitor and after PSU loses is probably over 700 watts, but hey, those 10 watts are really going to save me
 
Guy is apparently not talking about games. You think his coworker is thermal throttling with an AIO playing games? Apparently he is talking about MT workloads.
Man this is utterly painful, I'll screenshot it then:Screenshot 2023-12-24 144107.png
Where does it say this graph isn't about games? Where does it say it's specifically MT workloads?
You both are wrong, the 14900K does eat substantially more power when gaming, whilst also being slower. There is no ifs or buts, that is a fact.

I won't be responding past this post, until you're able to grasp the basic facts in this article, there's nothing further to discuss.
 
Man this is utterly painful, I'll screenshot it then:View attachment 89377
Where does it say this graph isn't about games? Where does it say it's specifically MT workloads?
You both are wrong, the 14900K does eat substantially more power when gaming, whilst also being slower. There is no ifs or buts, that is a fact.

I won't be responding past this post, until you're able to grasp the basic facts in this article, there's nothing further to discuss.
You're not missing much. Strawman is a well known intel fanboi. So is Antsu, apparently. I'll never understand it, but some will always meat-shield for corporations for 0 compensation. I do love their solution to the overheating that totally doesnt happen when gaming is to cut the TDP down to 200w, kneecapping the performance of the K chip.
 
Man this is utterly painful, I'll screenshot it then:View attachment 89377
Where does it say this graph isn't about games? Where does it say it's specifically MT workloads?
You both are wrong, the 14900K does eat substantially more power when gaming, whilst also being slower. There is no ifs or buts, that is a fact.

I won't be responding past this post, until you're able to grasp the basic facts in this article, there's nothing further to discuss.
Im not talking about the graph but about the post I replied to.
 
Status
Not open for further replies.
Back