AMD Ryzen 7 9800X3D Review: The New Gaming CPU King

Testing a major gaming CPU only @1080p is a farce, but I guess including other resolution would take a bit of shine off from that super duper amazing 11% uplift. Smh.

It's undeniably a great processor, but the X3D hype train is really off the charts. These are not cheap CPUs (especially if it's a new platform cost) and the real uplift only comes with a high-end card (and not always anyway) so it's not all that clear cut as these reviews suggest.

That's before we even get to the current CPU market situation which is ridiculous. I'm not saying it's exactly AMDs fault, but it just seems like the X3D chips are made of unobtainium. The new one costs an arm and leg, and the old ones either also paradoxically do, or are just unavailable.

Bitd it was normal that appearance of a new hardware will bring prices of older stuff down, but it's just not happening anymore. Also, a reasonable 2nd hand market seems to have evaporated.
There’s always one 🙂 They test at 1080p to minimise gpu bottlenecks.
 
My take on this is that the 7800X3D remains the star. Using 33% to 60% more power to achieve 11% more performance doesn’t sound that great to me. Hopefully any remaining 7800X3D stock will be sold off cheaper again 👍
 
My take on this is that the 7800X3D remains the star. Using 33% to 60% more power to achieve 11% more performance doesn’t sound that great to me. Hopefully any remaining 7800X3D stock will be sold off cheaper again 👍
IKR I’ve been asking for performance per watt charts for years, but nope, they just don’t care.
 
I
Since I've gone all in on Linux, AMD GPUs are my only option. Nvidia said they're going to work on improving their linux drivers but I'll believe it when I see. So, for the foreseeable future, AMD GPUs it is. Outside of Cyberpunk. I see raytracing as a disappointment but I do play lots of CP. Aside from that, get ~90fps is most games at 4k with my 6700xt. I'm hoping that the 8800XT will give 4080 levels of performance for ~$500. AMD doesn't need a flagship, they need a midranged monster.
Why is AMD your only option on Linux? We’ve been using NV on several hundred workstations and servers (up to 10 GPUs per node) for a decade, they work just fine.

I’d go as far as to say NV drivers are much easier to install and upgrade than AMD ones.
 
Why is AMD your only option on Linux? We’ve been using NV on several hundred workstations and servers (up to 10 GPUs per node) for a decade, they work just fine.

I’d go as far as to say NV drivers are much easier to install and upgrade than AMD ones.
Because I have a gaming desktop. If all I wanted to do was build compute clusters then, yeah, NV is great. I want to use my computer to argue with strangers on the internet and play videogames while also having a stable experience. That's difficult with nVIDIA GPUs(on linux).

Also, it's not just AMD GPUs. AMD chipset features and linux don't get along the best. It's been my experience that than Intel setups with AMD GPUs provide the best experience on linux. That said, I still have 5 AMD systems on the rack in my homelab and my desktop is a 3800x but my laptop is an i5-11300H and that by far has been the most stable.

Still, many of the problems I had in linux gaming on my 1080ti have been non-existent on the 6700xt.
 
Last edited:
Good to see an improvement finally, but it's too expensive to justify an upgrade from a 5800X3d on Zen4. When you add the very high cost of a motherboard and memory it's a lot of money for the gain you get. Close but think I'll wait another generation.
 
Interesting, thanks for the response. So I'd take that advice to mean, check your GPU usage, and if it's mostly pegged at 99% you've got enough CPU, and if it's not, then you don't. I should probably start paying more detailed attention, but my general sense is that I'm usually not that high. On the other hand, my CPU is almost never near 99% either, certainly not as an all-core total, but I think usually not on any individual core either. Maybe platform bandwidth (not sure if that's the right word but I mean general ability to move data around the system) is a limiting factor?

You also need to consider that even multithreaded games will typically have the majority of the load on only a few threads. So your CPU usage, which is the aggregate across all cores, might only say the CPU is 50-60% loaded, but the primary thread of the game is maxing out a single core and is therefore bottlenecking the framerate. This is especially the case for older game engines that are less capable of evenly distributing the load across multiple threads.

So, just because your CPU isn't at 99%, doesn't mean you aren't CPU limited in a particular game. You really need to look at benchmarks from other reviewers using your GPU to see what your GPU could provide if you had a faster processor.
 
Interesting, thanks for the response. So I'd take that advice to mean, check your GPU usage, and if it's mostly pegged at 99% you've got enough CPU, and if it's not, then you don't. I should probably start paying more detailed attention, but my general sense is that I'm usually not that high. On the other hand, my CPU is almost never near 99% either, certainly not as an all-core total, but I think usually not on any individual core either. Maybe platform bandwidth (not sure if that's the right word but I mean general ability to move data around the system) is a limiting factor?
Your CPU usage is irrelevant. It can be at 10% and still bottleneck your gpu. It's your gpu usage that matters.
 
My take on this is that the 7800X3D remains the star. Using 33% to 60% more power to achieve 11% more performance doesn’t sound that great to me. Hopefully any remaining 7800X3D stock will be sold off cheaper again 👍

In terms of value maybe, in terms of performance, absolutely not. 9800X3D tears 7800X3D apart and application performance is up 20-30 which is massive, mostly due to running 5.2 GHz on all cores at all times, with option to hit 5.4-5.6 GHz with OC and beat it even more.

+11% is overall in games. Some games shows up to +30% improvement in avg. fps and 1% lows.

This chip is obviously for people that need the absolute best gaming performance and don't want to compromise on performance outside of games. 5800X3D/7800X3D is a big compromise here. 7800X3D loses to Ryzen 7700 NON-X in applications.

This was the huge problem with earlier 3D cache chips and 2nd gen 3D cache fixed the clockspeed issues. Some games also prefer clockspeed over cache and 7800X3D lagged behind here, 9800X3D delivers on both clockspeed and cache. Best of both worlds. The new gaming king has arrived, and this time, with no drawbacks.

Chip gets 90/100 for a reason. Best rated CPU release this year. Best gaming CPU on the planet.

9800X3D uses slightly more watts, but runs cooler anyway, because 3D cache is placed at the bottom and cores can use IHS properly to remove heat. This is what AMD wanted 3D chips to be from the start, yet could not afford to do it, as it required a complete redesign.

I own 7800X3D and will upgrade. I can use the 7800X3D in my HTPC anyway.
I will be running 5.4+ GHz on my 9800X3D for sure. That is like 1 GHz higher than what my 7800X3D can drop to in demanding games and applications.
 
Last edited:
My ideal set up would be: 9800x3d + 8800XT + 1440p OLED 240hz monitor. I'm still wondering a little bit of using the 9800x3d for 1440p gaming whether it's worth it or should I opt for a cheaper CPU. Any one using 7800x3d for 1440p who can give me some advice?
 
My ideal set up would be: 9800x3d + 8800XT + 1440p OLED 240hz monitor. I'm still wondering a little bit of using the 9800x3d for 1440p gaming whether it's worth it or should I opt for a cheaper CPU. Any one using 7800x3d for 1440p who can give me some advice?
9800X3D no doubt if you want to maximize fps and make full use of that 240 Hz monitor.
 
They test at 1080p to minimise gpu bottlenecks.
Whoa, such wisdom! Who'd've thunk?
Since you are clearly in-the-know, maybe you could answer a couple of questions that bother me a bit:
-who exactly is this review for, apart from the handful of e-sports people and the "muh science" crowd? Seeing as it's not much use for me or other people who'd like to hear about real-life scenarios and decide if it really is a worthy upgrade?
-how come other sites such as Digital Foundry or TechPowerUp do bother with including higher rez tests? I presume they are wasting their time?
 
Thanks for the good review, again.. you're happy to call a spade a spade, and some shovels.. well a deserving turd... which I appreciate. It seems like this one delivers - a likely future upgrade for me.

(no I'm not being sarcastic! ha)
 
I was able to secure 1 before Microcenter opens.

update and it's sold out before it even opened
AMD'S website too.
 

Attachments

  • Screenshot_20241107_090400_Chrome.jpg
    Screenshot_20241107_090400_Chrome.jpg
    577.9 KB · Views: 2
  • Screenshot_20241107_091711_Chrome.jpg
    Screenshot_20241107_091711_Chrome.jpg
    395.6 KB · Views: 1
Last edited:
Whoa, such wisdom! Who'd've thunk?
Since you are clearly in-the-know, maybe you could answer a couple of questions that bother me a bit:
-who exactly is this review for, apart from the handful of e-sports people and the "muh science" crowd? Seeing as it's not much use for me or other people who'd like to hear about real-life scenarios and decide if it really is a worthy upgrade?
-how come other sites such as Digital Foundry or TechPowerUp do bother with including higher rez tests? I presume they are wasting their time?

What on earth are you babbling? It's been explained a million times, one that you've even quoted, that the low resolution game testing separates out the CPUs performance. At 4K you can assume they will all be identical or thereabouts, hence why it's a CPU bottleneck.

Techpowerups review shows the difference between the 9800X3D (fastest) and 9950X (slowest) at 2% in their charts at 4K, so it's a pointless test that just takes time. They only include it for completeness so people don't have to spoon-feed this response every time they do a CPU review.

Finally, in case you've missed it this is a tech site that reviews the latest tech for the "muh science" crowd, as you eloquently describe it.
 
What on earth are you babbling? It's been explained a million times, one that you've even quoted, that the low resolution game testing separates out the CPUs performance. At 4K you can assume they will all be identical or thereabouts, hence why it's a CPU bottleneck.

Techpowerups review shows the difference between the 9800X3D (fastest) and 9950X (slowest) at 2% in their charts at 4K, so it's a pointless test that just takes time. They only include it for completeness so people don't have to spoon-feed this response every time they do a CPU review.

Finally, in case you've missed it this is a tech site that reviews the latest tech for the "muh science" crowd, as you eloquently describe it.
It doesn't matter how many times you repeat that bottleneck mantra. It still doesn't explain why the results which actually matter to most people are omitted. If you want to claim we should "assume" it, then your desire to be in with the science crowd somehow doesn't gel. I thought that scientists base their findings on facts, not assumptions? Advising people to just believe that 4K results will forever be the same, just because the reviewers couldn't be bothered to do their job is quite something.

It also boggles the mind that a test which shows that upgrading to 9800X3D could be actually a waste of money for quite a lot of gamers - you realize that a lot of people who actually spend this kinda money probably don't game at 1080? - is described as a waste of time. Somebody send a memo to those dum dums at TS and DF, please enlighten them that their efforts are "pointless".

This is even before we get to 1440 p which somehow your deep analysis of my post omits. Should I also assume it's pointless? And if so, then how does it resonate with the claim this is a "new gaming CPU king"? Shouldn't such royalty deliver massive uplifts across all resolutions?

The king is actually naked, and all you and your fellow scientists can do is to repeat a kindergarten-level excuse for an extremely flawed reviewing methods (the reason I don't even mention 4090 - previously an outlier - here, is becasue soon it will become much more normalized and affordable).
 
It doesn't matter how many times you repeat that bottleneck mantra. It still doesn't explain why the results which actually matter to most people are omitted.
4k gamers are the smallest crowd, not the largest.

Anyway, the fact you're incapable of working out if your GPU will be your bottleneck or not, is not the reviewers fault, hence why it's mostly a complete waste of time to bother testing with and as the tests show from the sites you quoted, it doesn't say anything.

If you're looking into the future, when you upgrade your GPU for more frames at 4k, then buying the weaker CPU now just means you'll be bottlenecked at the CPU level now rather than the GPU, so you might as well buy the better CPU now.

"Oh, but how would I know which CPU is better" I hear you cry, well if we test at a lower resolution... We can find out which ones are better when we remove the GPU as a bottleneck...
 
Last edited:
4k gamers are the smallest crowd, not the largest.

Anyway, the fact you're incapable of working out if your GPU will be your bottleneck or not, is not the reviwers fault, hence why it's mostly a complete waste of time to bother testing with and as the tests show from the sites you quoted, it doesn't say anything.

If you're looking into the future, when you upgrade your GPU for more frames at 4k, then buying the weaker CPU now just means you'll be bottlenecked at the CPU level now rather than the GPU, so you might as well buy the better CPU now.
Hey, brother, clearly reading comprehension is not your strongest suit, so perhaps you could go back, read what I said again, and tell me more about how 1440p gamers are also "the smallest crowd"? How about the percentage of gamers who spend fortune on 4090 and 9800X3D and then play in 1080p, have you got any numbers on that crowd?

The "advice" (lol) to "just buy the better CPU now" is so ridiculous that I'm not sure what to say. You mean it's better to spend nearly 500 USD now, even if it will not give me any visible benefits assuming I'm a 4K gamer, and perhaps small ones if I'm a 1440p one? Are you aware that prices go down (well, at least they used to), and that new CPUs are being released? Here's another radical idea, people have other cards than 4090! Crazy, I know! And guess what, these CPUs work much better with hi end cards and yield even smaller gains with weaker ones. But I suppose I should still place my order pronto, because the "New King" "obliterates" the competition.

If you're too blind to see that testing in higher rez actually does say something - namely whether it's worth upgrading or not *right now*, which is kinda the main point of these articles, then there really isn't much to add. Maybe that not all of us have cash to splash around and not care about losing 2-3 hundred smackers here or there, but hey, who cares.
"Oh, but how would I know which CPU is better" I hear you cry, well if we test at a lower resolution... We can find out which ones are better when we remove the GPU as a bottleneck...
You're kinda right, this is so daft that it does nearly make me cry :) If so, why not test at 720p, or better yet, just fire up Quake at 640x480? Your boTTlEnEck would be removed even more this way.

Ok, this is getting really silly now, so for the next genius who'd like to reply, please don't bother unless you can at least try and answer the following questions without using any mental gymnastics I replied to here previously:

-who exactly is this review for?
-how come other sites do include higher rez tests?
 
To me the review provides helpful information on its own and I am happy for it. It just doesn't provide all of the information I could use when it comes time to plan a balanced system.

I do think it might benefit from just a sentence or two to warn less savvy readers just passing through how to best interpret it when it comes to practical buying advice - maybe a standard link to a reusable article could serve well.

The issue those of us who are not pro-gamers focused on high FPS at low settings face is that without access to all the different cpu + gpu pieces to mix and match test for ourselves, it's not trivial to understand how to allocate a given budget across a CPU + GPU. This is where a reviewer with access to all the parts (and a whole benchmarking infrastructure) is really in a position to do a service that is not easy to replicate for ourselves.

For example, we know that at low settings, the CPUs will turn in markedly different performances. It's been stated by several in the comments that at high settings/4k, all the cpus would turn in very similar performances. So we know about the extremes - but now where are the breakpoints in the middle? It would be interesting to see cross cpu+gpu analysis done either by budget or target graphics level.

Until then, my practical takeaway for non-pro gamers is that it probably makes sense to pick your CPU primarily based on your non-gaming needs and assume your GPU is still going to be the bottleneck whichever CPU that turns out to be (within reason of course.)
 
It doesn't matter how many times you repeat that bottleneck mantra. It still doesn't explain why the results which actually matter to most people are omitted.
The results that most people want to see in a CPU review is how fast the CPUs are. Nobody cares about how fast the 4090 is, which is what'd you'd be testing if you used 4k. How hard is that to grasp my man? I really don't get it.

What useful information will a 4k CPU test give you. Seriously, just answer the question if you so please.
 
Even a 5800X3D has 98% of the performance of the 9800X3D
relative-performance-games-38410-2160.png


But on the minimum FPS AVG the 7800X3D it's 2-3 fps better thank stock 9800X3D

minimum-fps-3840-2160.png


So if I have the money for a 4K display and GPU the current 5700X3D should do the job just fine.

Now time to rob a bank...
 
The results that most people want to see in a CPU review is how fast the CPUs are. Nobody cares about how fast the 4090 is, which is what'd you'd be testing if you used 4k. How hard is that to grasp my man? I really don't get it.

What useful information will a 4k CPU test give you. Seriously, just answer the question if you so please.

Yep at 4k the minimum delta difference between them 5800x3d are around 3% and relative averages as well. At 1080p the minimum framerate has a delta difference of 20% while the averages are slightly lower at 17%. I wonder how much a 5090 improve the deltas, looking at the 720p performance the difference onto TPU's reviw is about 33% on the relative average. We might see a similar delta move up 1 resolution if the 5090 is at least a 50% improvement over the 4090.
 
Back