Is the Ryzen 9800X3D Truly Faster for Real-World 4K Gaming?

I don't see a bunch of people complaining that the wrong CPU or brand was handed the crown. To me this discussion is mostly around the lack of additional information around "and how much of that potential difference could I expect to realize for my actual use case?"


I must be pretty effing stupid because I have no idea how to extrapolate from 1080p/4090/low results to 1440p/3080/high going just on paper. Is there a formula for that? I don't believe it's as simple as "none because you're entirely gpu limited" nor as dramatic as the single-case charts might suggest. Where exactly in-between is not an equation I can solve for myself.

Also, I feel important reviews like this get read not only by regulars who follow all this as a hobby, but also the drop-in crowd who is here every few years when its time to buy. They're not stupid but they could use the context and background that regulars might take for granted.
You can't extrapolate case-specific data from a single data point. There needs to be at least a second data point to create a difference and an inferred trajectory, along which you can infer additional data estimations for a variety of systems and use cases that weren't included in the base testing.

That's the point that's being made when people saying 1080p benchmarks on their own are useless to people wondering if they should upgrade. To be able to extrapolate 1440p and 4k performance, there needs to be testing at those resolutions using the same hardware and quality settings as the low-res test, so that the performance drop-off can be seen, so that people can then compare the witnessed GPU-limited drop-off results to their situation and infer whether the situation would be better or worse for them (and it will almost always be much worse, unless someone has the same system as the test data is from, or is using the data in the future after newer, more powerful GPUs are available).
 
Last edited:
Something I find weirdly normalized among recent reviews of top-end gaming CPUs is how many of them featuring only 1080p benchmarks don't say anything at all to inform viewers that the displayed gaming performance advantages of the new CPU won't be present at higher, typical-use resolutions. Hyping-up the CPU performance at 1080p and not informing people that that performance is basically just academic, while the actual gaming performance difference will disappear to around 0% at 1440p even with the most powerful, obscenely-expensive GPU really is lying by omission, and in effect isn't different than taking a kick-back from the manufacturer to upsell viewers on something they might not be willing to spend more money on, if they were aware they're paying for gaming performance they won't experience.

The primary purpose of CPU release reviews is to inform and guide potential purchasers of the product. And so, leaving out the part that actually informs them of the value and performance they'll get at the normal use-case resolutions is falling short of its job.

BTW, I think the 9800X3D is a really good CPU - but not due to a small gaming performance increase that won't actually be experienced by ~98% of purchasers for the next 6+ years. I think it's a good CPU due to it offering significantly more productivity performance than the 7800X3D while delivering the same typical-use gaming performance as the 7800X3D (however, while consuming significantly more power than a 7800X3D - which is disappointing). I also think it's a poorly-priced CPU that costs too much, given that it's not really offering much to its target market (gamers) over what was offered 1.6 years ago with the 7800X3D. I also think it's a bit disappointing considering that it comes 1 year and 7 months after the 7800X3D, and that it will be the only potential 7800X3D upgrade we get before its replacement arrives a rumoured 2 to 2.2 years from now. While it's a good CPU, the 9800X3D represents underwhelming and disappointing progress for the space of time it's filling, leaving us with nothing exciting for a very long period of time - and Intel comparisons are entirely notwithstanding to that point.
 
Last edited:
Am I the only one who finds the writing style in this article incredibly hard to read? All the preaching, the repetition, the patronizing tone—even at one point describing anyone who doesn’t buy the arguments presented as having some kind of personality disorder...

That kind of arrogance is usually reserved for Chuck Norris memes :)
 
"In contrast to GPU bottlenecks, which can almost always be alleviated by compromising on visuals, there's little you can do when you hit a CPU bottleneck – you've essentially reached a hard limit."

You are almost there Steve, this is exactly what we want, we want to know where the bottlenecks are, so that we know when our CPU is no longer competitive and that it's time to upgrade. Its not about seeing the All the 4k results, its where the the edge is. It doesn't even need to be a every CPU review thing, how bout once per Gen after the new product comes out, it would be nice and usefull for a large population.
 
I must be pretty effing stupid because I have no idea how to extrapolate from 1080p/4090/low results to 1440p/3080/high going just on paper. Is there a formula for that? I don't believe it's as simple as "none because you're entirely gpu limited" nor as dramatic as the single-case charts might suggest. Where exactly in-between is not an equation I can solve for myself.

Yes there's a formula for that: Youtube.

Reviews at exactly zero of the major sites review CPUs using every GPU available.
Reviews at exactly zero of the major sites review the exact games you play or will play in the future.

But someone on Youtube may have covered close to your setup if not the exact one.

Nobody mentioned anything about extrapolating from very different hardware because obviously that's impossible. However it's reasonable for someone who's about to spend $600 or $800 or $1200 on a new PC to:

read CPU reviews and see how a 7600/13600 performs
read GPU reviews and see how a 3080 performs

and the like and see if the fps in their games or similar ones are high enough in each. You well know there are no perfect answers but there are decent ballpark answers and that's where the 3060 buyer I mentioned has a very good chance to figure out that a 7800X3D is a waste of money whereas a 7600 or 13600 is not.

Also, I feel important reviews like this get read not only by regulars who follow all this as a hobby, but also the drop-in crowd who is here every few years when its time to buy. They're not stupid but they could use the context and background that regulars might take for granted.

Reading more will help that along but the suggestion here to get a more typical gamer's perspective is a good one. At the end of a 4090 review, it should be put in a system with an R5/i5 to see that it makes little difference in many games. And similarly in a 9800X3D review, it be put in a 3070/6700 XT system to show that will also make little difference in many games.
 
The video/article is for those who polled that they play 4k games upscaled. Not the 9% who said they play native. Can't please everyone.

It isn't 4k upscaled, it's 2227x1253 upscaled

Almost all games show huge benefit in 1% low on a X3D chip @ 4K. Seems like a winner to me.
Again, Techspot didn't use 4k.
On another site the low difference between a 5600x and 9800x3d is 7.2 fps at native 4k.
That was using the average lows over 10 games.
 
Many people who will want a cpu like this will pair it to their rtx 4090. You guys think we will play 1080p on our 4090's? And we demand 4k tests so we can decide if we will need to skip this generations amd cpu and stick with our 7950x3d/7800x3d
As Steve mentioned, you may well use a 4090 at 1080 if you’re into competitive gaming, or some other specific usage. Not everyone is the same.
“we demand 4k tests…” says it all for me.

I game at 4k. I enjoy Steve’s cpu and gpu reviews, and they tell me everything I need to know about if and when to upgrade.
If people still aren’t satisfied with the 1080 testing explanations, maybe they should look elsewhere.

Or maybe you could DM Steve every time a new cpu is released and he can tell you if you should upgrade yet?
 
Doing the research at 1080p is fine. Publishing articles that leave non-sophisticated readers with the impression that purchasing a shiny new CPU will have drastic impact on their gaming experience is not (when it won't).

Most of the tech reviewers who do the 1080p testing will say all the appropriate disclaimers if you happen to catch the right place in the right article or video. Not all publications will ensure that message has much chance of getting through though. For example, failing to balance an article done mostly at 1080p with even one single chart depicting the more muted differences in other use cases could strike some as deceptive.
You have a reasonable point there.
Perhaps the compromise is for just one graph showing 4k results, and one showing the 1080 results using the current market leader gpu - is it currently the 3060?
 
It's still a win. The X3D league are something special in regards of gaming.

Yes, it has a small victory at 4k, but if your current system is a 5600x is it worth it to spend $700 on a new system for that 7 fps at native 4k?
Just so you know, I am currently using a 5800x3d, and next year I plan to upgrade my system to a 9800x3d.
So I have nothing against it.

I just find that you get people that will read reviews on one site and think, Awesome 30 fps extra in games.
Not taking into account that the test are "only" done at 1080p.

Why can't 4 games at native 4k be included in the tests?
Say two heavy GPU bound games and two light GPU bound games, it would give more info to the reader to help make a more informed decision.
 
I think you have to reevaluate why gamers want 1440p and 4K benchmarks in relation with 1080p results. In an ideal world where money is no object, obviously everyone will go for the best CPU. But the fact is these CPUs are so darn expensive these days.

The fact is consumers who buy CPUs purely to game would want to see how low they can get away with. If the margins between the best and most expensive CPU and a mainstream CPU - that is half the price - at 1440p and 4K is low enough, that's what these consumers will go for.

Where the best CPUs will shine is in productivity workloads.
 
I find this website highly deceptive. I'm commenting mainly to warn the newer people who might not see what you're doing. I've been in this game for close to 30y.

This article was written with half-truths or outright lies.

"While some reviewers include 1440p and 4K benchmarks, they do not emphasize them." Most not some. This site is not part of the majority but a minority that hides 4k benchmarks. I don't go to all the sites, but the ones I go to, all test at all the 3 common resolutions. Another half-truth is that they don't emphasize them while leaving the fact that they don't emphasize any of these resolutions.

Here is an excerpt that one can truly apprieciate as a reader.

"At the highly GPU-bound 4K scenario, even with an RTX 4090, the differences between the top processors are minimal, and they will all offer a fantastic gaming experience. If you don't have to own the best, and want to smartly manage your budget, it might make sense to opt for a more affordable CPU and spend the savings on a faster graphics card, where you are getting more FPS for your money. Interesting gaming CPUs are Ryzen 9600X ($250), 7700X ($280), 14600K ($260), 13700K ($280) these are all considerably more affordable than the $480 9800X3D, of course not nearly as sexy, no bragging rights, but they'll get the job done, and you can buy a one-tier faster GPU."

Regarding your headline. 4k tests? Please point me to your 4k tests in this article? Also show me how I can duplicate your tests by turning on DLSS on my Radeon 6950XT? This somehow is an even more pointless test than the original one with just 1080 results.

Just because someone wants to see the full picture, doesn't mean they are shilling for the other company. This isn't politics.

I got a Ryzan 5600 a year ago for $110 that's 7% slower than the "king of gaming" 3 generations later at 4k. I can easily afford to spend another $500 on a CPU but why would I do that for 7% gain with a 4090 that I don't have? 4090 also offers such terrible value over a 4080 that I probabaly won't ever buy it either. Of course I have to also get an entire AM5 platform that will be easily over $1,000. For 7%? I can get a 4k OLED monitor or put that towards a GPU. Why spend money on something that gives you nearly no gains?

Then you talk about the future. Let me warn everyone. Future proofing doesn't exist. When you're ready to updrade to a 4090 level card 5y from now to finally take advantage of your king of gaming, you will also replace your CPU with the newest king of gaming. You will again be stuck with untapped potential waiting for that next gen GPU to take advantage of it. It's going to be an endless cycle. All that extra potential performance will never get used. x86 CPUs have been stagnant. When AMD released something with more than 4 cores, that was a breakthrough for me and I got a 1600 right away. Since then the core counts are not increasing, frequencies are not increasing, they're adding some cache to try to mask the stagnation. Don't buy into the hype.
 
Then you talk about the future. Let me warn everyone. Future proofing doesn't exist.

Don't be silly. Ofc Future Proofing exists.

I still use an Haswell CPU (4790K) for gaming 10 years later and yes me buying this specific CPU was a carefully calculated decision: I wanted it to last for 10 years and more than 10 years. It did.

There's many more ppl out there sitting on old hardware and old rigs and ofc future proofing was a major factor in their purchasing decisions.
 
I'd be really interested in an article that tries to find optimal CPU and GPU pairings from the last three or so generations. Start with a 3600x and a 12600k and figure out where they start to become a bottleneck. They clearly are at 1080p with a 4090, but what about with a 3070? Maybe it would be easier to do it by GPU. If I have a 3070, what's the cheapest CPU that won't bottleneck it? If I put a 9800X3D in a PC with a 3070, I doubt I'm going to see significant gains even at 1080p.
 
Without native 4k this test was truly pointless. If you want to highlight importance of a cpu ise games like cities skylines.
It’s the same though. The principle is easy, if you run 4k on a card that can keeps the cpu busy - the cpu matters. Go low settings on an fps game with 4k to get a high frame count - the cpu will matter. Run high settings at 4k - where as your card will push 60-65 fps, and your cpu won’t make much difference. It’s about use cases - most people «will» use a form of upscaling - where as the graphs shows that the cpu helps quite alot.
Just do whatever is right for you - tests and reviews are there to give you the ability to make educated decisions based on what you need
 
Something I find weirdly normalized among recent reviews of top-end gaming CPUs is how many of them featuring only 1080p benchmarks don't say anything at all to inform viewers that the displayed gaming performance advantages of the new CPU won't be present at higher, typical-use resolutions. Hyping-up the CPU performance at 1080p and not informing people that that performance is basically just academic, while the actual gaming performance difference will disappear to around 0% at 1440p even with the most powerful, obscenely-expensive GPU really is lying by omission, and in effect isn't different than taking a kick-back from the manufacturer to upsell viewers on something they might not be willing to spend more money on, if they were aware they're paying for gaming performance they won't experience.

The primary purpose of CPU release reviews is to inform and guide potential purchasers of the product. And so, leaving out the part that actually informs them of the value and performance they'll get at the normal use-case resolutions is falling short of its job.

BTW, I think the 9800X3D is a really good CPU - but not due to a small gaming performance increase that won't actually be experienced by ~98% of purchasers for the next 6+ years. I think it's a good CPU due to it offering significantly more productivity performance than the 7800X3D while delivering the same typical-use gaming performance as the 7800X3D (however, while consuming significantly more power than a 7800X3D - which is disappointing). I also think it's a poorly-priced CPU that costs too much, given that it's not really offering much to its target market (gamers) over what was offered 1.6 years ago with the 7800X3D. I also think it's a bit disappointing considering that it comes 1 year and 7 months after the 7800X3D, and that it will be the only potential 7800X3D upgrade we get before its replacement arrives a rumoured 2 to 2.2 years from now. While it's a good CPU, the 9800X3D represents underwhelming and disappointing progress for the space of time it's filling, leaving us with nothing exciting for a very long period of time - and Intel comparisons are entirely notwithstanding to that point.
Don’t really agree - the 9800 X3D runs alot cooler, gives much better performance for productivity - and due to new placement of the x3d cache - they unlocked the cpu for overclocking. I’ve seen some solid tests where they were able to push the core clock to a stable 5,5 - 5,6ghz with a quality AIO. At that point it’s pushing way ahead of its previous iteration which is hard locked to a frequency
 
I wonder if someone could make an API injector that tells the game code that a frame is done rendering instantly. That would totally take the GPU bottleneck out of the equations minus GPU overhead testing.
 
Don’t really agree - the 9800 X3D runs alot cooler, gives much better performance for productivity - and due to new placement of the x3d cache - they unlocked the cpu for overclocking. I’ve seen some solid tests where they were able to push the core clock to a stable 5,5 - 5,6ghz with a quality AIO. At that point it’s pushing way ahead of its previous iteration which is hard locked to a frequency
I wonder if you read the last paragraph of my post, where I said that I think the 9800X3D is a really good CPU because it offers a significant productivity performance increase over its predecessor.

The appreciable drop in temperature comes with an unappreciably-large rise in power consumption, even while pushing the exact-same FPS, which I think makes for a neutral shift in advantage between the 7800X3D and the 9800X3D.

But the overclocking ability is currently (and for the foreseeable future) irrelevant for gaming, when the CPU is already GPU limited for basically all purchasers of it, save the extreme, extreme few who'll buy an RTX 5090 when it releases. That's why I said the "small gaming performance increase that won't actually be experienced by ~98% of purchasers for the next 6+ years" isn't part of the reason why I think the 9800X3D is a really good CPU.
 
As I said in previous post, coming next gen GPU the 9800X3D current lead of ~11% might grow to over 20%.
If it does, it will mean that TechSpot, HU, Steve, etc, have all been doing precisely what they've been talking-down to others for wanting, which is giving us GPU-limited benchmarks due to doing their CPU testing in too-high a resolution. Then, according to their argument, they should've actually been providing only 720p-or-lower benchmarks in their reviews.

But the RTX 5000 series shouldn't offer any new 9800X3D gaming performance at 1440p or higher, unless a person buys an RTX 5090. That's because an RTX 4090 gets an average ~3% performance uplift over a 7800X3D at 1440p, and an RTX 5080 won't be as powerful as an RTX 4090. So, only RTX 5090 purchasers will see an advantage to gaming on a 7800X3D for at least the next couple of years. And who knows whether an RTX 6080 will get more performance than an RTX 4090.

The Steam survey shows that less than 1% of users have an RTX 4090 two years after it released. So, expect it to take a full 2 years for less than 1% of Steam users to have an appreciable advantage from a 9800X3D versus a 7800X3D. And if the rumoured price-increase for the RTX 5090 to $2,000 - $2,500 USD is true, then there could be far fewer RTX 5090 owners in 2 years than there currently are RTX 4090 owners.

As I said in a previous post, going by the number of people who own which GPU, it'll likely be another 6+ years of future GPU releases before a significant number of gamers are able to experience an appreciable gaming performance uplift from a 9800X3D over a 7800X3D.
 
I actually look forward to the day that an Intel CPU edges out an AMD CPU in gaming.
I sense that all of a sudden 1080p will again become the end all benchmark for a certain group.
 
Back