Is the Ryzen 9800X3D Truly Faster for Real-World 4K Gaming?

how to extrapolate from 1080p/4090/low results to 1440p/3080/high going just on paper. Is there a formula for that? I don't believe it's as simple as "none because you're entirely gpu limited" nor as dramatic as the single-case charts might suggest. Where exactly in-between is not an equation I can solve for myself.
You look at what the framerate is for 1080p/4090/low. This is the most fps you can get from this game with the CPU. You then find a review of your graphics card that includes the same game. You look to see what framerate your graphics card can achieve at 1440p/high. If this number is lower than the previous number you will be GPU limited in that game, if not you will be CPU limited.
 
As I said in a previous post, going by the number of people who own which GPU, it'll likely be another 6+ years of future GPU releases before a significant number of gamers are able to experience an appreciable gaming performance uplift from a 9800X3D over a 7800X3D.
Yes this is true. The 7800X3D is a powerful gaming CPU and owners of said CPU I don't believe are the target market for the 9800X3D. This has always been the case as a platform upgrade is not necessary until it is.
 
1440p would have been a better choice than 4K, since it's actually a useful resolution for gaming performance. Also 1% lows matter the most for a consistently fluid and responsive experience and that's where the X3D also shows dominance.
 
Unless you're a top global player (like an F1 driver in shooter competitions) this topic is irrelevant. Most players cannot perceive a difference beyond 60 FPS, as this does not align with the average "refresh rate" of the human eye. Only individuals with highly trained or exceptional vision can detect differences above this threshold. Consider LED lamps that flicker between 50 Hz and 100 Hz. If you can't see the flickering of LED lamps, you likely won't perceive any difference in games either. Don't fall for industry propaganda designed to push you into buying the latest, overpriced products.
 
Last edited:
Unless you're a top global player (like an F1 driver in shooter competitions) this topic is irrelevant. Most players cannot perceive a difference beyond 60 FPS, as this does not align with the average "refresh rate" of the human eye. Only individuals with highly trained or exceptional vision can detect differences above this threshold. Consider LED lamps that flicker between 50 Hz and 100 Hz. If you can't see the flickering of LED lamps, you likely won't perceive any difference in games either. Don't fall for industry propaganda designed to push you into buying the latest, overpriced products.
Even I instantly notice the difference between 60 and 120 fps - but it’s much more in the fluid motion than the percieved refresh rate. The pc gamer world has moved past 60 hz monitor years ago for good reason. Anything above 120 fps however is hard to detect unless you use either refresh rate over time
 
I don't know how many time I need to post this image...

4k Upscalled is not 4k. If you upscalled, your native resolution is lower and CPU bottlenecks are going to be a reality with a 4080, an XTX or a 4090.

dlss-render-resolutions-v0-6yr5adggi8sa1.jpeg
 
As a benchmark to compare CPU's, this is true. I appreciate and enjoy looking at these articles.

Most people use reviews and benchmarks to figure out what to buy. If you have a budget for a PC that's say $1500 or $2000 and you need to decide how to spend your money to maximize your frame rate, the benchmarks you typically find online are not all that helpful. You need to know how to spread your money across the different components. People want to know the answer to questions like: should I buy a 9800X3D and a geforce 3050 or should I buy a 9600X and a 4070? CPU benchmarks might imply a 9800X3D with a 3050 is a better approach. It's nearly impossible to find data that answers that question though. Because when you buy a pc, it's not just about any one component. It's about maximizing your frame rate for a specific amount of money you have to spend.

As a tech fan, I love benchmarks. But I can see why people might be frustrated with benchmarks if they just want to figure out what to buy.
 
"Is the Ryzen 9800X3D Truly Faster for Real-World 4K Gaming?"

You need to change your title. You say 4k gaming but you enable DLSS so the game is being rendered at less than 1440p NOT 4k.

That means there aren't any 4k results in your review.
Came to say the same thing that using any (DLSS/FSR) upscaling bs is pointless it doesn't show real 4K native performance at all and that's what we want to see
 
Most players cannot perceive a difference beyond 60 FPS, as this does not align with the average "refresh rate" of the human eye
The conclusion you are reaching is way off. Every once in a while some update or reset will cause my display and/or computer to revert to its default 60 fps, and although I'm not at all looking for it, I'll notice immediately that something doesn't feel right. And that's just using the desktop.

To the extent this comes from some biological research (if there really is any) that says basic eye processing does not exceed 60 fps, then the issue must be caused by some interaction between the eye processes the display technology.
 
What does it matter they use their so called custom benchmark runs. Even if you wanted to compared what you have to what they are getting its not possible. Also he knows his 285k scores are jacked but yet he still includes them so even more people can think the CPU is worse than it really is. I know this because I just happen to have a 285k and the numbers he is posting, the ones Jay2Cents are posting and other sites are not what I'm seeing. I have yet to run a benchmark that has been slower than a 14900k. I have done the whole 1080P medium, high, low, ultra mess with a 4090 just to see if they even remotely align with the crap that is getting posted. Because this day and age if it posted by someone on YouTube, TicTac or any of the other social media then people think its the gospel truth.
 
Last edited:
First off, thanks for the benchmarks at higher resolution. But I have to say I didn't appreciate the constant passive aggressive 'most with some knowledge will know this'.

Yes, I know this. But I also know there are genuinely improvements to be had, especially over time. I'm still on my 5800X as I play at 4k. In most games I'm able to get 60, I'd like a bit more headroom on some of those games to not drop below. I have seen the 5800x3D gets an average 4% uplift, the 7800 another 4ish percent. I'm not looking for huge numbers, but in a lot of cases the CPU does improve things, especially as you said when FSR etc comes into play.

Not having that information available is infuriating, finding nothing but 'you're just stupid' is demeaning and unnecessary.

Yes, 4K and 1440p are important. Definitely more important than 1080p 5000fps on a CPU that costs as much as a console that BEATS 1080p in every game. There are absolutely more people looking for higher resolution results to make generational upgrades, but it's not covered well in the slightest
 
The only reason I personally would like to see higher resolution benchmarks is for VR.
I have identical system to my friend AM5/7900XTX
I have 7700X he 7800X3D.... Playing ACC the 7800X3D will run same settings at 80FPS whereas my 7700X fluctuates between 65 and 75, only way to gain any FPS is to reduce number of opponents; using FSR is a no no due to the shimmer, DLSS is the same as your eyes are so close to the screens.
The only way we have of getting any idea of what specs we need is to look at 4k benchmark performance to get some idea of the bottlenecks.

Judging from comments about my stupidity, I should see no difference between 7700X and a 9800X3D when using VR in ACC as it is GPU bottlenecked, but in actual use this is patently not the case.
 
Unless you're a top global player (like an F1 driver in shooter competitions) this topic is irrelevant. Most players cannot perceive a difference beyond 60 FPS, as this does not align with the average "refresh rate" of the human eye. Only individuals with highly trained or exceptional vision can detect differences above this threshold. Consider LED lamps that flicker between 50 Hz and 100 Hz. If you can't see the flickering of LED lamps, you likely won't perceive any difference in games either. Don't fall for industry propaganda designed to push you into buying the latest, overpriced products.
Have you never played on a high hz monitor that's pushing 100+ FPS? I think that everyone who can see can see the difference above 60 FPS. 60 FPS is like a slideshow compared to even just 100 FPS. That's not industry propaganda, the difference is night-and-day.
 
But I have to say I didn't appreciate the constant passive aggressive 'most with some knowledge will know this'.
Yes, I completely agree 💯.
A couple of years ago, Steve posted something along the lines of, "If someone doesn’t understand how it works, just point them to my article." The arrogance in some of his articles is truly baffling. It’s reminiscent of preachers proclaiming the "truth," as though there’s only one possible way to approach CPU (Gaming) reviews. Reading this article the best thing (for me) was the Dr. Strangelove reminiscence - the rest was not that interesting. The sentence about diagnosing personality disorders in people who don't follow his argument was just plain Trump-style nonsense. Very poorly written.

The main issue I have with articles like this is the use of custom settings (that no one knows, and there are endless combinations possible in most modern games) and balanced upscaling. Custom settings can be manipulated to support a specific narrative and are far from being any kind of universal "industry standard." Any verdict derived from such practices - e.g. claiming that CPU X is 20% faster than CPU Y - is highly questionable. These tests lack standardization, and the particular settings that influence these chart bars are rarely disclosed (interestingly, amateur reviewers disclose them more often than professional writers). Additionally, the choice (or exclusion) of specific games can drastically alter the outcomes. This selective approach becomes a powerful tool to shape conclusions toward whatever point the reviewer wants to emphasize.

To produce compelling data, reviewers often lower settings on a 4090 to generate measurable differences. For reviewing CPUs the 4090 was a godsend and the 5090 will be even better. Try to do that with a vanilla 4070 or even something like an older 2080TI and you will understand why nobody tests those setups. However, if you check Techpowerups 4k CPU results with a 4090, you get another narrative. They just dial in Ultra setttings with everything cranked up. While they also say that the 9800x3D is the fastest gaming CPU, their aggregated results diverge significantly from this article’s narrative:

Techpowerup 4k CPU relative gaming performance
Techpowerup 4k CPU game results

So what Techpowerup is saying is that the CPU doesn't matter at (native) 4k gaming - if you look closely at the charts. They don't say it out loud, and they don't call you stupid if you don't follow their reasoning, they just let the numbers (and the chosen axis scale) speak for themselves.

But: Techpowerup and Techspot both have a point - when you base their 'truths' on the particular test setup. You need to check what review reflects your personal usage more.

In my opinion, most CPU testing is messy. I understand why some reviewers avoid it altogether. It’s challenging to create stable tests, you need the fastest GPU available (obviously) and even then you need to shape the settings to your narrative.

So, even if you use a 4090, you can get different results based on your setup. And to make things more messy: For gamers with lesser GPUs then a 4090 the situation is completely different. You’d likely rely on other settings and the findings of Techspot don’t apply to you. To be fair, also the findings of Techpowerup don't apply to you.

These tests and both narratives (CPU does / does not matter for 4k gaming) are a great activator and trigger for lots of people trying to find reasons why to stick with their outdated CPU or to combat buyers remorse on their new pricey top CPU. Like in this comment section. It's problematic from a scientific point of view, because you can find data for both narratives based on the different test setups and the shaping of settings in the reviews. If you re-run CPU performance tests with modern games after a few years, as Steve likes to do, you can provide a view on the question if a long year CPU usage (without changing the GPU) justifies the expensive upgrade. As an enthusiast, though, you don't need that kind of advice (you'll be upgrading anyway). As a reasonable buyer you probaly won't do that decision while running a 4090, so his findings do not fully apply to you. And to what degree they could apply with your particular games and settings on your Vega64, 2080 TI, 3060, 6800XT, 4070 and so on, you will never know.

So, to conclude, there is no real truth to be found here. The individual gains for you can shift dramatically from any review, making such blanket statements (The 9800x3D is 20% faster at avg than CPU xy in 4k) misleading at best. Combine that with arrogance and a know-it-all style and you get a poorly written article like this one here. Sorry Steve, just my two cents. You can do better - and you proved it in the past.
 
Last edited:
Unless you're a top global player (like an F1 driver in shooter competitions) this topic is irrelevant. Most players cannot perceive a difference beyond 60 FPS, as this does not align with the average "refresh rate" of the human eye. Only individuals with highly trained or exceptional vision can detect differences above this threshold. Consider LED lamps that flicker between 50 Hz and 100 Hz. If you can't see the flickering of LED lamps, you likely won't perceive any difference in games either. Don't fall for industry propaganda designed to push you into buying the latest, overpriced products.

Going back to the days of CRTs, I needed at least an 85hz refresh rate as anything less I could see the refresh interval.

As a simple test, I made a test application that (at various refresh rates) would display an all black background for X frames, but within each second randomly insert one grey frame (this is on an OLED so Black is actually Black). I can detect that grey frame pretty much every time it gets inserted while running @120Hz, so I can confidently say the human eye can perceive at least at that rate.
 
I genuinely still can't understand why people don't get the reason for testing like this.

Imagine a 16 lane drag strip with all the CPUs lined up for the race. 3. 2. 1. Go!

All CPUs are off and blitz down the track but there is a pace car out (game @ 4k) and it's preventing all the CPUs from passing because they're waiting on the pace car (GPU bound/bottleneck). The CPUs are jockeying back and forth but still held back by the pace car and essentially arrive at the finish line together.

Now, imagine the same race but without that pace car (game is now @ 1080p) meaning the CPUs are not held back waiting for the pace car but are free to rip down the track as fast a fluffing possible.

You've now efficiently tested the CPU and removed any constraints from it.

The end!
I think you just made the opposite point. The point, that some people are trying to make, is that at higher resolutions the differences between CPU performance is reduced, or rather CPU performance isn't the issue, it's the pace car. So why spend hundreds on a CPU that won't get you any more FPS because it's being hampered by the pace car?
 
I find this website highly deceptive. I'm commenting mainly to warn the newer people who might not see what you're doing. I've been in this game for close to 30y.

This article was written with half-truths or outright lies.
The article and HU's videos on this topic really are full of those, so much so that I can't see where it'd end if I tried to address them all. I'm not sure if it's ivory tower syndrome, or Steve just trying to cover his rear, but the fact are that HU are wrong in their arguments against higher-res benchmarking, and they're using wordplay and other disingenuous tactics to corral perceptions away from looking at the topic clearly, and into looking at only one part of the picture and thinking it discounts the rest of the picture. And it's already making people more tech-ignorant, as I've seen people arguing in higher-res benchmark videos that the videos are wrong because anything other than 1080p testing is GPU-limited and pointless - people who clearly lack basic knowledge of the topic and aren't very open to correction because their chosen authority figure insisted otherwise. So, Steve and HU are damaging naive people's understanding and creating difficulties in properly evaluating and discussing hardware.

People dogmatically taking HU's arguments (which are mostly nonsense) as truth occupy the role of pseudo-intellectuals, who think that because they've understood one part of a picture they can judge anyone saying something further to be wrong - despite them saying what they say not because they don't also understand that low-res testing of the picture, but because they understand more than just that part of the picture. HU's videos on this topic are serving as litmus tests for whether a person has any critical thinking skills or just blindly and mindlessly parrot whoever their chosen authority figure.

In other words, HU have created a mass of Dunning-Krugers who feel a false sense of superiority for essentially being unintelligently wrong. And if those at HU truly believe the arguments they're making (that higher-res benchmarks are useless, that they don't inform unless you have the exact-same system, that they don't fall under the category of a CPU review, and so many others), then they're a bunch of Dunning-Krugers, themselves.

The fact is that both academic (low-res, unbound) CPU testing and practical (typical-res, likely GPU-limited) testing are part of a proper CPU review. But people are being misled by HU pigeon-holding the topic into a false dichotomy of either low-res testing or high-res testing, and into being about whether 1080p benchmarking makes sense - which it isn't. But Steve is fixated on explaining 1080p benchmarking as though it's QED, arguing with himself over things that are besides the point, and painting his arguments' detractors with strawman arguments.

Just one example from Steve's here article:

We get why some readers prefer what's often referred to as "real-world" testing, but in the context of CPU benchmarks, it doesn't actually provide the insight you might think – unless you plan on using that exact hardware combo under those specific conditions.

All of that is false. The 4k Balanced benchmarks Steve used in his article don't reflect the alleged "real-world" testing, per the surveys he did which showed 4k Quality users represent more than double both 4k Balanced and 4k Performance users *combined*, and per the 1440p survey which shows that native 1440p is actually the most "real-world" demographic.

And saying that "real-world" benchmarks provide insight only if you're running the exact-same hardware and settings combo is as daft as saying that any benchmarks, including 1080p unbound CPU benchmarks, are only insightful if you're running an identical configuration. It's such a nonsensical claim that it's stunning to hear from a well-regarded tech reviewer.

Higher-res CPU benchmarks provide exactly the insight I think they do, and which I see many others thinking. And its insight is most definitely not contingent on users running the exact-same hardware combo and under those specific test conditions. If an RTX 4090 throttles a CPU's performance at native 1440p max settings so that there's only a ~3% average performance difference (using TechPowerUp's aggregate) compared to a 7800X3D, then I immediately know that anything less than an RTX 4090 will yield less difference - and I can infer how much less based on the performance spreads between GPUs. And I can also reasonably infer how the result can be modified by changing settings to away some of the GPU's load. I can very-well ballpark what performance to expect from my own system, or one that shares the CPU but is otherwise customized. I can also know that setting the resolution to DLSS will only improve the result, and can make an educated guess or use DLSS comparison videos for specific games to gauge how much it can improve the FPS.

In short, "real-world", higher-res benchmarks do actually provide a large amount of insight into what a tested CPU means to me, without me needing to have the same other specs in the benchmark. How is it possible for a tech reviewer to hold the ridiculously-false and oblivious contrary belief?

So, either Steve is a Dunning-Kruger, or he's being wilfully dishonest and obtuse. Either way, he shouldn't be making condescending appeals to authority as he tells people they're wrong and he's right, when he isn't even playing in the ballpark.

As I said in my first post in this thread, I'm not set on seeing HU do higher-res testing. I'm bothered by the continued intellectual dishonesty from HU's corner on this matter, and Steve's condescension while using fallacious, misrepresentative arguments as he talks down to people and tries to frame them as not getting it while he's right, when the reality is that many of whom are actually right while he's wrong.

BTW, HU's latest video on this topic, the one that's 14 minutes long, is again rife with nonsensical assertions. One of the claims made this time is that CPU reviews are not upgrade guides. If they're not upgrade guides, then I wonder why HU includes price-to-performance charts in their CPU reviews. In the real world, CPU reviews are quite literally buyer's guides, it's the central reason they exist in the first place. They exist to communicate technical and other information about CPUs so that people watching can become informed and figure-out whether it's in their interest to have one. And if data informing how a CPU performs in its target market's typical-use resolutions is omitted, then it's failing as a CPU review. Part of objectively measuring what a CPU can do is measuring what it can do when placed in its typical-use environments (such as resolutions) - that data shows what difference the CPU as the isolated factor can actually make in that environment, thus informing prospective buyers whether it's worth buying.

HU can just say that they don't want to do 1080p testing, and that'll be a lot more valid than saying CPU reviews are not buying guides as an excuse for not including other contextual information that's important to the buying decision-making. The latter route is both weak and false. What HU are doing by trying to dictate what "Review" means is like somebody who doesn't know what a word means making assumptions about its meaning, and then lecturing everybody else that they're wrong if they use the word in a way that doesn't conform to their completely arbitrary, actually-wrong personal assumption of what the word means. And as with HU's claims about higher-res benchmarking, that argument is also spreading ignorance.
 
I actually look forward to the day that an Intel CPU edges out an AMD CPU in gaming.
I sense that all of a sudden 1080p will again become the end all benchmark for a certain group.
I have a 7800X3D and was thinking about upgrading to a 9800X3D until I got some practical, real-world applicable review data from other reviews. Who else were you meaning?
 
I have a 7800X3D and was thinking about upgrading to a 9800X3D until I got some practical, real-world applicable review data from other reviews. Who else were you meaning?
I'm not doubting the facts, I'm poking a zinger at the future hypocrisy if\when
the performance tables turn.
All of a sudden, the big numbers of gamers still playing at 1080p will mean everything, and the narrowing frame rate gap at higher resolutions will suddenly mean little to nothing to them.

It's not like there is no history of fanboy tech hypocrisy.
 
Last edited:
Came to say the same thing that using any (DLSS/FSR) upscaling bs is pointless it doesn't show real 4K native performance at all and that's what we want to see
I mean - you know the reasoning for the test. You’ll see little difference at 4k high settings, you know this and you know why. Less than 1% of gamers on Steam use 4k gaming monitors.
Your cpu won’t have to work very hard at 4k as long as the system manages to provide enough data to your gpu - which usually wont run at much beyond 60 fps even with a 4090. So stresstesting a cpu in a scenario where it’s not being stressed is pretty redundant.
There will be scenarios in the near future where this could become much more viable - if the 5090 is 30% faster than the 4090, the cpu will get more work to do even in 4k, which will push load on the cpu - which in turn will provide different outcomes based on your cpu of choice.
If you game at 4k, your cpu shows 50% or less load - no need to upgrade, unless you’re future proofing for a 5090
 
I'm not doubting the facts, I'm poking a zinger at the future hypocrisy if\when
the performance tables turn.
All of a sudden, the big numbers of gamers still playing at 1080p will mean everything, and the narrowing frame rate gap at higher resolutions will suddenly mean little to nothing to them.

It's not like there is no history of fanboy tech hypocrisy.
Well said.

What I want to see is the 9900X3D/9950X3D benchmarks.
 
Unless you're a top global player (like an F1 driver in shooter competitions) this topic is irrelevant. Most players cannot perceive a difference beyond 60 FPS, as this does not align with the average "refresh rate" of the human eye. Only individuals with highly trained or exceptional vision can detect differences above this threshold. Consider LED lamps that flicker between 50 Hz and 100 Hz. If you can't see the flickering of LED lamps, you likely won't perceive any difference in games either. Don't fall for industry propaganda designed to push you into buying the latest, overpriced products.

Hunh ?
I have not gamed at 60fps in 15+ years, bcz doing so sucks and not worth it (how can u do a 360 with any precision?). I really do NOT care how fast my monitor is past 166Hz OLED... but my in-game FPS needs to be ovr 200FPS for my movement and shots and mantling and snap-shotting to be on (Real gamers do NOT sync their monitors).

For example, Call of Duty players want the most frames in game, so they have the latest info/update and the smoothest/most fluent movement. Hence, anyone playing COD at 60fps will loose nearly every engagement to someone who pushing 200+ frames.


Why even bother with a PC if you are happy with 60Hz gameplay..? (Have you even gamed at 166Hz or higher?)
 
What these reviews have pretty much shown is the 285K is basically equal to a 7700X in gaming and the sad Intel crowd are the Democrats of this election.
 
> Is the Ryzen 9800X3D Truly Faster for Real-World 4K Gaming?

How nice of you to acknowledge that!

> Do you only require 60 fps or less? If so, CPU performance is unlikely to be a significant factor for you. [...] What isn't fine is imposing a single viewpoint on everyone. While it might seem that low-resolution benchmarking enforces a particular standard, it actually provides the full picture – the entire spectrum of gaming performance. [and then goes on testing at 1080p with a rtx 4090]

While true this is also sad. >75% of Steam gamers play with a 4060 ti / rx 6700 xt at best, so at 1080p they have two options: play at ~60fps or play at low graphics settings. hardly anyone I know play's at 1080p low. The only one's that do are COD, CS2, etc. die-hards...

So I guess you just admitted to making cpu reviews for the minority of high-fps-gamers, because apparently the cpu doesn't matter to the rest.
 
Back