AMD Ryzen 7 9800X3D vs. Ryzen 7 7800X3D: 45 Game Benchmark

Sad when your competition is yourself. The sad part is that the 9800x3d is barely more than the I7 14700k. Idk how Intel Justified that price... 9800x3d is sold out though so ye prices may skyrocket soon.
 
What doesn't tell us that in 1080p there is not also a bottleneck with the RTX 4090 and that it will be lifted once we upgrade to an RTX 5090? Maybe the gap between these 2 CPUs will be bigger than now...
That's one of the most important reasons for 1080p benchmarks and what some people can not grasp. Today's 4090 will probably be at the level of 5070/ti in the near future. And in 2 years time this may be lower-mid tier of performance. Let's say AMD's fastest gaming CPU can already feed the fastest next gen GPU.
 
That's one of the most important reasons for 1080p benchmarks and what some people can not grasp. Today's 4090 will probably be at the level of 5070/ti in the near future. And in 2 years time this may be lower-mid tier of performance. Let's say AMD's fastest gaming CPU can already feed the fastest next gen GPU.

So buy this CPU 4y from now. Got it.
 
Sad when your competition is yourself. The sad part is that the 9800x3d is barely more than the I7 14700k. Idk how Intel Justified that price... 9800x3d is sold out though so ye prices may skyrocket soon.

Dozens all around me in every Microcenter in the area. They're still selling 5700X3D for $180. Now, that's a steal and if you're on AM4, get this.
 
This is missing the enormous elephant in the room that renders this sentiment pointless for nearly everybody. In order to experience a performance increase from a 9800X3D when buying a new GPU, your new GPU first has to clear the performance threshold that will stop bottlenecking the CPU. For the 9800X3D at 1440P, that requires a minimum GPU performance of an RTX 4090. And even with a 4090, there's only a 3% average performance increase over a 7800X3D. So, to actually get an appreciable performance increase from a 9800X3D at 1440p, you'll need at least an RTX 4090.

BTW, around two years after the 4090 released, less than 1% of Steam users have one. And something like 2% have an RTX 4080. So, for just about everybody, who isn't going to spend $2000 on an RTX 5090, there won't be a lifting of the 9800X3D's GPU for at least another 2.2 years - with the 2.2 years minimum timeframe being conditioned on the RTX 6080 lifting the bottleneck appreciably (because the RTX 5080 won't at all), and a person buying an RTX 6080 or higher. And that is depending on the RTX 6080 being faster than the RTX 4090, which might not be the case.

Realistically, the large majority of people with a 9800X3D won't see the GPU bottleneck at 1440p lifting at all for the next 6+ years. And by that time, there will be another three CPU generations released, and people might be upgrading their 9800X3D CPUs and encountering the next GPU bottleneck, and also might be playing on a higher resolution than 1440p and so also having a higher GPU bottleneck threshold.

There's no reason to buy a 9800X3D on the basis of thinking of experiencing the bottleneck being removed with a future GPU purchase, unless you game at 1440p and you're planning to buy an RTX 5090. The RTX 5090 probably still won't do anything to lift the bottleneck at 4k.

Exactly. That's why they don't test at a higher resolution than 1080p. They should just stick to application benchmarks but it's not as exciting to report that some application finished a few seconds faster. Other websites test at 1440p and 4k and you can see near 0 gains even with a 4090.

5700X3D is the king of gaming at $180.
 
What doesn't tell us that in 1080p there is not also a bottleneck with the RTX 4090 and that it will be lifted once we upgrade to an RTX 5090? Maybe the gap between these 2 CPUs will be bigger than now...
Exactly my thought. It is probable that for some games the CPU has been more than fast enough to let the GPU, in this case the RTX4090, to show its true uhindered performance.

I would guess that top results of roughly 20% will be the norm with the upcoming RTX5090.
 
This is missing the enormous elephant in the room that renders this sentiment pointless for nearly everybody. In order to experience a performance increase from a 9800X3D when buying a new GPU, your new GPU first has to clear the performance threshold that will stop bottlenecking the CPU. For the 9800X3D at 1440P, that requires a minimum GPU performance of an RTX 4090. And even with a 4090, there's only a 3% average performance increase over a 7800X3D. So, to actually get an appreciable performance increase from a 9800X3D at 1440p, you'll need at least an RTX 4090.

BTW, around two years after the 4090 released, less than 1% of Steam users have one. And something like 2% have an RTX 4080. So, for just about everybody, who isn't going to spend $2000 on an RTX 5090, there won't be a lifting of the 9800X3D's GPU for at least another 2.2 years - with the 2.2 years minimum timeframe being conditioned on the RTX 6080 lifting the bottleneck appreciably (because the RTX 5080 won't at all), and a person buying an RTX 6080 or higher. And that is depending on the RTX 6080 being faster than the RTX 4090, which might not be the case.

Realistically, the large majority of people with a 9800X3D won't see the GPU bottleneck at 1440p lifting at all for the next 6+ years. And by that time, there will be another three CPU generations released, and people might be upgrading their 9800X3D CPUs and encountering the next GPU bottleneck, and also might be playing on a higher resolution than 1440p and so also having a higher GPU bottleneck threshold.

There's no reason to buy a 9800X3D on the basis of thinking of experiencing the bottleneck being removed with a future GPU purchase, unless you game at 1440p and you're planning to buy an RTX 5090. The RTX 5090 probably still won't do anything to lift the bottleneck at 4k.
I see your point, but it is a lot of extra work for reviewers to test different CPUs at multiple resolutions just to prove what is already a given fact, you're only going to be as fast as the slowest component in the workload. If you crank up the resolution, or just have a weak GPU, then the GPU is the bottleneck and obviously frame rates will be limited to a similar level across the board.

Should they also change other graphics settings and components that have little to no bearing on the CPU and test against every possible configuration when the point of the review is to focus on CPU performance? Should they test at different resolutions when they review an SSD, or a network card? If people want to know the performance difference at different resolutions, they should be reading a GPU review, not a CPU review.

It's not like Techspot is trying to fool anyone, sites have been testing CPUs at low resolutions for decades, and they even include a link explaining the reasoning for anyone who might not understand. It's not until recently that I've seen so many people fail to grasp this concept.
 
Last edited:
I see your point, but it is a lot of extra work for reviewers to test different CPUs at multiple resolutions just to prove what is already a given fact, you're only going to be as fast as the slowest component in the workload. If you crank up the resolution, or just have a weak GPU, then the GPU is the bottleneck and obviously frame rates will be limited to a similar level across the board.

Should they also change other graphics settings and components that have little to no bearing on the CPU and test against every possible configuration when the point of the review is to focus on CPU performance? Should they test at different resolutions when they review an SSD, or a network card? If people want to know the performance difference at different resolutions, they should be reading a GPU review, not a CPU review.

It's not like Techspot is trying to fool anyone, sites have been testing CPUs at low resolutions for decades, and they even include a link explaining the reasoning for anyone who might not understand. It's not until recently that I've seen so many people fail to grasp this concept.

We get it. There is 0 real world difference for 99% of setups. But in this setup only we have (1080p monitor together with 4090), there is a 8% gain which is pretty unnoticable too and definitely not worth getting an entire AM5 platform when you can pluck in a 5700X3D in your AM4 and not notice a difference.
 
Incorrect, 1440p and 4k combined are still a minority of steam users, most are 1080p or below.

Also, you do not run at high resolution for a CPU test. The propose of CPU testing is to stress the CPU, not the GPU. Please learn, every single time someone has this question and it has to be explained to them.

Removing the GPU bottleneck reveals the absolute performance delta, but nevertheless it's fair to assume that the kind of people in the market for hardware at this price point are indeed 2k/4k users with much bigger warchests than the average Steam user. There is value in showing real-world metrics for this audience, even if it ends up being a nothing-burger.

The best reviews include these resolutions with minimum and frametime variation metrics and more.. these can reveal subtlties that are otherwise subjective.
 
Removing the GPU bottleneck reveals the absolute performance delta, but nevertheless it's fair to assume that the kind of people in the market for hardware at this price point are indeed 2k/4k users with much bigger warchests than the average Steam user. There is value in showing real-world metrics for this audience, even if it ends up being a nothing-burger.

The best reviews include these resolutions with minimum and frametime variation metrics and more.. these can reveal subtlties that are otherwise subjective.
The problem with testing a CPU in games at various resolutions has been explained, but I'll put it in another way.

Comparing CPUs at 1080p will always show a difference. Any mid range GPU can easily render that amount of pixels, so the GPU isn't stressed.

This is not an exaggeration. CPU A could be 30% faster than CPU B at 1080p. The very same test same settings everything, except that it's at 4k, EVEN if using the fastest GPU available may show literally just a few percent difference. CPU A is 3% (potentially 0%) faster than CPU B at 4k. If the reviewer were to use a mid range GPU, the differences would be large, even between 1080p and 1440p. Some readers would freak out!!

At 4k any GPU has to work really hard to render all those pixels. The CPU doesn't render pixels.

Basically, as soon as any GPU in a game starts to struggle, there is nothing the CPU can do about it.
They work together of course, but what they do is not the same. The GPU is slowing down the FPS.

If CPUs were benched at both 1080p and 4k there would be even more confusion, especially if the fastest GPU available wasn't used.

People would say wtf! It's 30% faster at 1080p, and within the margin of error at 4k.

That is an example of what could, and does happen as soon as the GPU starts to struggle.

For gaming, especially if high FPS is ones main aim, the 4k test is essentially testing the GPU.

So when it comes to CPUs for gaming, reviewers will get flamed either way. Those who don't understand the concept of CPU metrics for gaming will complain why only test at such a low res. with a 4090.

Those same people would then say the review is nonsense because it's much better at 1080p, but gives little, or even no advantage at 4k.

Testing at low res. gives the PC builder the info to know if they should upgrade their CPU when upgrading GPU. Often, it's totally unecessary to upgrade an older, but decent CPU. This is very much the case at 4k. To test the CPU properly the GPU must not be pushed up to and/or beyond what it is capable of rendering.

There is a great thread (sorry I don't have the link handy right now) written here which explains it in a lot of detail. I believe it was written by Steve. (Apologies if I'm wrong about the name, but it's he's a good Techspot reviewer.)

I doubt my post will be of much use. But it might?
 
And 4090 owners make up 0.91% of gamers as of October 2024. While these benchmarks do tell use what CPU is technically faster, it doesn't tell us anything about real world performance or what product you should actually buy. People gaming at 1080P overwhelmingly are paired with 60 class cards or anything sub $500 really.

Something I find infuriating about these benchmarks is that, while it find it important to eliminate bottlenecks, it doesn't tell the target market anything about how this product actually performs in their setups.

The testing methodology that everyone is using is designed specifically to find bottlenecks and exagerate performance. It tells us nothing about real world performance. The thing is, real world performance numbers don't sell clicks or move products.

While eliminating bottlenecks is important, it is also important to measure performance when THERE ARE bottlenecks. I'm not saying stop testing this way all together. The thing is, the way everyone is testing these chips leaves out valuable information that if people saw they might say, "wow, I don't need this product. That money is better spent elsewhere"

I love building building PCs as a hobby, I think this tech is cool and I'm happy it's around. But I see this in a similar way to cars as a hobby. Go buy a hellcat if you want one, but it isn't going to get you to work any faster. Websites are marketing this thing for AMD and it's going to end up hurting consumers in the long run. People with limited budgets are going to look at this and say "well I have to get this $500 CPU or my gaming computer wont work right". Fact of the matter is that many gamers out there will never own a GPU where the 9800X3D makes any difference in performance. want more FPS, buy a 7800X and a 4080 instead of a 9800X3D and a 4070.
Your concerns are valid. But to get that info there are sites, and reviewers that give sometimes very good advice about CPU/GPU pairing/best combo.

Thing is there are so many potential combinations that it's out of scope for a CPU review. For that info I strongly recommend googling which GPU is a sensible combo with which CPU. System building in other words.

Make sure to check a site which specializes in that kind of thing and has a good reputation.

But with a good 1080p CPU review over multiple games, as Steve does, it indeed does give good a good idea of it's worth upgrading just the CPU, or just the GPU. The more experience one has the easier it is to make the right choice, but its never easy, especially if planning to keep a CPU for 5 years while upgrading GPU every, say 2 years.

I've built about 12 systems, and try to figure out myself the best combo. Then verify it by checking out a few trust worthy sites. (Over 5 years sites come and go, so I won't recommend any.)

I haven't always got it right. My tendency has been to buy the best CPU available, and the second to best (like 4090 vs 4080super) GPU. As gaming is my main use for my PCs, I am sure I have over spent on CPUs several times.

Oh, and then there is Memory, buses, cooling, storage options. It's easy to build a desktop in a big case, but it takes serious thought to really balance all the components as much as possible.

If not gaming the situation is different, but for gaming, first step find the best, and most cost affective CPU/GPU combo. There are so many potential combo's that a straight up CPU review needs to remain focused on testing the CPU. Not much use otherwise.
 
Back