That's one of the most important reasons for 1080p benchmarks and what some people can not grasp. Today's 4090 will probably be at the level of 5070/ti in the near future. And in 2 years time this may be lower-mid tier of performance. Let's say AMD's fastest gaming CPU can already feed the fastest next gen GPU.What doesn't tell us that in 1080p there is not also a bottleneck with the RTX 4090 and that it will be lifted once we upgrade to an RTX 5090? Maybe the gap between these 2 CPUs will be bigger than now...
That's one of the most important reasons for 1080p benchmarks and what some people can not grasp. Today's 4090 will probably be at the level of 5070/ti in the near future. And in 2 years time this may be lower-mid tier of performance. Let's say AMD's fastest gaming CPU can already feed the fastest next gen GPU.
Sad when your competition is yourself. The sad part is that the 9800x3d is barely more than the I7 14700k. Idk how Intel Justified that price... 9800x3d is sold out though so ye prices may skyrocket soon.
This is missing the enormous elephant in the room that renders this sentiment pointless for nearly everybody. In order to experience a performance increase from a 9800X3D when buying a new GPU, your new GPU first has to clear the performance threshold that will stop bottlenecking the CPU. For the 9800X3D at 1440P, that requires a minimum GPU performance of an RTX 4090. And even with a 4090, there's only a 3% average performance increase over a 7800X3D. So, to actually get an appreciable performance increase from a 9800X3D at 1440p, you'll need at least an RTX 4090.
BTW, around two years after the 4090 released, less than 1% of Steam users have one. And something like 2% have an RTX 4080. So, for just about everybody, who isn't going to spend $2000 on an RTX 5090, there won't be a lifting of the 9800X3D's GPU for at least another 2.2 years - with the 2.2 years minimum timeframe being conditioned on the RTX 6080 lifting the bottleneck appreciably (because the RTX 5080 won't at all), and a person buying an RTX 6080 or higher. And that is depending on the RTX 6080 being faster than the RTX 4090, which might not be the case.
Realistically, the large majority of people with a 9800X3D won't see the GPU bottleneck at 1440p lifting at all for the next 6+ years. And by that time, there will be another three CPU generations released, and people might be upgrading their 9800X3D CPUs and encountering the next GPU bottleneck, and also might be playing on a higher resolution than 1440p and so also having a higher GPU bottleneck threshold.
There's no reason to buy a 9800X3D on the basis of thinking of experiencing the bottleneck being removed with a future GPU purchase, unless you game at 1440p and you're planning to buy an RTX 5090. The RTX 5090 probably still won't do anything to lift the bottleneck at 4k.
Or buy this CPU now and you're good for many GPU generations? Some people may interpret it like this, depends.So buy this CPU 4y from now. Got it.
I found many for 400 euro here, with the 20% tax. So not too bad. The new 1 is.... 650 euro here.. my god.Good luck finding a 7800x3d at $340. $476 is the cheapest I could find, and as for the 9800x3d $700 buys you one today!
Exactly my thought. It is probable that for some games the CPU has been more than fast enough to let the GPU, in this case the RTX4090, to show its true uhindered performance.What doesn't tell us that in 1080p there is not also a bottleneck with the RTX 4090 and that it will be lifted once we upgrade to an RTX 5090? Maybe the gap between these 2 CPUs will be bigger than now...
I see your point, but it is a lot of extra work for reviewers to test different CPUs at multiple resolutions just to prove what is already a given fact, you're only going to be as fast as the slowest component in the workload. If you crank up the resolution, or just have a weak GPU, then the GPU is the bottleneck and obviously frame rates will be limited to a similar level across the board.This is missing the enormous elephant in the room that renders this sentiment pointless for nearly everybody. In order to experience a performance increase from a 9800X3D when buying a new GPU, your new GPU first has to clear the performance threshold that will stop bottlenecking the CPU. For the 9800X3D at 1440P, that requires a minimum GPU performance of an RTX 4090. And even with a 4090, there's only a 3% average performance increase over a 7800X3D. So, to actually get an appreciable performance increase from a 9800X3D at 1440p, you'll need at least an RTX 4090.
BTW, around two years after the 4090 released, less than 1% of Steam users have one. And something like 2% have an RTX 4080. So, for just about everybody, who isn't going to spend $2000 on an RTX 5090, there won't be a lifting of the 9800X3D's GPU for at least another 2.2 years - with the 2.2 years minimum timeframe being conditioned on the RTX 6080 lifting the bottleneck appreciably (because the RTX 5080 won't at all), and a person buying an RTX 6080 or higher. And that is depending on the RTX 6080 being faster than the RTX 4090, which might not be the case.
Realistically, the large majority of people with a 9800X3D won't see the GPU bottleneck at 1440p lifting at all for the next 6+ years. And by that time, there will be another three CPU generations released, and people might be upgrading their 9800X3D CPUs and encountering the next GPU bottleneck, and also might be playing on a higher resolution than 1440p and so also having a higher GPU bottleneck threshold.
There's no reason to buy a 9800X3D on the basis of thinking of experiencing the bottleneck being removed with a future GPU purchase, unless you game at 1440p and you're planning to buy an RTX 5090. The RTX 5090 probably still won't do anything to lift the bottleneck at 4k.
I see your point, but it is a lot of extra work for reviewers to test different CPUs at multiple resolutions just to prove what is already a given fact, you're only going to be as fast as the slowest component in the workload. If you crank up the resolution, or just have a weak GPU, then the GPU is the bottleneck and obviously frame rates will be limited to a similar level across the board.
Should they also change other graphics settings and components that have little to no bearing on the CPU and test against every possible configuration when the point of the review is to focus on CPU performance? Should they test at different resolutions when they review an SSD, or a network card? If people want to know the performance difference at different resolutions, they should be reading a GPU review, not a CPU review.
It's not like Techspot is trying to fool anyone, sites have been testing CPUs at low resolutions for decades, and they even include a link explaining the reasoning for anyone who might not understand. It's not until recently that I've seen so many people fail to grasp this concept.
Incorrect, 1440p and 4k combined are still a minority of steam users, most are 1080p or below.
Also, you do not run at high resolution for a CPU test. The propose of CPU testing is to stress the CPU, not the GPU. Please learn, every single time someone has this question and it has to be explained to them.
The problem with testing a CPU in games at various resolutions has been explained, but I'll put it in another way.Removing the GPU bottleneck reveals the absolute performance delta, but nevertheless it's fair to assume that the kind of people in the market for hardware at this price point are indeed 2k/4k users with much bigger warchests than the average Steam user. There is value in showing real-world metrics for this audience, even if it ends up being a nothing-burger.
The best reviews include these resolutions with minimum and frametime variation metrics and more.. these can reveal subtlties that are otherwise subjective.
Your concerns are valid. But to get that info there are sites, and reviewers that give sometimes very good advice about CPU/GPU pairing/best combo.And 4090 owners make up 0.91% of gamers as of October 2024. While these benchmarks do tell use what CPU is technically faster, it doesn't tell us anything about real world performance or what product you should actually buy. People gaming at 1080P overwhelmingly are paired with 60 class cards or anything sub $500 really.
Something I find infuriating about these benchmarks is that, while it find it important to eliminate bottlenecks, it doesn't tell the target market anything about how this product actually performs in their setups.
The testing methodology that everyone is using is designed specifically to find bottlenecks and exagerate performance. It tells us nothing about real world performance. The thing is, real world performance numbers don't sell clicks or move products.
While eliminating bottlenecks is important, it is also important to measure performance when THERE ARE bottlenecks. I'm not saying stop testing this way all together. The thing is, the way everyone is testing these chips leaves out valuable information that if people saw they might say, "wow, I don't need this product. That money is better spent elsewhere"
I love building building PCs as a hobby, I think this tech is cool and I'm happy it's around. But I see this in a similar way to cars as a hobby. Go buy a hellcat if you want one, but it isn't going to get you to work any faster. Websites are marketing this thing for AMD and it's going to end up hurting consumers in the long run. People with limited budgets are going to look at this and say "well I have to get this $500 CPU or my gaming computer wont work right". Fact of the matter is that many gamers out there will never own a GPU where the 9800X3D makes any difference in performance. want more FPS, buy a 7800X and a 4080 instead of a 9800X3D and a 4070.