In this explainer we support with data why testing CPU performance with a GPU bottleneck is a bad idea, so you can have a full understanding of our testing methods when reviewing CPUs.
https://www.techspot.com/article/2618-cpu-benchmarks-explained/
In this explainer we support with data why testing CPU performance with a GPU bottleneck is a bad idea, so you can have a full understanding of our testing methods when reviewing CPUs.
https://www.techspot.com/article/2618-cpu-benchmarks-explained/
The sad thing is, no matter how many times you explain it, the knuckle dragging m0r0ns will still reply with "why would you test low end CPUs with a 4090" and "who uses a 4090 at 1080p". Steve's got one hell o an uphill fight to make these people see the light.Impeccable article, even going out of your way to test and show the useless data. Expect this article to be linked in most comment sections of hardware reviews.
Depends on if you change monitor/resolution as well.I've been making the exact argument this article highlights for years. And I'm speaking as someone still rocking his 8070k paired with a 3080Ti (previously a 1070Ti, so close to exactly the case this article highlights).
You isolate the performance of the specific part you are trying to test, and for CPUs in gaming the best way to do that is low resolution with a powerful GPU, to see what the real difference is when the GPU is removed as a bottleneck. Not realistic for today, but gives you an idea how well parts will last tomorrow.
I'm also surprised users are doing three GPU upgrades per CPU; I guess most here are upgrading every GPU generation or so. I tend to do two generation GPU upgrade cycles (EG: 1070TI to 3080Ti), then junk the build once the second CPU/GPU combo becomes functionally obsolete, usually 4-5 years into the build. I then build around the best CPU on the market and go again.
The sad thing is, no matter how many times you explain it, the knuckle dragging m0r0ns will still reply with "why would you test low end CPUs with a 4090" and "who uses a 4090 at 1080p". Steve's got one hell o an uphill fight to make these people see the light.
It's a CPU I repeat CPU, I'll repeat again, CPU test. As per this article you're commenting on, which has data to prove it, increasing the resolution makes the GPU a bottleneck, not the CPU, so you're no longer benchmarking a CPU.It's a review, right? Test the more unlikely configuration to show best performance, but also test the more likely scenario especially for that type of CPU and GPU combo by upping the resolution my man! There is no need for guesswork. Tim doesn't even touch 1080p monitors in his MUB reviews. We're getting excited about higher refresh rates almost weekly. OLED. HDR. 4K. 8K. 1440p+ is just where the industry is going. Omitting it just doesn't make sense.
As a car reviewer, do you go to offroad testing videos and complain that there is no sports car, or that they didnt do a highway test? After all those are really important and lord knows NOBODY has EVER dont a highway test of a jeep.As a car reviewer, are you only going to review how well a Jeep does offroad? Only test how well sports cars go in a straight line?
You want to remove bottlenecks. Cool. Do it. 1080p. But....
I think what initiated this is/was with 1080p ONLY testing results with "overkill" hardware. One comment in the first image in this article is complaining about exactly that, and that's what I had a big problem with. Those are all the people that have no use for 1080p ONLY results. Those are the keywords here: 1080p ONLY.
It's a review, right? Test the more unlikely configuration to show best performance, but also test the more likely scenario especially for that type of CPU and GPU combo by upping the resolution my man! There is no need for guesswork. Tim doesn't even touch 1080p monitors in his MUB reviews. We're getting excited about higher refresh rates almost weekly. OLED. HDR. 4K. 8K. 1440p+ is just where the industry is going. Omitting it just doesn't make sense.
Aside from that, I also had the idea of a tech site eventually throwing a popular mainstream build in the mix (included in review/separate video series) with say an i3/i5/R5 + 16GB 3200MHz CL16 / DDR5 6000MHz paired with new low to high end GPU's, and a vice versa with low to high end GPU's? Rigs closer to what the majority actually have. I feel it would help a lot of people with upgrading in addition to standard reviews.
Everything I suggested was in addition what is already being done and I stand by all of it.As a car reviewer, do you go to offroad testing videos and complain that there is no sports car, or that they didnt do a highway test? After all those are really important and lord knows NOBODY has EVER dont a highway test of a jeep.
Or would that be a really ignorant thing to do?
Because that is what you are doing right now, asking "but why only 1080p" while drooling because you cant read the title of the video. IF you want higher rez results they are provided on every single GPU test. If you bothered to Read The ****ing Article you would know this. Reviewers wont include a midrange test bench because then people like you would bleat out "ok but why are you not including the high end CPU tests that would be able to show the best performance and be a more likely scenario", just like you are doing here.
Did you actually read the article? I feel like you didn't read it.Everything I suggested was in addition what is already being done and I stand by all of it.
Most reviewers were already doing 1440p in CPU tests, because it made sense. Why are people against having more data? Are you implying anything beyond 1080p in a CPU test is useless when most wouldn't be using flagship hardware at such a low res? wtf am I supposed to do with data that is the furthest from my own setup which is closer to the majority? Praise X company for their tremendous achievement at a level I have no interest in? Reviewers can do what they want, but asking to add one more resolution shouldn't be treated as something a noob would say.
Well couple things from my foxhole:
1. If you actually maxed the settings out completely rather than High/Very High the gap would narrow further.
2. The average pc gamer isn't chasing frames; and in the current economy probably cares more about bang for buck. Chasing frames is great for the minority who do so; but for the mass of computer gaming personnel; they aren't doing that.
3. A 2008 i7 920 can still deliver scaling at max settings @3440x1440 even with a 3080Ti; and can even hover around 60FPS average with a 1080Ti. So that means that someone still using an i7 920 can stick to a 2018 GPU and be fine.
![]()
3 GPU upgrades per CPU upgrade seems a bit above normal. It is easier to add a new GPU whereas a CPU upgrade could entail new Mobo and memory in some cases. But, still. I upgrade a GPU about every 3-4 years, system upgrades around the 5-6 yr mark. And now with GPU prices being so ridiculous, I'll bet a lot of people are going to hold cards longer. So 2 upgrades, yeah I could see that. To be fair, I know some people upgrade parts every year but I think we know those are not the majority of users.I've been making the exact argument this article highlights for years. And I'm speaking as someone still rocking his 8070k paired with a 3080Ti (previously a 1070Ti, so close to exactly the case this article highlights).
You isolate the performance of the specific part you are trying to test, and for CPUs in gaming the best way to do that is low resolution with a powerful GPU, to see what the real difference is when the GPU is removed as a bottleneck. Not realistic for today, but gives you an idea how well parts will last tomorrow.
I'm also surprised users are doing three GPU upgrades per CPU; I guess most here are upgrading every GPU generation or so. I tend to do two generation GPU upgrade cycles (EG: 1070TI to 3080Ti), then junk the build once the second CPU/GPU combo becomes functionally obsolete, usually 4-5 years into the build. I then build around the best CPU on the market and go again.