I find this website highly deceptive. I'm commenting mainly to warn the newer people who might not see what you're doing. I've been in this game for close to 30y.
This article was written with half-truths or outright lies.
The article and HU's videos on this topic really are full of those, so much so that I can't see where it'd end if I tried to address them all. I'm not sure if it's ivory tower syndrome, or Steve just trying to cover his rear, but the fact are that HU are wrong in their arguments against higher-res benchmarking, and they're using wordplay and other disingenuous tactics to corral perceptions away from looking at the topic clearly, and into looking at only one part of the picture and thinking it discounts the rest of the picture. And it's already making people more tech-ignorant, as I've seen people arguing in higher-res benchmark videos that the videos are wrong because anything other than 1080p testing is GPU-limited and pointless - people who clearly lack basic knowledge of the topic and aren't very open to correction because their chosen authority figure insisted otherwise. So, Steve and HU are damaging naive people's understanding and creating difficulties in properly evaluating and discussing hardware.
People dogmatically taking HU's arguments (which are mostly nonsense) as truth occupy the role of pseudo-intellectuals, who think that because they've understood one part of a picture they can judge anyone saying something further to be wrong - despite them saying what they say not because they don't also understand that low-res testing of the picture, but because they understand more than just that part of the picture. HU's videos on this topic are serving as litmus tests for whether a person has any critical thinking skills or just blindly and mindlessly parrot whoever their chosen authority figure.
In other words, HU have created a mass of Dunning-Krugers who feel a false sense of superiority for essentially being unintelligently wrong. And if those at HU truly believe the arguments they're making (that higher-res benchmarks are useless, that they don't inform unless you have the exact-same system, that they don't fall under the category of a CPU review, and so many others), then they're a bunch of Dunning-Krugers, themselves.
The fact is that both academic (low-res, unbound) CPU testing and practical (typical-res, likely GPU-limited) testing are part of a proper CPU review. But people are being misled by HU pigeon-holding the topic into a false dichotomy of either low-res testing or high-res testing, and into being about whether 1080p benchmarking makes sense - which it isn't. But Steve is fixated on explaining 1080p benchmarking as though it's QED, arguing with himself over things that are besides the point, and painting his arguments' detractors with strawman arguments.
Just one example from Steve's here article:
We get why some readers prefer what's often referred to as "real-world" testing, but in the context of CPU benchmarks, it doesn't actually provide the insight you might think – unless you plan on using that exact hardware combo under those specific conditions.
All of that is false. The 4k Balanced benchmarks Steve used in his article don't reflect the alleged "real-world" testing, per the surveys he did which showed 4k Quality users represent more than double both 4k Balanced and 4k Performance users *combined*, and per the 1440p survey which shows that native 1440p is actually the most "real-world" demographic.
And saying that "real-world" benchmarks provide insight only if you're running the exact-same hardware and settings combo is as daft as saying that any benchmarks, including 1080p unbound CPU benchmarks, are only insightful if you're running an identical configuration. It's such a nonsensical claim that it's stunning to hear from a well-regarded tech reviewer.
Higher-res CPU benchmarks provide exactly the insight I think they do, and which I see many others thinking. And its insight is most definitely not contingent on users running the exact-same hardware combo and under those specific test conditions. If an RTX 4090 throttles a CPU's performance at native 1440p max settings so that there's only a ~3% average performance difference (using TechPowerUp's aggregate) compared to a 7800X3D, then I immediately know that anything less than an RTX 4090 will yield less difference - and I can infer how much less based on the performance spreads between GPUs. And I can also reasonably infer how the result can be modified by changing settings to away some of the GPU's load. I can very-well ballpark what performance to expect from my own system, or one that shares the CPU but is otherwise customized. I can also know that setting the resolution to DLSS will only improve the result, and can make an educated guess or use DLSS comparison videos for specific games to gauge how much it can improve the FPS.
In short, "real-world", higher-res benchmarks do actually provide a large amount of insight into what a tested CPU means to me, without me needing to have the same other specs in the benchmark. How is it possible for a tech reviewer to hold the ridiculously-false and oblivious contrary belief?
So, either Steve is a Dunning-Kruger, or he's being wilfully dishonest and obtuse. Either way, he shouldn't be making condescending appeals to authority as he tells people they're wrong and he's right, when he isn't even playing in the ballpark.
As I said in my first post in this thread, I'm not set on seeing HU do higher-res testing. I'm bothered by the continued intellectual dishonesty from HU's corner on this matter, and Steve's condescension while using fallacious, misrepresentative arguments as he talks down to people and tries to frame them as not getting it while he's right, when the reality is that many of whom are actually right while he's wrong.
BTW, HU's latest video on this topic, the one that's 14 minutes long, is again rife with nonsensical assertions. One of the claims made this time is that CPU reviews are not upgrade guides. If they're not upgrade guides, then I wonder why HU includes price-to-performance charts in their CPU reviews. In the real world, CPU reviews are quite literally buyer's guides, it's the central reason they exist in the first place. They exist to communicate technical and other information about CPUs so that people watching can become informed and figure-out whether it's in their interest to have one. And if data informing how a CPU performs in its target market's typical-use resolutions is omitted, then it's failing as a CPU review. Part of objectively measuring what a CPU can do is measuring what it can do when placed in its typical-use environments (such as resolutions) - that data shows what difference the CPU as the isolated factor can actually make in that environment, thus informing prospective buyers whether it's worth buying.
HU can just say that they don't want to do 1080p testing, and that'll be a lot more valid than saying CPU reviews are not buying guides as an excuse for not including other contextual information that's important to the buying decision-making. The latter route is both weak and false. What HU are doing by trying to dictate what "Review" means is like somebody who doesn't know what a word means making assumptions about its meaning, and then lecturing everybody else that they're wrong if they use the word in a way that doesn't conform to their completely arbitrary, actually-wrong personal assumption of what the word means. And as with HU's claims about higher-res benchmarking, that argument is also spreading ignorance.