Frankly, at this point the only people who still argue against 1080p CPU testing are Intel shills who will change their tune the moment Intel produces a better gaming CPU. It is the only explanation for the willful misunderstanding and mischaracterisation of the facts presented. It is getting into flat earth territory, at some point you just need to walk away and let people keep their delusions, particularly when they aren't based in ignorance but ideology.
Yet here I am, trying to find if I can play Star Citizen at higher rates with the new AMD vs my 14900KF. SC doesn't care much about GPU, unless you pump specific options, but it is CPU bound at a level most games are not. And there is one thing my fellow reviewers never get - there are always games which are borderline unplayable at a given resolution, and while it doesn't really matter if you play with 120 or 144 FPS, it matters a lot if you can gain 5 FPS when you are around the 60 FPS.
Another issue with 1080p, and even present in the current article, is that it amplifies the differences well beyond what is to be reasonably expected, and this won't matter in RL. (Sure, 3-4 years later, blah, blah. Blah, because 3+ years from now these top CPUs are obsolete tech surpassed at games by mid level CPU)
Say 30+% here, but in RL in anything but the lowest resolution, you get half of that. 2 years from now it would be the same and even half of the benchmarks would be the same. Oh, you could apply some common sense and realize the difference is not that big and it won't be, but do people really do it or they remember X was 30% faster than Y (more like 15% in reality, which they won't even know bevlcause the reviewers are too busy to care).
I have 15 years as a reviewer at PCW, Computer Bild, hardwareBG and various magazines, I've been on the other side too (PR at distribution). IMO it's laziness not to simply include the information which would render the whole argument moot.