What he said. Seriously. This is sooooo irritating.
I get it: There's (probably?) practically no difference between CPUs at this level, so it's boring to test. But because freaking EVERYBODY insists on doing running 1080p tests on high end cards like the 2080 TI - and ignoring 4k or even 1440p -, we just keep getting information that is irrelevant to practically anyone. The vast majority of the people with truly high end GPUs buy them to push truly demanding monitors - 1440p 16:9 being pretty much the MINIMUM resolution, 21:9 / 1440p (or even 32:9 - me!) or 4k being more realistic for the kind of person who buy that kind of card.
Actually: I wrote "irritating" - which it is for those of us who have been following this stuff for decades. But continually seeing tests like these creates a misleading impression - particularly for folks who are maybe a little newer to the scene. If I were 23 and ready to turn that first serious paycheck into a seriously nice rig, I'd be ill served overpaying for an intel CPU because I thought it would improve my FPS by 10% - when it actually might not improve my FPS at all but would definitely slow me down if I want to, say, stream, or edit video.
By the way:
Sure, there ARE some articles out there that compare CPUs with high end GPUs at high end resolution. But they are few and far between.
Of course there are competitive gamers out there who run at 1080p. But e-Sports titles can generally hold 240hz on a 2060 super. Unless you're a pro being payed to use top tier gear, even a 2080 Super is silly.
Of course there are more powerful GPUs on their way. But it's not like games are hold still in terms of their demands on GPUs - otherwise we'd all be playing something that looks like Half-Life 2 at 3000 FPS. You lust after the most powerful available GPUs because games get more demanding. I'm running a 1080 TI on a 32:9 at 1440 and I've decided to put off buying RD2 until I can upgrade. Why? Because my old 1080 TI is too weak to handle it with the kind of settings I'd like to enjoy.