I see a lot of text, a lot of speech...& I see inconsistent testing metrics. If you want to test
CPU performance at 1080p, you don't use a GPU that's designed for 1080p resolutions (I.e. GTX 1060)...you use a GPU that's designed to handle heavier resolutions like 1440p & 4K (I.e. a GTX 1080TI) to eliminate any potential limitations from the GPU.
I especially find those tests strange when they
only show
overclocked results for the chips. Even among gamers, the majority do not overclock their machines. We're talking performance comparisons here, after all. Comparing how well a NOS-equipped Ford Mustang performs compared to a NOS-equipped Dodge Charger may be interesting, but without the
baseline comparison (I.e. non-NOS-equipped) you have no idea if the NOS even added anything. Same with overclocking: there have been a number of games benchmarked here on Techspot where their top-spot Core i7 saw little to no performance gains, even when OC'd to 50% of its stock clock. So when I don't see
any testing done at stock clocks (or even at low-level overclocks), & I don't see any testing results that show how much effect overclocking has, it's hard to take them at face value. Especially when the results themselves end up being so close. Sorry, but when the spread of "first-place" to "last-place" on your testing only has a 5-10% gap, & especially if the gap is ~5FPS, it's really hard to claim that any 1 particular CPU is an "undisputed" winner. And considering that those Kaby Lake chips are 5 generations newer than the Sandy Bridge chips, but even when overclocked to 10% faster speeds can't manage to get a consistent 10% margin over them... exactly what kind of improvement is Kaby Lake providing over a
6-year-old product? Shaving 4W off the TDP? That's really impressive...& yes, you'd hear the sarcasm if we were speaking face-to-face.
I also find the tests suspect, since they not only don't seem to match other benchmarks. And no, I'm not talking about Techspot's Ryzen benchmarks (or Tom's Hardware's, or any other sites)...I'm talking about the
GPU benchmarks.
-- take Battlefield 1, for example. The
only test where their Intel Core i7 test seemed to match Techspot's benchmarks was the 4K test...except that HardOCP's GTX 1080TI was apparently running a lot slower, as its performance only matched Techspot's
non-TI model (
https://www.techspot.com/review/1267-battlefield-1-benchmarks/page3.html); Techspot's GTX 1080TI running on a
slightly slower i7-7700K (4.9GHz vs. 5GHz) was 12% faster at 4K (
https://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/).
-- Same thing at 1440p. HardOCP's GTX 1080 (paired with their OC'd 5GHz i7-7700K) ran
much slower than Techspot's GTX 1080 (paired with an i7-6700K that was only running at 4.5GHz). About 17% slower, to be exact (
https://www.techspot.com/review/1267-battlefield-1-benchmarks/page2.html). And HardOCP's GTX 1060 was also running significantly slower (about 13% slower), as was their RX 480 (about 16% slower).
-- It's even worse for DOOM, for example. There's a
37% difference in performance at 1440p between HardOCP & Techspot's benchmarks (& again, Techspot shows much better performance available). Their GTX 1060 performance was just as bad, running 40% slower than Techspot's.
Now, maybe it's possible that there's something about the settings that explains why HardOCP's performance numbers are so significantly lower than other benchmarks...but since they hardly gave any details about their benchmark system (unlike other sites, like Techspot...), it's harder to take their results at face value.[/B]