GreenNova343
Posts: 454 +336
The Intel 8400 IRL paired with a 1060 3GB , 570 4GB would not perform noticeably better in gaming. Using a 1080 ti creates an unrealistic and misleading impression. The same could be said for the K versions of the higher priced Intel CPUs. We know some games favor one or the other brand but that difference becomes slimmer when configurations match what gamers could afford and would actually buy.
Of course there would be little to no performance difference...because then the GPU becomes the logjam in the data flow.
Techspot, Anandtech, Guru3d, Tom's Hardware...most legitimate benchmarking sites will try to limit the potential variables in their testing as much as possible. They can't do much about Intel & AMD CPUs not being able to use the same motherboard, but they'll keep as much of the hardware identical (or close to identical) as possible. When it comes to CPU vs. GPU testing, however, they make sure that they use the fastest possible example of the category not being tested to avoid any potential logjams, bottlenecks, or other slowdowns that could skew the results. Not to mention that they keep the testing at 1080p or lower resolutions, because at those resolutions a high-end GPU will not be the limiting factor.
Hence...since they're comparing CPU performance (R5 2600 vs. other CPUs), they limited the testing to 1080p resolutions & used a GTX 1080TI. Seems like the billionth time that this has had to be explained by the millionth person...