^ Good article. It's stuff like this that makes me read Techspot, not the recent "political focus". The addition of HT to Pentium's has essentially turned them into $65 i3's which really does shake things up a bit. If Ryzen lives up to the hype (and if sales are good), and if Intel keeps churning out +1-2% improvements per year, sooner or later they'll have to have a whole tier reshuffle (ie, something like Pentium = 2C/4T, i3 = 4C/4C, i5 = 4C/8T, i7 = 6C/12T, HEDT = +8C/16C). Given a +$300 i7-6700 is double a $70 G4560 with a clock boost (but costs nowhere near double to make in terms of die size), that's one hell of a premium for what brings +10% performance in 2/3rds of the games reviewed here on the highest tier of GFX most likely to be paired (GTX 1060) and less than half that for the 1050Ti.
Thumbs up for using common sense of what dGPU a "budget build" is likely to consist of (testing 1050Ti vs 1060 vs 1080 on low-end chips is something sorely missing from other budget CPU reviews). My only suggestion for improvement when reviewing low-end hardware, is to add a "Medium" preset chart for the GTX 1050Ti in gaming as that's exactly how most real budget gamers play in reality. Eg, who's going to play Shadow of Mordor at
37fps Ultra when
74fps High gets a quality "loss" of... what exactly for double the fps? A nice 35-80% boost
for Crysis 3 for thin air is great too. In many indoor scenes of Witcher 3, the
+80% fps boost for Med amusingly looks better than
blurry as hell Ultra due to badly over-done DoF turning everything not in the centre of the screen to mush (guy's hair & shirt on the right, wooden windowsill on the left, etc)...
AFAIK, PCGamer.com is the only site that can be bothered to test like this when reviewing low end hardware and
42fps -> 69fps (+64% boost) averaged over 15 games is a nice boost with often minimal visual quality loss. Eg,
BF1 64-player on Medium runs nearer 75-80fps on the G4560 (a 60% improvement vs the 45-50fps you tested at on Ultra). Similarly,
for Witcher 3, a G4560 will comfortably get +70fps on a RX470 / GTX1060 on Med, so it's clear the 1050Ti is bottlenecking the CPU at the highest presets.
These days pure "Ultra" presets for low-end budget builds are meaningless given many of us turn a lot of the "Smear Your Monitor With Vaseline Simulator" cr*p off anyway simply out of preference (I do not find "Camera Lens Defect Simulator" (Chromatic Abhorration), "Glaucoma Simulator" (Vignetting), "Migraine Simulator" (Film Grain), "Cataract simulator" (Motion Blur), "Myopia Simulator" (DoF) etc, "enhances" anything at all). Likewise, competitive gamers still turn settings down even on high end cards to improve the "signal to noise ratio" (easier to spot sudden enemy motion when there's fewer environmental effects constantly moving about).
Just some rambling thoughts from someone who owns both a "high" gaming rig and a "low" HTPC (i3-6100 + GTX 1050Ti) and who is constantly surprised at what the latter can do once all the
"you have the worst and most f**ked up eyesight on the planet" post-processing "enhancement" cr*p of 'modern' games design is removed or at least toned halfway back to reality...