Pairing CPUs and GPUs: PC Upgrades and Bottlenecking

The series of unconventional tests like this one and the the older GPU tests are what makes Techspot stand out from the crowd of tech sites. Very well done!

Its a shame you didn't trow a RX580 and/or vega card in there to see if there is a difference in the way AMD and NVIDIA cards react on cpu's though.

Also this seems like very good news for the upcoming 4core 4ghz i3 (they should have called it i4 imo). And just goes to show how much modern games rely on their GPU.
 
Nice article which yet again reiterates why the best CPU reviews include a mid-range 1060 / RX480 card in addition to top-end (plus 1% lows in addition to just avg) and why Techspot is increasingly the "go-to" for common sense reviews of mid-range hardware.

The only thing I would have added is perhaps a Med / High test just for the lowest GPU / CPU combo (G4560 + 1060) just to see which Ultra settings impact the CPU as well as the GPU. Scaling GPU's downward alone normally caps all CPU's to equal level, but dialling back settings a little in addition (how budget gamers often play) often has an impact on both CPU & GPU. Even for those not starved of performance, this scaling information is useful in that many of us often turn half the "Pure Ultra" usually wildly overdone cr*p effects like Chromatic Abhorration, motion blur, film grain, "lens dirt", vignetting, etc, off anyway simply out of aesthetical preference and often find slightly lower than expected CPU bottlenecks on the same hardware vs "Pure Ultra" preset benchmarks as a result.
 
For me, the glaringly obvious omission is the Ryzen 1600.

Scores at stock and overclocked would have been a great guide to whether this CPU is the price/performance winner.
 
Ryzen 5 1600 & GTX 1070 is as good a bang for your buck combination you can buy, end of story.
 
Ryzen 5 1600 & GTX 1070 is as good a bang for your buck combination you can buy, end of story.

You know this, I know this. Casual browsers might not know it as it doesn't appear. The graphs appear to make the locked i5 7400 the next step up from R1400 which is clearly nonsense.
 
Ryzen 5 1600 & GTX 1070 is as good a bang for your buck combination you can buy, end of story.

You know this, I know this. Casual browsers might not know it as it doesn't appear. The graphs appear to make the locked i5 7400 the next step up from R1400 which is clearly nonsense.

This isn't the point of the article. We've already provided in-depth content comparing the 7700K and R5 1600. We explained quite clearly what were aimed to show with this article.
 
I disagree with part of the future proofing conclusion. Saving a $100 on a CPU to match it to GPU for the last decade would have been foolish. A CPU needs a MB and with Intel new and lack of compatibility between CPU sockets, to upgrade a CPU can be expensive. But a video card is purely plug and play, so better to have a high end CPU and just upgrade the GPU when games need it.

For instance how old is a 2500k early 2011? A small OC and it can still use recent video cards. That's nearly 7 years on the same CPU. The potential $100 saving 7 years ago for a lower spec CPU which would not be up to the task 3 years ago would be costing you a new MB and CPU 3 years ago. False economy.

Yes my 7 YO system needs an upgrade, but its still mainly the GPU that is slow (the bottleneck)
and that's been upgraded twice.
 
I disagree with part of the future proofing conclusion. Saving a $100 on a CPU to match it to GPU for the last decade would have been foolish. A CPU needs a MB and with Intel new and lack of compatibility between CPU sockets, to upgrade a CPU can be expensive. But a video card is purely plug and play, so better to have a high end CPU and just upgrade the GPU when games need it.

For instance how old is a 2500k early 2011? A small OC and it can still use recent video cards. That's nearly 7 years on the same CPU. The potential $100 saving 7 years ago for a lower spec CPU which would not be up to the task 3 years ago would be costing you a new MB and CPU 3 years ago. False economy.

Yes my 7 YO system needs an upgrade, but its still mainly the GPU that is slow (the bottleneck)
and that's been upgraded twice.

Sandy Bridge is pretty much the only example we know of in PC history so not exactly the norm. That said the 2500K was a step down from the 2600K. You do miss out on a number of platform features with Sandy Bridge though, so keep that in mind. Still there is no denying the 2500K is a beast but had the 8-core FX CPUs not been a complete disaster it might have been a different story today. Hell we'd be talking about how future proof the FX-series was if that were the case, imagine that (I can't).
 
Sandy Bridge is pretty much the only example we know of in PC history so not exactly the norm. That said the 2500K was a step down from the 2600K. You do miss out on a number of platform features with Sandy Bridge though, so keep that in mind. Still there is no denying the 2500K is a beast but had the 8-core FX CPUs not been a complete disaster it might have been a different story today. Hell we'd be talking about how future proof the FX-series was if that were the case, imagine that (I can't).

Yes I could really do with a new system/GPU, I believe you arte giving some parts away, hint hint :)

Been getting the best CPU I can afford and upgrading the GPU as a half life upgrade for well over a decade. Saved me heaps from upgrading every 2-3 years as I was. Sandy bridge was so good it got 2 GPU upgrades.
 
I am continually amazed that my now 7 year old i7-2600k (now moved to a secondary PC) is still not only capable of playing modern games well but doing so better than the newest generation of consoles. Until a console generation beefs up its CPU power considerably, the PC CPU market *for games* is likely to remain stagnant.

In some ways I almost regret moving from a 2600k to my current 5930k a couple years ago. At the time did it because it was helpful for the SLI setup I was running with Titan X cards but now that I'm down to a single 1080 Ti at 4k I'm pretty sure there would be zero difference between it and my old 2600k.
 
Where are the sandy bridge results ? I mean the system is listed at the beginning but don't appear further in the article, except in overclocking page...

I have a 2600k paired with a 1080ti on a 165hz monitor. Majority of the games that the 1080ti can run at these frequencies are suffering from a good amount of CPU bottlenecking. Some games I'm supposed to run at 150fps with 7700k (according to online benchmarks) hit only 100-110fps with my 2600k.
 
Last edited:
Steve said:
Again, the Ryzen 5 1400 is $160, while a B350 motherboard will set you back $70, a total of $230 or almost half the price, and you'll get about the same performance out of the GTX 1070 using AMD's processor, leaving you with ~$220 for future upgrades.
I tried to pour through the AMD 1400 review you did but missed the temps page so I have to extrapolate from the other Ryzen 5 review:

While the Intel CPU requires a new cooler doesn't the 1400 run near 90 degrees when OC'ed? I would think both CPUs would want an aftermarket solution for longterm stability and durability.
 
Would have been nice to see the i5-7600k instead of the i5-7400 but what I mainly take away from these tests are three results

The intel Pentium is delivering 55+ FPS in every scenario (often more)
The OC i5-2500k still runs and offers close to i5-7400 performance
Arguing over modern CPUs for gaming results is like arguing over the performance difference of premium vs regular gas in a Honda Civic 110hp 4 cylinder engine
 
Steve , you really got me thinking with your advice.. "better off looking at what hardware combos deliver the best value in the games you currently play "

Scott Wasson wrote a very deeply insightful article (too bad now over 5 years ago) in which he proposed a GPU 'jitter' index based on time to render/display individual frames ("Heck, we could probably even create a "jitter index" by looking at the difference between the 50th and 99th percentile frame times.")
https://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking

I think the a GPU jitter index would be really neat.. possibly 'lower is better' for 99th % frame time minus 50th% frame time divided by 99th % frame time. Has anyone seen something like this in the benchmark utilities?

All of which led to another speculation... is it possible that capping faster delivery of frames could result in a much smoother visual experience? Would a GTX 1050 Ti with delivery of frames timed to 35 milliseconds per frame provide a more enjoyable experience than a GTX 1060 swinging from 15 milliseconds to 30 milliseconds?

Of course, I may be so deeply in the past (result of over 50 years of computing) that I am missing something new (last decade or so).
 
I would like to see another CrossfireX/SLi comparison with various low, mid and high end CPU/GPUs pushing them. 1080p @ 100Hz + and 60Hz, 2K/1440p/1600p at 60Hz/100Hz + and 4K @ 60Hz.
 
"Overall, our advice is to determine your performance expectations for a given title or application and then find the best value hardware combination for your budget, which is easier said than done sometimes..." OUCH.

That's task specific optimization and can only be done AFTER the combo is installed - - catch-22.
A runner trains for a specific event; 440, mile, 10k, marathon while most of us just want a good 'general solution' that will perform as best possible in all cases, much more like the decathlon.
 
"Overall, our advice is to determine your performance expectations for a given title or application and then find the best value hardware combination for your budget, which is easier said than done sometimes..." OUCH.

That's task specific optimization and can only be done AFTER the combo is installed - - catch-22.
A runner trains for a specific event; 440, mile, 10k, marathon while most of us just want a good 'general solution' that will perform as best possible in all cases, much more like the decathlon.

Nah you use this thing called the Internet, it's very cool, check it out sometime. Anyway on the Internet you can research basically anything, actually dare I said it 'anything'.

If you're a content creator and you use Premiere and Audition, there are benchmarks which show you what those applications benefit from.

If you're a StarCraft 2 addict like me, you can find what that game requires and build your system accordingly. Or if you like games such as CS:GO, BF1, Overwatch and so on.
 
@Steve
There's a professional career called Capacity Planning, where the workloads are defined and the resources are gathered (either as a solo machine or a cluster), which will handle that workload. - - so sorry, I come from the commercial side of computing, not gaming.
 
@Steve
There's a professional career called Capacity Planning, where the workloads are defined and the resources are gathered (either as a solo machine or a cluster), which will handle that workload. - - so sorry, I come from the commercial side of computing, not gaming.

That changes nothing from your original comment by the way, or mine. Having said that I think the focus of this article (the one you are commenting on) was on gaming, the give away being that there were 9 tests and they were all based on games ;)
 
@Steve
There's a professional career called Capacity Planning, where the workloads are defined and the resources are gathered (either as a solo machine or a cluster), which will handle that workload. - - so sorry, I come from the commercial side of computing, not gaming.

What in the article made you think it was about commercial computing rather then gaming?
 
Back