4 Years of Ryzen 5, CPU & GPU Scaling Benchmark

For your future articles regarding cpu/gpu scaling, can you add FPS/$ to the graphs?

Thank you Steve!!
 
I mostly agree with @NightAntilli. For those who want to seem like understanding nice guys, we have to remember that Steve so often, seemingly always, does unnecessary work. For example in this test he could have dropped 4K results altogether or alternatively ran them with only RTX 3090. It's debatable if even 1440p results are needed to run with every GPU.

My point is planning your testing nicely beforehand gives better results and often lesser work. Steve might be the king of benchmark runs, but not the king of best possible data. I also appreciate his work obviously, but just want to see him be more clever and actually save his time.

I'm eagerly waiting the Intel data and again, you could either drop i7-8700K or i5-9600K (if that is the selected performance tier), since these CPUs are so close to each other in gaming performance, or even drop i5-10600K and just wait for i5-11600K instead. The point is there is no sense in testing similarly performing products, so consider wisely.

Yes most of us knows that at 4K the GPU is the deciding factor on frame rates . The point is this charts hammer it home to new readers - oh maybe another CPU will make a difference ,
But the brick wall charts will make an impression on even the most clouded mind . Plus I think you already knew roughly what the results will be across the board .
 
"We know for example that the Ryzen 5 1600X has aged better than the Core i5-7600K"

They came out the same year. The 7600K was first. The 7600K destroyed the 1600X for a few years.
If you had the 1600X when it came out, you were slower. Much slower.

So by the time you were faster than the 7600K, the 7600K user would probably be upgrading by then. Even if it was just to the 7700(K), that would be faster than the 1600X. Much faster.

If you bought the 1600X on launch day, you had inferior performance. Why would anyone do that, even IF they thought eventually it would be faster than a quad core down the road? Makes no sense whatsoever.
 
"We know for example that the Ryzen 5 1600X has aged better than the Core i5-7600K"

They came out the same year. The 7600K was first. The 7600K destroyed the 1600X for a few years.
If you had the 1600X when it came out, you were slower. Much slower.

So by the time you were faster than the 7600K, the 7600K user would probably be upgrading by then. Even if it was just to the 7700(K), that would be faster than the 1600X. Much faster.

If you bought the 1600X on launch day, you had inferior performance. Why would anyone do that, even IF they thought eventually it would be faster than a quad core down the road? Makes no sense whatsoever.
Only for gaming and not by much. Within a year the 1600X improved hugely with multiple platform updates, windows updates and game updates. And with the 7600K you are stuck in a dead abandoned platform, the 7700k isn't much of an upgrade, it's still a 4 core CPU (Intel had to rush the 8000 series). Upgrading to a 2600x or 2700x is much better and if your mobo got the BIOS update you could even go for the 3600x.

The 1600x essentially destroyed the 7600K in multithreading by as much as 2-3x in some tests (it was faster than even the 7700K) and only lost by single digit numbers in gaming. And I'm talking about launch benchmarks, not the later ones.
 
Last edited:
Those are some pretty amazing differences but it seems that if you don't have a high-refresh monitor then the 2600X is as good as it gets with a minimum of 61fps in Cyberpunk 2077.
 
Last edited:
This is by far the most useful article I've read recently. Probably uniquely informative in showing some "real world" GPU/CPU scaling, it has saved me from fruitlessly upgrading my CPU for gaming. By that I mean I'll be sticking with my Ryzen 5 2600 and Vega56 set up until GPU prices return to something sensible.
Many thanks for this.
 
Back