4 Years of Ryzen 5, CPU & GPU Scaling Benchmark

That is a lot of work to find that if you game in 4K it doesn't matter which CPU you are using as they perform the same.
So if you are a gamer, 5600X is certainly a bad choice over 3600X or even the 1600AF that costs a third of the price.
 
That is a lot of work to find that if you game in 4K it doesn't matter which CPU you are using as they perform the same.
So if you are a gamer, 5600X is certainly a bad choice over 3600X or even the 1600AF that costs a third of the price.
Depends on how long you wish to keep your processor.
 
don't agree with your statement that more cores don't provide a level of future proofing, 7600K owners can attest to that.
Tell that to ryzen 1700x owners watching 7700k and ryzen 3300x still beat them. Game set match, you lost.:joy::joy:
 
Next time that you get any sort of faulty product, remember to not complain nor look for warranty, but instead to tell yourself to make a better one instead.
The only way that is a good parallel is if you paid for this review.

Did you pay for this review?

Nope. You got it for free but still want to complain like techspot are your employees - your tone goes far beyond someone that sold you a faulty product.
 
"if you’re currently using an RTX 2060 or 5600 XT (or perhaps something slower), and your current CPU is a Ryzen 1600X or 2600X, then upgrading to a Core i5-10600K or Ryzen 5 5600X, for example, won't net you much additional performance."
this conclusion just destroyed the universe as we know it
my conclusion is that the 1600 was a shity competitor not worth swap from Intel. The 2600 still not worth and 3600 not being available - the clear market winner here is Intel again though his customers are really pissed of by their practices.
the other conclusion is that the world does not need new cpus with these architectures anymore, neither for gaming.
 
Tell that to ryzen 1700x owners watching 7700k and ryzen 3300x still beat them. Game set match, you lost.:joy::joy:
The 1700x is a beast of a CPU, just not as good as the 7700k in gaming. Double the cores is no joke. Besides some outlier titles like Far Cry 4, the 1700x can stand up to the 7700k in gaming (within a few FPS, AMD improved a lot with updates since the initial launch reviews).

On a side-note, you are pretty much stuck with the 7700k so many years after you've bought it with no upgrade path for it.
 
The only way that is a good parallel is if you paid for this review.
Not at all. The principle is exactly the same. Besides, if someone gives you a free sandwhich but it turns out to be filled with maggots, do you have a right to tell the one that gave you the sandwich, or should you just shut up because it's free?

Did you pay for this review?
Irrelevant, as already explained, but even then, I'll humor you.
I did not pay with money. But I it did cost me time to read it. That's basically the currency of the internet. It's why clicks make money. There is nothing wrong with wanting high quality content, independently from whether it's free or not.

Nope. You got it for free but still want to complain like techspot are your employees - your tone goes far beyond someone that sold you a faulty product.
There is a difference between complaining and constructive criticism. Want to know the difference?
Constructive criticism has arguments and reasons as to how something can be improved.
Complaining is verbalization of dissatisfaction without much reasoning.

Now figure out what you are doing with your replies.
 
I mostly agree with @NightAntilli. For those who want to seem like understanding nice guys, we have to remember that Steve so often, seemingly always, does unnecessary work. For example in this test he could have dropped 4K results altogether or alternatively ran them with only RTX 3090. It's debatable if even 1440p results are needed to run with every GPU.

My point is planning your testing nicely beforehand gives better results and often lesser work. Steve might be the king of benchmark runs, but not the king of best possible data. I also appreciate his work obviously, but just want to see him be more clever and actually save his time.

I'm eagerly waiting the Intel data and again, you could either drop i7-8700K or i5-9600K (if that is the selected performance tier), since these CPUs are so close to each other in gaming performance, or even drop i5-10600K and just wait for i5-11600K instead. The point is there is no sense in testing similarly performing products, so consider wisely.
 
Last edited:
Looking at benchmarks I would say a 5800X is closest to a 10900K but there is no equivalent Radeon card to a 3090. Some might say the 6900XT but it’s slower and has no DLSS and can’t really do ray tracing despite being advertised so.
DLSS is a software feature. That could be added throughout the lifecycle of the 6900XT and Ray Tracing, in my experience anyway, doesn't add a lot to games. The 6900XT is absolutely on par with the 3090, all you have to do is look at the benchmark breakdowns. It beats the 3090 in roughly 40% of benchmarks.
 
Thanks for the testing. It's interesting. You've told me I need nothing more than a 1600X and 5700XT. I'd rather have the highest resolution versus the highest frame rates.

Some other interesting testing would be to see how much these newest games cripple the 4c/8t cpus. I play old games and derive no income from my computing power so 4 cores should easily satisfy me for years to come. I'll ride my 2400G and 3100 until AM5/DDR5 becomes mainstream.
 
Very happy I got my 2600X. Got it for 99 bucks from microcenter on my latest pc rebuild a couple years ago. given todays inflated prices I'm guessing that's a ultimate steal.

Considering I have a 1440P monitor and dont even mind 30 FPS, and also have a RX570, and also dont really game on my PC, the 2600X is looking more than good! It's not a limiter on any setting or resolution I'd ever even begin to care about. Nobody games at 1080P anyway, and I dont even need 60 FPS let alone 100+

Being I'm farsighted (opposite of nearsighted, I tend to push AWAY from the screen!), havent gamed on a keyboard in years (it would feel really weird to me) and own a Series X +65" TV+Atmos soundbar, yeah, I just dont see any reason I'm likely to return to PC gaming anytime soon. Kind of sad about that LOL. But my Series X easily outspecs my PC, besides the mnk thing which is another barrier to PC.

I'm old, I remember when awesome games like Half Life and Unreal Tournament were only available on PC! back then there was a reason to PC game for a non social gamer like me. Now the only thing I can imagine the PC doing better is social kiddie weaabo moba twitch tv games like League of Legends, or basement dweller steam trash like counter strike (it's 1997 again?) which I have zero interest in.
 
Last edited:
From the looks of things, I'm not really getting anything out of my R5-3600X that I wouldn't get with my old R7-1700 with my RX 5700 XT. The charts also tell me that I should upgrade my video card long before I do my next CPU upgrade.

It just serves to reinforce the fact that for gaming, always go GPU heavy and CPU average because the GPU will always bottleneck your system first unless you're a competitive potato gamer (and let's face it, how many of those are there?).
 
From the looks of things, I'm not really getting anything out of my R5-3600X that I wouldn't get with my old R7-1700 with my RX 5700 XT.

OT:

This certainly depends. Remember that HUB only tests limited amount of games. Even at 1440p, if you lower graphic settings, there will be a slight or pronounced CPU bottleneck, since RX 5700 XT is still quite a beast. With R7 1700 the best option is to max out graphic settings as long as you get satisfying FPS.

I for one will likely upgrade my R7 1700 soon enough, though I'm not willing to wait till AM5/LGA1700. I have no strong preferences, but I rather play on medium/high settings at high FPS than at 60 FPS with highest possible settings.
For upgraders like me Rocket Lake looks promising - certainly my option number one for now. The generational leaps have been big enough that you can "downgrade" to six core CPU, while still having clearly faster CPU, especially at games. Games are problematic for measuring CPU power though, because they rarely utilize CPUs too much, which applies especially to R7 1700.
 
Hands down one of the best if not the best tech site for many years now. Good range of articles and website color and layout.

Thanks guys!
 
"If you're wondering why we didn’t use more Radeon GPUs like the RX 6900 XT and 6800 in place of the RTX 3090 and RTX 3070, there are two main reasons: most of you are interested in buying a GeForce 30 series GPU and most of you have, at least according to a recent poll we ran."

Behold the power of marketing!

"Oh, but my DLSS and RT!!!"

Again, few games support it RIGHT now and by the time the rest of the industry has this on every single game, we will be a couple of generations away from this current GPU's, which would make those techs usable.

So, all that said, I hope that Steve makes a similar video, but also include a 6850 and a 6900.
 
What this shows is that NVIDIA is doing a lot more work in the drivers compared to AMD. Think about it, the performance doesn't change for the AMD cards based on CPU all that much, 1-2 frames at most, while there is an obvious and significant difference caused by the CPU for NVIDIA cards.
 
What this shows is that NVIDIA is doing a lot more work in the drivers compared to AMD. Think about it, the performance doesn't change for the AMD cards based on CPU all that much, 1-2 frames at most, while there is an obvious and significant difference caused by the CPU for NVIDIA cards.
Yes, and this is not necessarily a good thing. A GPU is expected to offload the CPU of as much as possible of the graphics rendering. The drivers can use CPU combined with GPU to pull ahead in the frame rate, but then they do not offload that much and you can see this as some sort of cheating to make up for GPU resources which do not exist by heavily tasking the CPU.
 
Last edited:
The Nvidia 3000 series are compared here to one generation older AMD cards - which were to compete with 2000 series so this looks like it is on purpose to show AMD in a bad light, despite the justification written that "'hey, we ran a pool and most users would buy Nvidia even without our article helping it''.
 
Yes, and this is not necessarily a good thing. A GPU is expected to offload the CPU of as much as possible of the graphics rendering. The drivers can use CPU combined with GPU to pull ahead in the frame rate, but then they do not offload that much and you can see this as some sort of cheating to make up for GPU resources which do not exist by heavily tasking the CPU.

I agree with your statement, but then again, games rarely max out CPUs, so some driver overhead is simply not an issue. In the end, what matters is the end result. It doesn't matter if for example Nvidia's driver is more efficient in extracting all the possible GPU power, what matters is what FPS + frametimes you get and if you experience any issues. It's funny how many people try to steer the focus from relevant to irrelevant just to make their idolized brand look better. In upcoming benchmarks from HUB, we will see if the selected GPUs behave any differently with Intel CPU or not.
 
Tell that to ryzen 1700x owners watching 7700k and ryzen 3300x still beat them. Game set match, you lost.:joy::joy:
X370/B350 mobo owners and their first Gen Ryzen's laughing at 7700/7600k owners who are stuck with one CPU gen and only 4c8t. While us first gen Ryzen owners plonk the remaining Zen 2 chips into our motherboards which will soon be dirt cheap :)
 
Back