Ryzen 5 5600X vs. Ryzen 7 5800X vs. Ryzen 9 5900X vs. 5950X: GPU Scaling Benchmark

Cyberpunk should have a single overall preset for Max quality / performance graphics named “CyberPsycho”.

I got very good performance in Max settings with my 3090FTW3 on both a new i9 CPU and an older 5960x.
 
Great article and really thorough testing. Although not benchmarked I think I'll hang onto my 4c/8t 7700K for a while longer, since all the data is reiterating just how GPU dependant 4K is.
Even my old 3570k @4.6 GHz can still tango, maintaining 60 FPS minimum. Sometimes barely, but it still works. Anyone who bought a newer CPU recently is good for a decade
 
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.

 
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.
This is the same argument people used for needing mroe then 10GB of RAM on a GPU. "well there could be a solution that requires it" when none of the tests showed it as being necessary.

Is having a download in the background or a spotify playlist running REALLY going to make any impact when there is currently 0% difference between the variosu chips, and all of them are well north of the average 60hz refresh rate of monitors?
 
Damn you Steven, stop disproving more cores = future proofing with your facts! You will force fanboys onto alternative youtube videos to re-enforce overall performance of a CPU is meaningless against a slow multi core monster CPU...in the future.
 
Zen 3 is impressive, thnank you for the very useful tests that show it's not worth spending additional $300+ into higher specced cpu variants for gaming. only problem: rtx 30xx gpus are 3x as expensive as they should be right now so that solved the problem of upgrading by not doing it until prices normalize.
 
Great article and really thorough testing. Although not benchmarked I think I'll hang onto my 4c/8t 7700K for a while longer, since all the data is reiterating just how GPU dependant 4K is.
I had your same chip from launch til last maybhjt ehenninaaw a $99 (3300) coming that supposedlu would match mine I took a look and saw I I could still get over $300 for my chip . I wasn't going to waitbakd see it's value plummet so I sold it for 87% what I paid for it and upgraded to a 10900k for about $250 out of pocket.

I'm glad I did but if I had known the 3300 was basically just a paper chip I might have just held out til the 11900k (which is what I ACTUALLY wanted, you know with PCI 4.0)

I'll still easily be able to sell this chip for most of what I paid for it so no real loss.

I know it may not ultimately make much difference in gaming performance but I'd rather sell my stuff while it's still worth a significant amount and upgrade that way I'm never in a position where I'm questioning performance issues being related to outdated hardware. I've worked the math retroactively over the years and found I don't spend much more than the guy who rides his hardware into bad performance and then either sells for next to nothing or just doesn't sell at all and pays fill price for his next part.


On average over the last deacade I've spent about $300 a year on keeping my pc upgraded. To me it's been a very wise choice as I've never suffered the headaches of aging hardware whole also not breaking the Bank.
 
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.

I mean .... me..
I would be that person and ultimately outside of ram any of these CPUs could easily switch over to another basic program (like web browser) whole gaming. When I Alt tab my game isn't usually asking for full resources while it's in the background.

Unless your rendering orbdokng some other major stuff playing some music or jumping over to look at a social media site isn't going to effect performance very much.

If you think it does and requires someone to grab a 8 or 12 core chip to do so then your exactly the type the article is designed to speak to.

My buds running a 5600x with a 3080 and is one of those guys with 50 things going on in the background from mod tools to music to social to twitch while also running big Torrent downloads and more yet seeing absolutely no big difference in performance vs me .

I'm using a 10900k with a massive oc and yet when we both playing together his fps next to mine is within margin of error.

Like I said I'd you're trying to do real work while ALSO gaming then maybe you need to go big bit for the majority of gamers and they typical things they may want to do it's going to be ok for the most part with just focusing on a gaming cpu that does gaming at it's best you need.
 
Damn you Steven, stop disproving more cores = future proofing with your facts! You will force fanboys onto alternative youtube videos to re-enforce overall performance of a CPU is meaningless against a slow multi core monster CPU...in the future.

I'm glad he made this article and I'm even more excited for intel to be added I've known for a long time jeuond a certain amount (currently 6) cores really just have no impact on gaming and all these people going out of their way to spend multiple times The money on a 5950x when they'd be just as well off with the 5600x.

More importantly I'm excited to see how the impact of high single core performance and higher frequency have a much higher impact on overall gaming performance. I hope he's able to include the 11900k in this testing in the future it's the only chip I have any hope of actually bringing any kind of real gains.
 
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.

While you make some valid points, one could also argue that some might say, "Who games without turning off excess browsing tabs?"
 
This is the same argument people used for needing mroe then 10GB of RAM on a GPU. "well there could be a solution that requires it" when none of the tests showed it as being necessary.

Is having a download in the background or a spotify playlist running REALLY going to make any impact when there is currently 0% difference between the variosu chips, and all of them are well north of the average 60hz refresh rate of monitors?

I beg to differ. I can clearly see the difference between an old 4/4 i5 (like 7600k) and new 8/16 and up CPUs, and I am just a consumer. There were times when I had to wait for a download to initialize, and it was not fault of the particular webpage because even after alt-tab the game still used the CPU 100%. If I jumped back into the game the next minute was still a little jerky.

But where does it start to differ only marginally? Is it 6/6? 8/8?

I would like to know that, but until that time I can literally do whatever I want with a 12/24 CPU and every action is done immediately.
 
Last edited:
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.

It's always frustrating reading these articles because I get that it's tough to benchmark every permutation and system environment, so we have to use these as a baseline and go for there.

Plenty of people will say "I can game and do a million things with my 6c/12t and there's no reason to pay the premium for an 8c/12t and beyond" - which is objectively true and folks will have every anecdote explaining why my specific use cases for an 8c/12t are fringe uses.. but I personally stopped believing in the "X is all you need" after listening to the internet and getting the i5-6600k and being hamstrung after 2 years. Playing a CPU heavy DX11 MMO pegged me at 70% for my 4c/4t professor, and I refused to compromise with my CPU again.

No one looks at CPU Usage and only FPS. Cyberpunk gives the same fps but on a 6c/12t I wonder how much headroom there is to do much of anything - especially if you're streaming, don't have NVENC so are using CPU encoding, and any manner of other programs running.

I personally don't stream at less than 12000 Kbps (I prefer 15000), and all the guides I've seen tell people "oh you can use 8000 because ppl watch the videos in 1080p or 720p on there phone's mobile connection anyway and blah blah".. f$€k that noise lol - my friends watch streams on another monitor while they play games themselves, so that's the audience I cater to.

I'm playing the game at 3440x1440p (because I want to), so I'm letterboxing it into a 2560x1440p box, and my viewers can adjust the quality down accordingly.

MKBHD for example made a commitment to making high quality videos on YT and him and some fellow creators have been dubbed "Team Crispy" and have seen success, so I'm not going to have a shitty stream that compromises just because my hardware is inadequate.
 
This is fantastic work and clearly establishes the baseline by which other CPUs will be measured. Eagerly waiting for the current and older Intel CPUs and older AMD CPUs as that's what our house is littered with: 4c8t & 6c6t Intel, 6c12t AMDs.
 
Why the hell did you use old amd gpus? we could have gotten alot more from this if 6000series was used.
 
This really reinforces my upgrade plans to lean to my "bang for buck" config choice. I already have an RTX 3090 that is currently running on a i7 7700K that was my current machine when I upgraded to a 3090. As well as it runs everything gaming wise that I throw at it, I can see an average of 10% or so loss compared to benchmarks like the ones above. So a Zen 3 upgrade was always planned from the time I got the 3090.

Up till now, I've been super focused on 5800X, but now think it might be wasted money for my needs and the 5600X is looking super sweet at this moment. Zen 3 is a must as I want established and well optimized resizeable BAR support ready for Nvidia's driver release that enables support for 3090, as well as PCIe 4.0 for full speed 3090 and M.2 speeds is also a must.

Original target:
Ryzen 7 5800X - 563.00
Asus TUF Gaming X570-Plus - 170.00
Corsair Vengeance DDR4 3600 (4x8Gb) - 215.00
Corsair Force MP600 1Tb M.2 - 184.00
thermal paste - 10.00
total - 1142.00

Alternate "bang-for-buck":
Ryzen 5 5600X - 350.00
Asus TUF Gaming X570-Plus - 170.00
Corsair Vengeance DDR4 3600 (4x8Gb) - 215.00
Corsair Force MP600 500Gb M.2 - 140.00
thermal paste - 10.00
total - 885.00

Things like AIO, PSU, 6Tb HDD, and even case are all carry over from previous build, hence the thermal paste. The bang for buck upgrade path puts the whole thing with tax and shipping at around 1000 bucks. And really no gaming performance sacrificed at all, but significant overclocking potential on the cooler running 5600X makes itself possible, if desired (though possibly not really necessary).

Great article, it cemented my choice on the second option "bang for buck".
 
Why the hell did you use old amd gpus? we could have gotten alot more from this if 6000series was used.

GPU Scaling. Putting all AMD 6xxx GPUs in there whould have added no data as the 3090 and 3070 are already there. This isn't about GPUs, it's about how CPUs perform with GPUs of different capabilities.

The 3090 is the top performing GPU and the 3070 sits below the 6800, so adding in the AMD ones would add nothing useful. The 5700XT could easily have been the 2080 and the 5600XT the 1080, but they used AMD instead which is fine.
 
Why the hell did you use old amd gpus? we could have gotten alot more from this if 6000series was used.

It would have been meaningless to use 6000 series. 6000 series and Nvidia 3000 series are too close to make any difference, while older AMD GPUs are still very common and this shows how they do.
 
Although the results are valid, the value of this test is precisely zero. Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows
There could well be some fringe heavy background use that justifies it, but a lot of that stuff is often wildly over-exaggerated usually by people who've bought themselves a shiny new 16C/32T CPU, find that most games still won't come close to using all cores, and then start looking for ways to fill up the unused ones with cr*p to justify potentially over-spending for their needs.

Eg, it's very easy to measure how much CPU usage is required for MP3 playback or downloading a file. Barely 1-2% (of a quad core). As for web browsers, it's not 2009 anymore and any modern web browser worth it's salt comes with features like suspend idle / inactive tabs, suspend video on inactive tabs, "lazy load" of tabs when restoring previous sessions, script / tracking / ad blockers, all of which can not only significantly reduce background CPU usage, they also reduce RAM usage & page load times and increase browser security & page readability. The phrase "work smarter, not harder" comes to mind, ie, if someone with "only" a 4-6 core CPU wants to game with a web browser open (perhaps on a walkthrough page) but doesn't want any CPU slowdown issues, they've already figured out how to without needing a $500 CPU.
 
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.
I wonder what a "proper" test (where the value is not "precisely zero") would have looked like, in your opinion?
I have a strong feeling that we all have different preferences (on how we use our gaming rigs), so I can only presume that the chance to cover it all in one article is, well, "precisely zero".
 
I remember when I bought my 4790K. All the reviewers told gamers to ignore it and buy the 4690K instead as their tests showed no difference with the most powerful cards of the time. Which at the time I believe was a 780 ti.

However as the years went by and graphics cards improved a large difference began to open up. Gamers nexus recentish benchmarking of these chips now show an enormous difference between them that did not exist before.

I imagine the same will happen here. Today, with a 3090 you won’t get anything more with a higher core count CPU. However in say 3 years time, imagine you have a theoretical “RTX 5070” or something that’s powerful enough to make the CPU the slowest part then you might find a difference between CPUs.

Then there is ray tracing. For some odd reason Techspot doesn’t test ray tracing on the latest hardware. But other publications do and have shown that there can be quite a CPU overhead with ray tracing turned on. As ray tracing gets implemented into yet more and more games we might see a difference open up on these CPUs.

TLDR, this test doesn’t mean you shouldn’t buy more than 6 cores for gaming.
 
Back