Ryzen 5 7600 vs. Ryzen 5 5600: CPU and GPU Scaling Benchmark

It's clear that the 5600 is a bottleneck to the 4090, however, it would be nice to see this in a higher resolution since I think most people that would be interested in a 4090 would be playing at 1440p and hopefully 4K if they are going to part with that kind of money for a GPU. The 5600 is a huge bottleneck compared to the 7600 at 1080p, but I'm guessing that it is significantly closer at 4K. In any case, I think you would want a bare minimum of a 5800X if you are going to go with a 4090.
 
I would think that a platform upgrade is the way to go. The performance increase and platform longevity makes it a no-brainer...Especially when you consider the price of graphics cards at the moment. The financial outlay is miniscule by comparison.
 
It's clear that the 5600 is a bottleneck to the 4090, however, it would be nice to see this in a higher resolution since I think most people that would be interested in a 4090 would be playing at 1440p and hopefully 4K
The whole point of this test was to explore CPU bottlenecks, not GPU bottlenecks. Somehow people miss that point every time one of these tests is done.
 
I understand these articles and benchmarking take a long time but would have been better to have had the Ryzen 3600 in the list to as I imagine it would be a better representative of a worthwhile upgrade to 7600 with a faster GPU. My favourite article Steve did was the 4 years of Ryzen which helped me with my decision to upgrade my R5 1600 to the 3600 as I was buying a 3060ti.
 
I feel like this test should also have included a low-end nVidia card of similar performance or price to the 6650XT as a counterpoint to the red-v-green comparisons at the high end.
 
Sure but it's still nice to see actual use cases instead of a scenario only like 1%, if even, of all pc gaming users see.
You think it is more of an "actual use case" to show 4k results when testing the 5600X vs 7600X?
Are you aware of what this article is about?
 
The whole point of this test was to explore CPU bottlenecks, not GPU bottlenecks. Somehow people miss that point every time one of these tests is done.

What would the sweetspot be for a 5600X RX 6750 XT, RX 6800 XT, 3070 ti? I'm curious how a 5600X and a 3080 would do. I dunno about you folks but I seem to know quite a few people with that pairing.
 
You think it is more of an "actual use case" to show 4k results when testing the 5600X vs 7600X?
Are you aware of what this article is about?
Yes? People who purchase 4090 aren't gonna play on a 1080p monitor unless they're going for extreme high FPS for competitive games and they will not be on 7600 then lol.

People who have 5600 aren't gonna be interested in extreme high FPS so they're gonna be fine with 60 and I'm sure lots are curious how it's gonna fare on higher resolutions even if 1080p is still ubiquitous today.
 
What would the sweetspot be for a 5600X RX 6750 XT, RX 6800 XT, 3070 ti? I'm curious how a 5600X and a 3080 would do. I dunno about you folks but I seem to know quite a few people with that pairing.

Yes? People who purchase 4090 aren't gonna play on a 1080p monitor unless they're going for extreme high FPS for competitive games and they will not be on 7600 then lol.

People who have 5600 aren't gonna be interested in extreme high FPS so they're gonna be fine with 60 and I'm sure lots are curious how it's gonna fare on higher resolutions even if 1080p is still ubiquitous today.

LOL, the 5600 is good for way more than 60fps. It's fine for 1440p/144, depending on your GPU and graphical settings of course.

I'm using a 5600 and a 6800XT and game at 1440p/144Hz at Hi-Ultra graphical settings and these make for a good combination. Even the trashpile that is Forspoken (the Demo) runs at 65-90 fps with FSR off (it's on by default in all game presets) and settings one step down from Ultra. CPU use runs at 30-70%. For more normal games like CP2077 and SotTR, I'm not 100% GPU-limited but majority of the time am at Hi-Ultra, with over 100FPS in both.

Now, if you were to ask which would I upgrade to get more performance, I'd next upgrade the 5600 as a meaningful CPU upgrade (5900X/5800X3D) will be like 1/4 the price of a meaningful GPU upgrade (4080) and the 6800XT has a little more to give in some circumstances.

But for gaming the 5600 and the 6800XT are a high end gaming system made in heaven as I got the 5600 for $160 and the 6800 XT for $560. Not their lowest but pretty damn affordable for great performance.
 
"As we noted earlier, the upgrade to the 7600 will cost at least $560 for a basic B650 board and a 32GB kit of decent DDR5 memory."

This isn't even really the chip's fault, let's be honest: a quick surf on Amazon shows you can't get an AM5 motherboard for under $200, when the damn CPU itself only costs $230. I don't know what motherboard manufacturers are smoking right now, the pricing is absurd. I got my Asus X570 for about $160 in 2020, and I considered that pricey at the time with boards available for $100 or less; I just didn't want to burn myself with another cheap ASRock.
 
I understand these articles and benchmarking take a long time but would have been better to have had the Ryzen 3600 in the list to as I imagine it would be a better representative of a worthwhile upgrade to 7600 with a faster GPU. My favourite article Steve did was the 4 years of Ryzen which helped me with my decision to upgrade my R5 1600 to the 3600 as I was buying a 3060ti.


+1

As article is mentioning upgrade to 5600 on AM4, it would be really beneficial for those wwho have that kind of setup to see 1600 or 3600 there included.

Anyways, thanks for the testing Steve.
 
In the last +20 years CPU vs CPU articles used all sort of benchmarks to differentiate the compared products. From 7Zip to Cinebench and from Adobe to Sandra.
This is quite different to use a GPU to bottleneck the CPU.
But without CPU/GPU/PCIE load graphs tells no full story to me.
Also not to use 3 cards from the same vendor: Mainstream, Enthusiast and High-End makes no sense also.

 
I'm using a 5600 and a 6800XT and game at 1440p/144Hz at Hi-Ultra graphical settings and these make for a good combination. Even the trashpile that is Forspoken (the Demo) runs at 65-90 fps with FSR off (it's on by default in all game presets) and settings one step down from Ultra. CPU use runs at 30-70%. For more normal games like CP2077 and SotTR, I'm not 100% GPU-limited but majority of the time am at Hi-Ultra, with over 100FPS in both.

Now, if you were to ask which would I upgrade to get more performance, I'd next upgrade the 5600 as a meaningful CPU upgrade (5900X/5800X3D) will be like 1/4 the price of a meaningful GPU upgrade (4080) and the 6800XT has a little more to give in some circumstances.

But for gaming the 5600 and the 6800XT are a high end gaming system made in heaven as I got the 5600 for $160 and the 6800 XT for $560. Not their lowest but pretty damn affordable for great performance.

I hear that, I have a 3800X but my best bud gifted me his 5600X and I picked up a RX 6750 XT for $510 Cdn. Best deal I could find in months so I jumped on it. Seems like a good combo I don't expect a giant leap over the 3800X but I'm hoping to see less of a bottleneck and my temps will certainly be lower which is always a bonus in the quest for better 1% lows and such.

I can't remember for sure but I think the Tombraider benchmark tells you how bad you're bottlenecked so I'll compare before and after the swap.
 
It's clear that the 5600 is a bottleneck to the 4090, however, it would be nice to see this in a higher resolution since I think most people that would be interested in a 4090 would be playing at 1440p and hopefully 4K if they are going to part with that kind of money for a GPU. The 5600 is a huge bottleneck compared to the 7600 at 1080p, but I'm guessing that it is significantly closer at 4K. In any case, I think you would want a bare minimum of a 5800X if you are going to go with a 4090.

Totally agree. I do appreciate these tests and find them very interesting, but a 1080p benchmark is kinda pointless for a 4090.

Also, why not put an affordable high-end GPU for results that most gamers could relate to, like a 3080 RTX? Nobody that can afford a 4090 is pairing it with a paltry 5600. :p
 
I hear that, I have a 3800X but my best bud gifted me his 5600X and I picked up a RX 6750 XT for $510 Cdn. Best deal I could find in months so I jumped on it. Seems like a good combo I don't expect a giant leap over the 3800X but I'm hoping to see less of a bottleneck and my temps will certainly be lower which is always a bonus in the quest for better 1% lows and such.

I can't remember for sure but I think the Tombraider benchmark tells you how bad you're bottlenecked so I'll compare before and after the swap.

Yes, Shadow of the Tomb Raider's built-in benchmark will tell you how GPU bound you are. And it scales almost linearly in both CPU and GPU use so it's one of the best tests and game engines out there.

I realized I never ran it under typical conditions after installing the 6800XT.

R5 5600 w/PBO
RX 6800 XT +150 cores, +130 memory, +15% power (my stable settings)
16GB 3200 MHz CL16
1440p, Vsync off

HUB recommended settings: 177FPS, mins in the 120s, 42% GPU bound, so 56% CPU bound
Highest settings, but Aniso up to 16x and Motion Blur off: 168FPS, mins in the 120s, 47% GPU bound, so 53% CPU bound.

A CPU upgrade could net more FPS in busy towns with lots of NPCs, but no need. 120fps minimum is already overkill for this title. So LOL@ the comment earlier today about "60fps".
 
LOL, the 5600 is good for way more than 60fps. It's fine for 1440p/144, depending on your GPU and graphical settings of course.

I'm using a 5600 and a 6800XT and game at 1440p/144Hz at Hi-Ultra graphical settings and these make for a good combination. Even the trashpile that is Forspoken (the Demo) runs at 65-90 fps with FSR off (it's on by default in all game presets) and settings one step down from Ultra. CPU use runs at 30-70%. For more normal games like CP2077 and SotTR, I'm not 100% GPU-limited but majority of the time am at Hi-Ultra, with over 100FPS in both.

Now, if you were to ask which would I upgrade to get more performance, I'd next upgrade the 5600 as a meaningful CPU upgrade (5900X/5800X3D) will be like 1/4 the price of a meaningful GPU upgrade (4080) and the 6800XT has a little more to give in some circumstances.

But for gaming the 5600 and the 6800XT are a high end gaming system made in heaven as I got the 5600 for $160 and the 6800 XT for $560. Not their lowest but pretty damn affordable for great performance.
I mean yeah I know but I don't think any tech sites have done a specific benchmark on it. I get that it won't paint a pretty picture on all these newer CPUs however, and that's probably something they want to avoid for their sponsors.
 
Totally agree. I do appreciate these tests and find them very interesting, but a 1080p benchmark is kinda pointless for a 4090.
I don't agree with dismissing the most common resolution especially one that is widely used for e-sports makes sense.

I buy a graphics cards to drive frames at my actual resolution 1080p which to me sounds a hell of a lot better than buying a graphics card to be an upscaling machine. Buying a 1440p or 4k monitor for amazing clarity and pixel density just to have it looked washed out with DLSS. Yuck. I rather buy 1 GPU to last me vs buying 2-3 cards creating more future e-waste. It's like going back in time and telling people who bought 980's, 1080's, and 2080's, they were stupid meanwhile they were laughing in the long run.

At 1080P the RTX 4090 is doing a much better job that everything else but when it's certainly not dominating 1080P. If the 1% minimum was 240 FPS I would be far more impressed rather than the average high hitting 240FPS. Is RTX one heck of a graphics card? Absolutely!!! But it's no Megalodon that conquers ultra settings at 1080p in everything.

I think you're giving the RTX 4090 too much credit just because a processor is getting bottlenecked. I think your statement should be "I do appreciate these tests and find them very interesting, but a 5600X is kinda pointless for a 4090"

Average_1080p-p.webp


Average_1440p-p.webp
From Techspots review: https://www.techspot.com/review/2544-nvidia-geforce-rtx-4090/
 
I think it is also worth taking into considerations that user requirements are different. It's all good and dandy that my rig is capable of 97 FPS...but if my screen is 60Hz, it doesn't matter all that much, does it. I do enjoy these articles (in fact, these articles I enjoy the most! - Thanks a ton Steve! :) ), however, before we go to war over CPU/GPU botlenecks and what is the one and only true road ahead, let's just consider the use(r's) case.

For instance, I'm happy with 60FPS + Vsynch (because I mostly play on my big a** TV :) ). I know. Gasp. And I bet there are many more out there with similar expectations. High(er than 60) refresh rate is great, but it is not the only way. I'm still using a 3600 with a 3070, and for my needs, it is a fine pair. I'm always considering to get a (2nd hand) 5700x/5800x/5800x3D/5900x, but then I ask myself: what will I notice in my gameplay? Not much: both current components are more than capable of a rock steady 60FPS, and a new CPU couldn't improve on that. So I just keep postponing the upgrade...until I reach the point where there is actual benefit for me...

The point is, if the CPU/GPU of choice is comfortably(!!) delivering the required (or available) screen refresh rate (at your desired/available resolution), it doesn't matter how much extra/reserve performance we are not taping into. If you want 144hz, sure, go for it & enjoy: then that's your baseline, regardless of platform (for both CPU and GPU, btw). If 60 is your target (like me), then that...if 240, then chose CPU/GPU for that...simple :) (Just my 2 cents :) )
 
OK just for fun I ran all the in-game benchmarks on games I own from the last 5 years at 1440p Medium settings. I chose this because it's a reasonable level to play at for decent visuals (Shadows are frequently off and textures horrible at Low), targetting high refresh rates. Motion Blur and FSR off.

I was curious to see what FPS you might expect from an overkill GPU (6800XT no OC) paired with the R5 5600 w/PBO using reasonable but frequently CPU-limited gaming settings. All except FH4 are CPU-intensive at this reasonable setting level.

Cyberpunk 2077: 121 Avg, 56 Min
Horizon Zero Dawn: 182, 98
AC: Odyssey: 115, 43
Red Dead 2: 151, 91
Forza Horizon 4: 238, 180
Shadow otTR: 191, ~125

Watching the frame rates during the tests:
CP2077 never went below 95
HZD's graph and 99% indicate 125-130fps mins which correlates with what I saw
AC:O's graph shows minimums at about 100, with 2 spikes down to the 40s
Red Dead 2 never showed below 100

So instantaneous frame dips are probably represented in those discrepancies.

It seems like the R5 5600 can be expected to deliver 100fps minimum in the most demanding games and is capable of much higher in most circumstances. Though future games will slowly whittle away at that.

Incidentally, image quality was pretty good in all games with the exception of distracting LOD pop-in in HZD and shimmery shadow application in AC:O.
 
Looking at these comments, I think there really needs to be articles about which processors pairs well with what GPU in different scenarios. Simple benchmark obviously need to avoid bottlenecks, but also needing consideration is what people should pair up with what.

It gets especially confusing when you have different generations of graphics card and processors. Even if is just a simple ball park estimation.
 
4090 and a 7600 seems a laughable combo, who would run that let alone a 5600 and 4090? I get the 1080p testing, but I don't get using such a ridiculously expensive GPU on a 7600X. 4090 is an aberration in the current GPU world (in a good way). Also, why no 7900 data points?

Owners of 5600 would be much more likely to have 6600/6700/3060/3070, so I would focus the scaling in that mid-tier maybe up to 3080/6900 given the good second hand prices.
 
The whole point of this test was to explore CPU bottlenecks, not GPU bottlenecks. Somehow people miss that point every time one of these tests is done.
The 4090 is clearly bottlenecked at 1080p by both these processors, but if you are buying a 4090 to play at 4k which is very reasonable, will the 5600x still slow you down enough to count? It's relevant because 40/RDNA3 series cards could still be bottlenecked even at higher resolutions, but likely by much lower percentages. If at 4k the 5600x is less than 5% slower that might be relevant for someone not ready for a platform upgrade.
 
I am so done with these tests that have no real world application. Definitely get a 7600 if you plan to run your $1800 4090 at 1080p
 
Back