Is Your Older Ryzen CPU Fast Enough for the RTX 4080?

I see a lot of people questioning why you would put a 5600 with a 4080, but when the 5600x came out most reviewers said it was the best gaming CPU, and was faster (by a very small margin) then the bigger 5700x and 5800x cpu's. So I think it is likely that some people would be interested to see if that was a good idea.
 
The 5600x was never touted as the best CPU for gaming. It was never better than the 5800x. Reviewers picked it a lot for the price versus performance metric. Even then, they warned of it's six cores.

That said, most people run too powerful of a CPU compared to their GPUs. Unless you're a e-sports gamer needing 500 FPS, spending more on the GPU is the best move.
 
Impressive how X3D flexes its muscles, putting the 4070 ahead of the 4070ti. But you could have added the 5700x3d to the comparison, the best value for money on the AM4 platform.


I don't play Fortnite, but the difference is probably due to the DX11 API(?)

I agree, more pointedly in the conclusion it's mentioned to only go with a 5800X3D if you run a AM4 platform. Why? Is there any testing data to back up the claim? I don't see any, so IMHO it's more opinion than established fact. For example with current discounts a AM4 MB with a 5800X3D is a much better value, even if it doesn't preform at the same level as the newer Ryzen. Still overall it is a telling article. Better IPC will always show improved performance, who knew?
 
I don't know, speak for yourself, I have a 4K monitor, I've been running 4K since 2016, it was always nice to know that my CPU was going to be viable when paired with newer and newer GPUs. This allowed me to keep my 5960x for a really long time, before that my 920/960/x5650, and upgrading was more of a want, than a need to a certain extent, hoping my 11900kf can last another 5-7 years too.
But you don't need a 4k CPU review for that. You can check a 1080p CPU review. If your CPU gets 100 fps at 1080p, it can still get 100 fps at 4k, 8k, 16k, 256k etc.
 
Even the 5800X3D can limit a 4080 quite a bit at 1440p in certain games like Cyberpunk, Spider-Man, TLOU Part 1, Plague Tale, Jedi Survivor (well that last game is really poorly optimized). I've got a 5800X and 4070 Super and am thinking of switching to a 5700X3D - I should be able to sell the 5800X for at least 70% of the cost
 
A lot of elitist bullshit, dogma, and "common sense" in the comment section here.

We test what we don't know to learn and we test what we know to confirm.

Good on HUB/Techspot for doing some off the rails testing, good use of GPU/CPU review time between major launches.

Honestly a bit of a surprise to me to see the workhorse 5600 CPU that was at the heart of so many 2020-2023 systems kneecap a 4080 as badly as it does.

Would have liked to have seen AMD cards in a test like this. For a long time AMD cards were the ones that had bad driver overhead, but somewhere in the recent past that changed and now NV seemingly had more overhead than AMD.

Would be curious to know how the GPU stack shakes out on a more realistic mid-range CPU that the majority of folks are running instead of the usual Uber CPU we get in most GPU reviews.
AMD was only really bad in DX11 and OpenGL (not sure how they fare there now since the last AMD card I had was an RX 570 over 3 years ago) but their hardware scheduler really helps them with lower overhead in DX12 and Vulkan versus Nvidia which does the scheduling in the driver. Also back in the day GCN was helped by its asynchronous compute capabilities versus Maxwell and Pascal.
 
It cracks me up that these articles have to be made to be honest.

There's a reason why every CPU review is done with the highest GPU available and at lower resolutions, in every article that's looked into past CPU's, if a CPU was slower in games than a competing product at time of review, it doesn't get better, it doesn't mature like a fine wine, the gap between products normally becomes more pronounced.

Still hurts my brain when people ask for 4K benchmarks in a CPU review...
4K benchmarks show how unimpressive new CPUs are if you game at high resolution with max settings to match your high end card; in that scenario even the firt i7 with SATA II, DDR3, and PCIE 2.0 still performs well. I know because I have i7 920, 930, 960, and 980X alongside newer i9 10920X, Ryzen 5 3600, i7 10700KF, and i7 13700KF systems. Abd those old CPUs stilk scale with my RTX 4080 Super, RX 7900XT, RTX 3080ti, RTX 2080Ti, VEGA 64, GTX 1080Ti, GTX 980Ti, and R9 Fury cards.
 
Im sorry but the authors advice to spend $310 on upgrading a 5600 to a 5800X3D is appalling. Thats such a little performance gain for that money. For $350 where I live you can get a 7600X, mobo and RAM. And this same author already did a piece showing the 7600X to be 3% faster than a 5800X3D at gaming (and a lot faster at everything else). But youl have faster RAM and a more modern motherboard that will get a few generations of CPU upgrades.

In socket upgrades work well across CPU generations. But upgrading within the same generation is rarely worth it. I know the X3D is a little different, if it were $200 it might be a different story but at $310 just go AM5 (Or Intel, right now all the best value parts seem to be from Intel).



 
 
4K benchmarks show how unimpressive new CPUs are if you game at high resolution with max settings to match your high end card; in that scenario even the firt i7 with SATA II, DDR3, and PCIE 2.0 still performs well. I know because I have i7 920, 930, 960, and 980X alongside newer i9 10920X, Ryzen 5 3600, i7 10700KF, and i7 13700KF systems. Abd those old CPUs stilk scale with my RTX 4080 Super, RX 7900XT, RTX 3080ti, RTX 2080Ti, VEGA 64, GTX 1080Ti, GTX 980Ti, and R9 Fury cards.
There are current games that even at 4k you will have a massive difference if you compare your old CPU versus something more recent, this Review shows some examples of this.

if a zen3 CPU loses so much performance, imagine these older/weaker models.
 
I have a 5800x and 4080 and after seeing these results... I play at higher resolutions so the fact that the 5600X and by association the 5800X is actually holding back by significant frames at 4K in some games definitely gives me reason to consider upgrading.
 
Long gone are the days when you upgraded only your GPU, now even the 7800x3d/14900k can't fully drive a 4090.
Just look at scaling accross curent gen CPU's. If I'm right the 4090 should see some performance bump with new gen CPU's are released.
 
I have a 5800x and 4080 and after seeing these results... I play at higher resolutions so the fact that the 5600X and by association the 5800X is actually holding back by significant frames at 4K in some games definitely gives me reason to consider upgrading.

I think your 5800x is still fine for feeding frames to an 4080@4k (at least on most modern AAA games with the eye candy on). Check out the games you play before you make the switch.

There is a reason you can't find many 4k CPU Benchmarks on the net. Results are boring for most modern games when comparing modern CPUs.

Take this article and the 4k differences between 5600 and 78003d (seems a little bit unfair on paper but anyway)

- Avatar, Starfield, The Last of us, Star Wars Jedi Survivors: Minimal differences in AVG and lows. Those games imho show the typical behaviour of modern AAA games, when they are maxed out at 4k

- Spiderman: Ok, here are some bigger differences, seems to be worth it if this is your game. But don't mistake the odd Spiderman engine (Insomniac engine) to be a rule of thumb

- Rest of the games (COD:W, Fortnite, CS2): Medium settings or vintage engines chosen to force those differences - if you don't play those games at those settings the differences will be much smaller, e.g. max out the Fortnite Engine with RT on at 4k

I recon your 5800x is pretty fine if you play AAA Games and like eye candy on (of course on 4k). But still, it's a good idea to check the upgrade value on the games you play.
 
Last edited:
There are current games that even at 4k you will have a massive difference if you compare your old CPU versus something more recent, this Review shows some examples of this.

if a zen3 CPU loses so much performance, imagine these older/weaker models.
not really unless it's DRM related; keep in mind I actually own these systems and benchmark them in many popular titles like RDR2/GTA5/Dark Tide/Vermintide 2/Robocop/BF 2042; once you hit 4K max/Ultra the performance difference drops off dramatically because the GPU is what matters more. The real exception is AVX 512 games which will not run at all on older systems. Excepting that; the rest are very much playable. I'm not saying the perform as fast in every title; but certainly good enough. BF2042 My i7 13700K@XMP 32GB 6200Mhz/RTX 3080Ti gets 82.95FPS @4K ultra; while my i7 980X@4.5Ghz/24GB DDR3@1293Mhz/RTX 3080Ti gets 71.25
 
Last edited:
I still game on a TR 3960X with a 3080 12gb @1440p... I wonder how it compare with those results.
 
I agree.

I mean, if you are the type who will casually dump one grand+ on a GPU such as the 4080 while you know that you are being scalped there is pretty much zero chance you will own a lowly 5600.

You will throw in another grand and buy the best and latest CPU+MOBO+RAM.

The kind of guy who owns a 5600 would pair it maybe with a 3070 or 3060ti.
Sensible
 
If I was going to upgrade from a 5600($200 or less) where would I go? Maybe a 5700x3d or a 5800x3d? The 5700x3d is $244 at newegg and you could probably sell that 5600 for $125-$150. So the cost would be around $100.
 
Back