Is Your Older Ryzen CPU Fast Enough for the RTX 4080?

I don't know why, but I feel like the Fortnite results are a tad lower than expected at 1080p medium settings for such a CPU. Not saying they're wrong, just weird (I've seen similar results in other sources).
 
Impressive how X3D flexes its muscles, putting the 4070 ahead of the 4070ti. But you could have added the 5700x3d to the comparison, the best value for money on the AM4 platform.

I don't know why, but I feel like the Fortnite results are a tad lower than expected at 1080p medium settings for such a CPU. Not saying they're wrong, just weird (I've seen similar results in other sources).
I don't play Fortnite, but the difference is probably due to the DX11 API(?)
 
It cracks me up that these articles have to be made to be honest.

There's a reason why every CPU review is done with the highest GPU available and at lower resolutions, in every article that's looked into past CPU's, if a CPU was slower in games than a competing product at time of review, it doesn't get better, it doesn't mature like a fine wine, the gap between products normally becomes more pronounced.

Still hurts my brain when people ask for 4K benchmarks in a CPU review...
 
Without a 5700X3D or 5800X3D in this comparison is all a bit stupid, as the upgrade path to the 7x series is a whole lot of cost of new motherboard + memory.

If you have got money to burn on a high-tier graphics card, then obviously you can also spend less than a $1k on a new CPU+mem+motherboard to get the most out of it.

If you still have a Ryzen 5 system, not quite sure you're going to lash out on a 4080 in the first place.

A 4070ti or 4080 is also now well outgunned by the 4070Ti Super or 4080 super - where is that in this comparison? This article seems outdated already.
 
Disappointed to not see 5700X here :( one of the most popular and most balanced CPU from AM4.

This could have been a good reference when I want to upgrade my 1080 Ti.
 
This is like putting bicycle wheels to a tank to see if it works. It may, but it's a waste of time nonetheless.
 
Great review, although if there was an equivalent CPU it would be even more interesting... or with the addition of a class 70 series GPU...
and it's common for 4080 owners to choose a better CPU than the Ryzen 5600, since in many games, the 5600 cannot display the full performance of the 4080..
 
I upgrade in stages - recently upgraded my GPU (to the 7800XT) which is being bottlenecked by my CPU and RAM (i7 7820X @4.7 and 2666 DDR4 ram). Next upgrade will be the motherboard/cpu/ram. Probably not until next year though (so will have to cope with an approx 10% drop in GPU performance - according to Timespy extreme) as it is an expense I can't justify yet. As my monitor is a 60 Hz 1440p IPS it just means I push up the detail and effects to hit the sweet spot.
 
Great Article, Steve!

While having a 5800X3D and 3600/3800 in the same chart would have been a bonus for convenience, I am able to pull up a second article and extrapolate. (Sorry others cannot)

I find these where are the thresholds for CPU/GPU combos to be super interesting. Keep them coming!
 
This data backs my decision to upgrade my living room PC's 3900X to a 5800X3D.

While the 3900 was only holding back my RTX 3080 a bit, my planned upgrade of an RTX 5070/5080 would have been a waste without more CPU grunt and it looks like the 5800X3D will at least mostly keep up with the next GPU (saving a rebuilt for the next upgrade).
 
I am able to pull up a second article and extrapolate. (Sorry others cannot)
Oh, that's great. I'm one of those others you are sorry for. I can't seem to find the 'second' article with the 5800x3d and the RTX 4800 and the Avatar scores. And I'm also unable to locate the 5800x3d article with the exact settings and game mix avg as in this one. Clearly it has to exist. Would you be so kind and link 'the second article' for internet dorks like us? Many thanks :)
 
Great review, although if there was an equivalent CPU it would be even more interesting... or with the addition of a class 70 series GPU...
and it's common for 4080 owners to choose a better CPU than the Ryzen 5600, since in many games, the 5600 cannot display the full performance of the 4080..

I agree.

I mean, if you are the type who will casually dump one grand+ on a GPU such as the 4080 while you know that you are being scalped there is pretty much zero chance you will own a lowly 5600.

You will throw in another grand and buy the best and latest CPU+MOBO+RAM.

The kind of guy who owns a 5600 would pair it maybe with a 3070 or 3060ti.
 
Shocked Steve didn't include the FX 8350 to see if it was fast enough for the RTX4080
Yeah I was really surprised he didn’t include the 2600K as well.

Or the 9900K or the old Phenom II’s, I’m sure it’s all relevant…
 
Still hurts my brain when people ask for 4K benchmarks in a CPU review...
I don't know, speak for yourself, I have a 4K monitor, I've been running 4K since 2016, it was always nice to know that my CPU was going to be viable when paired with newer and newer GPUs. This allowed me to keep my 5960x for a really long time, before that my 920/960/x5650, and upgrading was more of a want, than a need to a certain extent, hoping my 11900kf can last another 5-7 years too.

Hate to use the ol' car analogy here, but I look at 1080p benchmarks with "Insert highest end GPU here" like drag racing, fun to watch, lets you yell out "MURICA!" or whatever your countries redneck equivalents would yell, and then get back into your slightly above average daily driver and comfortably drive home. It's important data to collect don't get me wrong here, it'll attracts views, comments, tears, all those good things, but ultimately applies to a small margin of users, and I suppose, understandably so am I.

However, data is data, and the more you have the better, do the charts bother you that much with extra data? I honestly hope in a few years I get to see a benchmark that includes the 11900k or kf in the same light as this one so I can know just how much it is bottlenecking my future GPU upgrade.

Cheers,
 
I get why we test at 1080p but in the real world no one is buying a 4080 to play at that res and honestly I don't even look at 1080p results anymore. I haven't played at that res since I got a GTX1070.
My second tier PC is running a 3700X + 2080 Super and handles 1440p perfectly for 95% of the games I play, no DLSS, and I use my 5800X + 6800XT PC for the more graphically intense games.

If you building a PC with such mismatched specs as 5600 + 4080, then you should stop.
 
A lot of elitist bullshit, dogma, and "common sense" in the comment section here.

We test what we don't know to learn and we test what we know to confirm.

Good on HUB/Techspot for doing some off the rails testing, good use of GPU/CPU review time between major launches.

Honestly a bit of a surprise to me to see the workhorse 5600 CPU that was at the heart of so many 2020-2023 systems kneecap a 4080 as badly as it does.

Would have liked to have seen AMD cards in a test like this. For a long time AMD cards were the ones that had bad driver overhead, but somewhere in the recent past that changed and now NV seemingly had more overhead than AMD.

Would be curious to know how the GPU stack shakes out on a more realistic mid-range CPU that the majority of folks are running instead of the usual Uber CPU we get in most GPU reviews.
 
Can you perhaps put current gen flagship versus previous gen flagship? This would be 5950x versus 7950X (plus perhaps 3D). Comparing a 5600 to a 7800 doesn't make sense to me.
 
Wtf is this dogshit article lmao, who paires 5600 ryzen with 4080 (which is crappy videocard anyway, either stick to xx70s or throw more money for 4090), and where are other videocards and CPUs e.g. 5800x3d?
 
I get why we test at 1080p but in the real world no one is buying a 4080 to play at that res and honestly I don't even look at 1080p results anymore.
The only thing I'd add here is that 1080p is DLSS Performance internal res at 4k, and close to it (960p) is quality mode at 1440p, so if your CPU is holding back lower res performance, the symptom might be minimal performance gains when using upscaling.

I don't disagree with you, a 4080 is a stupid card to target 1080p output gaming, but just offering up that perspective too.
 
Back