Starfield GPU Benchmark: 32 GPUs Tested, FSR On and Off

I'm running on the new Samsung G9 57 dual 4k with i9-13900k (790 MB and DRR5-6000) and RTX4080. With HDR.

The only time the i9 woke up was during shader calculation. It's asleep most of the game. Not a very CPU-intensive game. I do wonder if upgrading from my i7-8700K (48x multiplier OC) was a waste of $$.

FSR is doing a good job of upscaling to 21:9 2160p with the annoying black bars. So I'm basically playing it on a larger version of my old 34" 21:9 1440p. Mostly over 60fps, in the 50s in dense environments and 70s elsewhere.

On the bright side I have heaps of screen space for displaying overlays and my Aida64 panel. Joking. I want proper 32:9 support with no distortion at the edges which ruins RDR2 a bit.

I think the game/GPU benchmark/testing should start testing dual4k scenarios. It is pretty cool.
 
The text below the first chart refers to the performance differences between camps as simply, "quite surprising, although it remains very playable."
 
Latest Nvidia drivers dramatically improved performance! On my Asus ROG Strix 3080 OC, I gained from 20 to 30 FPS without changing anything. So this review is already outdated for Nvidia cards.
The game is super playable now with a 3080 in 1440p.
 
We don't plan to update this article to add the new Radeon GPUs unless there's a noteworthy driver or game update accompanying them.

Hopefully you'll do an addendum with the new Nvidia drivers and DLSS support when it becomes official.
 
The following is a recording of CPU utilization during a run around of Atlantis (main city hub). Graphics settings were 4K, Ultra, no FSR or resolution scaling; system used Core i7-9700K, 32GB DDR4-3200, GeForce RTX 4070 Ti.

View attachment 89226

As you can see, all 8 cores on that CPU are being absolutely hammered. I've tried to examine things further, but Nvidia's Nsight Systems and Nsight Graphics both just crash with this game, so it'll have to wait until they're updated. Unfortunately, I don't have a Radeon GPU to try AMD's diagnostic tools.

Whatever the CPU and/or system is being tasked with, it doesn't seem to be related to data flows -- neither the GPU nor the storage drives were transferring much in the way of anything.

Edit: And here's PresentMon data for the same area:

View attachment 89227

It's pretty frustrating not being able to pick out exactly what's going on here, as the above data seemingly contradicts the earlier set -- a game that's normally very CPU-limited would have a notable discrepancy between the Frame Time and GPU Busy time, but that's not the case here (and the GPU is clearly working hard).
Weird, I'll have to look into things on my end even further. I kept having CPU bottlenecks on my 5700X. It only seemed to use a few cores on my computer. I haven't really played it since that initial run because of that. I figured it was a bug.
 
Minimum specs are calling for a 6 core CPU. Anyone try with an older 4 core/8 thread CPU?

I have 2 gaming systems.
Older one: i7 4790K@ 4.6Ghz, 32GB RAM, 2080S, 1440p 60hz monitor.
Newer one: i5 10600K@ 5Ghz, 32GB RAM, 3080Ti, 3 1920x1200 60hz monitors in Nvidia Surround.

I use the older one for most games that don't support surround, and get 50-60FPS in virtually anything I run on it, including BG3. But none of the games demand a 6 core CPU. I can run SF on the newer one using just the center monitor if I have to, but I'd prefer the 1440p. Any feedback on 4 core performance?

With that question asked, I'm going to wait for the GotY or Definitive version before making the leap. Let the price come down and the inevitable bugs get squashed first. I'm too busy playing BG3 any way.
Of course, it makes sense to wait for the game of the year and buy cheaper, how good it is that there is BG3 that you can play, and soon there will also be an addition to Cyberpunk 2077, there will definitely be something to do
 
There's no denying that Starfield is a ridiculously demanding game that does require further optimization work, but had mid-range current-gen GPUs offered an actual performance uplift, it would be easy to achieve well over 60 fps at 1440p using high quality settings.

That's a bit of victim shaming, isn't it? So, it's Nvidia and AMD's fault that this game run poorly? I don't think so.
 
Back