Starfield GPU Benchmark: 32 GPUs Tested, FSR On and Off

I'd be interested to re-test NVIDIA with the DLSS mod.

But yeah, this isn't "optimizing" for AMD; this is just the most piss-poor performance I have seen since the original Crysis came out.
yeah, this is awful for everyone, unless you're using a 7900xt at 1440p.
 
Starfield is absolutely gorgeous and, yeah, a real humbling experience for our hardware.

My 3080 gaming laptop clocked out last week and is being repaired under warranty, so I'm back on my GT76 Titan I bought 4 years ago.
But it has a 9900k and a 210 watt 2080, so I can maintain a 30 fps frame rate with Ultra and only shadows lowered so I have locked it at that.
And like Steve found, FSR2 is pretty much worthless, so I'm going to try the DLSS mod today.

But it has already proved itself to me as a great game and a blast to play.

This FSR implementation confuses me. All the others used various presets, meanwhile this one is just a render scale slider. I think this games issues aren't GPU related. I think it's just like all other Bethesda games. I think it's poorly optimized for CPU usage. Then again I've seen people talk about their CPU usage being maxed and some saying their CPU usage was low. Reminds me of the FO4 days.
 
They have adopted the strategy "if I release a very unoptimized game, then":

- I'll have a sooner release, so money saved/earned $$$

- AMD and Nvidia (soon Intel) will fight ($$$) to win the "brand optimization"

- At the end, most customers think "the more demanding > the better game", so $$$

At the end $$$ + $$$ + $$$, why bother
 
Last edited:
This FSR implementation confuses me. All the others used various presets, meanwhile this one is just a render scale slider. I think this games issues aren't GPU related. I think it's just like all other Bethesda games. I think it's poorly optimized for CPU usage. Then again I've seen people talk about their CPU usage being maxed and some saying their CPU usage was low. Reminds me of the FO4 days.
all major sites test on 7800x3d or 13900k, and still see piss poor performance. 6800 can't get 60fps at 1440p medium in Akila lol
 
I have a 4090/7950X/64GB RAM/Gen-4 NVMe and my frame rates on every setting at max with no resolution scaling, no FSR2 I consistently stay in the 70s to over 100s at 4K… only time I’ve seen it dip to the high 50s is very rarely in the big cities
 
Bethesda are trying to build a hype for an average game as all their games . It wont get more than 7-7.5 out of 10 .
 
Last edited:
Minimum specs are calling for a 6 core CPU. Anyone try with an older 4 core/8 thread CPU?

I have 2 gaming systems.
Older one: i7 4790K@ 4.6Ghz, 32GB RAM, 2080S, 1440p 60hz monitor.
Newer one: i5 10600K@ 5Ghz, 32GB RAM, 3080Ti, 3 1920x1200 60hz monitors in Nvidia Surround.

I use the older one for most games that don't support surround, and get 50-60FPS in virtually anything I run on it, including BG3. But none of the games demand a 6 core CPU. I can run SF on the newer one using just the center monitor if I have to, but I'd prefer the 1440p. Any feedback on 4 core performance?

With that question asked, I'm going to wait for the GotY or Definitive version before making the leap. Let the price come down and the inevitable bugs get squashed first. I'm too busy playing BG3 any way.
 
Finally got it downloaded and configured for my 6800 (non-xt). Should be interesting how bad it hits this GPU as my prior was a 5600xt.
 
Getting 120fps on New Atlantis with everything maxed @ 1440p with the DLSS3 mod using Frame Generation and DLSS on Balanced on my 4070. Can't believe that Luke guy made the mod free. PureDark can go blow himself for trying to charge for that ****, greedy effer. Doesn't seem to realize, that Luke will make more money off donations for his free one that PD made for his paid one. lmao
 
yup.
but that's just theoretical performance, 30 series does better with dx12, so wouldn't be surprised to see 1080Ti around 2060 12g

yeah, 2060 super actually


anyway, grab some popcorn and get ready for the users to rip this game to shreds in 6 hours time


🍿🍿 🍿

so far 47 positive and 4 mixed, why do I feel user score will be overwhelmingly negative.....
There are over 6K Very Positive reviews on Steam so your theory does not seem to be holding true.
 
Don't think I've seen such poor performance from the 1080Ti ever (I may even pass on purchasing this game until I've upgraded - so may be some time yet). Bethesda's loss as this would normally be an easy sell (just completed Outer Worlds Spacer's edition so looking for my next game). Looks like I'll just hold out for the Cyber Punk DLC (which plays fine at 1440p on a 1080Ti).

Probably Nvidia doesnt bother taking more time in optimizing the 10 series card compared to 30 and 40 series, or maybe they dont even optimize it at all. 1080 Ti should be superior than 5700 XT, yet it performs far down below it in any resolution. Nvidia wants the 10xx users to upgrade.
 
There are over 6K Very Positive reviews on Steam so your theory does not seem to be holding true.
and 2.5K negative ones. Read the positive ones too, most of them acknowledge the game is a mess. On some forums it only gets 4.7/100 from user reviews, with 8.5-10 critic score.
at least we know where that sponsorship money went.
 
Last edited:
I think this games issues aren't GPU related. I think it's just like all other Bethesda games. I think it's poorly optimized for CPU usage.
The following is a recording of CPU utilization during a run around of Atlantis (main city hub). Graphics settings were 4K, Ultra, no FSR or resolution scaling; system used Core i7-9700K, 32GB DDR4-3200, GeForce RTX 4070 Ti.

starfield_cpu_usage.jpg

As you can see, all 8 cores on that CPU are being absolutely hammered. I've tried to examine things further, but Nvidia's Nsight Systems and Nsight Graphics both just crash with this game, so it'll have to wait until they're updated. Unfortunately, I don't have a Radeon GPU to try AMD's diagnostic tools.

Whatever the CPU and/or system is being tasked with, it doesn't seem to be related to data flows -- neither the GPU nor the storage drives were transferring much in the way of anything.

Edit: And here's PresentMon data for the same area:

starfield_presentmon.jpg

It's pretty frustrating not being able to pick out exactly what's going on here, as the above data seemingly contradicts the earlier set -- a game that's normally very CPU-limited would have a notable discrepancy between the Frame Time and GPU Busy time, but that's not the case here (and the GPU is clearly working hard).
 
Last edited:
I could test on 6800 and 3080 both, but I'm not buying this for 70eur.
8/8 might be not enough tho for an open world game anymore.
 
Hmm, now I'm beginning to wonder if there might be a file streaming issue, amongst other things. Here's a 12 millisecond sample of the Atlantis hub:

starfield_file_streaming.jpg

The long purple and cyan bars in the lower half of the image are the read times for specific textures. Many of them are taking 4 to 6 milliseconds to read, from a PCIe Gen 3 SSD (Intel P1). Yes, it's all heavily parallelized, but the entire reading period is 12 ms -- that seems unusually long, though I've not checked it against other open games that stream assets.

Here's the SSD utilization and read bandwidth of that drive over the entire sample period (13 seconds):

starfield_ssd_usage02.jpg

It's mostly fine (average is 21%) but there are some serious spikes -- there's a 475 ms period where it's 100% utilization, resulting a total read bandwidth of 1200 MB/s.
 
and 2.5K negative ones. Read the positive ones too, most of them acknowledge the game is a mess. On some forums it only gets 4.7/100 from user reviews, with 8.5-10 critic score.
at least we know where that sponsorship money went.

Knowing how gamers have agendas against Bethesda, that is an amazing score. So the game is more than living up to the hype. I am so far loving it. It got me hooked after a couple of hours. The game started me on 4K Ultra and its running buttery smooth.
 
Spent a couple of hours on the game today. I've uploaded a video showing how the 1080 Ti fares at 1440p.


All graphical Presets with the exception of Shadow which is Low, are set to High with FRS 2 engaged. Game mainly averages around the 30fps to 40fps depending on scene complexity. Overall the 1080 Ti didn't do too shabby given how demanding this game is. The framerates are not ideal but it's still playable.
 
Looks more like "crippling the greens to make the reds look more competitive" thing.

(Not a fanboy, I have both).

[I'll try it out on my XB|SX with the Gamepass though. I'll wait next year or after that to get the Game of the Year /Gold/Ultimate edition when it reaches the 80 or 85% sale. Dang, I haven't even downloaded it on by XB... no hurry, I guess, to spend full price as soon as something is released..]
 
Last edited:
The performance of this game is a joke. In no way does the quality of graphics on display justify the GPUs required to run it at a reasonable frame rate. As usual, gamers are being asked to buy optimization in the form of faster hardware, rather than devs actually doing their jobs and optimizing the game before release.
 
It just goes to show the importance of having a high-end GPU, even for gaming at 1080p. For a while now, the consensus among tech sites and Youtube channels has been "For 1080p gaming, you only need a $300 card like the 6700 XT. Anything above that is just overkill and unnecessary."

Glad I didn't listen to that advice and got a 4080 instead.
 
It just goes to show the importance of having a high-end GPU, even for gaming at 1080p. For a while now, the consensus among tech sites and Youtube channels has been "For 1080p gaming, you only need a $300 card like the 6700 XT. Anything above that is just overkill and unnecessary."

Glad I didn't listen to that advice and got a 4080 instead.
You base this conclusion off of one game....? Seems like an odd thing to do. Most recent games coming out, cards in that 6700XT range have been more than enough for 1080p gaming.

Games that have troubles hitting that are because of either new engine (Unreal Engine 5) that has already been known to be a bit of a resource hog from previews and tests to bad coding.

Here are the last 7 games ran for performance testing on TPU, 2 of them use UE5 (Immortals of Aveum & Remnant II) and it is more demanding than other game engines used.

performance-1920-1080.png

performance-1920-1080.png

performance-1920-1080.png

performance-1920-1080.png

performance-1920-1080.png

performance-1920-1080.png

performance-1920-1080.png

As you can see, 6700XT works just fine for 1080p gaming.
 
Back