Starfield GPU Benchmark: 32 GPUs Tested, FSR On and Off

Starfield is absolutely gorgeous and, yeah, a real humbling experience for our hardware.

My 3080 gaming laptop clocked out last week and is being repaired under warranty, so I'm back on my GT76 Titan I bought 4 years ago.
But it has a 9900k and a 210 watt 2080, so I can maintain a 30 fps frame rate with Ultra and only shadows lowered so I have locked it at that.
And like Steve found, FSR2 is pretty much worthless, so I'm going to try the DLSS mod today.

But it has already proved itself to me as a great game and a blast to play.
 
Ouch 5900x, 32GB RAM and 3090 OCed all under water, this is going to be blargh ... guess il have to be sensible with settings
 
I'm still trying to figure out why this game is so demanding on a gpu, when compared to a game like Cyberpunk!!! Admittedly Starfield looks pretty good at high and ultra, but the textures and NPC animation are pure garbage compared to Cyberpunk. Getting less than 60fps in Starfield at 1440P running high settings and with a 6950XP and having to look at lousy textures and daft npc animations is disappointing. Even worse, having better FSR implementation, the ability to use RT, better looking NPC's and getting almost 90FPS in Cyberpunk at 1440P with my 6950XT is a kick in the teeth. Makes me think that this is another poor PC port....
I am enjoying the actual game though...
 
LOL, just took a look at one of Asmondgold's Youtubes here that reinforces my point. If you want a good old laugh today have a gander :)
 
The game atm doesn't support ultra wide monitor support... Which sucks :| it's limited on my 49inch monitor. The game graphics are amazing, but not sure why all the hype. I don't feel like this game is fully there yet. Kinda of like what happened to CyberPunk.
 
Nvidia ouch, the XTX is 30-35% faster than the 4080 here and beats the 4090 in most scenarios! I doubt drivers are going to close that gap. Very disappointing.

AMD having the console market is definitely paying off here. XSX was likely the lead platform in development of this game. RDNA2 and RDNA3 shine here because of that. Ai has taken Nvidia's focus away from gaming. This game will sell 7900 XT and XTX's.

I see here that Techspot claims to have tested a very demanding area of the game, so I'm sure you're going to see benchmarks out there that are much more favorable overall. That being said, trying to find the most demanding part of the game is the appropriate approach.
 
Last edited:
Make sure when checking a cpu bottleneck you use the 7900xtx to mitigate the gpu bottleneck 😅. Remember when I said when something falls short of acceptance they will blame it on ai. Some are saying that Nvidia is using driver optimization personnel on ai support instead. Although I do recall 6 months back they added ai optimization to driver support as well.
 
I suppose the good news is that with more and more new titles bringing the hurt, gimping the next generational uplift won't really be an option if they want to sell any gaming GPUs.
 
Don't think I've seen such poor performance from the 1080Ti ever (I may even pass on purchasing this game until I've upgraded - so may be some time yet). Bethesda's loss as this would normally be an easy sell (just completed Outer Worlds Spacer's edition so looking for my next game). Looks like I'll just hold out for the Cyber Punk DLC (which plays fine at 1440p on a 1080Ti).
 
Now increasingly glad that I chopped in my 3070 at the end of last year for a Radeon 7900XTX.

I'm staggered how the 7900XTX is faster here than the 4090, as the 4090 is generally significantly faster elsewhere. It would be good to understand exactly why
 
I held off on installing mods for the first 10 hours and I wish I hadn't. I was afraid of breaking the game more or making it run worse. But a couple mods like adding DLSS and saving VRAM and my game is running and looking so much better. This really is a classic bethesda game, tons of content and a platform to be the best game but still needs help from the community because of the scale of it.
This game will be another skyrim because the modding community won't let it be bad lol.
 
I suppose the good news is that with more and more new titles bringing the hurt, gimping the next generational uplift won't really be an option if they want to sell any gaming GPUs.
The current thing is the blame game, blame it on ai, blame it on consoles, blame it on developers, blame it on the crunch pushed by publishers, blame it on relying on mods to fix their incompetence, blame it on upscaling and frame generation clutch to show a perceived smoke and mirrors to hide their shortfalls. The biggest winners are always Nvidia, AMD and Intel in brute forced computing.
I held off on installing mods for the first 10 hours and I wish I hadn't. I was afraid of breaking the game more or making it run worse. But a couple mods like adding DLSS and saving VRAM and my game is running and looking so much better. This really is a classic bethesda game, tons of content and a platform to be the best game but still needs help from the community because of the scale of it.
This game will be another skyrim because the modding community won't let it be bad lol.
I might hold off till when there is more mods to do a playthrough. Who knows maybe this time next year we'll have better drivers, patches, mods ( improved texture mods ( that can scale with vram and system ram with minimal performance hit) , pathtracing, dlss 3.5 rr support and FSR 3 or 4 fingers crossed.
 
Wow. Overly demanding and on top of it, I think it looks like crap. Almost like someone threw dust in my eyes - everything looks grainy in the images and in the game trailers I've seen. Yeah...no thanks.
 
Wow. Overly demanding and on top of it, I think it looks like crap. Almost like someone threw dust in my eyes - everything looks grainy in the images and in the game trailers I've seen. Yeah...no thanks.
Yeah I've seen a reshade mod that takes away the green algae filter added to the game and it looks actually pretty good.
 
My 5600X and 6950XT run this at 42fps in New Atlantis at native 4K. To play I run FSR 67% resolution with everything else at Ultra. In most areas I get 60fps, while busy areas like Atlantis or the other cities, I'll get around 54fps. I'm good with that for now, but really hoping they improve performance by about 10-15% after some updates to keep me above 60fps.
 
It's not even about my 6800 and 3080 being the same,but more about none being able to do 1440/60 Ultra. Not even close, incl. hub's optimized settings. And cpu performance scaling is even worse with 5800x3d losing to stock 10th gen.
what a farce to charge 70 dollars for this.

I'm still trying to figure out why this game is so demanding on a gpu, when compared to a game like Cyberpunk!!! Admittedly Starfield looks pretty good at high and ultra, but the textures and NPC animation are pure garbage compared to Cyberpunk. Getting less than 60fps in Starfield at 1440P running high settings and with a 6950XP and having to look at lousy textures and daft npc animations is disappointing. Even worse, having better FSR implementation, the ability to use RT, better looking NPC's and getting almost 90FPS in Cyberpunk at 1440P with my 6950XT is a kick in the teeth. Makes me think that this is another poor PC port....
I am enjoying the actual game though...
my rx6800 in cp2077 at optimized settings (no rt) 1440p got close to 100fps. In starfield, not even close to 60.

Oof, not even 6800xt fsr2 can do 1440/60. And even if it did, a 5800x would drag it down into 40s anyway.

LGHLtIV.png
 
Last edited:
Looks like my poor old 1080 Ti will be hammered at 1440p.... Does anyone know which modern GPU it's roughly equivalent to?
 
Last edited:
The visuals do not warrant the heavy performance demands. Especially in city areas that look incredibly flat, not particularly great textures. Let alone anything else like the kind of RT lighting and reflections seen elsewhere in demanding titles.
 
Yeah, but make sure you compare it to the 12GB model and not the extra weak and crappier 8GB model.
yup.
but that's just theoretical performance, 30 series does better with dx12, so wouldn't be surprised to see 1080Ti around 2060 12g

yeah, 2060 super actually


anyway, grab some popcorn and get ready for the users to rip this game to shreds in 6 hours time


🍿🍿 🍿

so far 47 positive and 4 mixed, why do I feel user score will be overwhelmingly negative.....
 
Last edited:
I'd be interested to re-test NVIDIA with the DLSS mod.

But yeah, this isn't "optimizing" for AMD; this is just the most piss-poor performance I have seen since the original Crysis came out.
 
Back