Starfield should be running better on Nvidia GPUs than it currently is

Not rendering the sun saves a lot of GPU work.

https://www.techspot.com/news/100099-amd-gpu-users-reportedly-cant-see-sun-any.html

Though, if the possibilities are "neglect on Bethesda's part" or any other cause, it's probably neglect on Bethesda's part.
it saves some, but dunno to what extent. the light is still there, just missing the actual source.
still, in a game like SF I'd rather play with the sun or just wait for a driver fix from amd.

People who enabled Rebar with nvidia inspector for this game, managed to get ~20% boost. Meaning it would get close to AMD GPUs.
If nvidia were to whitelist Starfield on their drivers for Rebar, everyone would benefit from this performance increase.
thx for the info guru.
 
Last edited:
People who enabled Rebar with nvidia inspector for this game, managed to get ~20% boost. Meaning it would get close to AMD GPUs.
If nvidia were to whitelist Starfield on their drivers for Rebar, everyone would benefit from this performance increase.
I've watched some videos on this and it seems Rebar gives about 3-5 extra FPS (when average is around 60fps) for most Nvidia GPUs so between 5 to 10% which is a more realistic gain.
 
I've watched some videos on this and it seems Rebar gives about 3-5 extra FPS (when average is around 60fps) for most Nvidia GPUs so between 5 to 10% which is a more realistic gain.
if you're talking about that 5800x3d+3060 video, that's what just 5800x3d can do in sf.


mid-60 avg, with dips to 50. the game is absolute trash if 12600k beats 7950x3d and 8700k beats 5800x.

see how enabling game bar gives you 5% performance, what the hell.

this game is just garbage for optimization overall.

 
Last edited:
Amd is just playing nvidia and intel at their own game, nvidia has been known for over 2 decades for doing everything it can to slow amd cards, good examples are tessellation in crysis 2 (out of view) and ray tracing in cyberpunk 2077 that just made the game run at cinematic framerates on all cards but mostly on amd cards, newer games are much better some of them run with lower rt settings so that you can get good framerates without upscaling

nvidia is also doing planned obsoleteness for a while now, giving you less vram and pushing rt that you have to upgrade a few years later

so I really don't see the difference here
"for over 2 decades for doing everything it can to slow amd cards, good examples are tessellation in crysis 2 (out of view) and ray tracing in cyberpunk 2077 that just made the game run at cinematic framerates on all cards but mostly on amd cards" How can NVIDIA slow down AMD video cards? Back in the crysis 2 days weren't RADEON their videos cards meh anyway? The 500 series was well ahead of the 6700 series from AMD. If anything NVIDIA spent their own money working with game developers to optimize their cards, but if you have solid objective evidence showing NVIDIA purposefully spent money ensuring games wouldn't play well on AMD cards, I'd love to see it, but I bet it's all speculation and rumor.
 
Digital Foundry has done paid content for Nvidia so they are not a neutral reviewer. GN would never do that because it's ethically questionable to take money from a company and then later do game performance reviews with that companies GPU's.
I feel the objective of DF's piece is questionable. There is nothing new here in a sense you sponsor a title, you get an advantage when it comes to game optimization. It seems like Nvidia is never in the wrong. Game performs badly, it's got to be AMD or bad developers.
 
AMD really harmed Starfield and its brand image big time with this sponsorship.
This game looks like a PS4 game and 4090 can not get 100fps on 1080P ultra on this game.
Really absurd!!!

This game was the best choice for showcasing FSR3. Which they did not do.
But why is AMD sponsorship bad, while Nvidia sponsored titles are great? Assuming AMD did not sponsor this title, how sure are you it will turn out any better? Not sure if this is a validated claim or just some guess work. By the way, if you think it is a bad title, best thing you can do is not to play it.
Other point is the graphic is due to the game engine, nothing to do with AMD. You miss features like RT, but given the poor performance, switching RT on is going to make the game a slide show for most.
 
But why is AMD sponsorship bad, while Nvidia sponsored titles are great?
They're not but there's a few factors going into this:

1) It's not just an AMD sponsored game, it's an AMD sponsored game that runs terribly even on AMD hardware so it's already on the backfoot, think Cyberpunk 2077 (An Nvidia sponsored title that ran terribly even on Nvidia hardware at launch)

2) AMD has been seen as the underdog and there's no getting around that: AMD will get far more dedicated enthusiasts defending them simply because they position themselves as the more open source friendly company and the better bang for the buck brand. This isn't even true consistently but well, lets go to the next point.

3) The reality of the situation is that Nvidia commands over 70% of the desktop GPU market if you don't count integrated solutions so if you take a sponsorship from AMD, you're basically only going to hurt your sales no matter how much AMD tries to compensate since you're optimizing for the decisive minority of PC gamers no matter what. It's not fair and it puts AMD on a position of never getting to be able to recover some market share but well, should have thought about that before spending the last 15 years falling way behind Nvidia.
 
Well, the game likely is heavily optimized for RDNA and/or makes use of a lot of 16-bit FP math. Reminds me of RDR2 performing very well on GCN/Polaris relative to Pascal, though the latter did improve eventually
 
When you look at the difference between AMD GPU's and Nvidia's across the range, the fact the 4090 still beats the 7900XTX is a testament of its absolute brute force performance in my eyes.
Yes, there's just a few outliers like COD which may also be at least partially due to Nvidia's higher CPU overhead
 
Well, the game likely is heavily optimized for RDNA and/or makes use of a lot of 16-bit FP math.
In the case of the former, it doesn't seem to be particularly well optimized even for RDNA architectures. At the moment, I can't get any GPU profilers to work properly with Starfield, so not able to check out the latter, but from what analysis I have done (using PIX and Nsight Systems), the engine is pretty messy compared to others I've looked at.

For example, walking around Atlantis City results in a huge number of asset reads, many of which take as long as 6 milliseconds to carry out. The vast majority of these are done in parallel but it means the storage drive and I/O system gets hammered quite frequently. The Last of Us Part 1, meanwhile, reads off entire levels in less than a millisecond. Sure it's not an open-world game like Starfield, but even Cyberpunk doesn't batter the I/O system like this game does.

Edit: Just done a quick PIX log of Cyberpunk -- the image below shows the file reads from the SSD on which this game and Starfield are stored on. The upper plot is CP2077, the lower one is Starfield:

cp2077_vs_starfield_asset_loading.png

The Starfield is actually for a slightly longer sample period (roughly two seconds more than in CP2077) but one can see that this game is doing an insane amount of streaming. It does mean that it uses way less VRAM than CP, but without the use of DirectStorage to reduce the CPU overhead, it's ridiculously excessive.
 
Last edited:
What about CPUs? intel CPU's are faster than Ryzen parts including 3d one, So personally I dont think that AMD intently pushed the optimization for its own product. Of course that does not mean that either companies (Intel, Nvidia, AMD) are no guilty of bad tactics.
 
2 articles explaining the same thing
"AMD GPUs won't show the sun on any planet"
"Nvidia GPUs have worse performance"

Not displaying a light source / object will certainly free up performance

Is it dark in the game during the day? No. People would probably notice that, right?

Does that mean that the Sun as a light source is being rendered and applied in the game? Yes.

Does that mean that an AMD card with a white decal missing from the sky is rendering less than a Nvidia card? No. The load is identical as decals take no extra rendering power compared to shading.
 
The decal that is the Sun is not rendered at all. Decals impose almost no GPU load. The light source that is the Sun (which pushes an actual GPU load) is rendered properly because people can see the shadows that the (missing) sun is casting. This stupid bug is highly unlikely to lead to any differential GPU load. But it calls into question if any other things aren't rendered correctly on either brand GPU which could lead to real performance differences.
Oh OK. I just read that the sun wasn't rendering or being seen for AMD cards. Didn't hear about it being a decal.
 
AMD really harmed Starfield and its brand image big time with this sponsorship.
This game looks like a PS4 game and 4090 can not get 100fps on 1080P ultra on this game.
Really absurd!!!

This game was the best choice for showcasing FSR3. Which they did not do.

LOL!! Good one!! LOL!!
 
So, Starfield is to AMD what Cyberpunk is to Nvidia ...
You'd hope not, Cyberpunk is a way better looking game, if you're trying to show off your GPU's capabilities, I don't think I'd want Starfield as my go to demo 😅
 
Obviously they can ship bug fixes and patches but you've got a game that....
*doesn't render the sun on AMD
*runs slower than it should on Nvidia and Intel.

OK, so what's left? Qualcomm Adreno? An M1/M2 port running on Apple GPU? I mean, it's not like the late 1990s/early 2000s when you had like a dozen vendors shipping fairly large quantities of GPUs.


 
You can blame Bathesda for that, not AMD. They are using an old engine that they keep patching up.
True.
But this game looks like a ps4 game and ps4 visual quality games run 200fps + on 4090 on 4k while this one does not even hit 100 at 1080P.

Amd spend millions for this sponsorship but it backfired. 50%~ of pc gpu users now have RTX GPU and 84% have a Nvidia gpu.

Bethesda should have not taken this sponsorship.

Did you saw the recent news that starfield dev was supposed to implement RTX features in that game with lots of features. It was in hin linkedin profile.

What happened to those PC exclusive features?

Amd should just stick consoles as they bring nothing of quality of innovation to pc or laptop users.
Only copy pasting nvidia's old tech.

They even tried to copy fp32 of ampere with rdna3 and failed badly.
 
XTX beating the 4090 never looked right.

"Moreover, it makes sense for Bethesda to focus mainly on AMD chips to optimize performance in the Xbox version."

I don't buy it. Is Bethesda a AAA studio or not?

They are in the sense that the hype and prices align with other AAA studios. Big problem as I see it is they rely way way too much on modders to finish their games for them.
I guess in the end it leads to lower costs and higher profits. And when you consider the last few titles they've released, that's their focus, profits over experience.
Which yeah... considering their competitors, they certainly are a modern AAA studio/publisher like EA or Ubisoft studios have proven to be.

Oh I pine for the days when studios/publishers were comprised of gamers, doing what they love. Making kick *ss games they'd want to play. Not your standard run of the mill, screw the consumer in every way possible capitalists most of them seem to be.
 
True.
But this game looks like a ps4 game and ps4 visual quality games run 200fps + on 4090 on 4k while this one does not even hit 100 at 1080P.

Amd spend millions for this sponsorship but it backfired. 50%~ of pc gpu users now have RTX GPU and 84% have a Nvidia gpu.

Bethesda should have not taken this sponsorship.

Did you saw the recent news that starfield dev was supposed to implement RTX features in that game with lots of features. It was in hin linkedin profile.

What happened to those PC exclusive features?

Amd should just stick consoles as they bring nothing of quality of innovation to pc or laptop users.
Only copy pasting nvidia's old tech.

They even tried to copy fp32 of ampere with rdna3 and failed badly.
Let's be honest. RT in Starfield is out of the question when performance is lower than Cyberpunk with RT enabled. It wasn't AMD who removed RT, it's Bathesda because of the piss poor performance on all platforms.

And it's stupid to say that AMD should not make PC GPUs. They may not have top tier performance, but from high-mid to low AMD is better on almost all price points. They had bad launch MSRP, but the market corrected itself fairly fast.
 
Back