AMD denies blocking Bethesda from adding DLSS to Starfield

Daniel Sims

Posts: 1,375   +43
Staff
A hot potato: As AMD unveiled two new graphics cards to round out the RDNA 3 lineup, it also tried clarifying some recent controversy regarding its agreements with game developers. As the company unveils a hardware bundle for Starfield, the details on whether it will support Nvidia DLSS remain unclear.

On Friday, AMD Chief Architect of Gaming Solutions Frank Azor said that if Bethesda wanted to implement rival Nvidia's DLSS upscaling technology in its upcoming RPG Starfield, AMD would not stand in the way.

"If they want to do DLSS, they have AMD's full support," Azor told The Verge, adding that nothing is blocking Bethesda from using it.

The recent comments address rising suspicions that Team Red prevents DLSS from appearing in most games for which it signs marketing agreements. Despite this, the game likely won't officially support the feature at launch.

Lists comparing game sponsorship deals with the two companies show that most major Nvidia-sponsored titles support AMD's FSR technology. In contrast, far fewer Team Red games include Team Green's upscaling solution.

AMD's prior hesitance to clarify the situation didn't help, and the company still couldn't reveal all the details of its contracts this week. Azor admitted that the agreements involve money in exchange for technical support and that AMD expects partners to prioritize its technology over its competitors but stressed that it doesn't forbid implementing DLSS.

A few past cases support his assertion. Sony's recent PC conversions include both upscaling tools regardless of sponsor, and Microsoft's Deathloop added DLSS in a post-launch update despite an AMD partnership. Unfortunately, current data mining shows no sign of DLSS or Intel's XeSS in Starfield, but the game will support FSR 2 at launch.

Azor suggested that Bethesda might have focused on FSR 2 for Starfield because Xbox and all recent PC graphics cards can access the technology, while DLSS requires an Nvidia GPU from the GeForce RTX 2000 series or later. Despite FSR's broader compatibility range, comparisons usually give DLSS the edge regarding image quality.

In any case, similarities between the AI workloads underpinning both solutions and Intel's XeSS mean that implementing one makes including the other two relatively straightforward, especially for a developer with Bethesda's resources. The key evidence for this is how games only supporting DLSS or FSR 2 receive third-party mods, adding the missing upscaling method. Many RTX GPU owners anticipate one such implementation from prolific modder PureDark, who has recently inserted DLSS into multiple older AAA titles, including Bethesda's Skyrim and Fallout 4.

However, users have criticized PureDark's policy of locking downloads behind $5 Patreon contributions. One Star Wars Jedi: Survivor player warned that while a single payment grants access to the latest version of that game's DLSS mod, downloading patches requires a continuous subscription. However, PureDark is not the only modder making DLSS patches. Other modders could just as easily bring DLSS 2 to Starfield for free.

In related news, AMD revealed the details of its Starfield hardware bundle. Purchasing any currently available Ryzen 9, 7, or 5 CPU from the 7000 or 6000 series guarantees a Steam copy of the game. Radeon RX 7000 and 6000 GPUs from the RX 6600 or better are also eligible, but the deal doesn't include the recently unveiled 7800 XT and 7700 XT, which share the same release date as Bethesda's upcoming game. High-end processors and graphics cards come with the premium edition, while mid-range chips include the standard edition.

Starfield launches on September 6 for PC, Xbox Series X, and Xbox Series S, but customers who pre-order the premium edition gain access starting September 1.

Permalink to story.

 
but the deal doesn't include the recently unveiled 7800 XT and 7700 XT, <- false, AMD said that they have starfield with them in their gamescom presentation
where they said all 6000 and 7000 series have a limited time starfield included
 
So it's a just ok gpu with an almost reasonable price. I'ma gonna get one right now!
Like every other AMD gpu you've reviewed.
 
I think this might be very unpopular and no direspect to Techspot and hardware unboxed for reporting on it but this entire thing about AMD allegedly blocking games from getting DLSS feels like such a nothing burger to me.

Like it might be strong arm tactics by AMD and I have 0 doubts AMD would basically turn into Nvidia if given the same market position, however at this point in time it was Nvidia that built that tradition of 'Push game publishers to include stupid new tech that is clearly not ready like Ray Tracing so we can sell newer cards faster'

So AMD is now doing it...Except it cannot be confirmed for sure that AMD barrs them from reaching out to Nvidia: At face value, it makes more sense to me than once a company signs a game to be an AMD title deal even if not exclusive to AMD then Nvidia might just take several weeks or months to return phone calls about help with DLSS implementation but if they go to Nvidia first then they send engineers and code ready to go inside a business day.

Plus the last thing I often remember is that again, the tactic is bad but let's not forget that FSR implementations always work on Nvidia cards anyway: if a game gets FSR 1 or 2 and soon(ish) 3 then most people with GPUs from the last 8 years or so will be able to use the feature even if not ideal. If you want DLSS well 3.0 is the best one so go buy only Nvidia and only the 4XXX series for a time. I still fail to see how games getting FSR and no DLSS is an overall negative if we ever want to see market share from Nvidia to decline and trust me: we definitively want Nvidia to lose market share if you ever want better pricing, sufficient VRAM, etc.
 
I think this might be very unpopular and no direspect to Techspot and hardware unboxed for reporting on it but this entire thing about AMD allegedly blocking games from getting DLSS feels like such a nothing burger to me.

Like it might be strong arm tactics by AMD and I have 0 doubts AMD would basically turn into Nvidia if given the same market position, however at this point in time it was Nvidia that built that tradition of 'Push game publishers to include stupid new tech that is clearly not ready like Ray Tracing so we can sell newer cards faster'

So AMD is now doing it...Except it cannot be confirmed for sure that AMD barrs them from reaching out to Nvidia: At face value, it makes more sense to me than once a company signs a game to be an AMD title deal even if not exclusive to AMD then Nvidia might just take several weeks or months to return phone calls about help with DLSS implementation but if they go to Nvidia first then they send engineers and code ready to go inside a business day.

Plus the last thing I often remember is that again, the tactic is bad but let's not forget that FSR implementations always work on Nvidia cards anyway: if a game gets FSR 1 or 2 and soon(ish) 3 then most people with GPUs from the last 8 years or so will be able to use the feature even if not ideal. If you want DLSS well 3.0 is the best one so go buy only Nvidia and only the 4XXX series for a time. I still fail to see how games getting FSR and no DLSS is an overall negative if we ever want to see market share from Nvidia to decline and trust me: we definitively want Nvidia to lose market share if you ever want better pricing, sufficient VRAM, etc.

I agree.
Nowadays more tech sites are running after sensational and click bait but disregarding the facts, or "forgetting" the truth and causing a great diservice to the readers and viewers. Regarding this madness approach I can bring something worth to investigate. And this is for Techspot editors too.
Check BG3-Baldurs Gate 3 and Nvidia relations, connections etc. Is BG3 a sponsored title by Nvidia? And I mean just ask same simple questions which are used to "demonize" AMD for the same or worse shennanigans.
Are BG3 devs, or Nvidia actively, or weaselly blocking FSR implementation?
What you can find may surprise you a lot.
 
Last edited:
They might've unblocked games from implementing DLSS and now they're technically saying the truth.

Idk, I'm just throwing things out there.

Wouldn't Nvidia know something if they were blocked from implementing their tech?
 
Plus the last thing I often remember is that again, the tactic is bad but let's not forget that FSR implementations always work on Nvidia cards anyway: if a game gets FSR 1 or 2 and soon(ish) 3 then most people with GPUs from the last 8 years or so will be able to use the feature even if not ideal. If you want DLSS well 3.0 is the best one so go buy only Nvidia and only the 4XXX series for a time. I still fail to see how games getting FSR and no DLSS is an overall negative if we ever want to see market share from Nvidia to decline and trust me: we definitively want Nvidia to lose market share if you ever want better pricing, sufficient VRAM, etc.

It negative when FSR produce worse image quality. Why RTX users have to suffer from using inferior tech for no good reason when they can easily just use DLSS

They gap between FSR and DLSS image quality can be big sometimes.
Look at shimmering on building on FSR image here
https://I.ibb.co/2N4WPF0/get.gif
 
It negative when FSR produce worse image quality. Why RTX users have to suffer from using inferior tech for no good reason when they can easily just use DLSS

There's a few assumptions here that I covered in my post but I think should be highlighted again since I expected this kind of response.

1) 'FSR looks worst' is the most common one but this doesn't consider that FSR is usually several implementations behind the curve so it's not really and apples-to-apples comparison when that case gets made. But let's say you limit the comparison to just FSR 2 vs DLSS 2 and eliminate a very large chunk of the 'FSR is worst' part but not all of it since they're a lot closer, I can still concede that DLSS 2 looks slightly better than FSR 2 sure.

2) The argument of 'Why should Nvidia users suffer worst tech for no good reason' really is a terrible one to be making since it was Nvidia who first released the technology exclusively for their products, not only that but their latest products only. So even if AMD is technically on the wrong for using the same tactics (allegedly) It was Nvidia that not only established this tactic but for this very tech: There was no good reason why some form of DLSS couldn't work on at least Pascal cars when it launched so forget about Nvidia screwing over AMD customers, Nvidia is always the main company there to screw over Nvidia customers.

3) Finally, the bigger point is if a game was really going to miss out on DLSS because as I mentioned in my point, I do believe is for good reason: Nvidia doesn't just give you DLSS as a turn on button it needs to be implemented and it's far easier if you get them to help out so if you're publishing a game chances are you won't even have to think about these kind of effects until either Nvidia or AMD actually give you a call and offer you help to include them along with probably a significant boost to your budget a.k.a. cutting you a check to do so.

So if a game gets FSR/DLSS and gets featured by either company, it's extremely likely in either case because they partially funded the development of the game beyond making it extra easier by helping out with some actual expertise in implementing the tech because they offer a sponsored deal. There's almost no game out there that isn't sponsored by either company at least in some minor way so the reason some RTX users might not have DLSS on day one is the wrong question, the right question is why do most games do offer DLSS day one instead which is Nvidia paid a lot of money for them to do so. So there's your reason: AMD is giving RTX users a taste of their own medicine.

And again both companies are wrong for doing it but any RTX user cannot in good conscience try to call this out AMD for doing what Nvidia constantly does and often way worst at least to begin with.
 
It negative when FSR produce worse image quality. Why RTX users have to suffer from using inferior tech for no good reason when they can easily just use DLSS

They gap between FSR and DLSS image quality can be big sometimes.
Look at shimmering on building on FSR image here
https://I.ibb.co/2N4WPF0/get.gif
Funny how questions can be spinned and twisted depending on narratives.
Better ask the golden question - WHY Nvidia GTX users has to suffer from Nvidia DLSS black pattern artificial market segmentation? Because Nvidia blocked DLSS 1, 2 for GTX 1xxx cards. WHY?
More, the diamond question - WHY the same Nvidia continue with its DLSS 3 black pattern artificial market segmentation only for RTX 4xxx cards? WHY?

The questions you propose pale in comparison and has the answer included in this 2 essentials questions for Nvidia customers. The answer is because of Nvidia DLSS black pattern artificial segmentation of its own videocards. That's why Nvidia users are forced to and will use FSR2 and FSR3 on Nvidia videocards. Because Nvidia is constantly screwing them?

Any technology that works is better than the one which Nvidia is deliberately blocking.
And AMD made FSR2 available for GTX cards and FSR3 for all RTX cards.

On ANY GTX1xxx cards FSR2 is better than DLSS2. On GTX 1xxx cards DLSS2 is vastly to infinitely inferior than AMD FSR2. Do we get the irony here? On any RTX 2xxx-3xxx cards AMD FSR3 is better than any DLSS 3, 3.14 soon 4 or 007 which simply does not work (read blocked by Nvidia corporation.) On RTX 2-3xxx cards DLSS3 is vastly to infinitely inferior than FSR3.
Any "expert" reviewer can benchmark and prove this, even more, can prove how FSR 2 and 3 is "universally superior" * in quality, performance etc. than DLSS2 on GTX cards and DLSS3 on RTX 2-3xxx cards, on 1K, 2K, 4K and 8K (soon 16K). It is so easy. Half of the tests and charts are already filled with 0 value data for Nvidia DLSS2 for GTX 1xxx=0 and for DLSS3 for GTX 2-3XXX=0. ZERO.
We can easily check and see the 0 fps for DLSS 2 on GTX cards and 0 fps for DLSS3 on RTX 2-3xxx cards on the same example https://I.ibb.co/2N4WPF0/get.gif
So Nvidia is checking huge and numerous ZEROUS for its customers while asking hundreds of real $ for these ZEROUS.

Thus, all Nvidia RTX 2-3xxx users can be thankful to AMD new FSR3 for taking care of them. Also all Nvidia gamers can now acknowledge how Anticonsumer and how Antigamer Nvidia corporation is by constantly screwing its own customers.

P.S. for * - "universally superior" is not my formulation, it is a catch phrase I met reading reviews on internet, and is often used to lead readers to agree with authors or manufacturers narrative.
 
Last edited:
Because Nvidia blocked DLSS 1, 2 for GTX 1xxx cards. WHY?
Bit difficult to run CANNs quickly on GPUs that have no dedicated units for GEMM operations (I.e. tensor cores). It can be done, but nothing like the same performance, so there's little point in expanding the compatibility of the algorithm for a platform that isn't going to benefit from using it. Besides, Nvidia has made an upscaler for 10 series cards -- NIS. It's not as good as FSR and I don't recall seeing any game offering it, but it's there for anyone to implement into a renderer.

WHY the same Nvidia continue with its DLSS 3 black pattern artificial market segmentation only for RTX 4xxx cards? WHY?
The only aspect of DLSS 3 that's locked to the 40 series is Frame Generation -- the rest (SR, DLAA, Reflex) all work on older platforms. Nvidia's argument is that the older GPUs' units aren't fast enough but since nobody has actively tried to benchmark them against Nvidia's claims, one is left with no option but to take its word on the matter. It's a very binary point, though: either it's lying and FG is locked to the 40 series simply to help sell it, or it's not lying and the 40 series is the only GPU good enough to do it.
 
It negative when FSR produce worse image quality. Why RTX users have to suffer from using inferior tech for no good reason when they can easily just use DLSS

They gap between FSR and DLSS image quality can be big sometimes.
Look at shimmering on building on FSR image here
https://I.ibb.co/2N4WPF0/get.gif
No one is forcing you to use anything. Hardware makers know that people will pay exorbitant prices to have pretty lights flashed in their eyes. So instead of making appropriate hardware to keep up with the demands if software(and by proxy, the unreasonable demands and unrealistic expectations of gamers). Then you guys argue about "but I can't play this game without my proprietary tech.

Then, people become so dependant on it that companies start selling BS cards like the 4060 which barely deserves to be called a 4050 because it can't even play modern games WHILE USING upscaling.

Meanwhile, gamers keep buying games without actually giving a damn about gameplay. It's gotten to the point that when a real game comes out that AAA devs complain about how it's creating unrealistic expectations.

I've been gaming for over 30 years, I have never seen such a toxic culture surrounding every aspect of it. For the last several years there has been maybe one game a year worth playing?

I frequently find myself playing older games because developers use graphics as to attract attention to their because the experience is so shallow you barely get any time out of it.

I also don't think Bethesda has released a good game since Oblivion. Skyrim was a shallow experience and fallout 4 sucked. I say skyrim sucked as someone who's first elderscrolls game was Daggerfall. I'm not confident Starfield will be the game it's hyped up to be. Rather than worry about whether or not it's going to be a good game everyone is worried about the graphics
 
Bit difficult to run CANNs quickly on GPUs that have no dedicated units for GEMM operations (I.e. tensor cores). It can be done, but nothing like the same performance, so there's little point in expanding the compatibility of the algorithm for a platform that isn't going to benefit from using it. Besides, Nvidia has made an upscaler for 10 series cards -- NIS. It's not as good as FSR and I don't recall seeing any game offering it, but it's there for anyone to implement into a renderer.


The only aspect of DLSS 3 that's locked to the 40 series is Frame Generation -- the rest (SR, DLAA, Reflex) all work on older platforms. Nvidia's argument is that the older GPUs' units aren't fast enough but since nobody has actively tried to benchmark them against Nvidia's claims, one is left with no option but to take its word on the matter. It's a very binary point, though: either it's lying and FG is locked to the 40 series simply to help sell it, or it's not lying and the 40 series is the only GPU good enough to do it.
Thank you for the info, some of them I already knew. My point is that AMD is proving that such upscaling technologies, not necessarely exactly the same, can be made available for a wide range of GPUs and generations. And they prove it with FSR2 and now with FSR3.
Hard for Nvidia to pretend now that they canno do it. Maybe will be a difference, it seems that will be quite small, between DLSS3 frame gen fort RTX2-3xx cards than 4xxx cards, but Nvidia should and could made available an option for RTX2-3xxx owners. This I am pointing out, Nvidia anticonsumers and antigamers approach to its own customers.
How AMD can do this and why Nvidia not? And the answer it seems that is not a pleasant one for Nvidia corporation regarding of how is treating gamers and its own videocard customers.
 
Last edited:
Thank you for the info, some of them I already knew. My point is that AMD is proving that such upscaling technologies, not necessarely exactly the same, can be made available for a wide range of GPUs and generations. And they prove it with FSR2 and now with FSR3.
Hard for Nvidia to pretend now that they canno do it. Maybe will be a difference, it seems that will be quite small, between DLSS3 frame gen fort RTX2-3xx cards than 4xxx cards, but Nvidia should and could made available an option for RTX2-3xxx owners. This I am pointing out, Nvidia anticonsumers and antigamers approach to its own customers.
How AMD can do this and why Nvidia not? And the answer it seems that is not a pleasant one for Nvidia corporation regarding of how is treating gamers.
The answers behind this are in my opinion also very interesting because they highlight company priorities very clearly for both:

1) Nvidia didn't really wanted to "lift up gaming to the next generation" at all. What they really wanted to do is tensor cores for ML and AI workloads on the enterprise but since they were already working on that tech, they set themselves to find a problem for the Tensor core solution they were ready to force into gamers.

And that first solution wasn't actually DLSS but it was Ray Tracing: that's what they wanted to showcase the Tensor cores for. However it was so rushed that they had to find solutions realized that while rasterization was not up to par to suppor ray tracing they still had enough tensor core power on the table so why not cut down on raster by cutting down the resolution? AI based AA it is

2) AMD has had different goals for the Radeon division for many years now. They'll never accept it publicly but they want to have their tech as a 'Good enough' option for their integrated graphics solutions which is a huge business if you consider the console market. The consequence is that they very clearly are playing catch up to Nvidia however, since they are more interested in catching up to the gaming areas isolated from the compute world (Since their server market is relatively strong on the CPU side of things so it's not like they're gonna disappear Epyc chips are decently highly demanded even if their enterprise GPUs are not as wanted vs Nvidia) their solution is to actually try to somewhat target the good will of the gaming market so even if their consumer GPU solutions are late, they don't actually use them (according to them anyway) as a wedge to drive up sales to better and more expensive products.

This is why if they want an answer to a popular tech like DLSS they do it not from the perspective of 'Lets use up these ML cores we include anyways' but more as 'Lets try to get feature parity' and being on the decisively and nearly constant smaller market share of course they will not risk further market segmentations so they just had to come up with a way for scaling to be done on nearly any GPU and so they did.

So honestly, DLSS is not as good for the market as people think it is and that's why I maintain that you should try and live with "Worst quality FSR 2 and 3" for the sake of reversing the market forces that lead to these tech becoming a defacto requirement in the first place and the way they came into being which is fairly anti-consumer and specially very anti pc gaming.
 
So honestly, DLSS is not as good for the market as people think it is and that's why I maintain that you should try and live with "Worst quality FSR 2 and 3" for the sake of reversing the market forces that lead to these tech becoming a defacto requirement in the first place and the way they came into being which is fairly anti-consumer and specially very anti pc gaming.
Considering that both consoles run on AMD hardware and cannot make use of DLSS. It's very niche in the grand scheme of things. Nvidia has market share on PC but AMD has the most gaming marketshare overall. With all the games coming out, many of which requires upscaling, developers are going to need widespread access to upscaling tech and they will need to prioritize FSR over DLSS
 
Considering that both consoles run on AMD hardware and cannot make use of DLSS. It's very niche in the grand scheme of things. Nvidia has market share on PC but AMD has the most gaming marketshare overall. With all the games coming out, many of which requires upscaling, developers are going to need widespread access to upscaling tech and they will need to prioritize FSR over DLSS
That's the same thinking that leads to poor PC ports for any number of reasons. Yes, it's understandable why some publishers chase the lowest common denominator first. It's happened plenty of times before and it will keep happening. Just like some PC gamers will continue to choose not to buy those games because they'd prefer to play something that takes full advantage of their chosen platform.

All that said, I'll be shocked if a noticeable hit to looks or frames from lack of DLSS implementation is Starfield's major bug on initial release. Based on the publisher's prior track record I expect to enjoy this game a year or two later, after fan mods have fixed hundreds of bugs, added many important QoL features, etc etc.

 
I think this might be very unpopular and no direspect to Techspot and hardware unboxed for reporting on it but this entire thing about AMD allegedly blocking games from getting DLSS feels like such a nothing burger to me.

Like it might be strong arm tactics by AMD and I have 0 doubts AMD would basically turn into Nvidia if given the same market position, however at this point in time it was Nvidia that built that tradition of 'Push game publishers to include stupid new tech that is clearly not ready like Ray Tracing so we can sell newer cards faster'

So AMD is now doing it...Except it cannot be confirmed for sure that AMD barrs them from reaching out to Nvidia: At face value, it makes more sense to me than once a company signs a game to be an AMD title deal even if not exclusive to AMD then Nvidia might just take several weeks or months to return phone calls about help with DLSS implementation but if they go to Nvidia first then they send engineers and code ready to go inside a business day.

Plus the last thing I often remember is that again, the tactic is bad but let's not forget that FSR implementations always work on Nvidia cards anyway: if a game gets FSR 1 or 2 and soon(ish) 3 then most people with GPUs from the last 8 years or so will be able to use the feature even if not ideal. If you want DLSS well 3.0 is the best one so go buy only Nvidia and only the 4XXX series for a time. I still fail to see how games getting FSR and no DLSS is an overall negative if we ever want to see market share from Nvidia to decline and trust me: we definitively want Nvidia to lose market share if you ever want better pricing, sufficient VRAM, etc.
AMD literally shut down the speculation and said there was nothing in contract forcing Bathesda to NOT implement DLSS. They confirm it was false.

It was another spin from fanboys to try to surf on and Tim literally screwup here. No to mention that similar sponsor deal by MS, Sony and Nvidia are existing since the industry exist.

Nvidia literally sponsored CDPR for Cyberpunk and the Witcher 3 at the expense of gamers on Radeon GPU getting gimped performances due to gameworks implementation.

I didn't see Tim losing his temper when I lost 40% performance in CF when I was playing The Witcher 3 simply to patch Hairwork issues... that frigging happened by the way and I was pissed off. I went from 60FPS at 2160p to 40FPS just because of the malware that was Hairwork.
 
That's the same thinking that leads to poor PC ports for any number of reasons. Yes, it's understandable why some publishers chase the lowest common denominator first. It's happened plenty of times before and it will keep happening. Just like some PC gamers will continue to choose not to buy those games because they'd prefer to play something that takes full advantage of their chosen platform.

All that said, I'll be shocked if a noticeable hit to looks or frames from lack of DLSS implementation is Starfield's major bug on initial release. Based on the publisher's prior track record I expect to enjoy this game a year or two later, after fan mods have fixed hundreds of bugs, added many important QoL features, etc etc.
It is actually the contrary. If AMD hardware can provide a smooth transition because of the ecosystem, PC ports should be occurring more often, which is the case by the way. It is the choice of the dev to implement or not FEATURES. If you are not happy, blame Bathesda, not AMD.
 
AMD literally shut down the speculation and said there was nothing in contract forcing Bathesda to NOT implement DLSS. They confirm it was false.

It was another spin from fanboys to try to surf on and Tim literally screwup here. No to mention that similar sponsor deal by MS, Sony and Nvidia are existing since the industry exist.

Nvidia literally sponsored CDPR for Cyberpunk and the Witcher 3 at the expense of gamers on Radeon GPU getting gimped performances due to gameworks implementation.

I didn't see Tim losing his temper when I lost 40% performance in CF when I was playing The Witcher 3 simply to patch Hairwork issues... that frigging happened by the way and I was pissed off. I went from 60FPS at 2160p to 40FPS just because of the malware that was Hairwork.

CDPR and Nvidia sabotaged and gimped both AMD video cards and Ryzen processors on Cyberpunk 2077. Check below.
On CP2077 they did not implemented FSR for many months after release, so Nvidia had DLSS exclusivity. After few months CDPR implemented FSR1, gamers saw the irony. Nexus mods community made mods which implemented FSR2 many months before CDPR officialy released it and practically forced CDPR to officially implement FSR2, but they delayed as much as they could.
Cyberpunk 2077 devs also blatantly favor Nvidia and Intel hardware even in 2023. CDProject Red did not patched or optimized the game to properly run on AMD processors, while one of the last mod made by Nexus mod community brought up to 27% more FPS for Ryzen processors with 8 cores or more, especially like 7800X3D.
[HEADING=2] Cyberpunk 2077, die AMD-Akte: Ryzen-Prozessoren liefern zu wenig Fps - machen Sie mit beim Community-Benchmark [Update] [/HEADING]
PCGH untersucht CP 2077 hinsichtlich der "falschen" Ryzen-Performance. Machen Sie mit beim großen Community-Benchmark!
https://wccftech.com/amd-ryzen-cpus...ance-boost-cyberpunk-2077-unofficial-smt-fix/
But hey, CP2077 devs, somehow, still CAN NOT and DID NOT patch their bugged game to properly optimize it for Ryzen processors.
Just that, "magically", CDPR devs COULD and DID a lot of mandatory updates which are EXCLUSIVELY for Nvidia cards.
And CP2077 devs should make it a PRIORITY and OPTIMIZE for AMD hardware, considering that PS4-5, Xbox consoles and many PC users run on AMD hardware.
 
Last edited:
That's the same thinking that leads to poor PC ports for any number of reasons. Yes, it's understandable why some publishers chase the lowest common denominator first. It's happened plenty of times before and it will keep happening. Just like some PC gamers will continue to choose not to buy those games because they'd prefer to play something that takes full advantage of their chosen platform.

All that said, I'll be shocked if a noticeable hit to looks or frames from lack of DLSS implementation is Starfield's major bug on initial release. Based on the publisher's prior track record I expect to enjoy this game a year or two later, after fan mods have fixed hundreds of bugs, added many important QoL features, etc etc.
Well, previously, console hardware was drastically different than PC hardware. Just look at the Cell Processor in the PS3. Ports sucked for a long time because they had to compile the game for a completely different set of hardware. Now, consoles and PCs are very similar on both a hardware and software level
 
Last edited:
Ha LoL people getting pithy about AMD using FSR a technology that works freely on any GPU made in the last 8 years because they may be blocking a technology that only works on nVidia GPU's of the last 3 gens

Where was this outrage when nVidia bought out PhysX and made it nVidia GPU hardware accelerated only to bad if you owned an AMD GPU you only got a software version running on whatever CPU you were running
or Hairworks or any of the other shiz that's only hardware accelerated if you're running an nVidia GPU
nVidia is the king of proprietary software technology but OH NO AMD only wants to let you use their technology which happily runs on any GPU even those that aren't made by them
 
I don't understand what's the fuss about... if AMD makes an agreement ($$$) that the producer holds for X months the DLSS support, it makes full sense. The producer can agree or not; Nvidia also makes huge agreements $$$ so that many games are optimized for Nvidia GPUs and DLSS. How many games support DLSS only and how many FSR only?

I think the industry should embrace AT LEAST an open standard (FSR) and additionally whatever they want.
 
Last edited:
Nvidia has been playing this game for a very long time and no one ever hardly said anything about it. Also, how about that so called Game works crap they always liked to have the game devs install into their game engines. Which barely worked as it was supposed to on Nvidia hardware and almost totally screwed AMD cards up so bad performance wise it would take AMD a few months to iron out Nvidia's traps in the game engine to get decent performance in the game.

People would praise Nvidia for Game Works and say oh AMD has crappy drivers and crappy hardware look at how well Nvidia is doing in such and such game when compared to AMD in a Game Works sponsored game.

If Starfield does not have DLSS at launch at least AMD gives everyone the ability to use FSR if they have GPU's that cannot render the game at full resolution unlike if this was an Nvidia sponsored title all there would be is DLSS and only Nvidia cards would get the boost when needed and AMD & Intel cards would be left hanging.
 
Everyone is trying to make a Mountain out of AMD's software, that unlike Nvidia's DLSS is actually a software solution that works with any GPU. DLSS is a hardware solution and is exclusive to Nvidia Cards only. AMD doesn't have to support it at all because it's not their crap as Nvidia sure as a Unicorn Farts doesn't give a damn about AMD's software
 
Back