AMD denies blocking Bethesda from adding DLSS to Starfield

It negative when FSR produce worse image quality. Why RTX users have to suffer from using inferior tech for no good reason when they can easily just use DLSS

They gap between FSR and DLSS image quality can be big sometimes.
Look at shimmering on building on FSR image here
https://I.ibb.co/2N4WPF0/get.gif
I have to agree with this. As an RTX user, The image quality is substantially better in some cases with DLSS. Especially older titles 'remastered' I.e. - witcher 3.

However it doesn't alleviate nvidia from price hiking their crap, and with DLSS 3.5 coming, its only gonna get worse.
 
I have to agree with this. As an RTX user, The image quality is substantially better in some cases with DLSS. Especially older titles 'remastered' I.e. - witcher 3.

However it doesn't alleviate nvidia from price hiking their crap, and with DLSS 3.5 coming, its only gonna get worse.
And as an RTX user, how good DLSS2, 3 and RT were? Because until 1 week before Nvidia and it's army of yt and reviewers were praising how good DLSS and RT are. Just that few days ago Nvidia came up with a "new" DLSS number 3.5 in which they showed how good this new DLSS is comparing with old DLSS 2,3 and RT combined which produce shimmering, ghosting, fuzzy images, and wrong colours, and how good the new DLSS 3.5 resolve those huge issues.
Clearly Nvidia PR and its army of reviewers lack consistency or like to contradict themselves while they were caught with the pants down.
From all of this is Better to acknowledge that Nvidia upscaling technologies are immature, need a lot of improvements, and more important, that Nvidia do not deliver what they claim or that in some situation Nvidia will delivers years later, too late for its customers. But in time, after many revisions and new versions, which will work only, (exclusively?) for their new generations of cards, they will deliver what that claimed 3,4 years before. Because Nvidia loves DLSS black pattern of artificial market segmentation while milking a lot of money from its uninformed customers but malevolent manipulated by Nvidia PR and its army of reviewers and yt-ers.
P. S. I am an RTX 3080 user. So RTX 3080 is stacked only to DLSS2, sometime in near future to DLSS 3.5, without FG not DLSS 3.5+FG. Here is another example of Nvidia misleading it's customers. DLSS3.5 will be available for all RTX cards Nvidia is claiming, just that not DLSS3.5 with FG which is exclusively for RTX 4xxx cards.
 
Last edited:
Excelent point @DSirius .

Yes the image comparation from DLSS 2/3 to 3.5 was a night/day situation. So even Nvidia admited the images from DLSS 2/3 were dirty, stained and color inacurate compared to DLSS3.5.
But how are both compared to native rendering?

Cheers!
 
Excelent point @DSirius .

Yes the image comparation from DLSS 2/3 to 3.5 was a night/day situation. So even Nvidia admited the images from DLSS 2/3 were dirty, stained and color inacurate compared to DLSS3.5.
But how are both compared to native rendering?

Cheers!

Thx man.
And check here how horrible DLSS3 is in Cyberpunk 2077, the most supported, endorsed and paid game by Nvidia to be Nvidia exclusive (and the same is available for DLSS2, and RTX, and the combination of DLSS2+RTX).
Time 00:10 till 01:00, upper left of the video.

And if in Cyberpunk 2077 DLSS3, also DLSS2+RT are so bad, there is no point in arguing how good or really worse they are in the other games not paid, endorsed or supported by Nvidia.
Nota bene, only CP2077 and maybe 2,3 games are confirmed to get this new DLSS 3.5 treatment in the future, end of September from Nvidia claims, the rest of the games who knows when or if will get. Until than, for Nvidia enthusiasts buyers, just turn a blind eye, like Nvidia army of reviewers and Youtubers did in their reviews, disregard them, or claim that those issues simply do not exist. Some of them deliberately, others due to incompetence, chose to show to audience only DLSS "allegedly goodies", even if some of them are exaggerated and others are misleading. In the mean time Nvidia army of reviewers and youtubers showed with "scientific accuracy" how AMD FSR is 3 to 5 % inferior than DLSS.
Until Nvidia will say otherwise and contradict them as just did with DLSS3.5.
And this also shows how embarrassing those reviews about DLSS vs FSR superiority proves to be now, they used "DLSS universally superior" catch phrase.
Time for those reviewers, or at least for some of them who will be able to see their huge mistakes, to come down with the feet on earth. And to acknowledge their mistakes, intentionally or not, but, regardless, to correct their rushed and superficial reviews which mislead readers and buyers and made a huge disservice to their audience for not showing or telling users about the huge issues and problems present for RTX with DLSS 1, 2, 3, 3.14, etc.
Also is funny how Nvidia magically resolved all those issues with the help of "new" DLSS 3.5, when AMD announced that FSR3 will be available for all RTX cards.
This proves what a huge and great beneficial influence AMD has on Nvidia abilities to resolve the huge issues which they kept secret and hidden :)
Now, Nvidia claims that they resolved those issues which they hide and kept them secret about DLSS3 and DLSS2+RTX combined. Until....., until, next year, when Nvidia will come up with a new DLSS, number 007, which will resolve numerous issues hidden and kept secret in DLSS 3.5. Spoiler! Just that the new DLSS 007, which will be the real One DLSS, will be another story because will work only on their next gen RTX 5xxx videocards.
 
Last edited:
If the title can benefit, I think AAA developers should implement all three upscaling techniques DLSS 2.0, FSR2, and XESS. There doesn't seem to be a good reason to leave any of them out considering that they all three have essentially the same motion vector requirements. Now, all of the additional features that might just cause headaches in DLSS 3.5 and FSR 3, I could see leaving those out if they are going to require additional development time.
 
Excelent point @DSirius .

Yes the image comparation from DLSS 2/3 to 3.5 was a night/day situation. So even Nvidia admited the images from DLSS 2/3 were dirty, stained and color inacurate compared to DLSS3.5.
But how are both compared to native rendering?

Cheers!
no, it's absolutely illogical, like most of his points.
3.5 is a denoiser only, not a new super resolution revision. native vs dlss2 was an entirely different thing before the denoiser came.
native without new denoiser was compared to dlss2 without new denoiser, that is why dlss2 was praised.
how are people not getting this, it is specified in the slides. 3.5 is still dlss 2.x.x (dlss sr) the only difference is rr denoiser is added.
now dlss will be better than native with rt on in most cases, cause to get the new denoiser you'll need dlss on.
 
Last edited:
no, it's absolutely illogical, like most of his points.
3.5 is a denoiser only, not a new super resolution revision. native vs dlss2 was an entirely different thing before the denoiser came.
native without new denoiser was compared to dlss2 without new denoiser, that is why dlss2 was praised.
how are people not getting this, it is specified in the slides. 3.5 is still dlss 2.x.x (dlss sr) the only difference is rr denoiser is added.
now dlss will be better than native with rt on in most cases, cause to get the new denoiser you'll need dlss on.
So to your understanding DLSS2 without denoiser looks good, but now with "denoiser" looks better than native with rt in most cases? Your arguments will earn you a job in Nvidia PR stuff.
Too bad that Nvidia army of reviewers already "demonstrated" that old DLSS with RT is better than native so you saying that DLSS 3.5 wil be better than native is redundant. So be careful to not contradict Nvidia PR and its army of reviewers. Unfortunately, those awful images with DLSS and RT provided by Nvidia as comparison, before DLSS 3.5, tell a different story. But why to contradict yourself when DLSS 3.5 is not even available, and CP2077 with actual DLSS and RT, provided by Nvidia itself looks awful? If you like those fuzzy images, wrong colors and shimmering, enjoy the game with them. Many of us are realists and call a spade a spade when they see it, regardless if you are able to agree it or not.
So enjoy playing games now as Nvidia meant for you to play them until the new DLSS3.5 will come out. But be careful, you may end up enjoying them just until a new greater DLSS, number 007, hiding in near future, will come out and will put ashame DLSS3.5. But only on the new RTX 5xxx gen videocards.

P.S. And if you find my arguments illogical, this means that:
1. You do not understand them.
2. Is hard for you to acknowledge Nvidia wrongdoings and misleading DLSS advertise, so you will disregard and insult those debating and exposing this.
3. My arguments are illogical.

For vast majority of persons with common sense it is easy to spot which one(s) of those is true and which not, by simply reading your messages and mines and using logical deduction.
 
Last edited:
So to your understanding DLSS2 without denoiser looks good, but now with "denoiser" looks better than native with rt in most cases? Your arguments will earn a job in Nvidia PR stuff
dlss2 without rt denoiser vs native without denoiser by HUB. only two games are significantly better at native, while 6 are significantly better with with dlss.
Also, rdr2 native is only better than dlss cause it comes with old dlss dll by default (agreed it's horrible), swapping to 2.5.1 turns it around, so make it 1 vs 7. GoW also uses an old dll, but I don't own the game so can't speak what 2.5.1 does there, probably same thing as in others, but if I never saw it, I'm leaving it as is, native wins slightly. FH5 still looks bad on power lines with 2.5.1 so there's a chance GoW won't get better with newer dll version either (not that taa looks good in fh5 tho, power lines still break up, but at least don't smear. to get them completely clean you gotta use 5K dsr with taa).

m2maFnw.png



HUB is doing nvidia's PR now. said by you, not me. You can't have it two ways sugar. Either me and HUB are both lying and are nvidia sponsored, or we are both right. I've played with dlss on for hundreds of hours, so I 1000% agree with Tim here.
 
Last edited:
dlss2 without rt denoiser vs native without denoiser by HUB. only two games are significantly better at native, while 6 are significantly better with with dlss.
Also, rdr2 native is only better than dlss cause it comes with old dlss dll by default (agreed it's horrible), swapping to 2.5.1 turns it around, so make it 1 vs 7. GoW also uses an old dll, but I don't own the game so can't speak what 2.5.1 does there, probably same thing as in others, but if I never saw it, I'm leaving it as is, native wins slightly. FH5 still looks bad on power lines with 2.5.1 so there's a chance GoW won't get better with newer dll version either (not that taa looks good in fh5 tho, power lines still break up, but at least don't smear. to get them completely clean you gotta use 5K dsr with taa).

m2maFnw.png



HUB is doing nvidia's PR now. said by you, not me. You can't have it two ways sugar. Either me and HUB are both lying and are nvidia sponsored, or we are both right. I've played with dlss on for hundreds of hours, so I 1000% agree with Tim here.
Yes, Tim's article is too close to an Nvidia PR, and I said exactly this in the original forum article with arguments, and pointed out that Tim's article is a low quality one. Also that he applies different standards when talking or comparing DLSS vs FSR like not saying a single word about DLSS weaknesses, which are painfully visible. His article should be updated in light of new evidences brought by Nvidia itself of how big DLSS2 and RT can fail in games, like CP2077, their most paid, endorsed and supported game.
He also must take into consideration that, who knows, if or when another expert, or even Gamers Nexus will check his article as he did with Linus. Joking.
And I also played games hundreds of hours with DLSS, AND FSR to make a better informed opinion.
Did you personally play with FSR too and if yes which games and with what videocards?

And I am not exclusive, I found some valid points and arguments in your posts too, so I definitely not see your posts or HUB article deliberately wrong or that both of you are lying.
We have different opinions, we bring some or many arguments to support them, and from all of them we can make a better informed opinion. And I found out that in some situations, some of our opinions align.
 
Last edited:
Imo he should not update anything if that's what he saw just because it makes some of the amd fanbase angry.
you're accusing him of biased journalism only because you can't stand it's not amd getting the praises.
in fact, he could have easily updated all games to 2.5.1 dll as all nvidia users do, there's even an automated tool for this (dlss swapper iirc, dunno, I don't use it, manual change is very easy even for a lazybones like me). so if anything, his testing still favors native/fsr.
And I also played hundreds of hours with DLSS, AND FSR to make a better informed opinion.
Did you personally play with FSR too?
yup, had to when I had the 6800 installed (still have it on the shelf in case I need 16gb or fsr3 turns out amazing). it's only after reading all that TPU kept writing about dlss 2.x.x vs fsr2 that I got a used 3080, soon after HUB's testing dropped to confirm what TPU wrote and I saw. Get over it.
see all fsr2 vs fsr2 vs xess comparisons. dlss beats fsr2 by a country mile, especially in real time as screenshots can't show the terrible shimmer that both HUB and TPU point out.

two of the most trusted sources in the whole industry are telling me the same, an opinion of a sour individual won't change anything.
 
Last edited:
Imo he should not update anything if that's what he saw just because it makes some of the amd fanbase angry.
you're accusing him of biased journalism only because you can't stand it's not amd getting the praises.
in fact, he could have easily updated all games to 2.5.1 dll as all nvidia users do, there's even an automated tool for this (dlss swapper iirc, dunno, I don't use it, manual change is very easy even for a lazybones like me). so if anything, his testing still favors native/fsr.

yup, had to when I had the 6800 installed (still have it on the shelf in case I need 16gb or fsr3 turns out amazing). it's only after reading all that TPU kept writing about dlss 2.x.x vs fsr2 that I got a used 3080, soon after HUB's testing dropped to confirm what TPU wrote and I saw. Get over it.
see all fsr2 vs fsr2 vs xess comparisons. dlss beats fsr2 by a country mile, especially in real time as screenshots can't show the terrible shimmer that both HUB and TPU point out.

two of the most trusted sources in the whole industry are telling me the same, an opinion of a sour individual won't change anything.
Again, If you like those fuzzy images, wrong colors and shimmering pointed out by Nvidia regarding DLSS+RT in CP2077, enjoy the games with them. I suppose that you do not see them or get over it. Perfectly fine, most important is that you are satisfied and enjoy games more with 3080 card. I have a 3080 card too and I called out Nvidia DLSS black pattern of artificial market segmentation. No DLSS3, FG part. That's why I am pleased that AMD is making FSR 3 available even for all RTX cards. Sad for Nvidia, that they chose not to offer something similar to RTX 2xxx and 3xxx owners. Like they did with DLSS 1 and 2 for GTX 1xxx owners
 
Again, If you like those fuzzy images, wrong colors and shimmering pointed out by Nvidia regarding DLSS+RT in CP2077, enjoy the games with them. I suppose that you do not see them or get over it. Perfectly fine, most important is that you are satisfied and enjoy games more with 3080 card. I have a 3080 card too and I called out Nvidia DLSS black pattern of artificial market segmentation. No DLSS3, FG part. That's why I am pleased that AMD is making FSR 3 available even for all RTX cards. Sad for Nvidia, that they chose not to offer something similar to RTX 2xxx and 3xxx owners. Like they did with DLSS 1 and 2 for GTX 1xxx owners
you know there are other games than cp2077 ? Why are you only referring to that one ? and rt in cp2077 is the first one getting the new RR denoiser. I played cp2077 with rt off, preferred to crank up the res with dldsr instead of using RT, too much ghosting indeed (native + taa with rt was way worse btw, it wasn't dlss responsible for it, it was rt). But after the 3.5 denoiser update I'll certainly go with rt + normal dlss instead of no rt +dldsr, as far as performance goes I have no problems with running 2077 at +80fps with rtgi+rt reflections on (the only two worth the performance hit imo, rasterized shadows are fine) even in the city.
The problem with dlss3 vs fsr3 is far more complex than nvidia just choosing not to include it deliberately. amd mentioned fsr3 runs on async, and games that already have it used for sth else (so most dx12/vulkan games) might not produce the desired result. So the question pops up - what happens when a fsr3 generated frame isn't 100% ready cause there wasn't enough async power headroom available. Is it skipped ? If so, what happens with frametimes ? Or is it still displayed to retain the fluidity, but it's showing more artifacts than other frames ?
amd never explained that. only admitted that running fsr3 on async has limitations. IMO they presented fsr3 running in a game that already got 60fps and said they recommend fsr3 be used at 60 cause at high resolution in heavy gpu places, fsr3 will just fall apart having little to no resources to spare for generating frames. Nvidia knew that amd will have no dedicated fsr3 hardware, that's why they showed it interpolating 25 fps, not 60. guess we'll find out. at least rtx40 owners do not have this problem, as FG has dedicated and updated hardware (more powerful OFA to let it do the FG thingy on its own). And I say good, let the poor souls not worry about it having paid 1200 eur for a 4080 lol 🤣. Frankly, I don't care about DLSS3 (FG) or FSR3, I play snail paced games anyway, so IQ always comes first for me.
We'll pick this up tomorrow, it's midnight here. Have a nice day.
 
Last edited:
Just finished some testing on my own now, the only thing that for me looks the same or better than native is DSR in some games. tested at 1440p
 
you know there are other games than cp2077 ? Why are you only referring to that one ? and rt in cp2077 is the first one getting the new RR denoiser. I played cp2077 with rt off, preferred to crank up the res with dldsr instead of using RT, too much ghosting indeed (native + taa with rt was way worse btw, it wasn't dlss responsible for it, it was rt). But after the 3.5 denoiser update I'll certainly go with rt + normal dlss instead of no rt +dldsr, as far as performance goes I have no problems with running 2077 at +80fps with rtgi+rt reflections on (the only two worth the performance hit imo, rasterized shadows are fine) even in the city.
The problem with dlss3 vs fsr3 is far more complex than nvidia just choosing not to include it deliberately. amd mentioned fsr3 runs on async, and games that already have it used for sth else (so most dx12/vulkan games) might not produce the desired result. So the question pops up - what happens when a fsr3 generated frame isn't 100% ready cause there wasn't enough async power headroom available. Is it skipped ? If so, what happens with frametimes ? Or is it still displayed to retain the fluidity, but it's showing more artifacts than other frames ?
amd never explained that. only admitted that running fsr3 on async has limitations. IMO they presented fsr3 running in a game that already got 60fps and said they recommend fsr3 be used at 60 cause at high resolution in heavy gpu places, fsr3 will just fall apart having little to no resources to spare for generating frames. Nvidia knew that amd will have no dedicated fsr3 hardware, that's why they showed it interpolating 25 fps, not 60. guess we'll find out. at least rtx40 owners do not have this problem, as FG has dedicated and updated hardware (more powerful OFA to let it do the FG thingy on its own). And I say good, let the poor souls not worry about it having paid 1200 eur for a 4080 lol 🤣. Frankly, I don't care about DLSS3 (FG) or FSR3, I play snail paced games anyway, so IQ always comes first for me.
We'll pick this up tomorrow, it's midnight here. Have a nice day.
Simple. CP2077 is the game which Nvidia refer the most for DLSS, RT, PT and other closed proprietary gimmicks. And I am citing Nvidia. Especially when they shoot in their foot themselves.
Yes, I play CP 2077 the same, without RT, just DLSS or FSR. And I agree, there are more complex factors involved with DLSS3, or 3.5 and FSR3. Just that I prefer AMD more open approach to Nvidia artificial market segmentation. DLSS3 FG or FSR3 FG both need more than 60 fps in games, for tangibile and quality results. However any of these upsfcaling technologies have to be to a reasonable quality. If AMD FSR3 will not be good enough, definitely I will not praise that FSr3 is"magical".
I think that at least RTX 3090ti owners have quite enough power for DLSS3 etc. For example when Nvidia promoted PT through CP2077 they indicate RTX4090, RTX 4080 and RTX 3090 tI. I suppose that requirement was for enough power and vram, ideally 24GB RAM, which RTX 3090 ti has.
Loool, snail paced game is the best joke of the day. I suppose I prefer them too. I am more an RPG gamer, only occasionally play FPS.
When I decided to upgrade my videocard, I was checking prices/performace for RTX 4080, it was over 1200 euro and RX 7900XTX. And I found an Asus Tuf RTX 7900XTX discounted to 840 euro. Instant choice for me.
 
I agree, there are more complex factors involved with DLSS3, or 3.5 and FSR3. Just that I prefer AMD more open approach to Nvidia artificial market segmentation.
this isn't as artificial as many paint it tho.you know uspcalers and frame generation will hit a wall without dedicated hardware. it's already apparent in fsr2 vs dlss2, where dlss2 does image reconstruction and if you look close, it draws small detail further away than fsr2. so is doing frame generation on compute, as amd point out.

Loool, snail paced game is the best joke of the day. I suppose I prefer them too. I am more an RPG gamer, only occasionally play FPS.
I'm just older. play on a controller now since I put convenience over speed. You know, 36yo, gotta start rubbing yourself with dirt to get used to it.last time I played a fps with kb+m was to train my hand after the craniotomy + brain tumor removal last summer. worked great for my left hand. replaying rdr2 to get 100% exploration side tasks is more like my thing now.
 
Last edited:
this isn't as artificial as many paint it tho.you know uspcalers and frame generation will hit a wall without dedicated hardware. it's already apparent in fsr2 vs dlss2, where dlss2 does image reconstruction and if you look close, it draws small detail further away than fsr2. so is doing frame generation on compute, as amd point out.


I'm just older. play on a controller now since I put convenience over speed. You know, 36yo, gotta start rubbing yourself with dirt to get used to it.last time I played a fps with kb+m was to train my hand after the craniotomy + brain tumor removal last summer. worked great for my left hand. replaying rdr2 to get 100% exploration side tasks is more like my thing now.
Oh, glad that you recovered, and from this point of view this is one, among others, of surprising benefits playing games.
I have D2R2 too, plan to play it, but after Starfield :)
 
Oh, glad that you recovered, and from this point of view this is one, among others, of surprising benefits playing games.
I have D2R2 too, plan to play it, but after Starfield :)
don't rush rdr2. it's not meant to be played for missions only. it's a proper open world game if there is one, start when you know you can commit at least 300hrs to free roaming exploration alone.
I'm super tempted by baldeurs gate 3 now, but I'd rather start it when I have the time to explore ( winter holidays? I'll have 2 weeks off since I'm a teacher) and play alan wake 2 first. it's a game I've been waiting for forever, and now that it's getting proper rt, I could not be happier. imo this is gonna be the next control, a game to set the standards in quality for 3+ years.
 
don't rush rdr2. it's not meant to be played for missions only. it's a proper open world game if there is one, start when you know you can commit at least 300hrs to free roaming exploration alone.
I'm super tempted by baldeurs gate 3 now, but I'd rather start it when I have the time to explore ( winter holidays? I'll have 2 weeks off since I'm a teacher) and play alan wake 2 first. it's a game I've been waiting for forever, and now that it's getting proper rt, I could not be happier. imo this is gonna be the next control, a game to set the standards in quality for 3+ years.
Lol, good to know, I had an ideea that would need proper time like CP2077 and, hope Starfield if will prove being a good game.
 
Back