VcoreLeone
Posts: 252 +120
amd doesn't produce or develop games.DLSS is a hardware solution and is exclusive to Nvidia Cards only. AMD doesn't have to support it at all
amd doesn't produce or develop games.DLSS is a hardware solution and is exclusive to Nvidia Cards only. AMD doesn't have to support it at all
I have to agree with this. As an RTX user, The image quality is substantially better in some cases with DLSS. Especially older titles 'remastered' I.e. - witcher 3.It negative when FSR produce worse image quality. Why RTX users have to suffer from using inferior tech for no good reason when they can easily just use DLSS
They gap between FSR and DLSS image quality can be big sometimes.
Look at shimmering on building on FSR image here
https://I.ibb.co/2N4WPF0/get.gif
And as an RTX user, how good DLSS2, 3 and RT were? Because until 1 week before Nvidia and it's army of yt and reviewers were praising how good DLSS and RT are. Just that few days ago Nvidia came up with a "new" DLSS number 3.5 in which they showed how good this new DLSS is comparing with old DLSS 2,3 and RT combined which produce shimmering, ghosting, fuzzy images, and wrong colours, and how good the new DLSS 3.5 resolve those huge issues.I have to agree with this. As an RTX user, The image quality is substantially better in some cases with DLSS. Especially older titles 'remastered' I.e. - witcher 3.
However it doesn't alleviate nvidia from price hiking their crap, and with DLSS 3.5 coming, its only gonna get worse.
Excelent point @DSirius .
Yes the image comparation from DLSS 2/3 to 3.5 was a night/day situation. So even Nvidia admited the images from DLSS 2/3 were dirty, stained and color inacurate compared to DLSS3.5.
But how are both compared to native rendering?
Cheers!
no, it's absolutely illogical, like most of his points.Excelent point @DSirius .
Yes the image comparation from DLSS 2/3 to 3.5 was a night/day situation. So even Nvidia admited the images from DLSS 2/3 were dirty, stained and color inacurate compared to DLSS3.5.
But how are both compared to native rendering?
Cheers!
So to your understanding DLSS2 without denoiser looks good, but now with "denoiser" looks better than native with rt in most cases? Your arguments will earn you a job in Nvidia PR stuff.no, it's absolutely illogical, like most of his points.
3.5 is a denoiser only, not a new super resolution revision. native vs dlss2 was an entirely different thing before the denoiser came.
native without new denoiser was compared to dlss2 without new denoiser, that is why dlss2 was praised.
how are people not getting this, it is specified in the slides. 3.5 is still dlss 2.x.x (dlss sr) the only difference is rr denoiser is added.
now dlss will be better than native with rt on in most cases, cause to get the new denoiser you'll need dlss on.
dlss2 without rt denoiser vs native without denoiser by HUB. only two games are significantly better at native, while 6 are significantly better with with dlss.So to your understanding DLSS2 without denoiser looks good, but now with "denoiser" looks better than native with rt in most cases? Your arguments will earn a job in Nvidia PR stuff
Yes, Tim's article is too close to an Nvidia PR, and I said exactly this in the original forum article with arguments, and pointed out that Tim's article is a low quality one. Also that he applies different standards when talking or comparing DLSS vs FSR like not saying a single word about DLSS weaknesses, which are painfully visible. His article should be updated in light of new evidences brought by Nvidia itself of how big DLSS2 and RT can fail in games, like CP2077, their most paid, endorsed and supported game.dlss2 without rt denoiser vs native without denoiser by HUB. only two games are significantly better at native, while 6 are significantly better with with dlss.
Also, rdr2 native is only better than dlss cause it comes with old dlss dll by default (agreed it's horrible), swapping to 2.5.1 turns it around, so make it 1 vs 7. GoW also uses an old dll, but I don't own the game so can't speak what 2.5.1 does there, probably same thing as in others, but if I never saw it, I'm leaving it as is, native wins slightly. FH5 still looks bad on power lines with 2.5.1 so there's a chance GoW won't get better with newer dll version either (not that taa looks good in fh5 tho, power lines still break up, but at least don't smear. to get them completely clean you gotta use 5K dsr with taa).
![]()
HUB is doing nvidia's PR now. said by you, not me. You can't have it two ways sugar. Either me and HUB are both lying and are nvidia sponsored, or we are both right. I've played with dlss on for hundreds of hours, so I 1000% agree with Tim here.
yup, had to when I had the 6800 installed (still have it on the shelf in case I need 16gb or fsr3 turns out amazing). it's only after reading all that TPU kept writing about dlss 2.x.x vs fsr2 that I got a used 3080, soon after HUB's testing dropped to confirm what TPU wrote and I saw. Get over it.And I also played hundreds of hours with DLSS, AND FSR to make a better informed opinion.
Did you personally play with FSR too?
Again, If you like those fuzzy images, wrong colors and shimmering pointed out by Nvidia regarding DLSS+RT in CP2077, enjoy the games with them. I suppose that you do not see them or get over it. Perfectly fine, most important is that you are satisfied and enjoy games more with 3080 card. I have a 3080 card too and I called out Nvidia DLSS black pattern of artificial market segmentation. No DLSS3, FG part. That's why I am pleased that AMD is making FSR 3 available even for all RTX cards. Sad for Nvidia, that they chose not to offer something similar to RTX 2xxx and 3xxx owners. Like they did with DLSS 1 and 2 for GTX 1xxx ownersImo he should not update anything if that's what he saw just because it makes some of the amd fanbase angry.
you're accusing him of biased journalism only because you can't stand it's not amd getting the praises.
in fact, he could have easily updated all games to 2.5.1 dll as all nvidia users do, there's even an automated tool for this (dlss swapper iirc, dunno, I don't use it, manual change is very easy even for a lazybones like me). so if anything, his testing still favors native/fsr.
yup, had to when I had the 6800 installed (still have it on the shelf in case I need 16gb or fsr3 turns out amazing). it's only after reading all that TPU kept writing about dlss 2.x.x vs fsr2 that I got a used 3080, soon after HUB's testing dropped to confirm what TPU wrote and I saw. Get over it.
see all fsr2 vs fsr2 vs xess comparisons. dlss beats fsr2 by a country mile, especially in real time as screenshots can't show the terrible shimmer that both HUB and TPU point out.![]()
TechPowerUp
www.techpowerup.com
two of the most trusted sources in the whole industry are telling me the same, an opinion of a sour individual won't change anything.
you know there are other games than cp2077 ? Why are you only referring to that one ? and rt in cp2077 is the first one getting the new RR denoiser. I played cp2077 with rt off, preferred to crank up the res with dldsr instead of using RT, too much ghosting indeed (native + taa with rt was way worse btw, it wasn't dlss responsible for it, it was rt). But after the 3.5 denoiser update I'll certainly go with rt + normal dlss instead of no rt +dldsr, as far as performance goes I have no problems with running 2077 at +80fps with rtgi+rt reflections on (the only two worth the performance hit imo, rasterized shadows are fine) even in the city.Again, If you like those fuzzy images, wrong colors and shimmering pointed out by Nvidia regarding DLSS+RT in CP2077, enjoy the games with them. I suppose that you do not see them or get over it. Perfectly fine, most important is that you are satisfied and enjoy games more with 3080 card. I have a 3080 card too and I called out Nvidia DLSS black pattern of artificial market segmentation. No DLSS3, FG part. That's why I am pleased that AMD is making FSR 3 available even for all RTX cards. Sad for Nvidia, that they chose not to offer something similar to RTX 2xxx and 3xxx owners. Like they did with DLSS 1 and 2 for GTX 1xxx owners
try dldsr. looks even crisper than dsr.Just finished some testing on my own now, the only thing that for me looks the same or better than native is DSR in some games. tested at 1440p
Simple. CP2077 is the game which Nvidia refer the most for DLSS, RT, PT and other closed proprietary gimmicks. And I am citing Nvidia. Especially when they shoot in their foot themselves.you know there are other games than cp2077 ? Why are you only referring to that one ? and rt in cp2077 is the first one getting the new RR denoiser. I played cp2077 with rt off, preferred to crank up the res with dldsr instead of using RT, too much ghosting indeed (native + taa with rt was way worse btw, it wasn't dlss responsible for it, it was rt). But after the 3.5 denoiser update I'll certainly go with rt + normal dlss instead of no rt +dldsr, as far as performance goes I have no problems with running 2077 at +80fps with rtgi+rt reflections on (the only two worth the performance hit imo, rasterized shadows are fine) even in the city.
The problem with dlss3 vs fsr3 is far more complex than nvidia just choosing not to include it deliberately. amd mentioned fsr3 runs on async, and games that already have it used for sth else (so most dx12/vulkan games) might not produce the desired result. So the question pops up - what happens when a fsr3 generated frame isn't 100% ready cause there wasn't enough async power headroom available. Is it skipped ? If so, what happens with frametimes ? Or is it still displayed to retain the fluidity, but it's showing more artifacts than other frames ?
amd never explained that. only admitted that running fsr3 on async has limitations. IMO they presented fsr3 running in a game that already got 60fps and said they recommend fsr3 be used at 60 cause at high resolution in heavy gpu places, fsr3 will just fall apart having little to no resources to spare for generating frames. Nvidia knew that amd will have no dedicated fsr3 hardware, that's why they showed it interpolating 25 fps, not 60. guess we'll find out. at least rtx40 owners do not have this problem, as FG has dedicated and updated hardware (more powerful OFA to let it do the FG thingy on its own). And I say good, let the poor souls not worry about it having paid 1200 eur for a 4080 lol 🤣. Frankly, I don't care about DLSS3 (FG) or FSR3, I play snail paced games anyway, so IQ always comes first for me.
We'll pick this up tomorrow, it's midnight here. Have a nice day.
this isn't as artificial as many paint it tho.you know uspcalers and frame generation will hit a wall without dedicated hardware. it's already apparent in fsr2 vs dlss2, where dlss2 does image reconstruction and if you look close, it draws small detail further away than fsr2. so is doing frame generation on compute, as amd point out.I agree, there are more complex factors involved with DLSS3, or 3.5 and FSR3. Just that I prefer AMD more open approach to Nvidia artificial market segmentation.
I'm just older. play on a controller now since I put convenience over speed. You know, 36yo, gotta start rubbing yourself with dirt to get used to it.last time I played a fps with kb+m was to train my hand after the craniotomy + brain tumor removal last summer. worked great for my left hand. replaying rdr2 to get 100% exploration side tasks is more like my thing now.Loool, snail paced game is the best joke of the day. I suppose I prefer them too. I am more an RPG gamer, only occasionally play FPS.
Oh, glad that you recovered, and from this point of view this is one, among others, of surprising benefits playing games.this isn't as artificial as many paint it tho.you know uspcalers and frame generation will hit a wall without dedicated hardware. it's already apparent in fsr2 vs dlss2, where dlss2 does image reconstruction and if you look close, it draws small detail further away than fsr2. so is doing frame generation on compute, as amd point out.
I'm just older. play on a controller now since I put convenience over speed. You know, 36yo, gotta start rubbing yourself with dirt to get used to it.last time I played a fps with kb+m was to train my hand after the craniotomy + brain tumor removal last summer. worked great for my left hand. replaying rdr2 to get 100% exploration side tasks is more like my thing now.
don't rush rdr2. it's not meant to be played for missions only. it's a proper open world game if there is one, start when you know you can commit at least 300hrs to free roaming exploration alone.Oh, glad that you recovered, and from this point of view this is one, among others, of surprising benefits playing games.
I have D2R2 too, plan to play it, but after Starfield![]()
Lol, good to know, I had an ideea that would need proper time like CP2077 and, hope Starfield if will prove being a good game.don't rush rdr2. it's not meant to be played for missions only. it's a proper open world game if there is one, start when you know you can commit at least 300hrs to free roaming exploration alone.
I'm super tempted by baldeurs gate 3 now, but I'd rather start it when I have the time to explore ( winter holidays? I'll have 2 weeks off since I'm a teacher) and play alan wake 2 first. it's a game I've been waiting for forever, and now that it's getting proper rt, I could not be happier. imo this is gonna be the next control, a game to set the standards in quality for 3+ years.
it's waaaay more time consuming than cp to find all the stuff in the world.Lol, good to know, I had an ideea that would need proper time like CP2077 and, hope Starfield if will prove being a good game.