How to Use Nvidia's DLDSR to Improve Image Quality on Older Games

Irata

Posts: 2,169   +3,747
Thanks for the review. What would have been interesting is a comparison to other methods like AMD Virtual Super Resolution.

It would be nice to see how the approaches differ wrt performance and quality.
 

Vulcanproject

Posts: 1,531   +2,785
I'm always looking for something that can deal with Unreal Engine 3 games. They were often really good looking titles when you pump the resolution from that Xbox 360/PS3 era but usually had primitive post process anti aliasing or basic MSAA that didn't deal with the deferred rendering nature very well. There are obviously a tonne of them too, big catalogue of quality games. I'll have to give this a shot.
 

Puiu

Posts: 5,631   +4,604
TechSpot Elite
Not to worry, the typical AMD 'me too' effort will surely not be much more than a year late and not that much worse.
It's not exactly that difficult to do.

What this does is that it essentially applies DLSS to the selected rendered resolution (selected factor) taking it up to what DRS would use and then downscales it.

DLDSR for 1080p screen --> 1440p render resolution --> 4k DLSS upscaled resolution --> 1080p downscaled resolution
DSR for 1080p screen --> 4k rendered resolution --> 1080p downscaled resolution

AMD can do this with FSR2.0 too. I can see this feature being announced together with RDNA3.
 

Stingy McDuck

Posts: 224   +164
It's not exactly that difficult to do.

What this does is that it essentially applies DLSS to the selected rendered resolution (selected factor) taking it up to what DRS would use and then downscales it.

DLDSR for 1080p screen --> 1440p render resolution --> 4k DLSS upscaled resolution --> 1080p downscaled resolution
DSR for 1080p screen --> 4k rendered resolution --> 1080p downscaled resolution

AMD can do this with FSR2.0 too. I can see this feature being announced together with RDNA3.

There are no motion vectors involved on DLDSR. So it doesnt use DLSS and it wouldn't work with FSR 2.0.
 

Stingy McDuck

Posts: 224   +164
1.78x DSR is blurry as hell on my 1080p screen. That's why I prefer to set up a custom resolution either with the Nvidia Control Panel or with CRU, because it's done with a less blurry filter. How does DLDSR compare to these solutions, or is it an improvement only over DSR?
 

Mr Majestyk

Posts: 1,265   +1,149
I think the author is blind. The DLDSR 1620p option in Prey is clearly superior to the 4K DSR option, it's not even close. The 4K DSR option just looks like a bit sharper version of 1080p with no detail improvement at all. The DLDSR clearly shows finer detail ie higher resolution than native 1080p or 4K DSR, just look in the background at the map for example.
 

b3rdm4n

Posts: 84   +68
DLDSR is excellent and so good when combined with DLSS, looks like Nvidia is the one helping their cards age like finewine here, given we got DLDSR mid cycle and DLSS just keeps improving and being put in games, among other factors.

If anyone ever didn't believe that above native IQ is possible (daft I know, but these people exist), DLDSR and even DLSS+DLDSR is yet another nail in the coffin of that silly statement.
 
Last edited:

b3rdm4n

Posts: 84   +68
Not to worry, the typical AMD 'me too' effort will surely not be much more than a year late and not that much worse.
It is certainly the way of things. Patient gamers will get a 'decent' free alternative... at some point.
 

Adhmuz

Posts: 2,244   +1,064
Would be great on a lower resolution monitor where you have the overhead on all titles, looking at the option it's only globally on or off and cannot be selected by title which would be the best way to use this. With a 4K panel this is only going to work with older games or non AAA games, I can't be bothered to have to manually go in and turn this on or off every time I switch from newer to older games. Perhaps this can be implemented with future driver releases.
 

Adhmuz

Posts: 2,244   +1,064
Not to worry, the typical AMD 'me too' effort will surely not be much more than a year late and not that much worse.
Don't forget, it's a meaningless feature that no one needs, oh wait, AMD has it too now? Oh now it's important and has to be used in all comparisons and benchmarks.
 

Angga B

Posts: 169   +138
Thanks for the quality article. I understand the process described and definitely see the difference in the results. However it ever escape my brain, how come game rendering engine could not output such high quality render DIRECTLY at its intended resolution, for example 1080p, that there is ever need of rendering at higher resolution then downsample to 1080p. I mean, pixels are still pixels, right. Could someone enlight me?