How to Use Nvidia's DLDSR to Improve Image Quality on Older Games

Thanks for the review. What would have been interesting is a comparison to other methods like AMD Virtual Super Resolution.

It would be nice to see how the approaches differ wrt performance and quality.
 
I'm always looking for something that can deal with Unreal Engine 3 games. They were often really good looking titles when you pump the resolution from that Xbox 360/PS3 era but usually had primitive post process anti aliasing or basic MSAA that didn't deal with the deferred rendering nature very well. There are obviously a tonne of them too, big catalogue of quality games. I'll have to give this a shot.
 
Not to worry, the typical AMD 'me too' effort will surely not be much more than a year late and not that much worse.
It's not exactly that difficult to do.

What this does is that it essentially applies DLSS to the selected rendered resolution (selected factor) taking it up to what DRS would use and then downscales it.

DLDSR for 1080p screen --> 1440p render resolution --> 4k DLSS upscaled resolution --> 1080p downscaled resolution
DSR for 1080p screen --> 4k rendered resolution --> 1080p downscaled resolution

AMD can do this with FSR2.0 too. I can see this feature being announced together with RDNA3.
 
It's not exactly that difficult to do.

What this does is that it essentially applies DLSS to the selected rendered resolution (selected factor) taking it up to what DRS would use and then downscales it.

DLDSR for 1080p screen --> 1440p render resolution --> 4k DLSS upscaled resolution --> 1080p downscaled resolution
DSR for 1080p screen --> 4k rendered resolution --> 1080p downscaled resolution

AMD can do this with FSR2.0 too. I can see this feature being announced together with RDNA3.

There are no motion vectors involved on DLDSR. So it doesnt use DLSS and it wouldn't work with FSR 2.0.
 
1.78x DSR is blurry as hell on my 1080p screen. That's why I prefer to set up a custom resolution either with the Nvidia Control Panel or with CRU, because it's done with a less blurry filter. How does DLDSR compare to these solutions, or is it an improvement only over DSR?
 
I think the author is blind. The DLDSR 1620p option in Prey is clearly superior to the 4K DSR option, it's not even close. The 4K DSR option just looks like a bit sharper version of 1080p with no detail improvement at all. The DLDSR clearly shows finer detail ie higher resolution than native 1080p or 4K DSR, just look in the background at the map for example.
 
DLDSR is excellent and so good when combined with DLSS, looks like Nvidia is the one helping their cards age like finewine here, given we got DLDSR mid cycle and DLSS just keeps improving and being put in games, among other factors.

If anyone ever didn't believe that above native IQ is possible (daft I know, but these people exist), DLDSR and even DLSS+DLDSR is yet another nail in the coffin of that silly statement.
 
Last edited:
Not to worry, the typical AMD 'me too' effort will surely not be much more than a year late and not that much worse.
It is certainly the way of things. Patient gamers will get a 'decent' free alternative... at some point.
 
Would be great on a lower resolution monitor where you have the overhead on all titles, looking at the option it's only globally on or off and cannot be selected by title which would be the best way to use this. With a 4K panel this is only going to work with older games or non AAA games, I can't be bothered to have to manually go in and turn this on or off every time I switch from newer to older games. Perhaps this can be implemented with future driver releases.
 
Not to worry, the typical AMD 'me too' effort will surely not be much more than a year late and not that much worse.
Don't forget, it's a meaningless feature that no one needs, oh wait, AMD has it too now? Oh now it's important and has to be used in all comparisons and benchmarks.
 
Thanks for the quality article. I understand the process described and definitely see the difference in the results. However it ever escape my brain, how come game rendering engine could not output such high quality render DIRECTLY at its intended resolution, for example 1080p, that there is ever need of rendering at higher resolution then downsample to 1080p. I mean, pixels are still pixels, right. Could someone enlight me?
 
DLDSR works just fine with gsync in full screen mode. The article is incorrect here. Make sure that the scaling options in the driver the scaling is chosen as “aspect ratio” and done by gpu.
 
I'm using DLDSR right now on 2008 Dead Space with g-sync and in full screen mode and it looks great! This title is old enough that with my 3080 12GB at 2.25X, I'm still getting a rock solid 120fps while rendering at 6880 x 2880! I limited it to 120fps through RivaTuner because this is is a first person shooter and doesn't require more. Plus, I read that the game engine doesn't play well at super high frame rates.

This, and using WideScreenFixer to get a better FOV has been great!
 
Back