Is Upscaling Useful at Lower Resolutions? Nvidia DLSS vs Native at 1080p

TAA =/= Native. Most of the time it makes everything blurry, so in fact you are comparing two image blurring techniques.

The best solution is to run without TAA at the highest possible resolution, but there are still problems because the current rendering tech depends on this crap to work correctly. -_-

 
Good information. Yet I honestly could not see any differences until they were directly pointed out (I still don't see the "obvious taillight ghosting"), and all were minor enough that I would never notice while in the thick of the action.
 
Nice article thanks for the info! Only issue I have is the author only listed what native resolution Quality DLSS started with at the end of article. I was confused until I read the conclusion. Kind of weird thinking 1080P native to 1080P DLSS, instead of 1080P native vs 720P to 1080P upscaled, lol.
 
Last edited:
All the images provided (can't watch the videos from work), the DLSS all have a lot of extra smearing/blurring that you can see when comparing the the two images. Foreground stuff looks pretty much the same, but once you start getting into the midground and background then you can really see the blurring effects more and more on the DLSS images.

I do not see a single reason why you would use DLSS. Even from the get go of the 1.0 version days it wasn't a good look and while it has improved, it still makes things blurry. I'd rather lower settings here and there over making things blurry with DLSS being turned on.

Granted, if you sat me down at a computer to play a game and didn't tell me the settings you were using and if DLSS was on or not, I wouldn't be able to tell without having a comparison side-by-side to go by. But if I'm playing a game and enable DLSS I can see the blurring effect that happens, especially in the background. I've tried DLSS 2.0 in a couple of games and I didn't like the blurring effect it gives, kind of reminds me of motion blur, but you get the effect as you're standing still looking at things. I hate motion blur.
 
I don't know about DLSS, but I can notice FSRs blurrier image even at 4K so If I really need some extra performance I will use upscaling, but I prefer native.
 
Upscaling at 1080p is the same as rendering from 540p...

Not to mention that upscaling in CPU bottleneck is providing limited return.

So no, it is even more garbage Tim...
 
TAA =/= Native. Most of the time it makes everything blurry, so in fact you are comparing two image blurring techniques.

The best solution is to run without TAA at the highest possible resolution, but there are still problems because the current rendering tech depends on this crap to work correctly. -_-
Absolutely, it is even more noticeable at 2160p.
 
With 1080p, Nvidia should offer an Ultra Quality that only reduces the render resolution to 900p, or redo their entire scale so that Quality is 900p, Balanced 720p, Performance 630p and U-P 540p. I just don't think 360p to play at 1080p is worth it for anyone. If the average uplift is 36% there is room for a 15-20% uplift tier that I think would come much closer to native.

Same thing for 1440p, there should be a 1200p render option, and for 4K a 1800p option.
 
TAA =/= Native. Most of the time it makes everything blurry, so in fact you are comparing two image blurring techniques.

The best solution is to run without TAA at the highest possible resolution, but there are still problems because the current rendering tech depends on this crap to work correctly. -_-
Yeah In Vermitide 2 I actually replaced the TAA with DLAA at 4k native resolution and after playing with it for a while stated noticing textures started popping.
But then is a 720p imagine upscaled to 1080 dlss quality superior to 1080p with taa?
Imo native resolution with dlaa>native resolution taa>dlss quality >dlss performance > dlss ultra performance.
 
Native resolution is preferable for most of the games, if it gives stable FPS and enjoyable gameplay.

If FPS is not stable and/or gameplay is not smooth (laggy) than, upscaling is a way to get more stable FPS and smooth gameplay.

And I didn't see noticeable difference between High and Ultra GFX settings for most of the games.

I suggest, lowering GFX from Ultra to High and play in Native resolution. It's better than upscaling.
 
Native resolution is preferable for most of the games, if it gives stable FPS and enjoyable gameplay.

If FPS is not stable and/or gameplay is not smooth (laggy) than, upscaling is a way to get more stable FPS and smooth gameplay.

And I didn't see noticeable difference between High and Ultra GFX settings for most of the games.

I suggest, lowering GFX from Ultra to High and play in Native resolution. It's better than upscaling.
Exactly
 
Yeah In Vermitide 2 I actually replaced the TAA with DLAA at 4k native resolution and after playing with it for a while stated noticing textures started popping.
But then is a 720p imagine upscaled to 1080 dlss quality superior to 1080p with taa?
Imo native resolution with dlaa>native resolution taa>dlss quality >dlss performance > dlss ultra performance.
DLAA looks categorically worse than native
 
DLAA looks categorically worse than native
While this is all subjective don't take my word for it. Here is tpu's conclusion on the matter.

"Conclusion
DLSS is designed to improve performance with only a minimal loss in image quality, while the new DLAA algorithm does the opposite, significantly improving image quality at a slight performance hit. Compared to TAA and DLSS, DLAA is clearly producing the best image quality, especially at lower resolutions. In most games, DLSS was already doing a better job than TAA at reconstructing small objects (like wires or tree leaves, for example); however, as our comparison images show, DLAA does an even better job at reconstructing small objects."


Let me me know if you have any questions.
 
I can report that using FSR works in Ubuntu on Intel Xe (Tiger Lake GPU). (It's the Core i3-1115G4, so it has the lowest end 11th gen GPU.) I had one or two games where FPS was a tad low, the FSR on lower settings increased FPS like 10-30%, cranking it to higher settings I was getting like 50-100% FPS boost, it looked a tad poor but on those games that needed it the FPS increase was worth it, if I want it to look it's best I run it on my desktop.
 
I feel the title of this article vs the actual testing don't sync up. The title asked, is upscaling useful at lower resolution, while the bulk of the article is discussing on the image quality between native and upscaled. Generally, upscaling will surely be useful at whatever resolution. The objective of upscaling is mostly to make an unplayable game (due to limited hardware), playable. And regardless of how it looks, that main objective is still achieved. And to be honest, if I want to play a game with dated hardware, it is not a matter of choice between using and not using upscaling. I will have to use it, or just lower the resolution to say native 720p. So if we want to see how well an upscaling technology work, I think we should be testing between an upscaled 1080p (from 720p) vs native 720p, and see what is the performance delta, and as a sideline, image quality.
 
Your GPU can’t create enough FPS at the resolution you want.

That is the only reason to run it.
Came here to post this. If you're running upscaling at low resolutions then your last concern is probably graphics fidelity.

Just look at the steam deck, it needs to runs FSR at 800P just to get over 30 FPS in some games but people are absolutely in love with these handhelds.

A slow PC is better than no PC!

People seem to forget that even during the GTX 980 era, you would often have to adjust some settings to get 1080P60
 
I think you missed the point completely. This is not a compare between DLSS and FSR.
Just in case anyone wanted alternatives to dlss, Epic made their own upscaler called tsr and according to Digital foundries podcast is better visually than FSR 2. Another upscaler is Xess which I believe Tim mentioned is superior to FSR 2 as well.

 
Back