Nvidia's DLSS Second Take: Metro Exodus Investigation

Scorpus

Posts: 2,223   +246
Staff member
It is quite some performance boost as I can play it on extreme quality while using dlss+rt on my 2080 while staying mostly above 60-65fps @1440p not to mention Im on I7 6700K I find that excelent I didn't se frame drop belo 55 in any scenario it ranges from 55 to 90 hovering mostly around 65 fps. All the negative hype regarding DLSS I call BS.
 
It is quite some performance boost as I can play it on extreme quality while using dlss+rt on my 2080 while staying mostly above 60-65fps @1440p not to mention Im on I7 6700K I find that excelent I didn't se frame drop belo 55 in any scenario it ranges from 55 to 90 hovering mostly around 65 fps. All the negative hype regarding DLSS I call BS.

It's mindblowing how often "techies" shoot down new tech when it doesn't "just work" out of the box.
 
@Scorpus Good article.
My take is that it is hard to really see a significant improvement. Whether due to the technology or these old eyes or my oldish monitor, since it takes a magnifying glass to really 'see', it is unlikely to be needed for gaming. Reminds me of streetable versions of the F1 racers, a lot of capability, but limited use. (I drive an older Volvo.)

Sorry nVidia.
 
Few things I found :
- When using DLSS I cant use HDR if I enable HDR the picture becomes completely washout and I cant compensate by playing with monitor control or gamma settings form the game as also the colors are Incorrect that possible will be fixed in upcoming patches or maybe nvida driver.
I can have almost the same performance on full 4K 3842x2160 ,maybe 10-15 fps less overall but on ULTRA settings which is also breath taking, however I chose to stay on EXTREME @ 2560X1440 in both cases there is NO BLURRINESS while using DLSS but :
change to 2560x1600 another story ,while the FPS is the same everything become blurry the whole image not just spots or parts .
Obviously nvida is training Tensor Cores AI @ specific resolutions so If you go with this sort of custom in-between resolutions obviously there will be some side effects like blurriness,flashing and who knows what. Just stick to the damn default resolutions ..and dont tell me you people pay 1500+USD for 2080Ti and cant play 4K with playable frame rate using DLSS + RT? or it looks like crap?? ..no it doesn't! I don't buy it as I play it and have the damn thing actually 2080 nonTi so If I can on a 6th gen cpu I7-6700K then all you kids with all this mighty I9's and Ryzen's can do the same if not better.
 
Last edited:
Any word on ultrawide support or do we have to wait for the ai to do its thing? Seems like if you own an ultrawide monitor you have to wait with no official eta and if you wanted to use dlss you have to use another monitor with 16:9 aspect ratio.
 
Pretty impressive tech, good job Nvidia.
The zoomed in shots show clearly how much better it is, thanks for posting.
 
I didn't buy my RTX2080 for dlss.

Never cared about it or anything "ray-tracing" either, I just want faster frames in the games I am playing. (which do not use any of that crap). And coincidentally NOBODY plays Battlefield V (bcz of hacks), so no amount of nvidia raytracing, or dlss in that game, matters to anyone. (Try again).

DLSS is a gimmick, that Jensen was so high on cash, that hen thought he could pull one over the fanbase's eyes.

"It just works..." <--- shows you how delusional that guy has become with greed.
 
lol^ (OMFG)

What are you talking about? Is clearly shows how horrible and unnecessary DLSS actually is, when you can simply run the games at 1800p and get the same frames and much better clarity.
like I mentioned before obviously the Tensor Cores are not trained for 1800p , wtf is 1800p anyway ? that is a non standard vertical resolution who came up with this? and for what good it is?
 
It is understandable, if you understand what DLSS is.
(& why it can NOT be used at every resolution)

Logical Gamer's came up with 1800p as their own argument towards DLSS. Logically, People who play their games on their brand new 4k Monitors, want 4k output. Not 1440p, upscaled to 4k using DLSS. If THE PLAYERS wanted that effect (DLSS), they could easily just manually set their game to 1800p and get much better frames (than playing at 2160p). Which is exactly what DLSS claims to do, but 1800p is still sharper & clearer than DLSS, providing identical frames to DLSS as well.

So it is a marketing gimmick.
So, "It just works" refers to all these developers & programmers & game makers running around frantically to make DLSS (just) work..?

OR...

Have the 4K PLAYERs manually set their games to 1800p & gain free FPS. (AMAZING!)
 
Last edited:
Logical ? someone "gamers" come up with a new standard for resolution ? there is nothing logical about it real 4K is 4096 x 2160 but we have it as 4K standard as 3840 x 2160 and this 1800 hybrid is completely illogical , nvida has to follow industrial and consumer STANDARDS and that's what they doing so you cant complain if Tensors are not trained for this sort of "gamers" crap they are coming up with. Nvidia clearly stated how the process of DLSS is working and how Tensor cores and AI has to be trained .
 
Is clearly shows how horrible and unnecessary DLSS actually is, when you can simply run the games at 1800p and get the same frames and much better clarity.
I don't think you understand the article.
It clearly shows in the zoomed in shots, DLSS is much clearer at 1440p then the other normal shader version at 1440p.
And who has 1800p screens? I have a Gysnc 1MS HP Omen/1440p and with DLSS will look better then 1800P without it, this article makes that quite clear.

Quote from the author
we think a lot of people will be more satisfied playing the game with DLSS enabled at a higher level of performance, than running with slightly better sharpness at lower performance
Either you are having trouble with reading comprehension, or your just trying to hate.
 
So checkerboard is best up-scaling technique so far?
is It possible to make comparison between DLSS vs checkerboard?
 
Is clearly shows how horrible and unnecessary DLSS actually is, when you can simply run the games at 1800p and get the same frames and much better clarity.
I don't think you understand the article.
It clearly shows in the zoomed in shots, DLSS is much clearer at 1440p then the other normal shader version at 1440p.
And who has 1800p screens? I have a Gysnc 1MS HP Omen/1440p and with DLSS will look better then 1800P without it, this article makes that quite clear.

Quote from the author
we think a lot of people will be more satisfied playing the game with DLSS enabled at a higher level of performance, than running with slightly better sharpness at lower performance
Either you are having trouble with reading comprehension, or your just trying to hate.

Man you guys fail at reading comprehension. The games they are using in the tests have their own resolution upscaling slider bars. They are not setting the monitor to 1800p, they are setting that scaling slider to 1800p. The game then renders at 1800p and *upscales* that to 4K. They did this to create a performance baseline. DLSS performs the *same* as setting that slider to 1800p. So logically we would expect DLSS to looks significantly better than simply letting the game engine upscale a 1800p image.

DLSS *requires* a pile of "Tensor" cores on the GPU that are not being used for *anything else* and yet they provide, at best, a tiny hard to see improvement over simply doing it in software. That should be everyones takeaway. I don't care if its 5% faster than a software solution. Its a pile of die space being uselessly filled with cores that are worthless for gaming.

The entire Turing die was designed for enteprise and AI/hyperscaling/cloud markets, *not* gaming. Nvidia just decided to try and slap some marketing terms on the GPUs and release very expensive enteprise products as the "new hotness". The spent a fortune on developing what amounts to ASIC cores for a very specific workload which is great, AI is amazing. What *isn't* great is them thinking they can just repackage it and drop it on gamers with a 40-50% premium over the previous gen and it would all be peachy.

At least it seems they got their hand slapped this time (from their sales reports).
 
Is clearly shows how horrible and unnecessary DLSS actually is, when you can simply run the games at 1800p and get the same frames and much better clarity.
I don't think you understand the article.
It clearly shows in the zoomed in shots, DLSS is much clearer at 1440p then the other normal shader version at 1440p.
And who has 1800p screens? I have a Gysnc 1MS HP Omen/1440p and with DLSS will look better then 1800P without it, this article makes that quite clear.

I don't think you understand the article, actually. The only reason DLSS ever seems to (sometimes) have an advantage is added "Sharpening" of the DLSS image. There are already programs that allow you to apply sharpening to your games without DLSS, and the author hypothesizes a "sharpened" up-scaled image will likely look as good or better than the "DLSS Sharpened" image in Metro.

In other words, Nvidia said they would fix the blurry DLSS results - and their fix was apparently applying a different technique on top of their garbage marketing gimmick. They didn't "Fix DLSS", they simply covered up how bad it is by also applying an additional technique. DLSS is thus never going to be anything but a marketing term.

Nvidia may call a graphics option "DLSS" in future games, but in reality it is just a downsampled image with sharpening applied lol.
 
And who has 1800p screens? I have a Gysnc 1MS HP Omen/1440p and with DLSS will look better then 1800P without it, this article makes that quite clear.

Not to pick on you here but this warranted a direct response. The 1440p comparison was between 1440p and 1290p, *not* 1800p. If you turn on supersampling in the game and push the slider to 1800p its going to look better than DLSS/1290p and slightly better than 1440p, but its going to have a high framerate cost. DLSS *upscales* it doesn't downscale (like supersampling). DLSS will never look as good as "native" and the article clearly shows that.

The question isn't "does DLSS look as good as native rendering" the question is "does DLSS outperform software only solutions with the same IQ" and the answer is "meh, sorta, maybe, not really".
 
Is clearly shows how horrible and unnecessary DLSS actually is, when you can simply run the games at 1800p and get the same frames and much better clarity.
I don't think you understand the article.
It clearly shows in the zoomed in shots, DLSS is much clearer at 1440p then the other normal shader version at 1440p.
And who has 1800p screens? I have a Gysnc 1MS HP Omen/1440p and with DLSS will look better then 1800P without it, this article makes that quite clear.

Quote from the author
we think a lot of people will be more satisfied playing the game with DLSS enabled at a higher level of performance, than running with slightly better sharpness at lower performance
Either you are having trouble with reading comprehension, or your just trying to hate.

That’s the trouble to see any real improvement you need to zoom in to 3x, so in the actual game running around you will not see these subtle effects most of the time and in other cases it’s clear even not zoomed in, the textures are now softer and you get other artifacts.

Not sure why people even worry about any sort of AA at 4K, it looks glorious without it. AA for 1080p and 1440p makes perfect sense.
 
Is clearly shows how horrible and unnecessary DLSS actually is, when you can simply run the games at 1800p and get the same frames and much better clarity.
I don't think you understand the article.
It clearly shows in the zoomed in shots, DLSS is much clearer at 1440p then the other normal shader version at 1440p.
And who has 1800p screens? I have a Gysnc 1MS HP Omen/1440p and with DLSS will look better then 1800P without it, this article makes that quite clear.

Quote from the author
we think a lot of people will be more satisfied playing the game with DLSS enabled at a higher level of performance, than running with slightly better sharpness at lower performance
Either you are having trouble with reading comprehension, or your just trying to hate.

That’s the trouble to see any real improvement you need to zoom in to 3x, so in the actual game running around you will not see these subtle effects most of the time and in other cases it’s clear even not zoomed in, the textures are now softer and you get other artifacts.

Not sure why people even worry about any sort of AA at 4K, it looks glorious without it. AA for 1080p and 1440p makes perfect sense.

I see this nonsense being spewed so often and in 90%+ of cases it's people who don't even have a 4K display say this.
No, 4K does not look 'glorious' without AA. The pixels are still visible and jarring. You still need a light supersampling, usually MSAAx2 is more than enough at 4K or a post AA like SMAA or FXAA if you can't afford the performance hit and don't mind slight blurriness.

At native resolution you'll always have jaggies unless you go ridiculously high DPI. I'm using a 4K 27" panel and there's a night and day difference when running with or without AA. You don't even need to "zoom-in", it's instantly visible unless you're going blind.
 
Is clearly shows how horrible and unnecessary DLSS actually is, when you can simply run the games at 1800p and get the same frames and much better clarity.
I don't think you understand the article.
It clearly shows in the zoomed in shots, DLSS is much clearer at 1440p then the other normal shader version at 1440p.
And who has 1800p screens? I have a Gysnc 1MS HP Omen/1440p and with DLSS will look better then 1800P without it, this article makes that quite clear.

Quote from the author
we think a lot of people will be more satisfied playing the game with DLSS enabled at a higher level of performance, than running with slightly better sharpness at lower performance

Either you are having trouble with reading comprehension, or your just trying to hate.

What article?
Gamer's Nexus and all the other review sites..? Including this one here, that states exactly what I have said..?

Let's not mention that if someone buys a RTX2080ti, then they are not using it for 1440p resolution, but for 4k. (Otherwise see rtx2080 subpar results). DLSS is for 4k, & offers no practical use at 1440p gaming. (Hence the new 1660ti'esque series.)

And why? It is not like DLSS is any better than TAA, etc.. ?


Lastly, DLSS doesn't just work, it takes deep learning. Nobody was looking for another form of antialiasing in their gaming. This is a gimmick.
 
Back