Nvidia DLSS 3: Fake Frames or Big Gains?

That doesn't change the fact that humans can't see 500+ FPS.
But you have to be able to separate marketing from reality.

266Hz gaming is a reality and perfect for FPS style games, such freq allows for fluid movement and consistency. But many start to feel "it" at about 120Hz+

IMO, after 266Hz it is much harder to differentiate or feel anything.


It may be hard to understand, but high-MHz Gaming, has very little to do with seeing and more-so about character movement and fluid panning. And yes, I think 500Hz panels are 100% marketing with little practicality.
 
Last edited:
To me this seems very similar conceptually to Long GOP lossy temporal video compression, where the delta frames are reconstructed by the codec in between key frames. Also reminiscent of that 'motion smoothing' frame interpolation feature that some televisions have.
 
But you have to be able to separate marketing from reality.

266Hz gaming is a reality and perfect for FPS style games, such freq allows for fluid movement and consistency. But many start to feel "it" at about 120Hz+

IMO, after 266Hz it is much harder to differentiate or feel anything.


It may be hard to understand, but high-MHz Gaming, has very little to do with seeing and more-so about character movement and fluid panning. And yes, I think 500Hz panels are 100% marketing with little practicality.
240Hz seems to be the sweet spot imo for first person shooter gaming.
 
The only value is motion fidelity. Looking like you are running the game at a higher refresh rate so that motion looks smoother. There is no other perceivable benefit and as far as I'm concerned, at least right now, the negatives outweigh the positives.

Exactly. Very few people are realise that this technology is for ultra high frame rates (120hz+) and the benefits for motion clarity (see the UFO tests for monitors). NVidia is marketing this as a performance boost, which it kinda is, but the best results will come from boosting an already high frame rate as argued by this article.
 
It almost feels like Nvidia thinks they need to offer create that distance from their competition, and decided that adding frames will do the job. So you have a high FPS scenario with likely not as great latencies. Perhaps DLSS 3.0 frame generation may improve over time, but at this point, its almost shaping up to be a DLSS 1.0 (not from a visual standpoint, but from a success standpoint) for most users, unless you are gaming at very high FPS.
 
Exactly. Very few people are realise that this technology is for ultra high frame rates (120hz+) and the benefits for motion clarity (see the UFO tests for monitors). NVidia is marketing this as a performance boost, which it kinda is, but the best results will come from boosting an already high frame rate as argued by this article.
It is, but the question is how many people will buy the expensive RTX 4090 and RTX 4080 16GB which are the likely cards that will offer very high FPS. I wonder how will Nvidia pitch this when the mid and lower tier cards arrive next year. Nonetheless, I think this is a nice to have technology.
 
If you need 60-100fps for it not to look bad, that kinda defeats the purpose of having DLSS to begin with, eh? DLSS was supposed to be for weaker cards that needed the speed boost and the increase in input latency would be a real headache. It would make even the best-written game controls feel slow and clunky.
 
But you have to be able to separate marketing from reality.

266Hz gaming is a reality and perfect for FPS style games, such freq allows for fluid movement and consistency. But many start to feel "it" at about 120Hz+

IMO, after 266Hz it is much harder to differentiate or feel anything.


It may be hard to understand, but high-MHz Gaming, has very little to do with seeing and more-so about character movement and fluid panning. And yes, I think 500Hz panels are 100% marketing with little practicality.

You might be right, but its worth checking out Blur busters. These super high Hz are probably more about motion clarity then responsiveness.
 
"The sweet spot where buyers will benefit the most will be around RTX 4070 levels of performance, where good frame rates are achievable and DLSS 3 can provide a boost into the 200 FPS range."

I always thought the purpose of DLSS was to make games playable that weren't before.
True, especially when RT is enabled. I guess this can be useful if it reduces blur at least on high-refresh monitors.
 
But you have to be able to separate marketing from reality.

266Hz gaming is a reality and perfect for FPS style games, such freq allows for fluid movement and consistency. But many start to feel "it" at about 120Hz+

IMO, after 266Hz it is much harder to differentiate or feel anything.


It may be hard to understand, but high-MHz Gaming, has very little to do with seeing and more-so about character movement and fluid panning. And yes, I think 500Hz panels are 100% marketing.

You might be right, but its worth checking out Blur busters. These super high Hz are probably more about motion clarity then responsiveness.

Yeah Blur_busters a way of checking how good a panel is, but nobody is actually looking at their targets in game, they just want to see the motion first, so they are the first to respond.
 
DLSS always looks noticeably inferior to full native resolution; you are better off dropping resolution with max settings than using dlss to make the game look worse at a higher resolution.
 
DLSS always looks noticeably inferior to full native resolution; you are better off dropping resolution with max settings than using dlss to make the game look worse at a higher resolution.
This is one of those things where if that is your opinion of DLSS then that is your preference. That being said, many people do not see it that way. I turn on DLSS whenever it is available because it looks much better than 1440p on my 4K monitor, even in "performance" mode. It provides better AA and does not look fuzzy like running in a non-native resolution. I'll typically run in the highest quality I can and still maintain a +90 framerate. That's DLSS 2.0, not interested in 3.0.
 
Last edited:
DLSS always looks noticeably inferior to full native resolution; you are better off dropping resolution with max settings than using dlss to make the game look worse at a higher resolution.
Unless you are using dldsr upscaling to a higher resolution than native especially when there is performance on the table to mitigate a cpu bottlenecks. The sampled resolution for example can be native 4k and and the upscaled image is 8k. if this can achieve 60 fps and then on top of that you can generate frames to 120 fps not only will you get a superior image but in theory a smoother experience at cost of latency offset by reflex lol. Will anybody buy into the dlss3 marketing?
Upscaling to a higher resolution than your monitors is where I see the future for upscaling techniques imo.
 
I avoid dlss like the plague; emperor's new clothes. Looks WAY worse than native. If I want that I'll hop on my P3 with an Nvidia M64.
 
This is one of those things where if that is your opinion of DLSS then that is your preference. That being said, many people do not see it that way. I turn on DLSS whenever it is available because it looks much better than 1440p on my 4K monitor, even in "performance" mode. It provides better AA and does not look fuzzy like running in a non-native resolution. I'll typically run in the highest quality I can and still maintain a +90 framerate. That's DLSS 2.0, not interested in 3.0.

Unless you are using dldsr upscaling to a higher resolution than native especially when there is performance on the table to mitigate a cpu bottlenecks. The sampled resolution for example can be native 4k and and the upscaled image is 8k. if this can achieve 60 fps and then on top of that you can generate frames to 120 fps not only will you get a superior image but in theory a smoother experience at cost of latency offset by reflex lol. Will anybody buy into the dlss3 marketing?
Upscaling to a higher resolution than your monitors is where I see the future for upscaling techniques imo.

Why on earth would upscale to 8k image, on a 4k monitor again....?
 
Why on earth would upscale to 8k image, on a 4k monitor again....?
Here is a great article on it
I used dldsr in Plagues Tale requiem 2x 4k dlss set to quality, the performance and image quality was superior than native. You have to play around with it yourself because it's purely subjective.
 
Here is a great article on it
I used dldsr in Plagues Tale requiem 2x 4k dlss set to quality, the performance and image quality was superior than native. You have to play around with it yourself because it's purely subjective.


So basically, you would NEVER do this, unless you have a 4090 and playing @ 1080p for some reason... otherwise native would be faster and superior.
 
So basically, you would NEVER do this, unless you have a 4090 and playing @ 1080p for some reason... otherwise native would be faster and superior.
I used this a lot with a 3080 and older games on a 1440p 144hz monitor. It looked fantastic compared to any AA available in those titles, especially since most older titles do not have TAA. It's not really meant for newer titles. The nice thing about DLDSR is that you did not have to quadruple the resolution for good down sampling AA, 4K to 1440p looks just as good. Or 1440p on 1080p.
 
Back