Nvidia DLSS 3: Fake Frames or Big Gains?

Ask yourself this.... Why is a $1600 dGPU not just giving us actual frames, & instead trying to shove all this pseudo-frames down our throats...?

Answer: Because Ada architecture is not designed for Video Games, but for Enterprise and Nvidia is just using that architecture and marketing for Gaming... using all the bad binned dies and lower tiered anomalies for gaming.


Consoles use RDNA and November 3rd RDNA3 will be announced for dGPUs.
 
Ask yourself this.... Why is a $1600 dGPU not just giving us actual frames, & instead trying to shove all this pseudo-frames down our throats...?
Okay, I have been very critical of DLSS 3.0, but the RTX 4090 delivers on rasterizing and RT performance. Whether it is worth $1600 is very debatable, but there is no question that it is by far the fastest GPU out there. I think the bigger concern is the deceptive marketing of DLSS 3.0. Calling it a 2X to 4X the performance is deceptive because there are very few scenarios where the benefits of frame generation outweigh the negatives, it is going to be a game by game and even GPU by GPU basis because you are going to want a certain amount of performance before even enabling it. And even then, the benefit seems extremely diminished if you already have to have a very good framerate and of course a very high refresh monitor.
 
Okay, I have been very critical of DLSS 3.0, but the RTX 4090 delivers on rasterizing and RT performance. Whether it is worth $1600 is very debatable, but there is no question that it is by far the fastest GPU out there. I think the bigger concern is the deceptive marketing of DLSS 3.0. Calling it a 2X to 4X the performance is deceptive because there are very few scenarios where the benefits of frame generation outweigh the negatives, it is going to be a game by game and even GPU by GPU basis because you are going to want a certain amount of performance before even enabling it.

Right, it makes no sense... until you look at the 4070, or the 4060ti marketing aspect of it....
 
No, I'm not - merely pointed out that the issue of added latency isn't likely to be addressed. That latency itself isn't necessarily a problem; in some games, it won't even be noticeable. The frame generation algorithm will certainly improve and the number of the visual issues should decrease in due course.
I'm optimistic more than most then.
 
"This is a decent step towards making 500Hz and 1000Hz displays of the future viable. Rendering at those sorts of frame rates is probably never going to be achievable with the advancements in games, but there is a visual benefit from running at such a high refresh rate."

I call BS on this Tim. I don't believe people can physically see a visual benefit from running 500+ FPS. The placebo effect of a bigger number doesn't count.
 
I see this doing wonders for non interactive cinematics. But for gameplay, GOOD QUALITY motion blur is enough to make 60fps look as smooth as 120fps.
 
Yes, great article, Tim. It makes sense out of the tech and the marketing. Your hard work in figuring out what to measure, how to do that, and how to present it (none of which are simple) is much appreciated.
 
In a world so fake, only fake seems real and real only has value to the few that understand the paradigm.
 
I have tried DLSS 2 in a few games, the images resulted look dirty and color distorted.
Maybe just for me and for games I play. Never tried it since, I prefer to lower shadows and other settings to keep frames in control.
The latest version of DLSS 2.0 is really good. They have removed most of the ghosting and it is increasingly hard to tell the difference between DLSS and native. That being said, some people maybe a little more sensitive to it. I have been playing Deathloop on Game Pass since it released and DLSS @ 4k on a 3080 has been great, the game runs smoothly and looks native 4K to me.
 
Great article, thanks a lot ! I really like these kind of articles: hands-on deep dive on a specific topic.

Aside from its usefulness, which seems to be 'lacking', I've been wondering about how this frame generation 'feels', and these quotes from the article make me wary of its effects:
".. enabling DLSS 3 can make a game feel more sluggish even if the frame rate has risen, a weird feeling for sure"
"It was honestly a weird experience playing a game where the smoothness is clearly high, but the game still feels slow, ..."

Some TVs have similar frame generation or motion smoothing, and it completely ruins the immersion and experience of watching a movie for me: makes the movie feels weird and out of pace (soap opera effect ?).

While DDLS3 seems to generate 1 frame between 2 rendered frames (and not some strange ratio), this should provide a more balanced output, but maybe also/still has a similar strange pace/feel ?
 
Well, I will say, I take it back all the times that I said that Tim would never criticize anything that Nvidia would do.

Bravo and thanks for a very informative video.
 
"This is a decent step towards making 500Hz and 1000Hz displays of the future viable. Rendering at those sorts of frame rates is probably never going to be achievable with the advancements in games, but there is a visual benefit from running at such a high refresh rate."

I call BS on this Tim. I don't believe people can physically see a visual benefit from running 500+ FPS. The placebo effect of a bigger number doesn't count.
High refresh rates are about smoothness of gameplay which is a biggy for peeps who play first person shooter games.
 
2 words, Placebo effect, DLSS 3.0 is! Also Digital Foundries said you don't need Dlss 2.0 on to generate frames you can do it at Native resolution FYI.
In conclusion if you are satisfied with your performance already I guess at a minimum 60 fps dlss 3.0 makes only sense as an add on here to generate 120 fps although the chart only shows generated frames only when dlss 2.0 is used as well and not native resolution. I guess my question is if you use native resolution and get 60 fps ( at 60 ms latency = 5 frames ) would generating frames with upscaling techniques worsen the the latency and by how much?
 
Ummm. So we don't get HDMI 2.0 to use monitors that will actually make use of this. Most ppl don't even have a high refresh 4k. If u do have a 144 hz 4k. U might as well use dlss 2.0 because It will almost max your refresh rate, why use fake frames at all. ??? I bet all the fancy games built around rdna for consoles will use minimal raytracing that will max your monitor refresh rate with fsr quality. Doesn't rdna 3 have HDMI 2.0
 
Back