A Review of Nvidia's DLSS 4 Multi Frame Generation

The latency with frame generation should be greater than it is stated because it is done with interpolation. So, in fact the FG latency add the time for 1 frame. At 60fps it ads at 16ms + the time it takes to generate the frames. But as shown, the real frame rate could drop by 25% because of frame generation to 45fps an then the added latency will be 22ms+ the time it takes to generate the frames. so we may be looking at +30ms added to the real latency.
 
I’m sorry TS community.
I can’t help myself.

The truth is heavy, and it’s hard to bear. Soon, everything we see will be shadows, everything we feel, a simulation.

The taste on our tongues, synthetic.
The air we breathe, perfumed lies.
Even the tools we hold, constructs of convenience, crafted to keep us blind.

The real slips through our fingers,
And all that’s left is the counterfeit.
 
Great article, well done lads.

Having recently gone back through my Steam catalogue and replayed a few older games (Deus Ex latter entries) it struck me just how crisp everything was. Sharp edges, clear scenery. Having just finished Back Ops 6 single player as an example it just seems like every game released has a thin coat of Vaseline smeared over the camera.

All these AI techniques just seem to be trying to get back to a quality point where we were 10 years ago.

Edit: as well as the rampant use of default-on post-processing effects that absolutely destroy image quality
 
As it is there is a small window where I will use FG usually 80-120fps (native with DLSS) and not fast paced. I don't see the need for MFG. I do seem to be ok sliding back one notch at 4k when switching from DLSS3 to DLSS4 though. For instance, 4k balanced before, I'm fine with 4k performance now. That makes it feel faster, and I don't need to replace the 4090. Plus, I got another 10% by fixing a system issue and putting OC back on. I had hardware issues and disabled everything that was a potential cause and didn't turn back on when I got the new motherboard.
 
I've been saying this since it's inception and often here:
Upscaling is a compromise/crutch for those who have older Video cards, so that they can dump their 8 year old 1080p monitors and jump on a new 1440p/4k (or ultrawide) OLED Gaming Monitor... ..WITHOUT having to buy a NEW GPU.

Instead of buying a $600 GPU, most gamers would be much better served, buying a new 1440p OLED Monitor instead. It will impact their gaming much more than a new GPU will. 1080p to 1440p is 66% more pixels, that are brighter/better/faster/bigger than their old monitor experience. Again, using the SAME gpu.



Nobody who is buying a new GPU, cares about upscaling BECAUSE, the reason you are buying a new GPU is to get away from it. Nvidia I trying to sell you upgraded fake frames and it's disgusting!

Prosumer tip:
Any GPU can upscale, so buy the one that offers the best raster and the lowest price (ie price/performance)
 
If I'm getting 80-120 fps native at 1440p I really don't feel a need for more fps - on a 60Hz monitor. Seems the technology makes the most impact where it isn't needed (much like Intels GPUs are only good value for new up to date systems).
 
I've used Single Frame Generation on a 4080 in a handful of games. I've noticed the latency issue is much more prominent when using a mouse to move the camera vs a controller. In Cyberpunk for example, the "improved" framerate can feel very odd when the original framerate is ~80 or lower I've noticed. In the Insomniac Spider-Man games, I use a controller, and I can hardly tell a difference in input delay when frame generation is on vs off and in general it just looks smoother without much issue (though there are issues once in a while).

Definitely a feature where it doesn't make sense to use all the time, and definitely should not be used when the original frame rate is low, though Nvidia seems to differ. Thanks for the review Tim! Was a great read through.
 
Lots of reason for angst. Nvidia making hilarious claims of tainted performance. MFG not living up to the hype. 5090 users are spending $2000 to be beta testers. The list goes on. All of this is true.

However….

AI is designed to improve. While this is a reason to skip on the 5090, the technology itself is still exciting. In a year or two, you may get MFG to work without artifacts, in conjunction with an updated Reflex model to help fight latency. The “250” frame farce, may become a reality sooner than many think. The future is exciting, just look at how good DLSS 4 upscaling has become.
 
Not my comment

but FG is yuck for twitch games
FG is unnecessary for beautiful slow moving open world games - where 30fps will look just fine or 60fps upscaled from 1440p etc

AI upscalers are the way to go that know what the 4k assets should look like and where attention to detail is needed

FG are like those jokes I can be a good guy for 30 years but get caught drunk french kissing a cow
If get anything wrong and it's not OK and jarring and stops immersion
 
It's a shame this is marketed as a performance enhancing tool rather than an image quality tool, since it should more accurately be called a motion smoothing.

It's marketed as something that will let you RTX5070 game like a 4090, which in pure FPS numbers maybe but in actual playability will leave a lot to be desired.

If it was targeted more toward people with 5090's trying to get their 120FPS game up to 480 FPS for their ultra high refresh rate monitor for that cleaner feeling motion then I think we'd see a lot less pushback against the tech, which is frankly neat but misleading to the average consumer.
 
So if frame generation is “blatanly misleading” tech by ngreedia, why then all these two years you and other reviewers from different teams, countries kept showing us 4000 series demolishing 3000, 2000 and radeon, with that separate “dlss+FG” stripe?
 
Back