A Review of Nvidia's DLSS 4 Multi Frame Generation

Thank you for the informative review...
As someone with a 3060Ti on a 5120x1440-144Hz monitor, I doubt I'll be using this specific tech any time soon while playing my games. It's simply not needed!
However I am keen to learn what team Red have coming soon in the way of raw raster performance...
 
I've been saying this since it's inception and often here:
Upscaling is a compromise/crutch for those who have older Video cards, so that they can dump their 8 year old 1080p monitors and jump on a new 1440p/4k (or ultrawide) OLED Gaming Monitor... ..WITHOUT having to buy a NEW GPU.
A lot of modern games upped the graphics quality to account for upscaling. I most certainly do use upscaling and in pretty much everything at varying quality settings. DLSS is pretty good (especially the new model)I don't notice much difference (at 4k) and I'd rather play at 100+ fps than 40-60 (and I have a 4090).

FG is a whole other story. I do use it but in very limited circumstances.
 
Hey..looks like I’m niche now :p.
From what I can read from this article - MFG is best for the 5090 that can reach 80+ fps in 4k. Which means that MFG on anything below 5090 will feel sluggish..which is ironic as the 5070 is the card that was supposed to be the new 4090 - which tests have shown it’s not.
The 4090 is probably the best card on the market now if you can find someone selling it cheap
 
This frame generation is the reason why Nvidia is one of the most valuable companies in the world! This is by far the most innovative marketing innovation in last decade. Just look at the amount of technical writing required to explain, debunk, analyzed. How much trouble it has caused its main competitor to even hold off on launching new products in the last minute. Every company should learn from Nvidia, THis is how you make money, lots of money without actually making practical and actual performance gain on a product that has its value greatly impacted by actual performance value.
 
The latency with frame generation should be greater than it is stated because it is done with interpolation. So, in fact the FG latency add the time for 1 frame. At 60fps it ads at 16ms + the time it takes to generate the frames. But as shown, the real frame rate could drop by 25% because of frame generation to 45fps an then the added latency will be 22ms+ the time it takes to generate the frames. so we may be looking at +30ms added to the real latency.
Techpowerup measured the latency using the 5080 in Cyberpunk 2077 with RT.
so, for the given settings:
FG off 35fps 55ms
FG x2 64fps 71ms
FG x4 120fps 74ms
They also observed an increase in VRAM of up to 1-1.5GB
When using upscaling and no FG, the latency at 66fps was 34ms compared to the 71ms at 64 fps with FG.
 
Last edited:
I don't really understand the negativity towards Nvidia for DLSS4. It's a free feature that you don't have to use. But it looks like a solid improvement, and they have made it accessible to all their cards. I think this is a win for everyone except Intel & AMD tbh...
 
Already that the article which dissected the RTX 5090 put a shovel in our hopes claiming that it was more of a RTX 4090TI than a real evolution of raw performance but then there with the analysis of the DLSS4 , it's the pompom!

Excellent article again, we can easily understand the marketing stroke of Nvidia which dares to compare the performance of the DLSS 4 with the DLSS of 2020.

Too few people will really be able to take advantage of it because everyone will not have the card to run a solo game with a 240 Hz or 360 Hz screen (acceptable frequencies to enjoy the DLSS4 without too much visual damage).

Apart from perhaps some who will be happy to have the last RTX5XXX and who will accept artefacts and a pitiful latency on their 144 Hz or 165 Hz screen.

Nvidia again throws a vagueness and the consumer with a so-called revolutionary technology which is not one.
 
Nvidia wants to sell it as 15 to 60fps technology… we need to know better.

AMD should just market ”real frames” technology, as if it was a new invention

If AMD doesn't blunder with their marketing, they should market the 9070 on real frames aka raster performance, but of course showcase the improvements with their ray-tracing and FSR4 features. Hopefully, they're paying attention to the reaction of gamers with 50 series reviews.
 
Techpowerup measured the latency using the 5080 in Cyberpunk 2077 with RT.
so, for the given settings:
FG off 35fps 55ms
FG x2 64fps 71ms
FG x4 120fps 74ms
They also observed an increase in VRAM of up to 1-1.5GB
When using upscaling and no FG, the latency at 66fps was 34ms compared to the 71ms at 64 fps with FG.
FG does not perform miracles, if you put everything on max and only get 35 fps then it will suck with FG as well. If you configure the settings to give you 60-80 fps and then use FG you will have better latency.
 
Hey..looks like I’m niche now :p.
From what I can read from this article - MFG is best for the 5090 that can reach 80+ fps in 4k. Which means that MFG on anything below 5090 will feel sluggish..which is ironic as the 5070 is the card that was supposed to be the new 4090 - which tests have shown it’s not.
The 4090 is probably the best card on the market now if you can find someone selling it cheap
If you buy a cheaper card like a 5070 you probably don't have a 4K gaming monitor.
 
Back