Nvidia DLSS 3 Revisit: We Try It Out in 9 Games

No, it is not acceptable for such an expensive GPU to need these tricks to deliver the expected performance.

It's not that these expensive GPUs are not delivering expected performance based on their specs; It's the bullcrap that Nvidia is using DLSS3 to pad the numbers of what these GPUs can do to make them look better than they really are.

We've all seen those graphs Nvidia has pushed out, claiming upwards of 3x the performance of a 3090Ti (but in tiny writing at the bottom of the graphs it tells you that the Ada cards are using DLSS 3 and the Ampere aren't using anything other than basic rasterization performance). Marketing gimmicks at their best.

DLSS 3 is a waste, if you ask me, but I'm certain there are hardcore Nvidia fans that will defend it with their lives. Much like how people are like RT!!!! OMG! RT! Nvidia does it so well over AMD. The problem is, I don't know who to feel worse for: AMD for being behind what Nvidia can do when it comes to RT performance or for Nvidia for having dedicated cores for RT and they still stuck at it.
 
No, it is not acceptable for such an expensive GPU to need these tricks to deliver the expected performance.
You realize rasterization itself is a series of clever tricks? Heck, so is ray tracing: it isn't doing a million rays per pixel as that would be stupid!
Engineering is all about clever tricks.
 
The DLSS pushers are still hard at work.

Still not interested.

dr-evil-how-about-no.gif
 
You realize rasterization itself is a series of clever tricks? Heck, so is ray tracing: it isn't doing a million rays per pixel as that would be stupid!
Engineering is all about clever tricks.

In fact, our entire world is a set of clever tricks. Electrons aren't really circling the atoms, as they would in a real world. Instead, their trajectory is replaced by a random number generator with a realistic spatial distribution. Same goes for most of things at the lowest resolution level of our world. We call them "quantum effects" and "dual wave-particle behavior" but it's nothing else than a very clever cheat to mask the loss of resolution. Replacing complex calculations with RNG and approximations to save on compute resources. So, no wonder game devs are doing the same.
 
“while we'd ideally like to test the tech on a mid-range option as that's where we feel it could be most useful, there are none of those yet on the market”

Yet you used a 4080 instead of a 4070 TI? 🤔
 
FSR/DLSS is great for certain situations but not using them is even better. I don't want upscaling to become the standard because it does look worse. Maybe if it becomes unnoticeable, but that's not happening for a while.
 
I dunno how can you suggest these techs nice at all. I have awfull sun glares flickers in CB77 even with DLSS2 (3080Ti), and overall picture quality is a bit worse than native when looking at it at 34" 1440p, dunno how awfull it looks at native 4K. Like, in details it's kind of the same, but when looking overall, native is "crispier".

P.S. I guess you can call DLSS usefull in games that already have *shiet* render methods and engine, smth like Dying Light 2 or numerous UE4 projects.
 
It's not that these expensive GPUs are not delivering expected performance based on their specs; It's the bullcrap that Nvidia is using DLSS3 to pad the numbers of what these GPUs can do to make them look better than they really are.

We've all seen those graphs Nvidia has pushed out, claiming upwards of 3x the performance of a 3090Ti (but in tiny writing at the bottom of the graphs it tells you that the Ada cards are using DLSS 3 and the Ampere aren't using anything other than basic rasterization performance). Marketing gimmicks at their best.

DLSS 3 is a waste, if you ask me, but I'm certain there are hardcore Nvidia fans that will defend it with their lives. Much like how people are like RT!!!! OMG! RT! Nvidia does it so well over AMD. The problem is, I don't know who to feel worse for: AMD for being behind what Nvidia can do when it comes to RT performance or for Nvidia for having dedicated cores for RT and they still stuck at it.
I haven't seen to many defending frame generation. Just an FYI dlss 3 includes 1) frame generation 2) upscaling and 3) reflex. I would say that 2/3 features are useful. Do gamers need frame generation? No, but Nvidia will use it anyway to market it and gain or maintain mindshare. From trying frame generation via DLSS3 on, Cyberpunk, Hogwarts, Darktide and Plagues tail requim even with reflex enabled it still feels leaggy as f*** besides what the frame per second counter is reading. The only exception was Hogwarts where it allows reflex to be set to boost mode with rt on at native resolution with dlaa on but it's far from perfect. I believe Nvidia did a disservice by grouping everything together with DLSS3 because mostly everyone can agree frame generation is useless but some say find the reflex and upscaling techniques in some titles when set to 4k dlss quality and reflex set to boost mode is more useful.

Update in case anyone was interested Darktide rt and dlss is broken and the game is super unstable. Without it have not experienced any crashes since.
 
Last edited:
DLSS3 is still only viable if you've got a decent framerate to begin with. In a single player game, I'm OK with 60 FPS and VRR to prevent tearing. Nvidia is still using smoke and mirrors to cover the fact that RT is not ready for prime time on the 3rd generation of RTX cards.
 
No, it is not acceptable for such an expensive GPU to need these tricks to deliver the expected performance.

I have no problem with DLSS@, but to have to resort ot fake frames is a joke. You can see going forward Nvidia is going to care less and less about pure raster and double down on AI trickery to market BS framerates. $1200 card needing this is as you say a joke.
 
Seems like the diehard DLSS fanboys are finally slowly giving up on "upscaled image can look better than native".
DLSS 2.x quality at 4k, still looking better than most TAA implementations, thanks for asking. perhaps it's being said less because more and more people accept it as true.

For such a gimmick, it's funny for the rest of the industry to be so intent to follow.
 
DLSS3 is still only viable if you've got a decent framerate to begin with. In a single player game, I'm OK with 60 FPS and VRR to prevent tearing. Nvidia is still using smoke and mirrors to cover the fact that RT is not ready for prime time on the 3rd generation of RTX cards.
And that's the catch-22... The best benefit is provided when fps is already high enough that it's not needed.
It's the latest way they are trying to cheat on benchmarks for marketing purposes. Add more frames regardless of quality
 
Im not sure I like fake frame, maybe for demanding single player games, def not for multiplayer ones.

However it's a smart feature for people using outdated CPUs because cpu bottleneck gets removed.

Looking forward to "FSR3" :)
 
DLSS 2.x quality at 4k, still looking better than most TAA implementations, thanks for asking. perhaps it's being said less because more and more people accept it as true.

For such a gimmick, it's funny for the rest of the industry to be so intent to follow.
So the edges are less jagged great but everything else is blurry and introduces artifacts and clipping. Thanks for asking

Developers are adopting it becuase they can be lazy and not optimize the their games.
 
For example, in Hogwarts Legacy we used the tried and true method of replacing the DLL file with a more recent version, updating it from DLSS 3 v1.0.3 to v1.0.7. In this title the UI issues were largely resolved after applying this manual update,

Well, my experience with HL is that it has plenty of graphic anomalies. Nothing that makes the game unplayable, but I do see flickering of objects and other oddities. And that's without DLSS3 (running on 2070 Super GPU).

As for the DLSS bashing here, I guess people have forgotten what it was like when we got our first graphics accelerators back in the days of Orchid and 3DFx Vodoo. They all had teething pains, but they led us to a better gaming experience.

Will DLSS be the defacto standard for graphics acceleration? Maybe, maybe not, but hopefully that and FSR will bring us a better gaming experience once they mature the technology. As games become more complex, graphically and computationally, it's going to take a lot more hardware to keep up. That presents a couple of problems and a big one is power. The second is cost. You're not going to get 150+ FPS in a $500 GPU. So, something has to address that and the cheapest way to do so is via software.
 
Back