Nvidia says over 80% of GeForce RTX GPU owners use DLSS

I wouldn't use it if I didn't have to... give us proper cards and we won't use ur fake frames and DLSS. Im fine with native ress. Your weak 2024-5 cards made me use ur setting. I had a 4K display for literally 10 years. Its time to give us some 4K capable cards that don't BADLY need DLSS. Under 2000 dollars too btw. 4K is old news. It's like the new 1080p. My phones and cameras have 4K video recording for ages too.

Are we really that bad at making video cards? Is it that hard? Or they just dont deliver big improvements to save some of that for their next gen? Is it about $$? I dont even know. I know their profit margins are the biggest they ever were, so that alone means we could have gotten cheaper cards, even if they aint good enough for 4K 60 fps. I wont even comment about VRAM, that's like the quad core CPUs. We were stuck with 4 cores for agessssssssss. Bottom line is, buying a video card is annoying,hard, expensive and it doesnt even make you happy once you got it. 1070 did, 3080 did... 4080? No. 5080? No.. even 5070 is a no. After the 30 series, everything is a giant meh. Waiting years and years for a proper product sucks.

Phones keep pushing the edges, and it shows. Video cards with their 15% extra REAL performance... dose not. A bit like Zen 5%. This whole last year+ this year has been horrible for new tech. Even 9800X3D is 1000 euro here. A 470 bucks CPU. My god. RIP. Im out of here now :)

P.s. Nvidia still sucks, ignore that I also mention AMD.
 
In an age of subscriptions its probably not a good idea to have graphic fidelity dependent on software rather that hardware.

How long will be be before nVidia are charging a monthly subscription for DLSS? Entry level tier gets you gaming at 720P at 30FPS while platinum plus gets you 4K at 240FPS
 
This is an absolute dumb remark by Nvidia. When games are so taxing especially with fancy RT visuals (introduced by Nvidia), it is not like you have a choice not to enable DLSS. If they are saying that all games run at native 1440p or 4K upwards of 120 FPS, then maybe this statement may be shocking. But in reality, even the RTX 4090 and the incoming 5090 likely runs new AAA titles at horrible FPS if you remove all these software enhancements.
 
Here we go again. Am I missing something here? Isn’t DLSS pretty much highly recommended when you turn on ray tracing, especially with path tracing? Nobody’s forcing anyone to use it if not needed.

If native rendering works fine for you, then great, stick with it. But let’s be real, in a game like Starfield with no RT, only top-tier GPUs like the 7900XTX or 4090 can hit 60fps at 4K max settings. For anyone using lower-end GPUs, DLSS is needed for better performance.

Not all games are created equal, and I could go on forever debating optimizations, but at the end of the day, whether you turn DLSS on or off depends on how the game runs and what settings you’re aiming for. With RT enabled, there is performance hit, so DLSS is recommended.

On the flip side, if you’re already getting over 60fps or solid performance at native resolution, you probably don’t need it. Unless, of course, you’re chasing even higher frame rates, in which case, go for it. Nvidia Reflex is there to keep the lag minimal. That’s pretty much how I decide when to play with or without DLSS.

DLSS 3 takes things even further. CP2077 RT Overdrive, no brainer to have DLSS 3 on. But then you have games like say, Still Wakes the Deep where frame generation or DLSS isn’t needed at all even though it is supported.

For games pushing visual fidelity to insane levels like the Zorah tech, DLSS 3 seems insufficient. That’s where DLSS 4 comes in to do some serious heavy lifting. With DLSS 4, CP2077 RT Overdrive can hit high triple-digit frame rates at 4K, which is something I thought we'd see in future-future GPUs.

Why use DLSS that causes minor artifacts to a game?, which just lowers the res in anycase, when a person can lower their res themselves without those minor artifacts.
From my quick test 4k DLSS quality had less fps and used slightly more power than 1440p.

The extra frames from DLSS, that you say is so important can be had by lowering the res yourself and you can still add your fake frames to boost that as well, with frame gen.

As I have said above, their display might be set to 4k, but since they used DLSS to hit that high number they weren't gaming at a res of 4k. If I'm not mistaken it was at a res lower that 1440p.
So no, you still haven't seen "high triple digit frame rates at 4k"
 
For games pushing visual fidelity to insane levels like the Zorah tech, DLSS 3 seems insufficient. That’s where DLSS 4 comes in to do some serious heavy lifting. With DLSS 4, CP2077 RT Overdrive can hit high triple-digit frame rates at 4K, which is something I thought we'd see in future-future GPUs.
Correction, 2077 hits tripple digits with DLSS4 + framegen. I doubt a 4090 could hit triple digits with RT overdrive @480p
 
Why use DLSS that causes minor artifacts to a game?, which just lowers the res in anycase, when a person can lower their res themselves without those minor artifacts.
From my quick test 4k DLSS quality had less fps and used slightly more power than 1440p.

The extra frames from DLSS, that you say is so important can be had by lowering the res yourself and you can still add your fake frames to boost that as well, with frame gen.

As I have said above, their display might be set to 4k, but since they used DLSS to hit that high number they weren't gaming at a res of 4k. If I'm not mistaken it was at a res lower that 1440p.
So no, you still haven't seen "high triple digit frame rates at 4k"
because when you lower the resolution to the non-native resolution of your monitor then the upscaling is done by the monitor and can introduce LOTS of latency. Lots more latency than why DLSS/FSR do. Also, the scalers in displays are all different. I'm a fan of upscaling tech because it allows lower latency on the display side and it looks better than whats built into my monitor or TV.
 
Correction, 2077 hits tripple digits with DLSS4 + framegen. I doubt a 4090 could hit triple digits with RT overdrive @480p
I'll give that a go now actually see if its possible :joy:

Edit: At 720p (480p doesn't seem to be an option) I'm getting over 100fps the whole time, normally sitting around the 120 mark, I've recorded a ShadowPlay and screenshot of the Benchmark below:
Screenshot 2025-01-17 140746.png

EDIT2: I could turn on DLSS to Balanced at 720p to semi-simulate 480p? Either way, it's definitely in the triple digits.
 
Last edited:
because when you lower the resolution to the non-native resolution of your monitor then the upscaling is done by the monitor and can introduce LOTS of latency. Lots more latency than why DLSS/FSR do. Also, the scalers in displays are all different. I'm a fan of upscaling tech because it allows lower latency on the display side and it looks better than whats built into my monitor or TV.

Do you know of a way I can check these claims?
If I use Intel PresentMon and have Display Latency showing, there isn't any difference between 4k native, 1440p native and 4k dlss quality.
I checked on Cyberpunk and an Indie game that also can't run 4k60 at native with max settings.
They both showed different latency between each other, but kept the same latency during the different settings.
 
I'll give that a go now actually see if its possible :joy:

Edit: At 720p (480p doesn't seem to be an option) I'm getting over 100fps the whole time, normally sitting around the 120 mark, I've recorded a ShadowPlay and screenshot of the Benchmark below:
View attachment 90197

EDIT2: I could turn on DLSS to Balanced at 720p to semi-simulate 480p? Either way, it's definitely in the triple digits.
I tried it as well.
720p with DLSS super extra performance and frame gen, was sitting around the 200 to 230 mark in game around the city.
That was with the Overdrive settings.
 
Last edited:
Do you know of a way I can check these claims?
If I use Intel PresentMon and have Display Latency showing, there isn't any difference between 4k native, 1440p native and 4k dlss quality.
I checked on Cyberpunk and an Indie game that also can't run 4k60 at native with max settings.
They both showed different latency between each other, but kept the same latency during the different settings.
The only way I know of is using the expensive test equipment that reviewers use to test input lag. I have seen some very bad displays where you didn't need to test for it, it was just really obvious adding in what felt like a 250-500ms of input lag. This is becoming less of an issue on newer displays, but using the upscaler on your videocard instead of your display is generally a good idea. This is why lots of "retro gamers" use a dedicated scaler instead of a typical analog-to-HDMI converter
I'll give that a go now actually see if its possible :joy:

Edit: At 720p (480p doesn't seem to be an option) I'm getting over 100fps the whole time, normally sitting around the 120 mark, I've recorded a ShadowPlay and screenshot of the Benchmark below:
View attachment 90197

EDIT2: I could turn on DLSS to Balanced at 720p to semi-simulate 480p? Either way, it's definitely in the triple digits.
that's actually pretty interesting, but I doubt you bought a 4090 to play a 720. I had just assumed that the processing power behind path tracing was so significant that it was fairly resolution agnostic.
 
I have no problem with the idea of DLSS, and I'll use it when need be. But like many have said before me, it seems like developers are using DLSS as a crutch instead of optimizing their games. Back in the day they had to get every last bit of performance from a game engine, now it seems they just say "screw it, close enough, upscaling will handle it".
 
The only way I know of is using the expensive test equipment that reviewers use to test input lag. I have seen some very bad displays where you didn't need to test for it, it was just really obvious adding in what felt like a 250-500ms of input lag. This is becoming less of an issue on newer displays, but using the upscaler on your videocard instead of your display is generally a good idea. This is why lots of "retro gamers" use a dedicated scaler instead of a typical analog-to-HDMI converter
Maybe my LG OLED TV isn't affected to bad by latency, I don't know.

When I play Cyberpunk at those different settings the game feels the same, to me.
The only difference for me is the extra ghosting while using DLSS(if I used it), but it wouldn't be enough to ruin the gaming experience, but it is there.
 
I ran a test yesterday.
I ran 4k with DLSS quality and then 1440p native. There was definitely less ghosting with 1440 native.
There are people so stuck up on the tech for some reason, that lowering a res by yourself is unheard of,
or the "gods" gave and we must tell everyone to use it.
It's the same thing for people who say they can't tell the difference from 60hz to 120. There is no saving those gamers.
 
It's the same thing for people who say they can't tell the difference from 60hz to 120. There is no saving those gamers.
I can see the difference, but it isn't enough to warrant the extra power needed to run it.
Same as RT, there is a difference, but not enough to warrant the power usage, either.

Since I still enjoy the games even at the poverty spec of 60hz, I don't care if other people need a 750hz monitor to enjoy the game. I fortunately don't.

I also don't care that defaultname365 wants to use DLSS with the fake frames, in the games he plays, he can play how he wants.
What matters is that the person is trying to get people to use the tech on most of his post, yet he doesn't know that he wasn't watching a game being rendered at less than 4k.
I assume becuse his boss(Jenson) lies and tells him 4k high refresh gaming.
 
Great video if you haven't already seen it. Covers all there is to the upcoming 5090.

Why use DLSS that causes minor artifacts to a game?, which just lowers the res in anycase, when a person can lower their res themselves without those minor artifacts.
From my quick test 4k DLSS quality had less fps and used slightly more power than 1440p.

The extra frames from DLSS, that you say is so important can be had by lowering the res yourself and you can still add your fake frames to boost that as well, with frame gen.

As I have said above, their display might be set to 4k, but since they used DLSS to hit that high number they weren't gaming at a res of 4k. If I'm not mistaken it was at a res lower that 1440p.
So no, you still haven't seen "high triple digit frame rates at 4k"

I literally covered all of this in my comment. Maybe try actually reading it (again).
 
How many of those 80% are just using default settings and might not even know they are using DLSS - and might not want to if asked..
Exactly I have seen a few times in a new game install that the FSR is turned on by default and sometimes even the FG is on. If I did not look, I would not know I was using fake res and not playing at Native rendering.
With my 7900XTX card I have had no reason to even want to turn this on yet so of course I turn these features off. But as you said how many people play their games and do not even know they are using these features such as FSR & DLSS because they are turned on as the default. This would inflate the user count usage by a lot. So, I guess now not only do we have to look out for fake frames and fake res rendering we have to also look out for fake inflated percentage user numbers as well.
 
Maybe my LG OLED TV isn't affected to bad by latency, I don't know.

When I play Cyberpunk at those different settings the game feels the same, to me.
The only difference for me is the extra ghosting while using DLSS(if I used it), but it wouldn't be enough to ruin the gaming experience, but it is there.
Probably this. I have no reason to believe that LG would cheapout on the post processing on their TVs. Considering the cost of the panel, the cost of a quality upscaler is probably negligible. I had a 1080p vizo TV that was terrible and replaced 2018 with a 4k60 Samsung that was top of the line at the time. The panel got a 1 pixel green line going down it in 2022 so I replaced that again with a 4k120 samsung. Each was better than the last, but it's still far from perfect. I'm hoping not to replace it until there is an 8k120 display in the 100-120" ange. I have had 3 "money is no object" builds in my 30 years of building computers. Once the display tech exists, I will build my fourth.

But anyway, I remember when this all became an issue when LCD monitors started over taking CRTs. I think people forget how bad LCDs really were up until about 2010-2012. I find, in my case, that using FSR gives me better performance and fidelity than using my TVS upscaler. I'm using what I think is a VA based QLED and I think the response times on OLED would help considerably, especially if the scaler was built with the performance advantages of an OLED in mind.
 
Exactly I have seen a few times in a new game install that the FSR is turned on by default and sometimes even the FG is on. If I did not look, I would not know I was using fake res and not playing at Native rendering.
With my 7900XTX card I have had no reason to even want to turn this on yet so of course I turn these features off. But as you said how many people play their games and do not even know they are using these features such as FSR & DLSS because they are turned on as the default. This would inflate the user count usage by a lot. So, I guess now not only do we have to look out for fake frames and fake res rendering we have to also look out for fake inflated percentage user numbers as well.

Brilliant! That speaks to how seamlessly DLSS integrates into games without negatively affecting the experience. If it were FSR, though... oh dear, we’d probably be hearing complaints about the noticeable image quality. Gotta love the mental gymnastics to justify why Nvidia is doing so well and so widely embraced.

What’s next? The 50-series selling well because gamers added it to their cart and purchased accidentally?
 
The only people that should have complaints with DLSS, XESS, and FSR are competitive gamers. You don't need sub 16ms latency playing a single player game. You may want it, but it's not a "need."
 
So you link a video from a company that lies in 32 seconds.
I am merely sharing what they have posted. You can interpret it all you want.

Also, you can whine, moan, complain, and scream into the stars all you want, it doesn’t change anything. I’m sharing what they’re marketing, anyone can do that. Take it with a grain of salt, or in your case, quite a bit more.
 
I am merely sharing what they have posted. You can interpret it all you want.

Also, you can whine, moan, complain, and scream into the stars all you want, it doesn’t change anything. I’m sharing what they’re marketing, anyone can do that. Take it with a grain of salt, or in your case, quite a bit more.

You are the one that is also pushing their lies, since they aren't doing 4k gaming(which you have shown you don't understand that).
 
Back