Nvidia DLSS client libraries allows users to manually update DLSS safely

jsilva

Posts: 325   +2
TL;DR: Nvidia's DLSS has multiple versions, ranging from 1.0.0 to 2.2.10. Regardless of this, game developers can update it, but that tends to take some time. However, DLSS client libraries have become available so users can update DLSS themselves, but only within the major version (e.g. DLSS 1.0 can be upgraded to 2.0).

It seems that for each new wave of games receiving DLSS, there's a new version of Nvidia's upscaling technology. For example, Doom Eternal features DLSS 2.1.66, Rainbow Six Siege has the 2.2.6 version, and Unreal Engine 5 comes with DLSS 2.2.9, but all of them should be upgradeable to the latest DLSS version... presumably the 2.2.10 found on Rust.

Users can wait for the developer to upgrade DLSS in upcoming game patches, but a manual upgrade is also possible. This was discovered by users who replaced the DLLs corresponding to the DLSS libraries located in the game's installation folders, potentially enabling newer features, image quality or performance improvements. To swap the files, you'll need to acquire the "nvngx_dlss.dll" for the version you want and replace it in the game's installation folder. A quick Google search will lead to these .dll files, but it may be hard to pinpoint which are safe to download.

To help with this task, TechPowerUp created a library of DLSS files so users can download the version they want with peace of mind. We trust TPU who say all files have been "hand-verified to be the unmodified originals."

The library currently contains 23 files, from which 18 are DLSS 2.0 files, and the other 5 are DLSS 1.0 files. The files range from DLSS 1.0.0 to 2.2.10. TechPowerUp expects to grow the library of files as new versions are launched.

Do note, DLSS 1.0 games can't be upgraded to DLSS 2.0 by simply replacing the file. Only the game developer can do a complete version upgrade, similar to how Remedy Entertainment did on Control.

In theory, each new version should be the same or superior to the former in terms of performance and image quality, but some users' reports suggest otherwise. Depending on the game and DLSS version, results will vary, so if you're into tweaking files and optimizations, you will have to try them all until you find the best suited to your game.

Permalink to story.

 
"but some users' reports suggest otherwise"... probably that is why it takes time for the developers. Since it is not a single file swap and there you go, but it could need extensive QA efforts and finetuning in the game engine. I'll just wait for devs to do the work properly.
I wonder though when the assassin's creed games will have DLSS. Odyssey and Valhalla are the ones that need some extra fps due to lack of optimization and the use of an old engine.
 
“DLSS”

5fov7b.jpg
 
“DLSS”

5fov7b.jpg

I specifically went with nvidia because of this feature. I recon I won't have to upgrade my GPU for longer due to this feature. If you cannot tell the difference in image quality, it's like you bought a more expensive GPU... At the moment I limit COD to 80fps and set DLSS on quality mode - then GPU runs nice and cool while I have enough frames for my 60hz monitor. I hope MS Flight Simulator will support this feature too.
 

At the moment I limit COD to 80fps and set DLSS on quality mode - then GPU runs nice and cool while I have enough frames for my 60hz monitor. I hope MS Flight Simulator will support this feature too.
Why 80 fps when you have a 60hz screen? Makes no sense at all...
 
Why 80 fps when you have a 60hz screen? Makes no sense at all...

You see more up to date frames, even though its still only 60 frames a second. This was raised in the linus tech tips videos I just posted in a reply on this page.
 
Last edited:
I think it was on Linus Tech Tips where they proved this helps for lower lag (which you would want in first person shooters)".
How much lower can it be from 60 to 80 fps? Is it really worth it to have screen tearing over that little improvement in latency?

60/80fps is still at major deficit vs 144/165/240 fps/hz. So what gives?

Excuse me if I don't actually see the benefits on running 80 fps on a 60hz screen.
 
How much lower can it be from 60 to 80 fps? Is it really worth it to have screen tearing over that little improvement in latency?

60/80fps is still at major deficit vs 144/165/240 fps/hz. So what gives?

Excuse me if I don't actually see the benefits on running 80 fps on a 60hz screen.
"if your PC is capable of pushing more than 60 frames, you might see some improvements to input lag and response time because the frames are more-up-to-date. Linus tech tips talks about this in" this video. But yes - your monitor still only displays 60... but if first person shooters is what you are playing, you might benefit a bit.

 
"if your PC is capable of pushing more than 60 frames, you might see some improvements to input lag and response time because the frames are more-up-to-date. Linus tech tips talks about this in" this video. But yes - your monitor still only displays 60... but if first person shooters is what you are playing, you might benefit a bit.

I understand that, but for a "bit" better latency, you also get screen tearing as a negative side effect.

So for me that is not worth it at all.

If one wants that better latency, one should just get a better monitor. The 144hz ones are very cheap now and if they are still a problem financially, well there are 75hz monitors too, which is as close as playing at 80fps, as possible, without the negatives.
 
I specifically went with nvidia because of this feature. I recon I won't have to upgrade my GPU for longer due to this feature. If you cannot tell the difference in image quality, it's like you bought a more expensive GPU... At the moment I limit COD to 80fps and set DLSS on quality mode - then GPU runs nice and cool while I have enough frames for my 60hz monitor. I hope MS Flight Simulator will support this feature too.
Not meaning to insult you, but I dont fall for marketing hype and I am actually hostile to anything that will place limits on me.

I keep reading all these post about wanting a nvidia card because RT and dlss, but you might be the only one that included one game next to those words.

the truth is, RT so far, is a gimmick and dlss is barely present on todays games and definitely nowhere near the vast majority of older games.

so think about how stupid it sounds in paying so much money for a feature that its barely there.

then as I mentioned, I am hostile towards actions that limit my options.
Example, I love the Arkham series, but if I want all the bells and whistles, I am forced to have a nvidia card. I refuse to be placed on that spot again.

thats why I support fsr, vulkan and anything else that is an open standard.

in the end, all that wont matter to you and many others because “mah RT and dlss rulez yo!” And the message will fall in deaf ears.

edit also, LTT is deep into nvidia and intel pockets and hasnt provided a fair review in ages, so do yourself a favor and get your info from more respectable sources.
 
Last edited by a moderator:
Lower inputlag
So you ignored all the posts above and you gave me the same response as them? Amazing.

Since you're that ignorant I'll repeat myself: 60fps to 80fps difference in input lag is not that much and not worth it the screen tearing it brings with it as a downside.
Just buy a 75hz (if money is an issue) or 144hz monitor and then you can benefit for better input latency. 80fps on a 60hz display is just plain stupid.
Not meaning to insult you, but I dont fall for marketing hype and I am actually hostile to anything that will place limits on me.

I keep reading all these post about wanting a nvidia card because RT and dlss, but you might be the only one that included one game next to those words.

the truth is, RT so far, is a gimmick and dlss is barely present on todays games and definitely nowhere near the vast majority of older games.

so think about how stupid it sounds in paying so much money for a feature that its barely there.

then as I mentioned, I am hostile towards actions that limit my options.
Example, I love the Arkham series, but if I want all the bells and whistles, I am forced to have a nvidia card. I refuse to be placed on that spot again.

thats why I support fsr, vulkan and anything else that is an open standard.

in the end, all that wont matter to you and many others because “mah RT and dlss rulez yo!” And the message will fall in deaf ears.

edit also, LTT is deep into nvidia and intel pockets and hasnt provided a fair review in ages, so do yourself a favor and get your info from more respectable sources.
Don't worry friend, there still are some of us that are not brainwashed by nvidia's lies and manipulation.

Also, FSR will get much much better adoption than DLSS and soon will surpass the number of games it will be available on.
Then even nvidia smug elitists will use FSR with their RTX GPUs, in those games that don't have DLSS.

In 1 year time from now I expect FSR to be at least in the same amount of games DLSS is in, if not more.
 
FSR with their RTX GPUs
That’s assuming that the d!cks at nvidia dont block it because “insert bs lie here” that their marketing dept says and of course, the drones will follow blindly.

hopefully the industry will do as you predicted and we will be in a better place.
 
So you ignored all the posts above and you gave me the same response as them? Amazing.

Since you're that ignorant I'll repeat myself: 60fps to 80fps difference in input lag is not that much and not worth it the screen tearing it brings with it as a downside.
Just buy a 75hz (if money is an issue) or 144hz monitor and then you can benefit for better input latency. 80fps on a 60hz display is just plain stupid.

Don't worry friend, there still are some of us that are not brainwashed by nvidia's lies and manipulation.

Also, FSR will get much much better adoption than DLSS and soon will surpass the number of games it will be available on.
Then even nvidia smug elitists will use FSR with their RTX GPUs, in those games that don't have DLSS.

In 1 year time from now I expect FSR to be at least in the same amount of games DLSS is in, if not more.
You already have tearing even at 60hz and 60 fps, unless you use one of the sync technologies.
It is so funny how kids are complaining about tearing and everything else nowadays.
We used to play csgo on 60hz monitors, with horrible tearing and we never cried about it. But having more than 60fps on those monitors meant a huge difference, at least in csgo.
 
You already have tearing even at 60hz and 60 fps, unless you use one of the sync technologies.
It is so funny how kids are complaining about tearing and everything else nowadays.
We used to play csgo on 60hz monitors, with horrible tearing and we never cried about it. But having more than 60fps on those monitors meant a huge difference, at least in csgo.
I'm close to 40, but thanks for making me a kid. I will always be one at heart, at least as long as I play games, that is.

Of course I use freesync even at 60hz. A clean image is much more important than a small insignificant decrease in latency from 60fps to 80fps.

No competitive gamer plays at 80fps, no one that is serious about that. So this entire point is as moot as it can be.

You're no more competitive (to a degree that actually makes a real difference) at 80fps than you are at 60fps. Both situations are severely gimped in latency vs 144hz and more.

I cannot stress enough how aberrant and stupid this point of 80fps on a 60hz screen is. That being said I have no desire to waste any more time on this topic, so let's agree to disagree or keep talking to yourself, I won't replay again.
 
I'm close to 40, but thanks for making me a kid. I will always be one at heart, at least as long as I play games, that is.

Of course I use freesync even at 60hz. A clean image is much more important than a small insignificant decrease in latency from 60fps to 80fps.

No competitive gamer plays at 80fps, no one that is serious about that. So this entire point is as moot as it can be.

You're no more competitive (to a degree that actually makes a real difference) at 80fps than you are at 60fps. Both situations are severely gimped in latency vs 144hz and more.

I cannot stress enough how aberrant and stupid this point of 80fps on a 60hz screen is. That being said I have no desire to waste any more time on this topic, so let's agree to disagree or keep talking to yourself, I won't replay again.
ignorant, aberrant, stupid... just some of the words you use at the age of 40 about a comment on 80 fps on 60Hz....
You are way too harsh for the topic, don't you think? This is a real first world problem. :)
We are living in a great world as long as people like you freak out this much about such a meaningless comment. No need to reply. I am considering this exchange of thoughts finished as well.
Sigh....
 
ignorant, aberrant, stupid... just some of the words you use at the age of 40 about a comment on 80 fps on 60Hz....
You are way too harsh for the topic, don't you think? This is a real first world problem. :)
We are living in a great world as long as people like you freak out this much about such a meaningless comment. No need to reply. I am considering this exchange of thoughts finished as well.
Sigh....
You're right, I'm very abrasive if I have to repeat myself so many times or if the discussion is dragging into redundancy.
Yet despite my harsh wards, what I say, the message is not invalidated by them.
 
I specifically went with nvidia because of this feature. I recon I won't have to upgrade my GPU for longer due to this feature. If you cannot tell the difference in image quality, it's like you bought a more expensive GPU... At the moment I limit COD to 80fps and set DLSS on quality mode - then GPU runs nice and cool while I have enough frames for my 60hz monitor. I hope MS Flight Simulator will support this feature too.
I have an RTX card but no I see no game worth playing with DLSS, and most of the time the games get it long after launch (as a result lot of so called reviewers still use control).
I still believe that nVidia wasted silicon on this, they could have used it for Cuda cores and get a bump in performance in every game.
DLSS and FSR (at least for now, as it seems easier to implement) are useless as long as they require per game implementation, this kind of upscaling should cover something like all DX12 games to be interesting
 
Back