Nvidia acknowledges DLSS shortcomings and is working to fix them

Greg S

Posts: 1,607   +442
Big quote: "The current experience at these resolutions is not where we want them." Nvidia is working hard to improve DLSS, especially at lower resolutions, after gamers share blurry screenshots rendered by the deep learning neural networks.

Nvidia has recognized that its Deep Learning Super Sampling feature introduced on the RTX series of graphics cards is not yet perfected. Through a Q&A post made by Andrew Edelsten, Nvidia's technical director of deep learning, it is shared that DLSS for resolutions below 4K is going to receive significant attention.

Gamers taking advantage of the latest Battlefield V update that included DLSS support have reported seeing blurry frames on occasion. In response, Nvidia has said, "We have seen the screenshots and are listening to the community’s feedback about DLSS at lower resolutions, and are focusing on it as a top priority." Additional training of neural networks will help bring about higher quality visuals, but training the networks for 1080p gaming is going to take a little longer. As a side note, TechSpot's full take on Battlefield's DLSS update will go live tomorrow morning.

When working with 4K titles, Nvidia has 3.5 to 5.5 million pixels that are usable as an input to their DLSS algorithms compared to a maximum of 1.5 million pixels usable at 1080p. It is far more difficult to render a final frame that looks good to human eyes with less source information available. Going forward, Nvidia will focus on improving DLSS on 1920x1080 and ultrawide monitors running 3440x1440 or similar resolutions.

There are instances where switching on TAA may appear to be slightly better than DLSS for the time being. The trade-off though is that TAA makes use of multiple frames and can cause ghosting or flickering in scenes with high motion rates. DLSS largely eliminates ghosting and flicker, but is not objectively better for all games and combinations of settings.

Besides an incoming update to Battlefield V, Metro Exodus also has an update incoming that was not ready in time for the initial launch. Nvidia is working on neural network training across an expanded portion of the game and is also addressing feedback received over issues with HDR functionality.

In this case, Nvidia has earned a nice shout out for publicly acknowledging problems within a timely manner and also committing to fix them.

Permalink to story.

 
I might need a new pair of eyeglass, but I can't see the difference, even on a 10-bit monitor.
*insert the picture with the grandma lifting up her eyeglass*
Hint: never go full retard with maxed out graphics settings in FPS multiplayer games.
 
I might need a new pair of eyeglass, but I can't see the difference, even on a 10-bit monitor.
*insert the picture with the grandma lifting up her eyeglass*
Hint: never go full retard with maxed out graphics settings in FPS multiplayer games.

Whats wrong with max settings?
 
I might need a new pair of eyeglass, but I can't see the difference, even on a 10-bit monitor.
*insert the picture with the grandma lifting up her eyeglass*
Hint: never go full retard with maxed out graphics settings in FPS multiplayer games.

Whats wrong with max settings?
Nothing if you have the horse power. Higher frame rate is a ways better in fps games but if you can achieve your monitors max refresh rate at max setting that's great but your better of turning down your settings to achieve that than play at a lower fps
 
I might need a new pair of eyeglass, but I can't see the difference, even on a 10-bit monitor.
*insert the picture with the grandma lifting up her eyeglass*
Hint: never go full retard with maxed out graphics settings in FPS multiplayer games.
It depends because if you only have a 60hz monitor and can achieve that at max setting there's in point in playing at lower settings. I have a 144hz monitor and can play rainbow six siege at 1080p 144fps at max setting and I have a 1080p 144hz monitor there's no point in playing at low settings to get 240fps when my monitor can't do 240hz anyway so might as well no have my game look like halo combat evolved(ps. There is nothing wrong with halo ce in fact it's one of my favorite games ever and the graphics were amazing for it's time but that was over 10 years ago and a quite bad by today's standards)
 
Nvidia loves to charge a premium for hardware that it acknowledges is a shortcoming in its features! They found a way to charge people for alpha testing. Nvidia expectets enthusiast to make up the cost for its revenue short comings, it doesn't matter if its not even half baked.
 
I might need a new pair of eyeglass, but I can't see the difference, even on a 10-bit monitor.
*insert the picture with the grandma lifting up her eyeglass*
Hint: never go full retard with maxed out graphics settings in FPS multiplayer games.
If you have the hardware to run it why not? in some games if I dont run at max settings the fps goes above my 100hz gsync range and then I get taring so there is that. FYI my pc is 9 years old with mods and im gaming at 3440x1440p. My opinion is do your research about the game find a balance what works for you if you want better gameplay or better visuals there is always a compromise ( know your hardware thats why its called personal computer ).
 
DLSS is just another method of upscaling. Major upscaling to my eyes often looks blurry and ugly.

The best methods I have seen are checkerboarding done well but this is affected a great deal by how good the developer implements it.

The next best is dynamic resolution, but only if the range is fairly narrow, the bottom end not too low and dropping down to the lower resolution does not last beyond a split second. Maybe a couple dozen frames.

The first game I played with excellent dynamic resolution was Wipeout HD on PS3. Which for the most part was a crisp 1080p. The game ran so quickly and the engine scaled resolution so well it was difficult to tell when it happened. It looked fantastic because it was very incremental. Like a clean PC game when the majority of console titles at the time ran 720p, or frequently below! This was magical stuff way back in 2008.

Checkerboarding this generation worked wonders in Watch Dogs 2 and Horizon Zero Dawn, the use of it was very impressive. Most other titles I saw that deployed it didn't do a great job.

In a ideal world DLSS will eventually better both of these techniques. I see that it has really good potential and offers PC gamers yet more flexibility to find the perfect settings on their hardware. It is very early days for this technology. I expect it to improve over time and be an option people will take a bit more seriously in future.
 
Last edited:
I might need a new pair of eyeglass, but I can't see the difference, even on a 10-bit monitor.
*insert the picture with the grandma lifting up her eyeglass*
Hint: never go full retard with maxed out graphics settings in FPS multiplayer games.
If you have the hardware to run it why not? in some games if I dont run at max settings the fps goes above my 100hz gsync range and then I get taring so there is that. FYI my pc is 9 years old with mods and im gaming at 3440x1440p. My opinion is do your research about the game find a balance what works for you if you want better gameplay or better visuals there is always a compromise ( know your hardware thats why its called personal computer ).
If you're interested in testing something, try turning vsync off in game and selecting fast sync as the vsync type in the nvidia control panel and uncap your framerate. this will allow you to play with vsync off for better response times and also avoid screen tearing when jumping over your monitors refresh rate :). fast sync.....is like the best thing ever in if you cant get vsync off to stop tearing/corrupting your image. you must try. everyone must try. Gsync is supposed to have vsync on (don't quote me) so I have no idea how you're going above your monitors refresh rate if its on properly o.O
 
" Additional training of neural networks will help bring about higher quality visuals"

In other words, they hope the machines will rise and save nGreedia's butt.
 
Who the hell needs any AA at 4K at least on a desktop monitor. Maybe in a very slow paced game where you walk around looking at the gorgeous visiuals, but for fps action shooters for example, if you aren’t happy with 4K visuals nothing will make you happy. Sure in 2-3 generations when we can run 4K with 8x AA at 100fps sustained yeah I’ll take it. Even on 1440p I rarely use more than 2x AA unless the frames rates are ridiculousy good.
 
I can see Jensen playing 200h of each game, trying to get it's AI to learn how to remove jaggies...

It just works!
 
It's all just a marketing ploy and a business tool, not an actual innovation or advancement. AI is currently a hyped buzzword, so, Nvidia created something that utilizes some form of AI (well, machine learning), because it allows them to charge top dollars from clueless customers with a lot of money. But other than, DLSS offers actually nothing to the end user.
 
If you have the hardware to run it why not? in some games if I dont run at max settings the fps goes above my 100hz gsync range and then I get taring so there is that. FYI my pc is 9 years old with mods and im gaming at 3440x1440p. My opinion is do your research about the game find a balance what works for you if you want better gameplay or better visuals there is always a compromise ( know your hardware thats why its called personal computer ).
Don't get me wrong but I think you misunderstood my sentence.
I'm unable to get (probably an infant) why RTX cards worth it to turn on DLSS, while in this article's thread and so many more on reddit it looks like to me that a bit blurry or sharper sometimes and looks the same, but yes my rig (8700k+1080Ti) can run any single player (and multi) game in 1440p at ultra with vsync on with uncapped FPS (yes there is a difference between capped and uncapped FPS while vsync is on, but some games won't tear with vsync on for some reason) settings (no tearing, but enjoying the pixels). But I rather have advantage (low video settings, high FPS) in multiplayer games than enjoy the graphics.

It depends because if you only have a 60hz monitor and can achieve that at max setting there's in point in playing at lower settings.
Yeah except I like have advantage in MP games (where I can) and I don't really care about graphics in these games, the more advantage I have the better, obv not every game gives you advantage in MP, but I tend to turn of particles and some other stuff to make sure when an explosion goes off I can see "behind it" not just like a fireworks in IRL.
These thing for me in MP just have the annoying category, it looks nice and fancy, but when you turn off you get advantage, which is all MP game about, have the advantage over your enemy whatever it takes as long as it's not violating the EULA.

It just probably with me, but when I see a bigger difference between DLSS turned on Ultra and TAA on a normal GTX (1080Ti) card then I will invest into RTX, that will need like 2-3 generations.
My friend playing metro with 9900+2080Ti and I'm with 1080Ti, hence there were some case when I was able to spot the difference (2-3 out of 30-40 screenshots) mostly near the water and when lens flare comes in (which is the first I disable alongside with motion blur and DoF and chromatic aberration, even if I play single player game on ultra, these things simply annoying).
When I was able to spot the difference, those things are I'm no care about...
Currently this is what I see when someone mention RTX or DLSS:
RTX(sorry DLSS) sucks on higher resolutions with Maxed out video settings
RTX cards sometimes twice the normal GTX
RTX/DLSS is in like open Beta state
RTX(ohh come on it's called DLSS) is very good on 1080p with semi-maxed out video settings, but best keep you off from ultra settings.
and IMHO these are all Cons, only pro is the pure power of RTX cards but price/perf is nonsense.

And yes I know what my PC is capable, very much running anything on maxed out video settings sometimes even with DSR on 1440p monitor (unfortunately this is a U2715H, the AW3418DW coming soon).
 
Last edited:
Nvidia loves to charge a premium for hardware that it acknowledges is a shortcoming in its features! They found a way to charge people for alpha testing. Nvidia expectets enthusiast to make up the cost for its revenue short comings, it doesn't matter if its not even half baked.

Give this man a cookie.
 
Back