An Ultra Quality mode is coming to Nvidia's DLSS 2.0

midian182

Posts: 9,726   +121
Staff member
Something to look forward to: Nvidia's Deep Learning Super Sampling (DLSS) 2.0 offers some fantastic results for those who like to play games at high resolutions with high-quality settings without sacrificing a huge amount of performance, and it could soon make titles look even prettier with the addition of a new "Ultra Quality" mode.

DLSS currently has four presets: Ultra Performance, Performance, Balanced, and Quality, each one altering the balance between picture fidelity and a game's performance. But Redditor u/Reinhardovich has spotted what appears to be a new Ultra Quality preset.

The new option is listed as a placeholder in Unreal Engine 5's DLSS documentation PDF. The preset comes with a note that reads: "mode is a placeholder for feature updates. It should not be visible to end users," suggesting it's still in development.

Currently, the DLSS 2.0 option setting that offers the best image is Quality, which renders a game at 66.6% of the native resolution, upscaling it 1.5x times. It could be that this new Ultra Quality setting is Nvidia's response to AMD's FSR. Team Red's DLSS rival has an Ultra Quality setting that renders games at around 77% of the native resolution, about 1.3x upscaling, so Nvidia may follow suit with its own UQ option.

The latest version of DLSS is 2.2.6.6 that's supported by games including Rainbow Six Siege and Lego Builder's Journey. However, Alexander Battaglia from Digital Foundry notes you can modify the DLSS in Doom Eternal to implement the new DLSS version 2.2.9 found in the Unreal Engine 5 plugin. They shared a comparison on Twitter showing the differences between each iteration, though the Ultra Quality mode in 2.2.9 isn't available to users just yet.

It'll certainly be interesting to see how FSR compares directly to DLSS 2.0, though we're still waiting for a game that supports both technologies. Check out Hardware Unboxed's analysis of AMD's FidelityFX Super Resolution in the video below.

Permalink to story.

 
What do you think, TS:
Apparently not even AMD has acknowledged FSR as a DLSS competitor, and Alex from DF has also given a very convincing explanation on the differences.

 
Last edited:
"FSR is a DLSS competitor, no it's not!"

It's just semantics and I'm sick of stupid people that make a big deal on a technicality when it's clear that the goal is the same for both of these techs: give more fps without losing image quality (if possible, or better) by lowering the resolution internally and then outputting the desired higher resolution at the end, with all the "magic" done inbetween.

Who gives a **** how you solve a problem, if you do? You can do it in 100 ways (exaggeration) and come to the same end result... and that's the only thing that matters, the end result, the end goal. And that's the same for both FSR and DLSS.

Also I want to add one more thing: thanks to FSR, now DLSS will also get better much faster. So everyone can thanks AMD twice.
 
I've stumbled upon this month's ago I thought it was just ultra settings in Unreal engine and thought the maximum setting for dlss was quality. Technically you can force a higher quality dlss with dsr currently. Set dsr superior native resolution to above 4k and dlss set to quality. The sampled resolution will be higher than 1440p.
 
More proprietary nvidia cr@p......no thanks.

I know that publications dont need to express their opinions and simply keep creating hype, but I feel that is a disservice to do not call out sh!tty locking tech like what nvidia is always so fond off.
 
Last edited by a moderator:
More proprietary nvidia cr@p......no thanks.

I inow that publications dont need to express their opinions and simply keep creating hype, but I feel that is a disservice to do not call out sh!tty locking tech like what nvidia is always so fond off.
Let's hope that AMD leapfrogs FSR with RDNA 3 against Nvidias actively fortifying position. Open standards need to keep proprietary ones in check!
 
Open standards need to keep proprietary ones in check!
Personally, I prefer to say that open standards should bury the proprietary ones.

those serve only the manufacturer, not us the consumers.
 
Personally, I prefer to say that open standards should bury the proprietary ones.

those serve only the manufacturer, not us the consumers.
True. Unfortunately the industry works like this. Company A uses resources on R and D and wants return on investment creating a niche market through proprietary technology
company B uses resources for open standard to open up market for everyone. Each company is thinking about profits and using a different angle to corner the market. As consumers we need both to keep competitive to a steady stream of innovation. Once the balance is broken you either have too much proprietary innovation plateau into no innovation like intel where with every generation stagnated into the same 4 cores 8 threads for a decade with minimal improvement. It would be nice if open standards kept innovation better than proprietary ones but I believe the RandD wouldn't allow it. Just my opinion.
Update isn't it ironic that Intel is using open standards to compete against Nvidia in gpu market.
 
Last edited:
Just another thing that nVidia never would have done if AMD hadn't done it first. To hell with them, I'd rather have an industry-wide standard that nVidia's proprietary crap. Buying an nVidia card these days is like buying a Dell.
 
Just another thing that nVidia never would have done if AMD hadn't done it first. To hell with them, I'd rather have an industry-wide standard that nVidia's proprietary crap. Buying an nVidia card these days is like buying a Dell.
What did AMD do first?
 
Just another thing that nVidia never would have done if AMD hadn't done it first. To hell with them, I'd rather have an industry-wide standard that nVidia's proprietary crap. Buying an nVidia card these days is like buying a Dell.
So when my friend bought a Ryzen Aurora Alienware build with 5800x and 6800 xt he actually bought a Nvidia computer 🤔?
The moral of the story is everyone knows that Nvidia is greedy we just need AMD to keep Nvidia on its toes and sometimes make it sweat!
 
So when my friend bought a Ryzen Aurora Alienware build with 5800x and 6800 xt he actually bought a Nvidia computer 🤔?
The moral of the story is everyone knows that Nvidia is greedy we just need AMD to keep Nvidia on its toes and sometimes make it sweat!
AMD can't do this job unless you support AMD doing this job by buying its products where it matters.
 
AMD can't do this job unless you support AMD doing this job by buying its products where it matters.
True but wouldn't the rolls be reversed? Eventually AMD will stop being the underdog, a decade ago who would have predicted AMD would have consumer products close to $1000 or more and today we have some x570 motherboards, 5950x, and a 6900xt around that price. It seems like the working formula is great products and not products consumers will regret purchasing due to brand loyalty every generation imo.
 
I was just watching a video about Nvidia's new Ultra Performance mode, the one for 8K gameplay.
Of course, since memory on AMD's video cards tops out at 16 GB, they can't offer 8K gaming, and so they won't be copying that mode.
 
I was just watching a video about Nvidia's new Ultra Performance mode, the one for 8K gameplay.
Of course, since memory on AMD's video cards tops out at 16 GB, they can't offer 8K gaming, and so they won't be copying that mode.
How much vram was used to make such a claim? The article is talking ultra quality not ultra performance though.
 
So when my friend bought a Ryzen Aurora Alienware build with 5800x and 6800 xt he actually bought a Nvidia computer 🤔?
The moral of the story is everyone knows that Nvidia is greedy we just need AMD to keep Nvidia on its toes and sometimes make it sweat!
Actually, the moral of the story is "Don't buy nVidia until they smarten up."
 
True but wouldn't the rolls be reversed? Eventually AMD will stop being the underdog, a decade ago who would have predicted AMD would have consumer products close to $1000 or more and today we have some x570 motherboards, 5950x, and a 6900xt around that price. It seems like the working formula is great products and not products consumers will regret purchasing due to brand loyalty every generation imo.
When AMD stops being the underdog and achieves parity then the ideal goal for the consumer has been reached. Until then, supporting nVidia has only proven to embolden them more and more. Look at what has actually occurred before offering conjecture to the contrary of reality.

People who were brand-loyal to AMD or (like me) anti-Intel are the reason that AMD managed to survive and bring us Ryzen. Your own argument defeats itself.
 
When AMD stops being the underdog and achieves parity then the ideal goal for the consumer has been reached. Until then, supporting nVidia has only proven to embolden them more and more. Look at what has actually occurred before offering conjecture to the contrary of reality.

People who were brand-loyal to AMD or (like me) anti-Intel are the reason that AMD managed to survive and bring us Ryzen. Your own argument defeats itself.
Do you want intel to not compete at all? Same question goes for Nvidia?
 
True but wouldn't the rolls be reversed? Eventually AMD will stop being the underdog, a decade ago who would have predicted AMD would have consumer products close to $1000 or more and today we have some x570 motherboards, 5950x, and a 6900xt around that price. It seems like the working formula is great products and not products consumers will regret purchasing due to brand loyalty every generation imo.
Let's talk about this when AMD actually stops being the underdog.

There's no point talking about this now.
 
Back