Nvidia shows off DLSS 3.5 reconstruction, improves ray tracing denoising

Daniel Sims

Posts: 1,376   +43
Staff
Why it matters: Prior steps in Nvidia's DLSS toolchain have used AI to reconstruct pixels and generate new frames. Now, the company uses the same technology to approximate rays in ray-traced games, thus improving image quality and efficiency. The new functionality supports all RTX GPUs.

This week, Nvidia released DLSS 3.5, unveiling a new feature called Ray Reconstruction aimed at improving denoising, a crucial phase in the ray tracing process. This technique enhances lighting with a minor performance boost or impact, depending on the specific game.

Emitting rays for each pixel is too computationally intensive in applications incorporating ray or path tracing. Therefore, they only shoot enough to approximate each frame. However, this technique occasionally causes a spotty or noisy appearance.

Numerous denoising techniques strive to fill the gaps but introduce new flaws, which could lead to ghosting or omit specific effects like ambient occlusion. Furthermore, upscaling – usually necessary to lessen the extreme performance cost of ray tracing – can interfere with denoising.

Ray Reconstruction combines a game's disparate denoising techniques into a unified step, working with the upscaling process instead of against it to provide more comprehensive ray tracing. With the new technology, games with many ray tracing features, like Cyberpunk 2077, could see slightly higher frame rates. However, titles with comparatively light RT implementations may suffer a minor performance drop.

Nvidia's new feature supports RTX 2000, 3000, and 4000 graphics card lines. It will debut later this year in Cyberpunk 2077, Alan Wake II, Portal RTX, Chaos Vantage, and D5 Renderer. The company will unveil Alan Wake II's use of path tracing in a video demonstration of Ray Reconstruction on August 23. The technology will also feature in an in-development path-traced remaster of Half-Life 2.

Another product Nvidia introduced this week is NeMo SteerLM, a toolchain allowing developers to utilize the company's Avatar Cloud Engine (ACE) AI models. Nvidia demoed the ACE large language model at Computex by showing a virtual character having a dynamic conversation with a user.

NeMo SteerLM allows game NPCs to deliver appropriate responses based on individual attributes. They can also fluidly react to changes in the story and the world. There's no information yet on what developers or games could eventually utilize the toolchain, but mods have tried to apply the same fundamental concept to titles like Mount & Blade.

Permalink to story.

 
Seems promising to offset the brute forced pathtracing performance hit.
 
Last edited:
Making the 2000rtx models compatible with this is good. But one wonders if they have the grunt to do it justice anyway.
 
Just in case anyone's wondering how all the various DLSS naming mess fits together:

nvidia_dlss_chart.png

Seems promising to offset the brute forced pathtracing performance hit.
Path tracing isn't a brute force approach -- quite the opposite, really. A mass of additional math and shaders is required to reduce the number of rays that are actually cast during the process, hence the enormous processing hits. A brute force approach would be to fire out a mass of rays, be it sensible to do so or not.
 
I didn't realise the denoisers were causing such a massive loss in colour, look at the difference! Damn, did we get confirmation this is coming out with the Cyberpunk DLC?
 
I didn't realise the denoisers were causing such a massive loss in colour, look at the difference! Damn, did we get confirmation this is coming out with the Cyberpunk DLC?
Yep it seems that will come with CP2077 PL expansion (not DLC).

Of course that you and many of us did not realize it, because Nvidia hide this from consumers while claiming false performance and quality of their DLSS 1,2, 3, 3.14, 4.33 and so on.
For this Nvidia used his army of youtubers reviewers, which even now are ready to prove us that DLSS 2 and 3 is "superior" and offer amazing "quality" even better than 4K native.
Until Nvidia will come out with a new DLSS, number 007, in which they will recognize again that previous DLSS were noisy, wrong colored, but this "new" one is the real one.
Just buy Nvidia new videocards which are even "smarter", cause it has AI word in their names, but in reality may prove being just more expensive for something like DLSS that truly works as suposed to be only for their next gen implementation.
 
Last edited:
Still not sold on ray tracing, imo Control still has the best show of what it can do and that game is kinda old now.

it's neat to see if you have the hardware to push it, but theres so many games that dont have it and use great art direction instead to look just as good.
 
Me: "The image quality of dlss isn't as good as native"
Nvidia fan boys: "wtf are you stupid? The upscale image looks better than native!"
Nvidia: "here are all the things we're gonna fix because dlss image quality isn't as good as native"
Nvidia fanboys: "omg 😍❤️"
 
Yep it seems that will come with CP2077 PL expansion (not DLC).

Of course that you and many of us did not realize it, because Nvidia hide this from consumers while claiming false performance and quality of their DLSS 1,2, 3, 3.14, 4.33 and so on.
For this Nvidia used his army of youtubers reviewers, which even now are ready to prove us that DLSS2 and 3 is "superior" and offer amazing "quality".
Until Nvidia will come out with a new DLSS, number 007, in which they will recognize again that previous DLSS were noisy, wrong colored, but this "new" one is the real one.
Just buy Nvidia new videocards which are even "smarter", cause it has AI word in its name, but in reality may prove just more expensive.
Yeah yeah, you can complain all you want, the competition has done nothing to fix the colour issues in the denoisers either.

Literally everything you said, you could replace "Nvidia" for "AMD" and "DLSS" with "FSR" the sentence would still make sense.

I really hope Intel is able to break out into the GPU market in a big way, we need more competition in this space.
 
Yeah yeah, you can complain all you want, the competition has done nothing to fix the colour issues in the denoisers either.

Literally everything you said, you could replace "Nvidia" for "AMD" and "DLSS" with "FSR" the sentence would still make sense.

I really hope Intel is able to break out into the GPU market in a big way, we need more competition in this space.
At least competition did not lie as Nvidia did about fake frame, temporal scaling DLSS and so on. At least not at the level which Nvidia is doing.
And claiming that, in your imagination, I am complaining, only shows that your approach to a debate is embarrassing. I am surgically exposing Nvidia anticonsumer policy about DLSS.
And some Nvidia fanboys, unable to accept the facts, have a lame reaction towards the messenger, instead of focusing and bringing valid arguments.
Because not all entire Nvidia DLSS technology is bad. Some of DLSS is good, just that not at the level of which Nvidia PR is misleading consumers.
And this article just shows this crystal clear.

I hope that you realize that exposing Nvidia false PR of DLSS in time, and especially Nvidia DLSS dark pattern of artificial market segmentation is helping everybody, including Nvidia videocards buyers, because they will make better informed decisions. We, consumers also can determine Nvidia to stop their anticonsumer approach, and to come up with better products and prices or to make their "new" technology available at least for more generations of their videocards.
Like AMD is doing with their FSR technology, or with their AM4 and AM5 platform longevity.

You can thanks those which exposed Nvidia wrongdoings, so that Nvidia "offered" DLSS 3.5 to all RTX gen cards. Without the community backlash against Nvidia DLSS artificial market segmentation, DLSS 3.5 would have been an "EXCLUSIVE" technology only to 4090 owners or worse, to next gen RTX 5xxx videocards.
Making DLSS 3.5 available for all RTX cards it is a good step in the right direction from Nvidia for their videocards buyers. A small step in the right direction for Nvidia, which is also a giant win for Nvidia videocards buyers. Thus, all RTX owners, including myself can enjoy it.
You, and also to chat readers, do you agree?
 
Last edited:
Yeah yeah, you can complain all you want, the competition has done nothing to fix the colour issues in the denoisers either.

Literally everything you said, you could replace "Nvidia" for "AMD" and "DLSS" with "FSR" the sentence would still make sense.

I really hope Intel is able to break out into the GPU market in a big way, we need more competition in this space.

And again, is OK to have different opinions, to debate them and bring arguments. This does not means that I am against you, or others are against me, when our opinions differ. I learned a lot from those which have different opinions than mine and brought valid arguments, and I am thankful for that. All of this helps me to make a better informed opinion, and to adapt or change my opinions when they were wrong, or when I was not enough informed.
 
For this Nvidia used his army of youtubers reviewers, which even now are ready to prove us that DLSS 2 and 3 is "superior" and offer amazing "quality" even better than 4K native.
How to tell us you've never used dlss 2.x.x without telling us you've never used dlss 2.x.x

Hardware Unboxed is sooooo sold to nvidia, cause it's them that gave it 17 points at 1440p dlss vs 11 points for native+taa in image quality comparions in 25 games.
/S
And that wasn't even with manually swapping the default dll to 2.5.1.

Now with ray reconstruction it's not even a contest, since even console games use rt for max settings.All amd will do is lock more games to fsr2.

Me: "The image quality of dlss isn't as good as native"
Nvidia fan boys: "wtf are you stupid? The upscale image looks better than native!"
Nvidia: "here are all the things we're gonna fix because dlss image quality isn't as good as native"
Nvidia fanboys: "omg 😍❤️"
you're terribly confused. see nvidia's video on native vs 3.5 (sr+rr). it's supposed to improve ray tracing only compared to native resolution too. RR is effectively a better rt/pt denoiser that works with dlss only, not a new dlss super resolution revision. educate yourself before you post this sort of flamebait content.

 
Last edited:
You, and also to chat readers, do you agree?
Honestly, It wasn't worth reading your whole comment, I'm not standing up for Nvidia in the slightest, I don't care for their PR or pricing or locked-in bullsh*t like DLSS.

The reason all your "points" ring hollow is because the competition don't ever seem to have an answer that's not just as toxic as Nvidia's own strategies.

Personally, I think AMD actively blocking developers from using DLSS and forcing FSR is a dirtier trick than anything Nvidia's currently doing.
 
Why are people complaining? Raytracing socks big time. I can't believe this fake crap has persisted to this date when beam tracing is far superior. It contains all the physics because beams are solutions of simplified Maxwell equations and no you do not need to solve this equation at all, your seed beams just have to be solutions of Helholtz equation. 1000 beams will produce noise free images. Colour falls out naturally, refraction and diffraction too. The equations for reflection and refraction are just modified versions of Snell's law. You wouldn't need RT cores you'd just havrle to make your gpu fp32 fast.
 
You wouldn't need RT cores you'd just havrle to make your gpu fp32 fast.
how fast ? how would it compare to doing ray casting + bvh + denoising on dedicated rt hardware ?

Honestly, It wasn't worth reading your whole comment, I'm not standing up for Nvidia in the slightest, I don't care for their PR or pricing or locked-in bullsh*t like DLSS.

The reason all your "points" ring hollow is because the competition don't ever seem to have an answer that's not just as toxic as Nvidia's own strategies.

Personally, I think AMD actively blocking developers from using DLSS and forcing FSR is a dirtier trick than anything Nvidia's currently doing.
Yeah, I really didn't like that strategy of locking dlss instead of improving fsr2.
After watching HUB's video on the topic, I packed my 6800 right back into the box to serve me as secondary in the future maybe (or sell during a shortage), and got a used 3080. Let's keep the standards even and clear for both parties. Just cause one has a history of crap, doesn't mean I need to support the other doing it presently.
Turned out to be a great decision, dlss sr with that rr denoiser looks great

 
Last edited:
You wouldn't need RT cores you'd just havrle to make your gpu fp32 fast.
You'd still need ASICs to accelerate the plane-geometry intersection calculations and BVH traversals, though. Simply throwing more FP32 throughput at the chip isn't enough -- one only needs to see how well a Pascal GPU copes with ray tracing to see this, although its cache structure doesn't help matters.
 
I love those upscaler tests/videos from HUB, imo upscalers are the future. In my own testing, 1440p fsr/dlss in balanced presets looked FAR better than 1080p native, while performing exactly the same. Similar situation running 4K fsr balanced/dlss performance vs 1440p native, though on amd you can't really use fsr performance because of shimmer, so gotta use balanced - you'll still end up only 10% slower than 1440p native for much clearer IQ.
A good upscaler is a must these days, the happier I am seeing nvidia is improving dlss and intel is making xess better and better all the time.
 
Back