Nvidia DLSS support comes to four more titles, including Call of Duty: Black Ops Cold War

Polycount

Posts: 3,017   +590
Staff
What just happened? Nvidia's DLSS feature is better than ever these days: many implementations of the technology can give players tremendous performance upticks at little to no visual cost. Though it's only available in a handful of select games so far, that number has just increased by four today, as DLSS arrives across several new and existing titles.

As Nvidia's Andrew Burnes announced over on the official GeForce blog, the company's AI-based super sampling tech (an image renders at a lower resolution, and then gets upscaled to a higher one) is now available in Call of Duty: Black Ops Cold War, War Thunder, Enlisted, and Ready or Not.

We'd need to extensively test each of these games to say how good their implementation of DLSS is, but we'll share some of Nvidia's performance claims -- which you should take with a grain of salt, as we're not sure how they perform their benchmarks -- now.

For starters, Black Ops Cold War is promising tremendous framerate improvements of up to 85 percent at 4K with ray tracing effects (ray-traced shadows and ambient occlusion) switched on. That takes the game from an uncomfortable 40 FPS on the RTX 2070 Super all the way up to a very playable 76 FPS with DLSS on.

Higher-end GPUS, such as the RTX 3090 or 3080, enjoy much higher DLSS-powered max framerates of up to 124 FPS (about 113 for the 3080). Combined arms warfare title War Thunder's DLSS gains are quite a bit smaller, but it's also worth noting that the game doesn't feature any RTX tech. Regardless, you can expect performance to be "accelerated" by up to 30 percent here.

Squad-based MMOFPS Enlisted (another combined arms title with infantry combat, tanks, and jets) is also an RT-free game, but it benefits from FPS gains of up to 55 percent at 4K. The game is already very playable across a wide range of GPUs, so DLSS is probably overkill: the 3090 bumps framerates up to 233 from 163. Even the humble RTX 2060 can already push a solid 60 frames, but with DLSS, it goes up to 98.

Finally, we've got Ready or Not, which benefits from tremendous performance improvements with DLSS. The game is completely unplayable on lower-end hardware with ray-traced reflections, shadows, and ambient occlusion, but it becomes tolerable with DLSS. The 2060 can only handle a pathetic 15 FPS with RT features cranked up, but DLSS pushes that number up to 40. At the high end, we're seeing pre and post-DLSS numbers of 46.7 and 95.2 (respectively) for the 3090, and 39 and 86 for the 3080.

Overall, the results are a bit of a mixed bag: some games see massive performance improvements, some get more modest increases, but in all cases, your FPS will improve materially with DLSS on in the above titles. It's worth noting that all of Nvidia's testing took place at 4K, so your mileage may vary at lower resolutions like 1080p and 1440p.

Permalink to story.

 
I feel like this is the technology that really matters right now for gaming, much more so than ray tracing. They go hand in hand beautifully though.

The results are so hugely impressive, especially if you want to target high resolutions because that is where big gains are. It's also exciting because this still has scope to improve as AMD get in on the act and offer competition.
 
I feel like this is the technology that really matters right now for gaming, much more so than ray tracing. They go hand in hand beautifully though.

The results are so hugely impressive, especially if you want to target high resolutions because that is where big gains are. It's also exciting because this still has scope to improve as AMD get in on the act and offer competition.
Why not just set the resolution lower? I mean if you want to compromise image quality you can do it yourself for free on any GPU.

The 1080ti is technically capable of 8k gaming too if you make it run at 1080p.
 
Those numbers are from DLSS Performance Mode, which will give a noticeable degradation in visuals vs. Quality Mode, which then of course doesn't give quite the performance uplift listed here. TS and other have mentioned that using DLSS 2.0 Quality Mode in 4K is very good and probably should be used whenever available, but that the visual degradation is noticeable in any lower resolution or quality modes (1440p or 1080p Quality, or 4K Performance as listed here).

And in that video comparing 4K native to 4K DLSS Quality, the oversharpening halos on the DLSS Quality Mode were distracting, though I'll assume it isn't quite so obvious at 4K on a 27-32" monitor. I see those in many example screencaps so I guess that can't be turned off and may be part of why DLSS doesn't look quite as good at 1440p, where those artifacts/effects could be more noticeable.
 
Why not just set the resolution lower? I mean if you want to compromise image quality you can do it yourself for free on any GPU.

The 1080ti is technically capable of 8k gaming too if you make it run at 1080p.

Because DLSS is a better compromise than having a monitor scaling a lower resolution image.

Because every test you look at with sample images and high bitrate video show DLSS works incredibly well.
 
Why not just set the resolution lower? I mean if you want to compromise image quality you can do it yourself for free on any GPU.

The 1080ti is technically capable of 8k gaming too if you make it run at 1080p.

Setting the resolution lower is not what DLSS is at all. DLSS now makes most games look better than native resolution in their “quality” mode whilst running at a higher frame rate.

It’s outstanding tech.
 
Setting the resolution lower is not what DLSS is at all. DLSS now makes most games look better than native resolution in their “quality” mode whilst running at a higher frame rate.

It’s outstanding tech.

Close to native, not better than native. What you are likely perceiving is the light sharpening filter applied by DLSS as "better than native" but you should know that 1) That adds sharpening artifacts 2) Anyone can add a sharpening filter to any game. I personally detest sharpening filters.

Pray tell, how did you come to the conclusion that the DLSS version is better than native? Please explain by which process is it possible to have more detail than the native image through AI upscaling.

Mind you the performance shown above is in performance mode, not quality.

DLSS is good stuff but you are just spreading misinformation.
 
Close to native, not better than native. What you are likely perceiving is the light sharpening filter applied by DLSS as "better than native" but you should know that 1) That adds sharpening artifacts 2) Anyone can add a sharpening filter to any game. I personally detest sharpening filters.

Pray tell, how did you come to the conclusion that the DLSS version is better than native? Please explain by which process is it possible to have more detail than the native image through AI upscaling.

Mind you the performance shown above is in performance mode, not quality.

DLSS is good stuff but you are just spreading misinformation.
It's most likely stems from the Death Stranding implementation that makes some text/textures sharper than the native resolution.

As a side-note, performance mode will never look as good as native, this is a given.
 
Back