DLSS finally arrives in CoD: Warzone; Nvidia claims up to 70% performance boost

midian182

Posts: 9,763   +121
Staff member
What just happened? Over three months since it announced the feature was on its way, DLSS has finally come to Call of Duty: Warzone. According to Nvidia, turning on the company’s Deep Learning Super Sampling, assuming you have an RTX 20-Series or newer, will bring up to 70% faster performance in the FTP battle royale title.

At its CES keynote in January, Nvidia said Warzone would be following in CoD Black Ops Cold War’s footsteps by introducing support for DLSS. While the amount of difference it makes does vary from game to game, team green claims you can expect up to a 70 percent frames per second improvement in Warzone, though that’s probably an optimistic figure.

According to the company’s charts, enabling DLSS while using an RTX 3060 (4K, max settings, i9-10900K) will boost Warzone's framerate from 47.9 fps to 82.1 fps. The difference is even greater if you’re lucky enough to own an RTX 3090; it moves from 102 fps to 150.9 fps.

VideoCardz writes that some players who have already enabled DLSS in Warzone are reporting visible artifacts and ghosting, but stability should improve with future updates.

DLSS has seen vast improvements since its arrival and is becoming an option in more titles—check out the full list here. We heard earlier this month that the Unity game engine would be receiving support by the end of the year, and there’s an Unreal Engine 4 plugin that makes implementing DLSS in games a lot easier for devs. This method has already been utilized in the System Shock remake demo.

In addition to adding DLSS, the 25.2GB Warzone Season 3 update adds new content and a slew of bug fixes.

Permalink to story.

 
DLSS will ALWAYS have ghosting. I still think it's better than plain old TAA though. Some games pretty much have to have TAA for it to look normal though, like RDR2.
 
The results are only impressive when you are using DLSS on a 4K resolution. The earlier findings on higher CPU usage to drive their software scheduler is clearly hitting the performance gains at lower resolution hard as seen from the limited gains. The positive is that its free and effortless performance gain for consumers.
 
AMD software needs to improve immensely and immediately. Being faster isn't enough. A GPU isn't a one trick pony anymore.
When you said software, I supposed you mean Fidelity Super Resolution? I feel software wise, AMD have improved quite a fair bit over the years. There are bugs here and there which is a perpetual problem regardless of which brand, but I feel stability is no longer that big an issue as compared to the initial launch of RDNA.
 
"Turning down the settings improves performance" is like, the least newsworthy thing to happen the past month.
 
When you said software, I supposed you mean Fidelity Super Resolution? I feel software wise, AMD have improved quite a fair bit over the years. There are bugs here and there which is a perpetual problem regardless of which brand, but I feel stability is no longer that big an issue as compared to the initial launch of RDNA.
I mean their entire software product stack.
From streaming to gaming.
 
Last edited:
"Turning down the settings improves performance" is like, the least newsworthy thing to happen the past month.
This is true, but it is not a simple case of just turning down the resolution because there are advantages to using DLSS 2.0 apart from just improved performance. There may be noticible degrade in image quality, I.e. blurriness, that's being introduced. But it is not as bad as you switching to a lower resolution and not using DLSS 2.0.
 
When DLSS was first announced I remember thinking "this is a game-changer", and then almost immediately being disappointed by the results.

Kudos to Nvidia for sticking with it and improving things massively with DLSS2. It literally is a game-changer now, just needs better game support. If only I could buy a 30-series card.

I'm surprised they don't release a Tensor-only card that implements DLSS for non-Nvidia cards. I'm sure their PR dept would have a field day with that one.
 
I think it's a given that the next Nintendo Switch will have Tensor cores on the dies to upscale games to 4k.
 
Looks fantastic to my eyes, virtually 0 difference to native but now I can framecap 140fps @ 3440x1440 at all times.
 
I'm surprised they don't release a Tensor-only card that implements DLSS for non-Nvidia cards. I'm sure their PR dept would have a field day with that one.
I always thought a RT only card would be cool. So you could actually turn Ultra RT on without a halving in fps.
 
I'm surprised they don't release a Tensor-only card that implements DLSS for non-Nvidia cards. I'm sure their PR dept would have a field day with that one.
I always thought a RT only card would be cool. So you could actually turn Ultra RT on without a halving in fps.
Tensor and RT workloads consume a lot of data, and hit the GPU’s internal and external memory structures hard. No discrete add-in solution that’s remotely affordable would be able to offer the necessary bandwidth and latencies.
I think it's a given that the next Nintendo Switch will have Tensor cores on the dies to upscale games to 4k.
I’d disagree with the remark that it’s a give , as Tensor cores are not necessarily required for DLSS - preferred, yes, but not actually necessary. The other element to consider is that Nvidia’s Tensor core enabled Tegra models are massive in size, compared to the X1 in the Switch (well, they’re big by any measure). Big chips don’t make for good yields of lots of cheap dies.

Naturally, Nvidia may well have designed a completely new SoC just for Nintendo, just as AMD did for Sony and Microsoft. But the latter was still based on existing architectures, and in the case of Nvidia’s, there’s no magic cure for shrinking the die area of Tensor cores (as they’re just a large array of, typically, FP16 ALUs).

Historically, the Nintendo crystal ball rumour mill has generally overestimated what would eventually come to pass, and I have the feeling that the same will occur here. Nintendo doesn’t need to release a 4K Switch Pro model when they’re still selling bucketloads of cheap-as-chips-to-make standard models; what we may just see is an improved docking station with an integrated 4K upscaler and nothing more than that.
 
Tensor and RT workloads consume a lot of data, and hit the GPU’s internal and external memory structures hard. No discrete add-in solution that’s remotely affordable would be able to offer the necessary bandwidth and latencies.

I’d disagree with the remark that it’s a give , as Tensor cores are not necessarily required for DLSS - preferred, yes, but not actually necessary. The other element to consider is that Nvidia’s Tensor core enabled Tegra models are massive in size, compared to the X1 in the Switch (well, they’re big by any measure). Big chips don’t make for good yields of lots of cheap dies.

Naturally, Nvidia may well have designed a completely new SoC just for Nintendo, just AMD did for Sony and Microsoft. But the latter was still based on existing architectures, and in the case of Nvidia’s, there’s no magic cure for shrinking the die area of Tensor cores (as they’re just a alrge array of, typically, FP16 ALUs).

Historically, the Nintendo crystal ball rumour mill has generally overestimated what would eventually come to pass, and I have the feeling that the same will occur here. Nintendo doesn’t need to release a 4K Switch Pro model when they’re still selling bucketloads of cheap-as-chips-to-make standard models; what we may just see is an improved docking station with an integrated 4K upscaler and nothing more than that.
I actually agree that 4K may not make it to the Nintendo Switch. I think Nintendo have over the years shown that they are not interested in catching up in terms of graphics, and just focus on innovative ways of gaming. And I that strategy have served them well when we look at the massive success of both Wii and Switch. I can't imagine DLSS on a Switch because the Tegra SOC is unlikely to be anywhere near 4K capable in the first place with a very limited power budget to not kill the battery. If they try to upscale from 720p or lower to 4K, its gonna look like blurry mess of a game.
 
Back