DLSS 2.0 in Death Stranding will support 4K 60 FPS, even on an RTX 2060

Shawn Knight

Posts: 15,294   +192
Staff member
In brief: When Kojima Productions’ PC port of Death Stranding arrives later this month, it’ll do so with support for Nvidia’s DLSS 2.0 technology in tow and according to Nvidia, it should be quite the treat. PC gamers will also get to experience Death Stranding’s new photo mode and enjoy support for ultrawide monitors.

Tom’s Hardware recently got their hands on an early preview build and put it through the paces. With DLSS performance mode enabled, the team was able to hit 77 FPS at 4K on even the slowest of the supported cards, an RTX 2060. On the high-end RTX 2080 Ti, they realized a 34 percent improvement in frame rates at 4K (from 78 FPS up to 105 FPS) while in DLSS quality mode.

Of course, you can also run the game natively if your hardware is up to the task or doesn't support DLSS 2.0. A GTX 1060 or AMD Radeon RX 590 and a semi-modern CPU is all that is needed for 1080p at 60 FPS according to Nvidia.

Must read: Nvidia DLSS in 2020: This Could Be a Game-Changer

Nvidia’s revised DLSS (short for Deep Learning Super Sampling) 2.0 was released in April 2020. If you aren’t fully up to speed on the tech, I’d highly recommend checking out Tim’s primer on the subject that we published earlier this year.

Death Stranding was originally scheduled for a June 2 launch on the PC but was delayed by six weeks due to complications from having to work from home during the Covid-19 pandemic.

Death Stranding is now slated to land on Windows PCs on July 14, 2020.

Permalink to story.

 
Hopefully the image quality isn't far off native 4K
9 out of 10 gamers wouldn’t be able to tell the difference in a blind test. DLSS was a bit pants when it initially launched but it’s dramatically improved now. I routinely use it and often have to go into the settings to see if it’s on or not. It’s still coming on in leaps and bounds too.

In fact it’s so good that as soon as all of Nvidias cards get tensor cores (although not all DLSS implementations require this) then AMDs equivalent parts will need more raw power to just to stay even remotely competitive. I’m worried that DLSS will be so effective that courts will rule it as anti-competitive.
 
The statement excludes facts like 4K 60 FPS if you run DLSS on the low quality pre-set. I also couldn't find anywhere in the article where they tell us at what quality level the game is being run.

They didn't even bother to put the info in graphs. That said, 31% at 4K is less then what DLSS 1.0 gained in BFV, which had a 40% advantage.

I guess the title "Game that runs on potato machines also runs on 2060 with low quality DLSS at 4K 60 FPS" isn't clickbait enough though.

In fact it’s so good that as soon as all of Nvidias cards get tensor cores (although not all DLSS implementations require this) then AMDs equivalent parts will need more raw power to just to stay even remotely competitive. I’m worried that DLSS will be so effective that courts will rule it as anti-competitive.

Yeah no. Both Nvidia's freestyle and AMD's CAS (as stated in the article) are just as good quality wise as DLSS 2.0. They have some shimmering artifacts but yeah, it's not a final build. Those techs don't require a per game implementation either. It's been how many years and we have what? A handful of DLSS titles. That's what they call a gimmick, not even 0.00001% of the game market. Freestyle and CAS don't require ridiculous tensor and turning cores taking up die space and use a fraction of the resources to do their job. Just because the tensor/turning cores aren't being used for anything else doesn't mean resources aren't being used and they still take die space.

DLSS is like taking a square block and shoving it through a round hole. I would be much more pleased if Nvidia actually used those turning cores for something that can't be achieved via a sharpening filter.
 
Last edited:
Yes, DLSS really helps players to gain interest in the delivery services. I think Amazon should own this title as their official company game
 
The statement excludes facts like 4K 60 FPS if you run DLSS on the quality pre-set. I also couldn't find anywhere in the article where they tell us at what quality level.

They didn't even bother to put the info in graphs. That said, 31% at 4K is less then what DLSS 1.0 gained in BFV, which had a 40% advantage.

I guess the title "Game that runs on potato machines also runs on 2060 with low quality DLSS at 4K 60 FPS" isn't clickbait enough though.



Yeah no. Both Nvidia's freestyle and AMD's CAS (as stated in the article) are just as good quality wise as DLSS 2.0. They have some shimmering artifacts but yeah, it's not a final build. Those techs don't require a per game implementation either. It's been how many years and we have what? A handful of DLSS titles. That's what they call a gimmick, not even 0.00001% of the game market. Freestyle and CAS don't require ridiculous tensor and turning cores taking up die space and use a fraction of the resources to do their job. Just because the tensor/turning cores aren't being used for anything else doesn't mean resources aren't being used and they still take die space.

DLSS is like taking a square block and shoving it through a round hole.

You can't seriously be comparing DLSS to Freestyle sharpen or AMD CAS, they have nothing in common. But I guess it's Techspot own fault, they released a couple of articles claiming the same.
 
Wish we would get the Metal Gear series on PC instead of this.

I wonder what quality preset they are using, low?
 
You can't seriously be comparing DLSS to Freestyle sharpen or AMD CAS, they have nothing in common. But I guess it's Techspot own fault, they released a couple of articles claiming the same.

Comparing output image quality. The technologies are certainly different but the end goal is the same: Increase performance by reducing resolution with little loss in visual quality.

If you can point out to me why comparing the two on that basis is wrong, feel free.
 
Sounds like DLSS 2.0 will roll out on day 1 on supported games, that is extremely good news!

Though I never liked any of Kojima's games though, MGS series included...they are just too weird.
 
Sounds like DLSS 2.0 will roll out on day 1 on supported games, that is extremely good news!

Though I never liked any of Kojima's games though, MGS series included...they are just too weird.

Yes, much better then DLSS 1.0.

Yeah, Death stranding was definitely targeted at a certain kind of player. I personally enjoyed the game but I wouldn't say it was amazing. I feel the same way towards outerworlds, happy there is more of that content being made by well know game devs who are doing their own thing outside big corp influence but I do hope they step up their next games.
 
Comparing output image quality. The technologies are certainly different but the end goal is the same: Increase performance by reducing resolution with little loss in visual quality.

If you can point out to me why comparing the two on that basis is wrong, feel free.
DLSS 1.0 is comparable maybe. But not 2.0. It’s still very new but in April 2020 DLSS dramatically improved, I now believe most gamers wouldn’t tell the difference between native and DLSS 2.0. The performance is also apparently improving. Looking at how DLSS works with deep learning, it will continue to get better overtime.

It also seems to be getting quite wide support amongs developers. It’s going to be a big advantage for upcoming Nvidia cards, in particular the budget and midrange parts as will allow them to punch well above their weight.
 
DLSS 1.0 is comparable maybe. But not 2.0. It’s still very new but in April 2020 DLSS dramatically improved, I now believe most gamers wouldn’t tell the difference between native and DLSS 2.0. The performance is also apparently improving. Looking at how DLSS works with deep learning, it will continue to get better overtime.

DLSS tries to guess what image "should look like". I don't care if that is close to original or not because I don't really like when "what I'm supposed to see" is changed into "what AI wants me to see".

Essentially same as AI takes all texts on these forums and try to make "better" versions of them.
 
DLSS 1.0 is comparable maybe. But not 2.0. It’s still very new but in April 2020 DLSS dramatically improved, I now believe most gamers wouldn’t tell the difference between native and DLSS 2.0. The performance is also apparently improving. Looking at how DLSS works with deep learning, it will continue to get better overtime.

It also seems to be getting quite wide support amongs developers. It’s going to be a big advantage for upcoming Nvidia cards, in particular the budget and midrange parts as will allow them to punch well above their weight.

Look through Krizby's post history, he posted some links comparing DLSS 2.0 to FreeStyle to CAS to DLSS 2.0 + FreeStyle.

I went over those images at maximum magnification and regular magnification to get both a granular and overhead view. DLSS2.0 + FreeStyle was the worst (due to over sharpening artifacts, do not combine technologies). Image quality wise, DLSS 2.0, FreeStyle, and CAS were on par with each other.

I wouldn't mind TechSpot doing a deep dive though as that was just a small sample. Mind you native 4K was still better then them all. DLSS 2.0 still has a loss of granular detail. CAS and FreeStyle have in rare cases shimmering.
 
DLSS tries to guess what image "should look like". I don't care if that is close to original or not because I don't really like when "what I'm supposed to see" is changed into "what AI wants me to see".

This is exactly what JPEG, HEIF, mp3, AAC, MPEG, h.264, h.265 do. Do you watch your videos, listen to music, view your pictures in lossless formats? This is not an argument in favor of DLSS, just to show that the concept behind DLSS is already being used in other media.
 
This is exactly what JPEG, HEIF, mp3, AAC, MPEG, h.264, h.265 do. Do you watch your videos, listen to music, view your pictures in lossless formats? This is not an argument in favor of DLSS, just to show that the concept behind DLSS is already being used in other media.

Those compressions are what they say, lossy. Lossy compressions usually remove something but they don't actually try to create something that is supposed to look something because AI thinks so. That is what DLSS do. Very big difference.
 
Techspot/Hardware unboxed released a few articles and videos slamming DLSS when it released initially but then didn’t really follow it up with any coverage of the successful DLSS 2.0 so I’m not surprised that commenters here aren’t aware.

But watch this video from digital foundry, they go on to show how in some cases DLSS can look better than native res in motion. But also notes to see any difference you have to zoom in. There are also videos by several other tech tubers saying pretty much the same thing.

 
Techspot/Hardware unboxed released a few articles and videos slamming DLSS when it released initially but then didn’t really follow it up with any coverage of the successful DLSS 2.0 so I’m not surprised that commenters here aren’t aware.

But watch this video from digital foundry, they go on to show how in some cases DLSS can look better than native res in motion. But also notes to see any difference you have to zoom in. There are also videos by several other tech tubers saying pretty much the same thing.

Yeah really. DLSS makes picture different from what it should be and then it "looks better" :joy:

This pretty much sums up how much they understand about this. DLSS absolutely cannot make better as it changes pixels (edit: perhaps content is better word as AA techniques also change pixels but not content) from original. And original is best possible.
 
Last edited:
Techspot/Hardware unboxed released a few articles and videos slamming DLSS when it released initially but then didn’t really follow it up with any coverage of the successful DLSS 2.0 so I’m not surprised that commenters here aren’t aware.
 

I have seen that and fair play to them. I’m not saying that this outlet isn’t aware or even biased. DLSS certainly wasn’t very good initially. But if you search “dlss 2.0 hardware unboxed” the top 5 videos carry thumbnails reading;

DLSS in 2020 Success or Fail? (Released prior to the 2020 DLSS 2.0 update)

Is DLSS amazing .. more just basic upscaling?

DLSS but better (a video about Radeon image sharpening)

DLSS = Fail

Nvidia Scores Own goal - DLSS is defeated.

All these videos released prior to DLSS 2.0. I’m just saying, I can understand why consumers this of this outlet think poorly of DLSS!

It’s a tech that’s had a rough start. But to write it off that early on in its first iteration was a bit unfair. Any new tech will have a teething process. It’s certainly no fail anymore and users who bought into the hardware originally are now receiving the benefit. I’d call it a fail if the people who bought into RTX 20xx series missed out. However, I think the benefit will be most felt when we get some budget parts with tensor cores. They will soon trickle their way down, potentially on something like a 3050 perhaps. With 4x scaling available, a card just needs to do 1080p level rendering to achieve a 4K DLSS output.
 
I’m still debated about this technology. I tried it with Shadows of the Tomb Raider, for a while, but I didn’t like image quality. Performance was great, on a 2070 Super, but the images were a little bit washed out. I ended up turning it off.
I should try another game before saying something about DLSS.
 
Back