Nvidia DLSS in 2020: This Could Be a Game-Changer

Some readers criticized our original features...

From how the tech was described, there was no reason at the time to believe it wouldn't get better. It didn't get better in BFV and Metro, but it eventually got better to become a "game changer."

I think the point was rather that the original DLSS implementation - which Techspot found severely lacking - would magically improve over time due to the AI learning.

That was not the case but instead a „fine wine“ moment for nVidia who realized their original was not up to par so they redid it. Kudos to them for doing that.

Still, it appears to me that Techspot were quite right with their original assessment.
 
Why do the last graphs highlight “performance improvements” instead of giving us the difference between the native resolutions?

Example: FPS at native 1440p compared to FPS at 1440p with DLSS set to 4K compared to FPS at native 4K

What users want to know is how many FPS they are sacrificing when using DLSS.

Techspot must have this info as they keep saying DLSS is a “free” performance upgrade... show us the numbers!

You're not sacrificing FPS, my man. You are gaining FPS..."for free."
 
Because else you can't sell RTX,
It's selling well.

that 5% FPS lead over AMD is the only thing anyone cares about ... right?
The only AMD chips that are within 5% are the ones that cost twice as much as Intel's chips, and they still lose. If you compare Intels $350 9700K to AMD's 3800X in games, it gets ABSOLUTELY SLAUGHTERED by 10-20%. Intel's 6/12 8700K smashes AMD's 6/12 3600/3600X chips, at various resolutions and with various GPUS's, just like how the 9900K smashes the 8/16 3700X/3800X, from 720p, to 1080p and 1440p, usually by 10-25 FPS across the board.

Right now all RTX does is trade quality from somewhere in the game, to try to spruce it up in another area.
RT looks amazing when used properly, the technology will be included in both the new Xbox and PS5.
 
Last edited by a moderator:
If I understand, DLSS uses the custom Tensor Cores to implement an AI based image up-sampler. If you are not using DLSS, the tensor cores go unused for gaming. They can be used for other AI applications, but with out DLSS, gamers are paying for addition silicon they don't need.

What about integrated graphics?
 
I think the point was rather that the original DLSS implementation - which Techspot found severely lacking - would magically improve over time due to the AI learning.

There was no way to know until time had passed. They had no idea at the time. No one did. People merely made weak predictions from a sample of two games.
 
Why do the last graphs highlight “performance improvements” instead of giving us the difference between the native resolutions?

Example: FPS at native 1440p compared to FPS at 1440p with DLSS set to 4K compared to FPS at native 4K

What users want to know is how many FPS they are sacrificing when using DLSS.

Techspot must have this info as they keep saying DLSS is a “free” performance upgrade... show us the numbers!

How would you be sacrificing FPS when using DLSS? It's supposed to be the opposite.
 
This better be the reason that Cyber Punk was delayed. It's the only way I'm going to be able to run it at 2k with a 2080. 144hz is out of the question but maybe 90/100hz if I'm lucky.
 
How would you be sacrificing FPS when using DLSS? It's supposed to be the opposite.
I meant you are sacrificing FPS when implementing it at the same resolution.... example: playing at 1440p vs playing at 1440p with DLSS set to 4K....

Techspot says the FPS will be virtually the same... show us!
 
Honestly I couldn't care less about DLSS or RTX.

Still quite happy with my good old overclocked GTX 1080 Ti which does the job just fine on my humble 1440p display.
LoL I actually refused to buy even those I stay at the simple 1070 and wait what AMD is going to deliver with big navi.

N(fale) failed to run 4K native as promised and then go back to make it 1440p and then rescale it to 1800p. again not really 4K
Thats in my view still a huge fail for those cards one of my friends bought 2 of them and was expecting all 4K games to run smooth and fantastic.
But lol nooooo it did not, now N(fale) starts again to sell a new product which actually lets you play a 1440p game, with fake and flawed upscaling.

Again its selling products which do not run as promised, using these graphic tricks does not make it better. Especially not for those ***** prices they charge for that cards. My mate was pretty angry spending a small cars price on those 2 cards and still the performance does not make it worth the money spend
 
Erm.......
Sorry, but no. Still not worth it. I'm looking at these screenshots fullscreen at 1440p, and I'm seeing Jack and 5h1t for improvements. The shadows are maybe slightly different, or the edges ever so slightly smoother, the type of improvement that is only noticeable if you take a screenshow in 4k and zoom in 400% while standing perfectly still. I frankly cant see ANY real difference between the different screenshots.

Wait, no, I did find something consistent, the DLSS images look consistently blurrier then the rest. The posters and tags with small illegible black writing become eraser smears.

And the gaming industry expects me to sacrifice half my FPS for this? This is like bragging your 16xAF is TOTALLY a massive game changer over 8xAF. If you need perfect vision, a 8k monitor, and 0 movement and 15 minutes to see any difference, that is the definition of a useless gimmick. Keeping the FPS consistently higher while pushing higher resolution and ultra details with lower latency per frame is worlds more noticeable.
 
Last edited:
Erm.......
Sorry, but no. Still not worth it. I'm looking at these screenshots fullscreen at 1440p, and I'm seeing Jack and 5h1t for improvements.
This is not something your going to appreciate or truly notice by looking at comparative screenshots, as good as they are.
This is something you will have to fully experience yourself, and if you did, you would probably come away pretty impressed, just like the reviewer.
I trust Scorpus's judgement, he's a good guy and not easily swayed.
 
Why are we in the year 2020 still focused on WW-II, or even worse, sci-fi games that somehow still resolve to Nazi-vs-Allies gameplay? Isn't it time, with or without DLSS, to move away from the boring WW-II scenery, and into the modern and/or futuristic gameplay, without constantly recycling the same old 5 years of the history? The gamers don't want that shitty period anymore. Let's move on.

Or is someone gaining political and financial interest in constantly feeding the young generations with false information and propaganda?
 
It's selling well.
By NVidia's admission, the uptake has been very slow. They wouldn't want to increase sales? Jensen just says "It's selling well, people, stop all the marketing spend, and improvement efforts everyone, its selling good enough."? Hmmm .... I have my doubts ...

The only AMD chips that are within 5% are the ones that cost twice as much as Intel's chips, and they still lose. If you compare Intels $350 9700K to AMD's 3800X in games, it gets ABSOLUTELY SLAUGHTERED by 10-20%. Intel's 6/12 8700K smashes AMD's 6/12 3600/3600X chips, at various resolutions and with various GPUS's, just like how the 9900K smashes the 8/16 3700X/3800X, from 720p, to 1080p and 1440p, usually by 10-25 FPS across the board.
What, no underlined, oversized font and steam survey "marketshare" data this time? You're getting soft. At least you got a few all caps.

RT looks amazing when used properly, the technology will be included in both the new Xbox and PS5.
with DLSS off and 30 fps, yes it looks pretty decent on RTX ... it'll come around eventually.
 
Last edited by a moderator:
This is not something your going to appreciate or truly notice by looking at comparative screenshots, as good as they are.
This is something you will have to fully experience yourself, and if you did, you would probably come away pretty impressed, just like the reviewer.
I trust Scorpus's judgement, he's a good guy and not easily swayed.
I have seen DLSS in action, in uncompressed 1440p footage. Looks identical to non DLSS footage except viewed through dirty glasses. Blur central. The screenshots here back up that footage. DLSS is pointless, like PhysX turned out to be in 99% of applications that used it, and will likely disappear just as quickly.
 
Why are we in the year 2020 still focused on WW-II, or even worse, sci-fi games that somehow still resolve to Nazi-vs-Allies gameplay? Isn't it time, with or without DLSS, to move away from the boring WW-II scenery, and into the modern and/or futuristic gameplay, without constantly recycling the same old 5 years of the history? The gamers don't want that shitty period anymore. Let's move on.

Or is someone gaining political and financial interest in constantly feeding the young generations with false information and propaganda?

Off topic ... but poignant ... I'd say you might be correct ...
 
Why are we in the year 2020 still focused on WW-II, or even worse, sci-fi games that somehow still resolve to Nazi-vs-Allies gameplay? Isn't it time, with or without DLSS, to move away from the boring WW-II scenery, and into the modern and/or futuristic gameplay, without constantly recycling the same old 5 years of the history? The gamers don't want that shitty period anymore. Let's move on.

Or is someone gaining political and financial interest in constantly feeding the young generations with false information and propaganda?
I guess you missed this really big game, it was called call of duty...modern warfare I think? Came out a decade ago, it was a really big deal massive popular, kicked of 8 years of nothing but modern military shooters about shooting terrorists or Russians in the middle east while the WWII genre went MIA entirely, to the point that a Call of duty of battlefield set in WWII was a breath of fresh air.

Of course, for a non cheeky answer, its because everyone hates nazis. Nazis can be gunned down with no consequence. But any other time period or topic, you are bound to anger some sect of twitter or reset era and send them REEEing to the likes of kotaku and reddit. Game developers are terrified of pissing off people with no lives outside of calling things racist/sexist/phobic/antisemetic/nazi/bigot/ece, for fear of losing $10 from the all important unemployed twitter sperg market, and thus target the lowest common denominator market that nobody will get upset at so they dont have to grow a spine.

Or it could be that, despite your protests, gamers really do love that time period, as well as paying hundreds of dollars in micro transactions and DLC and pre order bonuses and lootboxes and unimaginative slop. Its either WWII shooters, modern shooters set in a sandbox, or casual gatcha games. Everything else doesnt make all the money in the world, and thus most developers wont ever bat an eye at them. Consumers cuddle so deep into the nostalgia blanket they are discovering new species of alien crab, and developers milk that rose vision for every penny its worth, while genuine ideas and niche markets get ignored entirely, like sony passing up the genuinely interesting popeye short and instead going for the emoji movie.

Doesnt make a lick of sense to me, but then most of this industry doesnt anymore. The quality of games has slid so far down the last 20 years, its astonishing.
 
So, how many games are out there that use this amazing, mumbo jumbo technology?? I don't care for the re-hashed Wolfenstein (it's getting painfully tedious to even read about their "new" games. So who else??

Essentially, in X number of years there will be "more" games that will use this technology, Or maybe not. As the article finally mentions: " It will be a genuine selling point for RTX GPUs moving forward, especially if they can get DLSS in a significant number of games". And if not??

Vaporware articles like this (that read like an nVvidia PR Dept's brain fart) are not helpful. And this is from a person who bought more nVidia cards from than other companies!
 
There was no way to know until time had passed. They had no idea at the time. No one did. People merely made weak predictions from a sample of two games.
It appears they made the correct assumption and nVidia came to the same conclusion.
 
Too many comments still trying to bash DLSS despite techspot proof that it's majorly improved (some times it's even better than native resolution as it is shown in the screenshots)

Same people calling it as "Gimmick tech or blurry tech" all the time. A guy even attacked the author of the article calling it as "vaporware article"...

The proof is here DLSS in 2020 has been improved and looks stunning. It's obviously what kind of people are continuously bashing DLSS all the time their behaviour is getting predictable. Take your AMD fanatism away from here !!!
 
Last edited by a moderator:
Keyword: Assumed
They also assumed "DLSS is dead" back in September....
DLSS as it was back then is indeed dead.
While this is imho a bit of a “cheat“ technique (nVidia have a long experience with this), version 2.0 appears to have turned out good.

And to be honest, if I can get „sorta“ 4k quality at 1440p frame rates, why not?
 
DLSS as it was back then is indeed dead.

That's a reach.
As I said back then, give it time to fail or not fail, rather than attack it on day one so you can have a chance to say "I told you so" a year later.

No one knew.
 
Back