Some readers criticized our original features...
From how the tech was described, there was no reason at the time to believe it wouldn't get better. It didn't get better in BFV and Metro, but it eventually got better to become a "game changer."
Why do the last graphs highlight “performance improvements” instead of giving us the difference between the native resolutions?
Example: FPS at native 1440p compared to FPS at 1440p with DLSS set to 4K compared to FPS at native 4K
What users want to know is how many FPS they are sacrificing when using DLSS.
Techspot must have this info as they keep saying DLSS is a “free” performance upgrade... show us the numbers!
It's selling well.Because else you can't sell RTX,
The only AMD chips that are within 5% are the ones that cost twice as much as Intel's chips, and they still lose. If you compare Intels $350 9700K to AMD's 3800X in games, it gets ABSOLUTELY SLAUGHTERED by 10-20%. Intel's 6/12 8700K smashes AMD's 6/12 3600/3600X chips, at various resolutions and with various GPUS's, just like how the 9900K smashes the 8/16 3700X/3800X, from 720p, to 1080p and 1440p, usually by 10-25 FPS across the board.that 5% FPS lead over AMD is the only thing anyone cares about ... right?
RT looks amazing when used properly, the technology will be included in both the new Xbox and PS5.Right now all RTX does is trade quality from somewhere in the game, to try to spruce it up in another area.
If I understand, DLSS uses the custom Tensor Cores to implement an AI based image up-sampler. If you are not using DLSS, the tensor cores go unused for gaming. They can be used for other AI applications, but with out DLSS, gamers are paying for addition silicon they don't need.
I dunno ... from what I constantly am hearing, most gamers that use Intel CPUs, turn all their settings down and game at low resolutions anyway in hopes of bottlenecking their CPU so they can have a 5% advantage over AMD...
I think the point was rather that the original DLSS implementation - which Techspot found severely lacking - would magically improve over time due to the AI learning.
Why do the last graphs highlight “performance improvements” instead of giving us the difference between the native resolutions?
Example: FPS at native 1440p compared to FPS at 1440p with DLSS set to 4K compared to FPS at native 4K
What users want to know is how many FPS they are sacrificing when using DLSS.
Techspot must have this info as they keep saying DLSS is a “free” performance upgrade... show us the numbers!
I meant you are sacrificing FPS when implementing it at the same resolution.... example: playing at 1440p vs playing at 1440p with DLSS set to 4K....How would you be sacrificing FPS when using DLSS? It's supposed to be the opposite.
LoL I actually refused to buy even those I stay at the simple 1070 and wait what AMD is going to deliver with big navi.Honestly I couldn't care less about DLSS or RTX.
Still quite happy with my good old overclocked GTX 1080 Ti which does the job just fine on my humble 1440p display.
This is not something your going to appreciate or truly notice by looking at comparative screenshots, as good as they are.Erm.......
Sorry, but no. Still not worth it. I'm looking at these screenshots fullscreen at 1440p, and I'm seeing Jack and 5h1t for improvements.
By NVidia's admission, the uptake has been very slow. They wouldn't want to increase sales? Jensen just says "It's selling well, people, stop all the marketing spend, and improvement efforts everyone, its selling good enough."? Hmmm .... I have my doubts ...It's selling well.
What, no underlined, oversized font and steam survey "marketshare" data this time? You're getting soft. At least you got a few all caps.The only AMD chips that are within 5% are the ones that cost twice as much as Intel's chips, and they still lose. If you compare Intels $350 9700K to AMD's 3800X in games, it gets ABSOLUTELY SLAUGHTERED by 10-20%. Intel's 6/12 8700K smashes AMD's 6/12 3600/3600X chips, at various resolutions and with various GPUS's, just like how the 9900K smashes the 8/16 3700X/3800X, from 720p, to 1080p and 1440p, usually by 10-25 FPS across the board.
with DLSS off and 30 fps, yes it looks pretty decent on RTX ... it'll come around eventually.RT looks amazing when used properly, the technology will be included in both the new Xbox and PS5.
I have seen DLSS in action, in uncompressed 1440p footage. Looks identical to non DLSS footage except viewed through dirty glasses. Blur central. The screenshots here back up that footage. DLSS is pointless, like PhysX turned out to be in 99% of applications that used it, and will likely disappear just as quickly.This is not something your going to appreciate or truly notice by looking at comparative screenshots, as good as they are.
This is something you will have to fully experience yourself, and if you did, you would probably come away pretty impressed, just like the reviewer.
I trust Scorpus's judgement, he's a good guy and not easily swayed.
You're right. Literally no one does this, but some people can't understand that. That was my pre-emptive sarcastic assed point to that guy in my post above.This is news to me.
Why are we in the year 2020 still focused on WW-II, or even worse, sci-fi games that somehow still resolve to Nazi-vs-Allies gameplay? Isn't it time, with or without DLSS, to move away from the boring WW-II scenery, and into the modern and/or futuristic gameplay, without constantly recycling the same old 5 years of the history? The gamers don't want that shitty period anymore. Let's move on.
Or is someone gaining political and financial interest in constantly feeding the young generations with false information and propaganda?
I guess you missed this really big game, it was called call of duty...modern warfare I think? Came out a decade ago, it was a really big deal massive popular, kicked of 8 years of nothing but modern military shooters about shooting terrorists or Russians in the middle east while the WWII genre went MIA entirely, to the point that a Call of duty of battlefield set in WWII was a breath of fresh air.Why are we in the year 2020 still focused on WW-II, or even worse, sci-fi games that somehow still resolve to Nazi-vs-Allies gameplay? Isn't it time, with or without DLSS, to move away from the boring WW-II scenery, and into the modern and/or futuristic gameplay, without constantly recycling the same old 5 years of the history? The gamers don't want that shitty period anymore. Let's move on.
Or is someone gaining political and financial interest in constantly feeding the young generations with false information and propaganda?
It appears they made the correct assumption and nVidia came to the same conclusion.There was no way to know until time had passed. They had no idea at the time. No one did. People merely made weak predictions from a sample of two games.
It appears they made the correct assumption and nVidia came to the same conclusion.
DLSS as it was back then is indeed dead.Keyword: Assumed
They also assumed "DLSS is dead" back in September....
DLSS as it was back then is indeed dead.