Can DLSS Render "Better Than Native" Graphics?

This article is completely out of the track. All the game tested do not have 4k texture. So you are comparing low quality upscaled texture how they show with dlss. The test show that adding blur to an upscaled texture add some improvement. Wow can you please test some game with REAL and be sure REAL 4k texture PLEASE. Do not mix native resolution and texture resolution. Look at all the screenshot most texture are not 1080p at least. Some game have dh texture pack like Far Cry 6, and no I guest the texture resolution are not yet 4k .
 
Having upgraded to a high end rig and 4k monitor I've been using upscaling techniques a lot recently. Indeed, even with the latest GPUs, I doubt 4k gaming would be viable without them. By and large they work very well in quality mode, with the odd visual hiccup. It provides excellent antialiasing and often hides traditional native issues such as TAA shimmering, etc. Anything below quality is a noticable downgrade over native, at least at high res. However, these modes are still incredibly useful on the likes of Steam Deck, for example. I'd favour a hardware agnostic implementation for obvious reasons, I hope it's only a matter of time before these catch up to Nvidia's implementation.

DLSS3 frame generation for me is only beneficial when already pushing at least 50 fps, otherwise the added lag is definitely perceivable. I've tried Cyberpunks new path tracing with DLSS3 at 4k. It looks nice, but I'm not sure if I'd be too keen to play it that way personally..

RT and path tracing is the thing I'm most sceptical about. In many cases it highlights just how well artists manage with traditional techniques, and the cost is just so high..
 
Did you know that even in Tarkov game there a lot of cheaters, like 30% are cheaters? And Tarkov game is harder than other FPS games, because you loose all when you are taken down.
Absolutely. Loads of cheaters. Really irritating.
 
I think what this article is really telling us is which games are beginning to approach true 4K native implementation including all the textures and other artwork having sufficient 4K detail to back it up, and which are merely offering a "4K resolution" marketing label that is mostly upscaling of individual assets mastered at lower resolution.

It's not hard to appreciate how in the latter category DLSS might be a superior upscaling tech to whatever the game engine was doing before, which in some cases might be pretty minimal effort.

I don't think there's any danger of DLSS improving on a true 4K implementation, which I don't think is common yet or possibly exists at all yet. Few devices can render it with sufficient performance and it is more expensive to create as well.
 
I was expecting to see a bunch of assessments in the table I disagreed with but....nope. I have played a whole bunch of these titles for a while and pretty much agree with what I have tried personally.

Particularly Death Stranding being way better with DLSS which is absolutely true, it looks super clean. Also Forza Horizon 5 being quite a bit worse which I also found to be spot on. As soon as I enabled it when it was added to that title I disliked the softened image quality it imbued. A few other games also come out well, old stuff like Shadow Of The Tomb Raider looks nice using DLSS.

Cyberpunk 2077 works great with DLSS 2 until you turn on frame generation implemented in DLSS 3, and then it doesn't. There is way too much ghosting. Probably because of all the high contrast scenes. It makes the game more playable when path traced and yeah it looks superb when slow or static, but I would rather just wait for more GPU performance for path tracing than have messy frame gen enabled.
 
I scrutinized all and I can say that Native is better, looks natural too.

Anybody who thinks otherwise must be someone who paid good money for a gimmick but is afraid to admit the blunder.

Oh, and the Nvidia shills too.
 
forspoken is one of the games sponsored by AMD.... And yet DLSS looks better in this game than FSR according to hardware unboxed video about DLSS vs FSR

There is absolutely no evidence that Nvidia sponsorship have any impact on DLSS quality on any of sponsored games. You can always come up with assumption/conspiracy theories but you will never able to prove it.




FSR2 does not run better on AMD than nvidia in GPU limited cases. This has been proven many times if you looked at benchmarks

If you watch hardware unboxed video, you will notice that in some games 4070 Ti has bigger gain in fps than 7900XT when using FSR2. Like in Witcher 3 (at 1440p and 4K) and Forza Horison, 4070 Ti has bigger fps gains with FSR than 7900XT


Bigger fps gain, only because of worse quality.
FSR 2 is not superior to DLSS 2.
However FSR 2 can be decent, in some games. Overall tho, DLSS 2 easily win.
DLSS 1 and FSR 1 both sucked ... and AMD has no answer for DLSS 3.

Stuff like DLSS 2, DLDSR, DLAA etc. is the true magic of RTX for me - Not "Ray Tracing" .. I don't care about RT at all.

DLSS 2 with correct .dll is very impressive. Some games has bad implementation, easily fixed with a better .dll

FSR never really amazed me yet. I will always pick DLSS over FSR if a game has both. In no games FSR looked better than DLSS for me.

DLDSR can transform older games (4K -> 1080p/1440p without the perf hit of native 4K) and does not require game support, you simply choose the virtual res and Nvidia drivers and Tensor cores handles the rest. DLSS 2 can even be used on top. For example, I used DLDSR 4K -> 1440p in Elden Ring on my 3080 and game looked way better and it ran locked 60 fps (fps limit without modding - I like online features).

However, I like FidelityFX CAS (Sharpening) but you can do the same on Nvidia cards if you want (in several ways, with and without Experience)

Generally, Nvidia wins easily in terms of features, and this has been true since launch of RTX 2000 series, meaning 5 years at this point... Half a decade...
 
Last edited:
With lossless scaling ( on steam 4€ ) or Magpie ( Free on github) you can have FSR on EVERY games...
 
This article is completely out of the track. All the game tested do not have 4k texture. So you are comparing low quality upscaled texture how they show with dlss. The test show that adding blur to an upscaled texture add some improvement. Wow can you please test some game with REAL and be sure REAL 4k texture PLEASE. Do not mix native resolution and texture resolution. Look at all the screenshot most texture are not 1080p at least. Some game have dh texture pack like Far Cry 6, and no I guest the texture resolution are not yet 4k .
1080p is a term specific to monitors, not textures. Games have been using 4K textures (and higher) for years, but there are some good reasons why games don't 4096x4096 textures everywhere. Small meshes won't display the benefit of using multiple high-resolution maps unless the frame resolution is very high too, and since almost all of the games listed in this article were developed for consoles first, one can't expect a PC port to have every material asset reconfigured for a native 4K setup (as it's a relatively small portion of the PC userbase).

Not everything needs to be using 4K+ textures, either -- environment materials don't require such textures, simply because such meshes rarely fill the entire frame (unless one purposely goes out to make this happen), whereas weapons in a typical FPS game do. However, with appropriate modeling and lighting, they can still look really good 2K or lower textures, even with a 4K frame resolution.

Endless, unnecessarily high-resolution textures cause problems with streaming and cache thrashing, and even the use of DirectStorage or the like won't help stave off these problems. The GPUs in consoles don't have the level of cache that the latest top-end desktop GPUs do, and need to be appropriately optimized to account for this.
 
1080p is a term specific to monitors, not textures. Games have been using 4K textures (and higher) for years, but there are some good reasons why games don't 4096x4096 textures everywhere. Small meshes won't display the benefit of using multiple high-resolution maps unless the frame resolution is very high too, and since almost all of the games listed in this article were developed for consoles first, one can't expect a PC port to have every material asset reconfigured for a native 4K setup (as it's a relatively small portion of the PC userbase).

Not everything needs to be using 4K+ textures, either -- environment materials don't require such textures, simply because such meshes rarely fill the entire frame (unless one purposely goes out to make this happen), whereas weapons in a typical FPS game do. However, with appropriate modeling and lighting, they can still look really good 2K or lower textures, even with a 4K frame resolution.

Endless, unnecessarily high-resolution textures cause problems with streaming and cache thrashing, and even the use of DirectStorage or the like won't help stave off these problems. The GPUs in consoles don't have the level of cache that the latest top-end desktop GPUs do, and need to be appropriately optimized to account for this.
On Nexus mod community there is a guy, Halk Hogan, which made 2 amazing mods called HD Reworked Project for The Witcher 3 and for CP2077 designed to improve graphic quality of both games. The Witcher 3 HD Reworked Project NextGen Edition, and Cyberpunk 2077 HD Reworked Project.
He did it so successfully that CDPR hired him to include his mods into the last Witcher 3 Complete Edition and perhaps in future CP2077 patches. Both games look amazing, better, with his improved textures, he even offered the option to choose the proper mods textures for 2k and 4K for CP2077. Cyberpunk 2077 HD Reworked Project
I am not a technical expert in this field, are his projects good examples here?
 
Last edited:
1080p is a term specific to monitors, not textures. Games have been using 4K textures (and higher) for years, but there are some good reasons why games don't 4096x4096 textures everywhere. Small meshes won't display the benefit of using multiple high-resolution maps unless the frame resolution is very high too, and since almost all of the games listed in this article were developed for consoles first, one can't expect a PC port to have every material asset reconfigured for a native 4K setup (as it's a relatively small portion of the PC userbase).

Not everything needs to be using 4K+ textures, either -- environment materials don't require such textures, simply because such meshes rarely fill the entire frame (unless one purposely goes out to make this happen), whereas weapons in a typical FPS game do. However, with appropriate modeling and lighting, they can still look really good 2K or lower textures, even with a 4K frame resolution.

Endless, unnecessarily high-resolution textures cause problems with streaming and cache thrashing, and even the use of DirectStorage or the like won't help stave off these problems. The GPUs in consoles don't have the level of cache that the latest top-end desktop GPUs do, and need to be appropriately optimized to account for this.
Look at the screenshot, I said ( most) of texture of every screenshot are not still not rendered at my screen resolution (1080p) they are upscaled an ugly most texture use 2x2 to 4x4 pixel of my 1080p reslution , they cannot be texture developed for 4k resolution. Most of the texture are developed for the distance they should be viewed. They look ugly when looking at near. Check for game with HD texture pack, you could see the difference and then yet not all texture are rendered for 4k.
 
On Nexus mod community there is a guy, Halk Hogan, which made 2 amazing mods called HD Reworked Project for The Witcher 3 and for CP2077 designed to improve graphic quality of both games. The Witcher 3 HD Reworked Project NextGen Edition, and Cyberpunk 2077 HD Reworked Project.
He did it so successfully that CDPR hired him to include his mods into the last Witcher 3 Complete Edition and perhaps in future CP2077 patches. Both games look amazing, better, with his improved textures, he even offered the option to choose the proper mods textures for 2k and 4K for CP2077. Cyberpunk 2077 HD Reworked Project
I am not a technical expert in this field, are his projects good examples here?
HD textures packs work well in older games primarily because the number and complexity of the meshes are relatively low, even in the likes of Witcher 3. That particular title used quite low-resolution textures because the primary development platforms were the PS4 and Xbox One -- both of which only permitted around 4 to 5 GB of RAM to be used by a game, not to mention the GPUs in them were limited by bandwidth. Had CDProject more time or resources, it may well have created finer-looking assets for the PC version, but it was already running behind schedule. The same is also true with CP2077 and that games development was hindered by trying to make the game fit as many platforms as possible, in a very tight timeframe.

Look at the screenshot, I said ( most) of texture of every screenshot are not still not rendered at my screen resolution (1080p) they are upscaled an ugly most texture use 2x2 to 4x4 pixel of my 1080p reslution , they cannot be texture developed for 4k resolution.
The videos and screenshots in the article are typically zoomed in and/or displayed (or captured) post YouTube video compression, so they're not going to look the same as if one is sitting in front of an actual 4K monitor. Filling a webpage with lots of high-res, non-compressed images doesn't help loading speeds much!

I've uploaded some screenshots from The Last of Us to Google Drive -- click here. These were taken at 4K with all the graphics settings put to maximum, bar the texture qualities. There are two images, at native resolution, using Low and Ultra textures, then two more with DLSS Performance mode, for the same two texture settings. You can see for yourself how DLSS affects the impact high-resolution textures have on visual fidelity.

Most of the texture are developed for the distance they should be viewed. They look ugly when looking at near. Check for game with HD texture pack, you could see the difference and then yet not all texture are rendered for 4k.
I'm not suggesting 4K textures aren't worth using -- I was pointing out that they are in use in today's games, but not everywhere because of the performance impact they have in consoles.
 
I, personally, have found that as long as a game has a decently sharp and detailed picture, I would rather enjoy the game and not become obsessed with minutia. I only ever play a WW2 flight sim that looks good enough for me, so those who spend hordes of time in their games may have a different opinion.
 
DLSS works great as AA in many games while improving framerate on top.

And yes, it can improve graphics too; https://www.rockpapershotgun.com/outriders-dlss-performance

Depends on implementation, how good it is.

No, no it cannot improve graphics beyond native rendering. Native is native = how it was meant to be seen, how it is made. You cannot go beyond native as you're suggesting. Not possible. Just factually. It'd be like repainting the Mona Lisa using AI and saying the result is better than the original! Extras that are displayed are exactly that - extra. It does allow underpowered machines to experience/output/display closer to native when they cannot acheive it alone. Thats why DLSS exists, its the presume for its existence. To acheive native when it cannot be acheived without low frame rate etc.
Rockpapershotgun are total baloney. They are so ridiculous its unreal. Personal opinion citied as knowledge. I've only ever managed to find 1 person that posts articles there with half a brain. All the others chat total nonsense on a regular basis. Good luck to you if thats your source for any information!
I am however enthused to see some others here haven't fallen to the madness that is "DLSS better than native" LMAO. Nice1 peeps
 
Last edited:
No, no it cannot improve graphics beyond native rendering. Native is native = how it was meant to be seen, how it is made. You cannot go beyond native as you're suggesting. Not possible. Just factually. It'd be like repainting the Mona Lisa using AI and saying the result is better than the original! Extras that are displayed are exactly that - extra. It does allow underpowered machines to experience/output/display closer to native when they cannot acheive it alone. Thats why DLSS exists, its the presume for its existence. To acheive native when it cannot be acheived without low frame rate etc.
Rockpapershotgun are total baloney. They are so ridiculous its unreal. Personal opinion citied as knowledge. I've only ever managed to find 1 person that posts articles there with half a brain. All the others chat total nonsense on a regular basis. Good luck to you if thats your source for any information!
I am however enthused to see some others here haven't fallen to the madness that is "DLSS better than native" LMAO. Nice1 peeps

this ^

And lets not forget, He uses DLSS because it saves on energy...
 
I've only been saying DLSS can exceed native for ~18+ months now, and despite numerous reputable tech sites backing that up, and now this piece of content, you get replies like;

"I scrutinized all and I can say that Native is better, looks natural too". Some people see what they want to see.

It was also proven by this very site that upscaling (except frame generation) improves input latency rather than reduce it, provided there's an uptick in FPS, but you get replies like;

"These upscalers are absolutely horrendous on input lag"

Tim acquires the relevant hardware, set's out clear testing methodology, tests over 100 configurations and painstakingly analyses them objectively, he organises the results into a table and video/written content and presents his findings, then internet randoms

"lol nope, upscaling bad"

Given the way this site and forums swing, of course those takes get likes, hardly surprising, such is an echo chamber. Thank goodness these people only make up a very small yet vocal minority, and most don't arbitrarily choose what to like and dislike based on personal whims and just judge based on image quality and performance. Like usual, the proof is in the pudding.
 
No, no it cannot improve graphics beyond native rendering. Native is native = how it was meant to be seen, how it is made. You cannot go beyond native as you're suggesting. Not possible. Just factually. It'd be like repainting the Mona Lisa using AI and saying the result is better than the original! Extras that are displayed are exactly that - extra. It does allow underpowered machines to experience/output/display closer to native when they cannot acheive it alone. Thats why DLSS exists, its the presume for its existence. To acheive native when it cannot be acheived without low frame rate etc.
Rockpapershotgun are total baloney. They are so ridiculous its unreal. Personal opinion citied as knowledge. I've only ever managed to find 1 person that posts articles there with half a brain. All the others chat total nonsense on a regular basis. Good luck to you if thats your source for any information!
I am however enthused to see some others here haven't fallen to the madness that is "DLSS better than native" LMAO. Nice1 peeps
Who cares, look at the link, you can't argue with screenshots that show facts.

It shows improvement. Improvement = Better. It's that simple. No-one cares if a game runs native or not, when image quality is better.

Filters like sharpening etc. can easily improve on native image quality. So can mods and filters.

It's not the point of DLSS or FSR tho. It's to make games PLAYABLE on settings that are UNPLAYABLE without. And it works great in most games. However it can also be used to put life in an otherwise outdated GPU.

You are simply in denial. There's a reason why many gamers think of DLSS like black magic. Only RTX owners (wonder why)

DLSS, DLDSR and DLAA is the true magic of RTX cards. Not "Ray Tracing", however it works too, unlike AMD.

DLSS 1 was crap, just like FSR 1.
DLSS 2 and FSR 2 are way better, however DLSS 2 wins overall. In no games does FSR 2 beat DLSS 2. They are similar at best. DLSS 2 beats FSR 2 in 21 out of 26 games, draw in rest, Hardware Unboxed.

I always find it funny when (mostly) AMD people rambles on about Nvidia features being bad, while AMD is in full panic mode to try and match those features.

The only other people that hate RTX features, are GTX owners, trying to tell themself that their 900 or 1000 series GPU are still somewhat decent, after 7/9 years :joy:
 
Last edited:
This world it's nuts promoting fake resolution and fake frames as better than native.
Like you can say fake boobs are better than natural and only needs an "repair shop" each 5-10 years.

If you think 10 years won’t impact natural boobs, there is some disappointment waiting in your future. 🤫
 
I've only been saying DLSS can exceed native for ~18+ months now, and despite numerous reputable tech sites backing that up, and now this piece of content, you get replies like;

"I scrutinized all and I can say that Native is better, looks natural too". Some people see what they want to see.

It was also proven by this very site that upscaling (except frame generation) improves input latency rather than reduce it, provided there's an uptick in FPS, but you get replies like;

"These upscalers are absolutely horrendous on input lag"

Tim acquires the relevant hardware, set's out clear testing methodology, tests over 100 configurations and painstakingly analyses them objectively, he organises the results into a table and video/written content and presents his findings, then internet randoms

"lol nope, upscaling bad"

Given the way this site and forums swing, of course those takes get likes, hardly surprising, such is an echo chamber. Thank goodness these people only make up a very small yet vocal minority, and most don't arbitrarily choose what to like and dislike based on personal whims and just judge based on image quality and performance. Like usual, the proof is in the pudding.

NO...
You have been trying to insinuate that, but you keep falling flat. It is a shame that our last conversation got tapped.


Upscaling to 4k is not the same as upscaling to 1440p.... or 1080p.

All of your arguments stem from that^ one fact... that upscaling works better, the higher up you go in resolution. And that you need a $1600 card and a $800 4k monitor to make it better, so that you can see the "better" upscaled results.... (Who is going to spend $2k on new gpu and 4k monitor and look forward to using upscaling?)

You do not seem willing to admit this and avoid what grounds your facts. And instead try to use your opinion as fill-in facts.



Here is a FACT:
Upscaling is not meant for new GPUs to use, it is meant for OLD outdated GPUs to use. (ie: to help low-end gpus)

The most likely use-case scenario are the millions of People gaming on low-end (by today's standards) such as the GTX1080ti... who have been gaming at a high level (120 ~ fps) on 1080p for the last 7 years, but now they want one of those new low-latency 27" QLED 1440p 166Hz monitors... and doesn't want to have to buy a new GPU...!

So they buy that awesome Monitor and use FSR upscaling in all their games, until (over the next few years of not being pressured) they choose to buy/save-up for the right dGPU to push your new 1440p monster.

If you been gaming at 1080p with an older card, you now have the freedom to go ahead and buy ANY new monitor and continue to play at the same frames as you did with your old monitor and resolution.

All those GTX and Vega owners are NOT going to buy a RTX4070.... and use it to upscale their 1440p games, when they can upscale with their old cards.!

They want to buy a card that can naturally push their brand new monitor without using any gimmicks.
 
What troubles me is that FSR doesn't need specific hardware to run but manages to compete with DLSS. And also the fact that people use it at 1080p, what Steven W. from HUB call's now "low resolution" in CPU tests.
But you cant expect to have a 0.5 megapixel image and strech it to 2 megapixel screen and expect to have the same details as a native 2 Mp image. Not talking about artefacts or any other visual glitch.
Like taking a picture with phone and expect it to fit to a 6 story building banner and still look good.
Oh I know, master Jensen told us "turn on DLSS and bam it just works" RTX on MFkers.
 
I'm 47 this year and might know a thing or two about it. Married with children's and all.
All depends on initial size, because gravity is constant.
Pretty much like GPU sagging :)
Respect man, your jokes and experience are treasure gold to this forum.
 
Back