Can DLSS Render "Better Than Native" Graphics?

My post is addressing to Tim the writer of this article and also to chat forum too.
Since your articles about DLSS generated a lot of controversy I think that you should give us more basic info about what and how you are doing these tests, like:

1. How many of these games tested are sponsored, paid or supported by Nvidia in any way, including unofficial, behind the scenes, Nvidia support.
I want to know if a game is sponsored, publicly or behind the scenes, by any corporation, like Nvidia, AMD or Intel. Cyberpunk 2077 is the most noisy example of this. Nvidia is so in bed with CP2077 game devs without officially admitting, that CDPR can be considered an Nvidia subsidiary. I want to know if DLSS is so "superior" in the games especially if in fact they are "sponsored" or "supported" by Nvidia as they did with Cyberpunk 2077 and Portal, etc (like in Nvidia doing all the work for "free" for those game developers). Because, in reality, customers are paying a revolting higher prices for this "free" DLSS support from Nvidia.

2. What videocards did you used for this test? Why only 1 Nvidia? Nor AMD too? Did you find some games running FSR2 better on AMD cards than on Nvidia? Or you just resumed yourself to test FSR and DLSS only on this Nvidia card? Because too many games are sponsored by Nvidia, and also Nvidia videocards may run FSR worse than AMD counterparts videocards. I find odd that you did not mention and did not took this into consideration.

3. Citing "4K using Quality mode upscaling, 4K using Performance mode upscaling, and 1440p using Quality mode upscaling".
Can you specify on which Nvidia videocards DLSS2 runs better than native exactly?
Because many Nvidia videocards, like those with 10GB Ram or less, have a serious VRAM limitation and do not run 2K or 4K high or ultra settings without degrading image quality or stuttering which is worse. 8GB vs16GB VRAM: RTX 3070 vs Radeon 6800
This narrow the Nvidia videocards DLSS almost universally "superiority" claims only to 4080 and 4090, and for some games to 4070(TI), 3080 12Gb, and 3090(TI).
Hogwarth Legacy clearly showed a degrading image quality for less than 10-12GB RAM, so even if DLSS claims to be so good, in reality the gameplay quality experience is poor.
Can you show us on a 3070 videocard how superior is DLSS2 vs AMD FSR2 vs Native at 4K high and ultra settings, running Hogwarth legacy? The last of Us can be another example, worth to check this and clarify. And run the same games on AMD videocards corresponding price competition.

4. Can you mention or at least give us an estimate over price cost of Nvidia DLSS2, DLSS3 versus AMD FSR2, (soon FSR3) which is available for both AMD and NVIDIA cards to have o better and more objective picture if it really worth the price difference? Because when you benefit of something which is open, easier and almost free to implement like FSR2 (soon FSR3) versus having to pay extra for a proprietary, closed Nvidia DLSS, it is very important.

5. The percentage of Nvidia 1xxx cards on steam chart is high.
So, can you show us how superior is DLSS2 on a 1080(TI) videocard? Or an 1660 ti?
And show us how good is FSR2, on the same Nvidia 1xxx cards?
Citing your next affirmation:
"Over the past few weeks we've been looking at DLSS 2 and FSR 2 to determine which upscaling technology is superior in terms of performance but also in terms of visual quality using a large variety of games. On the latter, the conclusion was that DLSS is almost universally superior when it comes to image quality, extending its lead at lower render resolutions."
Thus, we want to see how "superior" is DLSS2 versus FSR on GTX1xxx videocards. If it is not superior than it is mandatory to know.
Can you show us a graph with how many FPS FSR2 brings and how many FPS DLSS2 brings? Or the "superior"quality? Even if DLSS brings 0 fps and 0 superior quality, you have to mention this, not disregard it.
So we can have a better informed opinion of how "DLSS is almost universally superior when it comes to image quality, extending its lead at lower render resolutions."
Because for Nvidia own GTX 1xxx cards, AMD FSR2 is infinite more times "superior" in ANY game (any FSR2 positive number is more than 0 Nvidia DLSS2).

6. Why did you chose to mention only the strong points of DLSS2 in your latest articles but fail to mention at least one of the major weak points at all?
Do you at least acknowledge the irony of FSR2 working on more Nvidia cards while own Nvidia DLSS versions work only on some Nvidia videocards and this mostly because Nvidia intentionally chose a market fragmentation dark pattern business model? Why do you choose to keep silence of this instead of saying it louder and every-time when you mention DLSS superiority to also mention the weak points of the same DLSS?
Otherwise it seems that you rather limited yourself to write a PR commercial article for Nvidia DLSS2 "superior" quality.

I find these 5-6 issues being really concerning points of this article and that's why, especially of the 6 point, I find this article as being of a low quality at least. Can we appreciate an article as being consistent when it only shows the strong points of a product, technology but fail to mention the weak points? Is DLSS technology so superior or, so perfect, as Nvidia PR claims to? I can tell you that you make it very hard to discern when tons of articles including this one write almost only about Nvidia superior DLSS strong points but in the same time are hit by Nvidia Amnesia and DO NOT aknowledge or mention Nvidia weak points which often are evident.

An advice: it is better to present both the strength and the weakness of a product, technology, etc. and the same for the competition product and engage the readers to formulate their own conclusion instead of leading them to your own narrative.
I did not read or hear you saying anything about the strong points of FSR2, though you compare against DLSS2 in this article and the previous one too. I want to know the strong and weak points for both DLSS2 and FSR2. Why you failed to mention for both of them and focus only on DLSS2 "strong" points shows a lack of consistency.
An image is worth a thousand words.
How do you think that your praise of DLSS2 "superiority" is looking on a graph image showing 0-ZERO fps increase from DLSS2 on GTX 1xxx versus between 50-75% more than native FPS with AMD FSR?

Overall, I find AMD FSR very good, because it is available for all AMD, Intel and Nvidia videocards, and also on the last gen consoles too. FSR is available even for the old generation Nvidia cards which Nvidia DLSS is not available.
And I suggest to Tim to acknowledge the following arguments in his articles about Nvidia and AMD videocards comparison.
I am citing Haiart, an user from another forum, which very clearly pointed out the weak points of DLSS as a deliberately planned obsolescence and also as an Nvidia dark pattern business model:

"It is obvious that NVIDIA could have made DLSS3 work with Ampere and Turing, or DLSS 2 work on GTX 1xxx. NVIDIA didn't make it available because of market segmentation. If you truly believe that NVIDIA couldn't, this also means that AMD is more technically advanced for being able to make it work not only on AMD, but on NVIDIA and Intel hardware too across all their generations."

As I said in a previous post, I'll prepare an interesting post about "triple standards" of many hardware reviewers which praise some strength of hardware components which they get for free to review, others are payed, while others are blackmailed, but in the same time blatantly disregard the huge issues or limitations which the same components have, only to suite their narrative or better to say the manufacturer narrative.
The tragedy is that more and more of these "reviewers" are hit by this Intel, AMD and especially Nvidia Amnesia, and forget to use the same skewed "standards" or methodology when reviewing or comparing hardware from the competition.
Thus, unfortunately, many hardware reviewers gradually became the 2nd class PR tools for the big tech corporations instead of formulating an HONEST, personal and intelligent argumented opinion. And some of them became so delusional that they claim they like it instead of fighting against.

P.S. About this DLSS "drama", I decided to take a stand because too much DLSS is shoved on readers and users eyes by Nvidia PR and by their most passionate reviewers and users, but quite often some of their arguments lack consistency or do not apply the same standards or methodology to competition products.
 
Last edited:
This world it's nuts promoting fake resolution and fake frames as better than native.
Like you can say fake boobs are better than natural and only needs an "repair shop" each 5-10 years.
Well, unfortunately, its fair to say that natural ones need “repair shop” too, more like overhaul each 15-20 years.

So why not if “natural” resolution is bad from the start, like Dying Light 2.

But, if native is crispier, I’m not buying huangs fakes, yeah
 
Good article. The conclusion of “it depends” makes sense, though I’m honestly surprised at the number of games where native rendering is (subjectively) worse than upscaled.

Just to add another data point, I have an old HTC Vive that I use with MS Flight Simulator along with DLSS. While sitting on a runway with the propeller spinning, I can see artifacts in the propeller and the runway that aren’t there with native rendering. Also, I found that the instrument cluster is harder to read, especially as you progress towards the performance side of the DLSS scale. So in this title, the visual quality is clearly worse than native rendering, but I’m willing to live with DLSS because of the framerate benefits.
 
This world it's nuts promoting fake resolution and fake frames as better than native.
Like you can say fake boobs are better than natural and only needs an "repair shop" each 5-10 years.
Well there's many examples where graphics are improved.

 
I think both upscale techs are doing its jobs and are quite good, still I'd choose native.
What I cant stand for though is the aggressive nvidia policy to push their own crap to devs trying to tide them closely to their platform, even, if it would be more beneficial to use open solutions or contribute to open solutions - gf now instead or much cheaper free sync, or linux drivers are most obvious samples.
 
You may be surprised to find out that the connoisseurs in videogame industry, like professional players do not use DLSS3 for example. Because the downgrades are higher than advantages. So, no, DLSS technology is not the Saint Graal, even if Nvidia is pushing hard through PR and reviewers to think so. In reality DLSS is a controversial technology, which could be good and promising if Nvidia did not sabotaged it with fake promises, like will run on 10001 games but did not run or was so bad for 1 year, and market segmentation like DLSS2 can run only on RTX2xxx and above, DLSS3 runs only on RTX4xxx cards, DLSS4 runs only on RTX 5xxx gen cards, DLSS n will run only on RTX (n+1) gen videocards.
I have first hand experience with DLSS1, 2 and 3.

DLSS 1 sucked. Just like FSR 1. Blurry.

DLSS 2 is amazing in most games. Mostly, much better than FSR 2.

DLSS 3 is great for people who wants to ramp up settings and max out RT. AMD has no response.

AMD actually needs FSR more than Nvidia needs DLSS, because AMD RT perf is really years behind Nvidia.

AMD also lacks a DLDSR alternative, highly underrated RTX feature, that will transform many older games.

I don't need DLSS 3 myself. Because I don't like RT. I prefer high frames. I have used DLSS 2 in tons of games by now, tho. In 9 out of 10 games, DLSS beats FSR and they are similar in the last. FSR beats DLSS in NO GAMES, as Hardware Unboxed showed in their 26 game testing video recently.

AMD is simply WAY BEHIND Nvidia in terms of features and RT. This is why they should FOCUS ON RASTER PERF ONLY and DROP PRICES, or AMD will be below 10% dGPU marketshare soon.
 
This world it's nuts promoting fake resolution and fake frames as better than native.
Like you can say fake boobs are better than natural and only needs an "repair shop" each 5-10 years.
Also Nvidia fake "boobs" looks "better" only on new future generations, fake booDLSS is already expired on old generations :laughing:
 
Last edited:
Also Nvidia fake "boobs" looks "better" only on new future generations, fake booDLSS is already expired on old generations :laughing:
DLSS and fake boobs both look better from distance, once you get real close you see the scars and when you touch it only then you can fell the difference.
 
I can see dlss2/fsr helping some games visually, but if you have a gpu with enough horsepower just go native.

dlss3 and its frame trickery is just a bunch of marketing from nvidia, raytracing is just too costly for what it gives and bragging about fake frames seems stupid, normal tv's have been doing it for years and all we did was turn it off and laugh.

as strong as these current gpu's are people shouldnt be looking for new ones for awhile, at this point we should turn our gaze onto the game devs and ask why titles that play fine on the current ps/xbox need the nuclear option in hardware to run well on pc in the first place, I'll even toss the switch and its measly amount of power into the argument also.

I think the new gpu's are some awesome hunks of tech but most of their features look like they'll be used by developers to keep releasing janky games, all these features are just ways for them to be lazy and crank out shambling heaps of software that hopefully the gpu's can iron out.
 
I want to know if DLSS is so "superior" in the games especially if in fact they are "sponsored" or "supported" by Nvidia as they did with Cyberpunk 2077 and Portal, etc

forspoken is one of the games sponsored by AMD.... And yet DLSS looks better in this game than FSR according to hardware unboxed video about DLSS vs FSR

There is absolutely no evidence that Nvidia sponsorship have any impact on DLSS quality on any of sponsored games. You can always come up with assumption/conspiracy theories but you will never able to prove it.


2. What videocards did you used for this test? Only Nvidia? Or AMD too? Did you find some games running FSR2 better on AMD cards than on Nvidia? Or you just resumed yourseslf to test FSR and DLSS only on Nvidia cards. Because too many games are sponsored by Nvidia, and also Nvidia videocards may run FSR worse than AMD counterparts videocards. I find odd that you did not mention and did not took this into consideration.

FSR2 does not run better on AMD than nvidia in GPU limited cases. This has been proven many times if you looked at benchmarks

If you watch hardware unboxed video, you will notice that in some games 4070 Ti has bigger gain in fps than 7900XT when using FSR2. Like in Witcher 3 (at 1440p and 4K) and Forza Horison, 4070 Ti has bigger fps gains with FSR than 7900XT

 
Last edited:
FSR2 does not run better on AMD than nvidia in GPU limited cases. This has been proven many times if you looked at benchmarks

If you watch hardware unboxed video, you will notice that in some games 4070 Ti has bigger gain in fps than 7900XT when using FSR2. Like in Witcher 3 (at 1440p and 4K) and Forza Horison, 4070 Ti has bigger fps gains with FSR.

Better does not means always more.
FSR clearly runs better than DLSS in many instances even you disregard or dodge the proofs. As I said, for example, run FSR and DLSS on GTX 1xxx cards and come back.
Can you acknowledge this?
Also it is important to clarify if FSR quality is better on AMD cards than Nvidia cards, and also if FSR on Nvidia cards is intentionally crippled on Nvidia cards to make DLSS looks better. These kind of test were not done.
And again, DLSS superiority on a game endorsed, payed or supported by Nvidia is not "superior" to FSR, it is just Nvidia discriminate in favour of DLSS, which btw it is also expensive and not worth the price vs FSR. Nvidia was cought playing dirty since old times and too often.
Game engines which offer a proper support to both Nvidia and AMD, like UE5, proves that FSR and Ray Tracing run at least or sometimes even better on AMD new RX 7xxx videocards than RTX4xxx cards. Even RTX 4090 card is loosing in some games when RT is enabled against RX 7900XTX.
On older games, massively supported by Nvidia, guess what, DLSS is slightly better, but hey, most of Nvidia videocards are bottlenecked by lack of RAM, or DLSS do not even run on their own Nvidia GTX 1xxx cards, DLSS3 do not run on RTX 2xxx and 3xxx cards and so on.
I hope that you will be able to discover how Nvidia inflicted too many depravations and limitations to their own products and all of these lead to a worse consumer experience and against consumer interests and wallet including yours.
 
Last edited:
Does DLSS remove motion blur, depth of field (the article mentions once that it breaks it sometimes) and other wanted blurs ?
 
There is absolutely no evidence that Nvidia sponsorship have any impact on DLSS quality on any of sponsored games. You can always come up with assumption/conspiracy theories but you will never able to prove it.
You did not played Cyberpunk 2077? You should play it. A very good game. Though heavily sponsored and supported by Nvidia, with a special dedicated team to make it to look better on Nvidia hardware. Support offered for "free" to CDPR by Nvidia, but way more expensive for customers who have to buy and can use it only on Nvidia 4xxx expensive cards. Customers who bought 3xxx or older cards were left behind by Nvidia. Even CP2077 was made by CDPR and tested while running on RTX 2080 cards.

I suggest to rely on common sense and not bringing your own conspiracy theory, that others come up with conspiracy theories, if they are not aligned with your opinions, and instead have their own argumented opinion. Though this may bee too subtle for you to rely on.
On the contrary, I support and encourage to check the facts, do more test to clarify the flaws which may shift or deviate an objective test.
 
Last edited:
My post is addresing to Tim the writer of this article and also to chat forum too.
Since your articles about DLSS generated a lot of controversy I think that you should give us more basic info about what and how you are doing these tests, like:

That's a whole essay right there, however I wanted to point out that some of your own questions/comments are addressed in the article when necessary -- in others, your questions are beside the point. This article is meant to answer a single question first and foremost: DLSS vs. Native rendering, that's it.

The FSR stuff is mentioned as reference and addressed in the intro as it was already tested in two previous articles, so whatever your issues are with FSR vs. DLSS testing, this is not the article for that. Whatever Nvidia sponsors or not, or AMD does, it's interesting info but ultimately irrelevant here.

People also seem to confuse DLSS 2 (deep learning enhanced rendering tested here, somewhat equivalent to FSR 2 in nature) with DLSS 3, which is a bad name because it's something else entirely, also tested a while ago by TechSpot, where "fake frames" are introduced. This article has nothing to do with DLSS 3.
 
You did not played Cyberpunk 2077? You should play it. A very good game. Though heavily sponsored and supported by Nvidia, with a special dedicated team to make it to look better on Nvidia hardware. Support offered for "free" to CDPR from Nvidia, but way more expensive for customers who bought only Nvidia 4xxx cards. Customers who bought 3xxx or older cards were left behind by Nvidia. Even CP2077 was made by CDPR and tested while running on RTX 2080 cards.

1- You did not prove anything. Where is your "proof" that DLSS on Cyberpunk was made to look better on nvidia ?? ZERO proof
All nvidia did is add extra effect like path tracing... etc.... That not evidence about "DLSS" deliberately made look better on nvidia GPU than FSR or whatever.

2- forspoken is sponsored by AMD, yet DLSS looks better according to Hardware Unboxed.... Explain that without coming out with conspiracy thoery

Last of us is not sponsored by nvidia, and DLSS looks better. Explain that ??

Even if we ignore Cyberpunk, there is games that are not sponsored by nvidia and DLSS still look better than FSR

I suggest to rely on common sense and not bringing your own conspiracy theory that others, which are not aligned with your opinions, but have their own argumented opinion, come up with conspiracy theories. Though this may bee too subtle for you to rely on.
On the contrary, I support and encourage to check the facts, do more test to clarify the flaws which may shift or deviate an objective test.

What "conspiracy theory " did I say ?! LOL

Everything you said so far has ZERO proof. Everything you said based on assumptions.

There is no reason come up with conspiracy theory that nvidia is cheating in sponsored title, when FSR can't even match DLSS in non-nvidia sponsored titles.

GTX 1xxx does not able to run DLSS is not relevant to anything I said.... We all know that DLSS does not run on GTX 1xxx. What is your point here ?? We all know that DLSS runs on RTX GPU only and point of these comparison to see if DLSS has advantages over FSR (and it does)
 
That's a whole essay right there, however I wanted to point out that some of your own questions/comments are addressed in the article when necessary -- in others, your questions are beside the point. This article is meant to answer a single question first and foremost: DLSS vs. Native rendering, that's it.

The FSR stuff is mentioned as reference and addressed in the intro as it was already tested in two previous articles, so whatever your issues are with FSR vs. DLSS testing, this is not the article for that. Whatever Nvidia sponsors or not, or AMD does, it's interesting info but ultimately irrelevant here.

People also seem to confuse DLSS 2 (deep learning enhanced rendering tested here, somewhat equivalent to FSR 2 in nature) with DLSS 3, which is a bad name because it's something else entirely, also tested a while ago by TechSpot, where "fake frames" are introduced. This article has nothing to do with DLSS 3.
Some people may confuse DLSS2 with DLSS3 but this is rather Nvidia wrongdoing. They could easily named it more clear. Even so, in my posts I do not confuse DLSS2 with DLSS3, on the contrary, and when I mentioned them together I am doing to show an Nvidia dark pattern bussienss model.

Whatever Nvidia sponsors or not, or AMD does, it's interesting info but ultimately irrelevant here.
Why do you consider it irrelevant? Can you bring some valid arguments instead of an affirmation? I think that many of us may find out that it is at most relevance. Like in proving being objective. It is better to check if most of the games which shows a "superior" DLSS2 are sponsored, supported, (even behind the scenes) by Nvidia than those which are not sponsored. If you choose to disregard this and to not check it, will be hard to pretend after that you are objective. Better to check it and let's the fact speaking.
And I can easily prove to you that is relevant. A new game is in development or released, and game devs announce that the game is not sponsored or supported by Nvidia, or that they support FSR 2 and 3, or they are using UE5.
UE5 game engine offers very good support for AMD cards too, including RT (not only to Nvidia).
So, this is relevant because customers will buy a competition videocard - AMD for example which offers a better performance/price or just a better price than Nvidia.
In this examples and many others I do not need an Nvidia expensive card to play this game or those games equally supported by UE5.
Or on the contrary, if I find out that the game or games in which I am interested are better supported by Nvidia this would make easier for me to make the right choice and buy the new Nvidia card for the right price.
Though here is where Nvidia opened Pandora box with planned obsolescence, market fragmentation and crippling their products in detriment of users experience, just to squeeze more money from them.
And that's why I pointed out the DLSS2 and also DLSS3 weaknesses and also Nvidia VRAM weakness when I proposed 2 Nvidia "conjectures":

1. DLSS version will increase in number with every "new" Nvidia generation videocards and will "work" only on that last generation.
Spoiler warning: DLSS4 will come and it's new added features will "work" only for the next gen Nvidia 5xxx (or 6xxx) videocards.

2. Nvidia will offer the proper ram and performance only to their top most expensive videocards, the rest of them will not have enough Ram, or enough performance, or both, to play the games released almost 2 years after, without bottlenecks (stutters) or less image and textures quality.

And believe me, I will be glad if Nvidia will contradict these "conjectures", because it means that customers will get better products instead of intentionally crippled ones.
 
Last edited:
You did not played Cyberpunk 2077? You should play it. A very good game. Though…

You have written a lot of words. I’m guessing you own a radeon?

My opinion as a gamer of some 40 years:
I play most of my games with dlss enabled if it’s available. The only game I don’t is Tarkov because I consider it to be a tiny bit less sharp to my eyes and it’s important to see every pixel as it be might all you see before you receive a headshot. That’s about it. It’s interesting to see these comparisons. I appreciate the time and effort put in to the article.
 
You have written a lot of words. I’m guessing you own a radeon?

My opinion as a gamer of some 40 years:
I play most of my games with dlss enabled if it’s available. The only game I don’t is Tarkov because I consider it to be a tiny bit less sharp to my eyes and it’s important to see every pixel as it be might all you see before you receive a headshot. That’s about it. It’s interesting to see these comparisons. I appreciate the time and effort put in to the article.
I do not mind responding.
I am using both Radeon and Nvidia videocards.
Nowadays I have RX7900XTX on my desktop and Nvidia 3080 on my laptop. I played many games on both systems to make an informed and intelligent argumented opinion.
Also, I had on my computers (bought) and played with all AMD gen cards and also Nvidia generation cards 1xxx, 2xxx, 3xxx. I decided that 4xxx is not worth the money. Too much empty words, unfulfilled promises from Nvidia. And those promises which worked as should have been from the start, worked after 1 year and only on Nvidia next generation.
I can say that I enjoyed the experience on both, and did not see much of a difference between Nvidia or AMD gameplay experience. I found out that where one of manufacturers claim that can be a big difference in gameplay experience, especially from Nvidia, is not so big, and definitely not worth the money.
Also I can say that I had some issues from both Nvidia and AMD videocards.
Like in both Nvidia and AMD tried and, in little part, managed to screw me.
So I gathered most of my conclusions from experience, and paying quite a lot for this.

I propose to change the narrative and better to share with me your experience if you like.
What videocards are you playing nowadays or in the past? What is your experience with them?
From this point of view, I prefer to play games above 120 fps, ideally 144-165 fps native resolution, if not possible above 120 fps than with FSR2 or DLSS2 enabled, and the last is RT enabled.
For example I played and I tested CP2077 with RT enabled both on 3080 and 7900XTX. Even though fps is higher on 7900XTX, and playable on both videocards, fps are under 100 and do not like it. I can easily spot the difference (ghosting and something like blurr) when playing games under 60-70 fps.
Nowadays I am testing Returnal game with RT and DLSS2 enabled and also with FSR2 and RT enabled on RX7900XTX and compare them. Until now I am quite pleased by both implementations but need to do more tests. I found Returnal a game very well ported to PC.
With RX 7900XTX, FSR2 and RT enabled the game run over 100 fps, average 120 fps.

Did you know that even in Tarkov game there a lot of cheaters, like 30% are cheaters? And Tarkov game is harder than other FPS games, because you loose all when you are taken down.

P.S. About this DLSS "drama", I decided to take a stand because too much DLSS is shoved on readers and users eyes by Nvidia PR and their most passionate users, but quite often some of their arguments lack consistency or do not apply the same standards to competition products.
 
Last edited:
People also seem to confuse DLSS 2 (deep learning enhanced rendering tested here, somewhat equivalent to FSR 2 in nature) with DLSS 3, which is a bad name because it's something else entirely, also tested a while ago by TechSpot, where "fake frames" are introduced. This article has nothing to do with DLSS 3.

Very confusing. Why on earth would they name it DLSS 3 if it's an entirely different rendering technique from DLSS 2??
 
I remember people complaining about vsync input lag and making a huge deal about it. These upscalers are absolutely horrendous on input lag, and what's worse is the lag changes, rather than the consistent vsync input lag. Also that input lag dropped out at higher refresh rates regardless, whereas for DLSS/FSR you're inherently working with lower framerates and adding in extra processing.

I pretty much never see it mentioned, and I truly sometimes wonder if people even really play games. You see people talk about RTX and DLSS and other features, but key info is always missing and it seems more like an excuse to argue with others and use tech as a mid life crisis **** measuring contest. "I own this product look at me" rather than "hey this game is fun did you find this thing and how did you enjoy it?" like it used to be. There's no way someone would miss the response time issues on modern hardware if they are as sensitive as they claim to be on it.

Also what happened to freesync/gsync? That is an absolute gamechanger anyway for stuttering and inputs. When these tests are run is that even on? And if so which one? There are several different standards in play, each of them is different, but the reviewers won't give you that info, it's obviously going to have an effect. It took decades for the community to even admit frametime spikes existed, when Nvidia and AMD were using multi-GPU solutions we had the same denials in place over the microstutter, it was a terrible solution but it was so expensive people used it for that online tech social proof. Same thing is going on now with these dumb Nvidia techs.
 
Back