Nvidia says resolution upscaling like DLSS (and not native resolution) is the future

If tensor cores and fake RT weren't being pushed on the market, GPUs would either be 30-40% stronger or equally cheaper.

They are not the same, only if you have some damage to your vision you might not notice, native is better, but sometimes a bad implementation of TAA brings a blurred appearance and that's why there is this illusion that Upscaling is better. The second point is that the games lost so much in optimization that even if I ignore the rendering problems, one thing cancels out the other.

The technology works exactly as I said to fake transition frames between the real rendered frames. Do you have any valid arguments other than telling me to read Nvidia's texts?
So, hardware unboxed/techspot lied about DLSS looking same or better than Native?
Millions of people that use DLSS are lying?
All the AI image upscaler/generator companies are lying?
Adobe is lying?
Microsoft is lying?
Only you are telling the truth.

I am not gonna continue. You seem to be anti-innovation supporter.
Games being unoptimized is not nvidia's fault. They are made for amd consoles then ported to nvidia pc's. 85% pc users have a nvidia gpu, 94% laptop gpu users have a nvidia gpu.

Ofcourse the game is gonna run bad. But as we all saw 1-2 months later with pathces games run just fine.
 
Upscaling might be the future but I'm pretty sure nVidia's market segmented multiple flavour propriety DLSS isn't it. Whilst nVidia has a dominant market share in the PC space it is pretty non-existent in the console and mobile phone spaces. FSR might not be as good as DLSS (jury is out as seems to be game and implementation specific) but it is open and can be used across many GPU architectures so it might be prudent of nVidia to be careful what they wish for........
 
Frame generation is the big no-no for me as it's hard to trust that it'll get it right and the potential for smearing and artifacts is higher than I'd like and would rather have quality over performance for the games I play which don't rely on high frame rates to enjoy.

Upscaling becoming more prominent is a reasonable call. It's essentially no different from the interpolation technology that's been used in rasterisation for decades to reduce demand on hardware at the required resolution (remember VESA before 3D hardware locked us to only a few popular resolutions?). I'm not involved with modern game development but as long as the UI can be rendered at native resolution while the 'game world' gets the upscaling treatment, I doubt I'd even notice a mild boost if I needed to try and nudge my FPS upwards. It's the modern version of "can it run Crysis?" and there will always be those that can't play as the creators intended and need to make some sacrifices.

However I agree that such tech shouldn't be used as a crutch by lazy developers to churn out unoptimised junk. At least have the decency to try and make your games playable on high/ultra at 1080p on mainstream hardware, and leave the 4K@240hz moonshots to those with DLSS/FSR or bottomless pockets. Spoil the high end, don't punish the low end.
 
Upscaling might be the future but I'm pretty sure nVidia's market segmented multiple flavour propriety DLSS isn't it. Whilst nVidia has a dominant market share in the PC space it is pretty non-existent in the console and mobile phone spaces. FSR might not be as good as DLSS (jury is out as seems to be game and implementation specific) but it is open and can be used across many GPU architectures so it might be prudent of nVidia to be care ......

Upscaling might be the future but I'm pretty sure nVidia's market segmented multiple flavour propriety DLSS isn't it. Whilst nVidia has a dominant market share in the PC space it is pretty non-existent in the console and mobile phone spaces. FSR might not be as good as DLSS (jury is out as seems to be game and implementation specific) but it is open and can be used across many GPU architectures so it might be prudent of nVidia to be careful what they wish for........
Switch sold more than xbox x/s and ps5 combined.
Switch 2 is coming will DLSS, RR, DLAA next year.
So, no worries.
Amd did not take AI route.
So, Nvidia will be the only one providing the innovation.
 
Switch sold more than xbox x/s and ps5 combined.
Switch 2 is coming will DLSS, RR, DLAA next year.
So, no worries.
Amd did not take AI route.
So, Nvidia will be the only one providing the innovation.
what is the use of RR for a nintendo switch ?
DLSS is going to be super useful tho.
 
Frame gen sucks because what it would help the most with it hinders,

and thats latency and bringing games up to 60fps, if it could fill in those missing frames without making the game feel crappy and no screen tearing it would be a game changer.

they always make cool stuff but either miss the true target problems or overshoot so far that the people who can afford it dont really need those features.
 
What else can you say when a 4090 is barely able to run Cyberpunk with Path Tracing at 20 FPS at native 2160p...

DLSS will never take over FSR as long as Sony and Microsoft are using AMD hardware for their consoles, which the industry is using as their developing platform for their games.

With the industry pushing 8k tv's this will be the ONLY way to afford a Xbox/PS#
 
What else can you say when a 4090 is barely able to run Cyberpunk with Path Tracing at 20 FPS at native 2160p...

DLSS will never take over FSR as long as Sony and Microsoft are using AMD hardware for their consoles, which the industry is using as their developing platform for their games.
Well, given that Nvidia has 78% share of the discrete GPU market, I'd say FSR has a long hill to climb.
 
You are juggling words to try to deceive the inattentive. The truth is that Nvidia tries to create false frames between the real rendered frames, people with already damaged vision or brains taken over by fanaticism will say that this is the future and everything is fine. While reality shows the opposite, with games falling in quality and optimization, to the point of requiring upscaling as the only way to play.
Video games have been doing this for years. Remember, back in the day, when Internet speeds weren't great, and some people in-game had high latency? You would see them "rubber banding" all over the screen because the game was "predicting" the path of a moving object but would update that prediction with real position data when it came available.

As games become more complex, graphically speaking, how do we support those games without requiring a GPU that pulls 1000W? That is the ultimate problem. You can get native performance, but it's going to cost you in terms of power requirements.
 
I don't have any problem with the emerging interpolation/upscaling technologies. PC gaming has a long history of pivoting to new methods when the established one hits a wall. I remember when PC gamers were upgrading their CPUs almost every 9-12 months just to get good frame rates on Quake and Wing Commander. And then this thing called a Voodoo3D came out and made CPU rendering much less important. If we just kept following the CPU graphics rendering timeline, then power consumption / die size / cost would have gone off the rails too. I'm looking at GPU card sizes now and they are comically swollen.

Gaming graphics is just the creation of a visual illusion. Lots of different technologies intersect with each other to achieve the final result. Interpolation technologies are just another tool in the toolkit, with its own advantages and disadvantages. It's up to developers to choose how they implement them.
 
I really can't understand how these upscaling techniques can be looked at as a bad thing. The idea that devs will use them instead of optimizing seems to be the biggest complaint. If you are worried about it, don't, of course there will be developers that use it instead of optimizing, but there will be others that don't and instead optimize the games to get the most out of the tech. So.. nothing is really going to change.
 
To me this all comes down to, does the final visual look more like native 4K video, or more like 1080/1440 video upscaled to 4K?

I have no problem with more processing options, but I would have a problem with products marketed and priced for "4K" capability that do not deliver.

btw this is not really a new issue, few people see true uncompressed 4K video on their TVs, and even many streams labelled as 1080p are, in addition to being compressed, also streamed/transmitted at sub 1080p resolutions.
 
As a reminder: The necessary horsepower to output increasing resolutions is not linear; and there's simply a point where it isn't efficient to try and push more pixels, especially as we're already at the point where it's hard to tell the difference even on large displays.

Then again, there are people who claim 8k120 isn't "real" for HDMI 2.1 because DSC has to be used, so...
 
A lot of people here seem to be missing the plot here going in pretty hard in defense of frame generation as an outright replacement of rasterization. Couple of things with that

The most important part to remember is that the Raster APIs are freely available to everybody and in the case of Vulkan for example, collaboratively developed by consensus and not just outright controlled by one company. Still, even DirectX is directly controlled by Microsoft has had huge controversies in the past with Microsoft intentionally creating market segmentation to force people into versions of Windows they didn't want to pay for and that's basically why OpenGL stuck around for so long and Vulkan is also still around: Yes there might be some technical aspects why some prefer from Vulkan but just not tying yourself to the whims of Microsoft that might at any point decide to split your market up is quite important.

Now imagine depending on Nvidia doing the same thing Microsoft invented, except an order of magnitude more severe since their hardware updates are 3x as frequent (In fact even worst than 3x because not every new version of Windows promotes a new DirectX version but so far Nvidia is aiming for a new DLSS version exclusive to each new generation), far more reactive to 'Inflation' and with only Nvidia interest in mind which seeks to artificially just completely eliminate the low end market and even significantly shrink the midrange market as well.

As a game developer, why would you invest millions on a new game only to hear basically endless complains because the only way your new game runs decently is if people buy specifically a 5070 to 5090, not just an Nvidia card and not just their latest gen but specifically only the top tier of their offerings will actually deliver the new tech and the performance that's even necessary because Nvidia keeps locking it intentionally to sell more expensive cards.

Nvidia is literally poison to the PC AAA industry and if left unchecked it will just kill the will of gamers, you guys defending DLSS as the main focus are never going to be a significant enough portion of the market simply because there's just not enough people spending 1000 on a GPU every other year. So no, DLSS is not the future, not for any of you Nvidia enthusiasts either when there's just going to be more and more developers moving to console only releases instead.
 
Last edited:
A lot of people here seem to be missing the plot here going in pretty hard in defense of frame generation as an outright replacement of rasterization. Couple of things with that

The most important part to remember is that the Raster APIs are freely available to everybody and in the case of Vulkan for example, collaboratively developed by consensus and not just outright controlled by one company. Still, even DirectX is directly controlled by Microsoft has had huge controversies in the past with Microsoft intentionally creating market segmentation to force people into versions of Windows they didn't want to pay for and that's basically why OpenGL stuck around for so long and Vulkan is also still around: Yes there might be some technical aspects why some prefer from Vulkan but just not tying yourself to the whims of Microsoft that might at any point decide to split your market up is quite important.

Now imagine depending on Nvidia doing the same thing Microsoft invented, except an order of magnitude more severe since their hardware updates are 3x as frequent (In fact even worst than 3x because not every new version of Windows promotes a new DirectX version but so far Nvidia is aiming for a new DLSS version exclusive to each new generation), far more reactive to 'Inflation' and with only Nvidia interest in mind which seeks to artificially just completely eliminate the low end market and even significantly shrink the midrange market as well.

As a game developer, why would you invest millions on a new game only to hear basically endless complains because the only way your new game runs decently is if people buy specifically a 5070 to 5090, not just an Nvidia card and not just their latest gen but specifically only the top tier of their offerings will actually deliver the new tech and the performance that's even necessary because Nvidia keeps locking it intentionally to sell more expensive cards.

Nvidia is literally poison to the PC AAA industry and if left unchecked it will just kill the will of gamers, you guys defending DLSS as the main focus are never going to be a significant enough portion of the market simply because there's just not enough people spending 1000 on a GPU every other year. So no, DLSS is not the future, not for any of you Nvidia enthusiasts either when there's just going to be more and more developers moving to console only releases instead.
In order to avoid spending the rest of the year responding to the rabid fans carrying the "Nvidia always tells the truth" flag, I wasn't even going to respond to this topic anymore, but I just wanted to point out that you've touched on an important part of the reason the industry is bringing extremely low-quality games to market. They think they can save time and development effort, and by simply adding RT and DLSS/FSR + fake frame generation to boost the framerate it will be a success. The PC gaming industry is very dismal, barring exceptions and indies, I'd rather play Nintendo games than look at most AAA PC games.
 
Exactly what all gamers want to hear. Using trickery to decrease image quality and improve framerate is just what we all want.
 
A lot of people here seem to be missing the plot here going in pretty hard in defense of frame generation as an outright replacement of rasterization. Couple of things with that

The most important part to remember is that the Raster APIs are freely available to everybody and in the case of Vulkan for example, collaboratively developed by consensus and not just outright controlled by one company. Still, even DirectX is directly controlled by Microsoft has had huge controversies in the past with Microsoft intentionally creating market segmentation to force people into versions of Windows they didn't want to pay for and that's basically why OpenGL stuck around for so long and Vulkan is also still around: Yes there might be some technical aspects why some prefer from Vulkan but just not tying yourself to the whims of Microsoft that might at any point decide to split your market up is quite important.

Now imagine depending on Nvidia doing the same thing Microsoft invented, except an order of magnitude more severe since their hardware updates are 3x as frequent (In fact even worst than 3x because not every new version of Windows promotes a new DirectX version but so far Nvidia is aiming for a new DLSS version exclusive to each new generation), far more reactive to 'Inflation' and with only Nvidia interest in mind which seeks to artificially just completely eliminate the low end market and even significantly shrink the midrange market as well.

As a game developer, why would you invest millions on a new game only to hear basically endless complains because the only way your new game runs decently is if people buy specifically a 5070 to 5090, not just an Nvidia card and not just their latest gen but specifically only the top tier of their offerings will actually deliver the new tech and the performance that's even necessary because Nvidia keeps locking it intentionally to sell more expensive cards.

Nvidia is literally poison to the PC AAA industry and if left unchecked it will just kill the will of gamers, you guys defending DLSS as the main focus are never going to be a significant enough portion of the market simply because there's just not enough people spending 1000 on a GPU every other year. So no, DLSS is not the future, not for any of you Nvidia enthusiasts either when there's just going to be more and more developers moving to console only releases instead.

That's a binary, machiavellian interpretation of the story. Thing is, nobody is forced to use AI assisted rendering. That's the great thing about PC gaming, you choose your hardware, your software, your resolution, your target frame rate, etc. Does nvidia develop his own games that can't run on any other gpu? No. Do they force gamers or developers to enable upscaling, frame generation or ray reconstruction when enabling ray tracing? No. Every single part of this is a distinct setting. Did AMD bring any innovation, any new rendering technique in the last years that isn't inspired on nvidia technologies? No. Did the fact that AMD gpus run in 2 gens of Playstation and 3 gens of Xbox consoles helped innovation? No. And that's really sad.

Nvidia is bringing disruptive technologies to the market. As a private company, they chose no to go full open source. It's understandable that a lot of people criticize that choice and I'm not trying to defend it. It's bad for consumers in general, but it's way too easy for AMD to play the victim when they don't bring anything new to the table. They're barely coping right now. There's nothing that nvidia does that prevents any competitor to implement similar technologies. There are common APIs to enable raytracing on AMD gpus, it's just they it runs like a potato. So what's the problem with nvidia trying to find optional solutions to make it playable for more and more people?

They're not locking frame generation just for economic reasons on the 40x0 gen, there's actual hardware limits on motion vectors detection that would hurt the prediction that it would end in bad overall quality. That is all public data available on their website. Not just statements, but actual figures and details.

You seem so sure that these new technologies will miserably fail, but you're already wrong because if that approach was so bad, then AMD would have taken another route and wouldn't have created FSR 1, 2 and 3 or even added ray tracing support on console.
 
Last edited:
Back