An Nvidia supercomputer has been improving DLSS non-stop for the past six years

Shawn Knight

Posts: 15,626   +198
Staff member
The big picture: It's been more than six years since Nvidia introduced the world to its image enhancement and upscaling tech – deep learning super sampling, or DLSS for short. The latest implementation, DLSS 4, was announced earlier this month at CES and promises to be exponentially better than what we first saw with GeForce 20 Series, but have you ever stopped to ponder exactly how we got to this point? As it turns out, a massive supercomputer has been involved in the process since the very beginning.

While discussing the tech at at the consumer electronics show, Nvidia's VP of applied deep learning research, Bryan Catanzaro, said improving DLSS has been a continuous, six-year learning process. According to Catanzaro, a supercomputer at Nvidia loaded with thousands of the latest and greatest GPUs runs 24/7, 365 days a year – and its sole focus in on improving DLSS.

The training process largely involves analyzing failures, Catanzaro said. When a DLSS model fails, it looks like ghosting, flickering, or blurriness in a game. When such failures are detected, Nvidia tries to figure out what caused the model to make the wrong choice.

Analyzing errors helps Nvidia figure out how to improve their training data. The model is then retrained on the newer data, and gets tested across hundreds of games. Rinse, repeat. "So, that's the process," Catanzaro concluded.

Nvidia introduced DLSS 4 at CES alongside its new RTX 50 Blackwell GPUs. The graphics specialist said its $549 RTX 5070 delivers 4090-like performance when using DLSS 4, a claim many are looking forward to putting to the test.

According to benchmarks shared by Nvidia, the RTX 5090 is roughly 30 percent faster than the 4090 without DLSS. Furthermore, the 5080 is said to be 15 percent faster than the 4080, and the 5070 is up to 20 percent faster than the 4070. Again, these are Nvidia's own benchmark numbers, so we will have to wait until closer to launch for non-biased, real world figures.

Permalink to story:

 
So DLSS 4 can run on older GPUs then and nVidia is just gating keeping to price gouge customers? I guess they knew people weren't going to pay 30% more for a 15% increase in performance so gate keeping DLSS is the only way to sell product. And of Google, meta and MS don't want Blackwell, makes you wonder how much gamers are going to want it once it's finally released.

I'm sure gamers would much rather take advantage of higher yields on older nodes for decreased prices instead of cutting edge performance, and I'm using that phrase pretty liberally.
 
I suppose AI is kind of best guess

A lot of this stuff is being automated by other AI models
eg AI Meta analysis
AI feedback trainer etc
AI backbox analysis what is happening under the hood

More interesting is we are using human designed games for AI to predict, but how far are we away when games are made so to facilitate stuff like DLSS- ie the game engine is 100% optimise for AI right from start

Ie will games be created on the language of AI models - as seems wasteful to have old antedated matrix, meshes, shaders etc or whatever programmers actually use now - so resource intensive - when instead a 1000 word descriptive model , with cumulative changes added or subtracted as game progresses .

Maybe some human input to create unique stuff first

 
So DLSS 4 can run on older GPUs then and nVidia is just gating keeping to price gouge customers? I guess they knew people weren't going to pay 30% more for a 15% increase in performance so gate keeping DLSS is the only way to sell product. And of Google, meta and MS don't want Blackwell, makes you wonder how much gamers are going to want it once it's finally released.

I'm sure gamers would much rather take advantage of higher yields on older nodes for decreased prices instead of cutting edge performance, and I'm using that phrase pretty liberally.
That sounds like you just want AMD or Intel to compete.

Older than 40series cards don't get FG at all. Just a renewed model for the upscale aspect of DLSS. I think it's actually great. Idk what you're complaining about really. Older cards getting improvements is what gamers want. It's what I was really hoping for as 30 series owner. Not more of the same. FSR4 is only for the newest Radeons anyways.
NVidia isn't a budget option anymore, I'd look more at Intel than AMD at this point too if I was a budget gamer.
 
That sounds like you just want AMD or Intel to compete.

Older than 40series cards don't get FG at all. Just a renewed model for the upscale aspect of DLSS. I think it's actually great. Idk what you're complaining about really. Older cards getting improvements is what gamers want. It's what I was really hoping for as 30 series owner. Not more of the same. FSR4 is only for the newest Radeons anyways.
NVidia isn't a budget option anymore, I'd look more at Intel than AMD at this point too if I was a budget gamer.
I'm cheap and I hate feeling like I'm getting ripped off. I have watched nVidia rip off every generation of gamers since they introduced the 20 series, and the 50 series is no different, I'm sure the 60 series will do this again. I'm not willing to pay $1000+ for a card that will last for 2 years if I'm lucky. I will(and have) just stopped playing new games. Gaming used to be a cheap hobby, but I can throw several ebay servers in my rack for the cost of a 4090.

And considering that many nVidia features just don't work on Linux, their software that they tout is useless to me. There was even a recent release that the general release of SteamOS is being delayed because of nVidia driver issues and the general release will instead be a "Beta".

Every generation they're giving us less and making us pay more. It's finally getting absurd enough that I've basically checked out of the the hobby. This has made it all that much easier by the fact that most games are microtransaction filled slot machines with a political agenda.

The games aren't worth playing, the hardware is too expensive and if I want to use any of their software features, I have to switch back to windows.

Something has to give at this point. Either games need to get better, Linux support needs to improve or the price needs to drop. I'm not asking for all 3 but that sad part is that we'll probably get none.
 
Last edited:
I pretty much expected that the 5000 series would heavily benefit from AI.
I'd heard that integrated graphics would be able to scale up graphics to 4K and use DLSS to increase framerates while other AI improves sound quality. Isn't this what the AI revolution was supposed to be about?

Less expensive, more powerful, more portable computers that have advanced capabilities without thirsty CPU and GPU?

It boggles my mind when people start arguing about "fake frames".

If the image is good, it's good. What's the problem?

Theoretically, AI could make one game look like an entirely different game on the fly.
 
We need lawsuits, everywhere. This form of AI is a scam. It's not helpful to society like other forms of AI. We are being sold something we are literally told is fake with nothing but downsides yet we accept it? What are you people doing?
 
So DLSS 4 can run on older GPUs then and nVidia is just gating keeping to price gouge customers? I guess they knew people weren't going to pay 30% more for a 15% increase in performance so gate keeping DLSS is the only way to sell product. And of Google, meta and MS don't want Blackwell, makes you wonder how much gamers are going to want it once it's finally released.

I'm sure gamers would much rather take advantage of higher yields on older nodes for decreased prices instead of cutting edge performance, and I'm using that phrase pretty liberally.
I'm cheap and I hate feeling like I'm getting ripped off. I have watched nVidia rip off every generation of gamers since they introduced the 20 series, and the 50 series is no different, I'm sure the 60 series will do this again. I'm not willing to pay $1000+ for a card that will last for 2 years if I'm lucky. I will(and have) just stopped playing new games. Gaming used to be a cheap hobby, but I can throw several ebay servers in my rack for the cost of a 4090.

And considering that many nVidia features just don't work on Linux, their software that they tout is useless to me. There was even a recent release that the general release of SteamOS is being delayed because of nVidia driver issues and the general release will instead be a "Beta".

Every generation they're giving us less and making us pay more. It's finally getting absurd enough that I've basically checked out of the the hobby. This has made it all that much easier by the fact that most games are microtransaction filled slot machines with a political agenda.

The games aren't worth playing, the hardware is too expensive and if I want to use any of their software features, I have to switch back to windows.

Something has to give at this point. Either games need to get better, Linux support needs to improve or the price needs to drop. I'm not asking for all 3 but that sad part is that we'll probably get none.
This.
I'm going AMD out of spite at this point.
And the only "modern" games I'm playing are Cyberpunk 2077 and Warhammer Space Marine II.
Gaming is not worth it anymore.
This is my last rig, probably.
 
I pretty much expected that the 5000 series would heavily benefit from AI.
I'd heard that integrated graphics would be able to scale up graphics to 4K and use DLSS to increase framerates while other AI improves sound quality. Isn't this what the AI revolution was supposed to be about?

Less expensive, more powerful, more portable computers that have advanced capabilities without thirsty CPU and GPU?

It boggles my mind when people start arguing about "fake frames".

If the image is good, it's good. What's the problem?

Theoretically, AI could make one game look like an entirely different game on the fly.
The issue arises when those fake frames cause input lag because they didn't rely on human input at all, they need to make sure games still feel responsive
 
I'm wondering how GPU reviewers are going to handle all this. The methodology of presenting only FPS metrics as if everything else is the same across all cards is not going to hold up. DLSS and "neural rendering" and other AI techniques means different cards may actually produce different individual frames, so human judgment as to the visual fidelity and/or gaming "usefulness" of those frames will become a factor which is tough because it moves the review from the objective to the subjective. And as inflated frame rates become more common, latency may become a more important factor to how good the game feels and how well the gamer can compete. Again will require a new set of analysis and judgment.
 
I'm cheap and I hate feeling like I'm getting ripped off. I have watched nVidia rip off every generation of gamers since they introduced the 20 series, and the 50 series is no different, I'm sure the 60 series will do this again.
I have a friend who's had a 2060 Super since launch (got it on some mad deal so paid considerably less for it than normal) and I just asked him exactly that, "do you feel ripped off with your 2060 Super".

His answer was a clear and concise "no". So I asked why, when he bought the GPU, he was never promised frame generation, so he doesn't expect frame generation.

He's happy they added stuff like Ray-Reconstruction and kept improving DLSS, his GPU is 6 years old now and they're fully supporting the latest DLSS 4 on it.

My question to you is, what is your definition of "ripped off"? Because it currently looks like not adding future technology to your older GPU, you're considering that a rip off?
 
I have a friend who's had a 2060 Super since launch (got it on some mad deal so paid considerably less for it than normal) and I just asked him exactly that, "do you feel ripped off with your 2060 Super".

His answer was a clear and concise "no". So I asked why, when he bought the GPU, he was never promised frame generation, so he doesn't expect frame generation.

He's happy they added stuff like Ray-Reconstruction and kept improving DLSS, his GPU is 6 years old now and they're fully supporting the latest DLSS 4 on it.

My question to you is, what is your definition of "ripped off"? Because it currently looks like not adding future technology to your older GPU, you're considering that a rip off?
Great source, one guy with a 2060.

The problem is that lots of people bought into the 10 series thinking, "graphics performance has finally peaked and this will probably last me a very long time". NVidia blind sided the industry with ray tracing. It didn't really matter because no games supported it and it's wasn't even really usable on ANY of the cards in the 20 series. People were made about the waste silicon.

Things like DLSS were cool, but again, most gimmicks but you could force it on in the drivers if you wanted.

Then came the 30 series and prices were still reasonable from an MSRP perspective, scalpers ruined that, but their upscaling tech, which was necessary if you wanted to play with RT, wasn't really available on the 20 series. Then the 40 series came, we got price increases and new software locked tech. Then the 50 series came, and we got the same thing again. Then the 60 series will come out with DLSS 5 and the 50 series won't get it.
 
I hope Nvidia paid for this advertisement.

Would the word "adverticle" be a good way to describe an advert that's pretending to be an article?
 
Great source, one guy with a 2060.
I mean, I was a 1080Ti owner, If it really helps, I could ask everyone on my Discord who have Nvidia cards? Ask them if they feel ripped off? I feel like, you know the answer from most people though.
The problem is that lots of people bought into the 10 series thinking, "graphics performance has finally peaked and this will probably last me a very long time".
Did they? You think lots of people bought into the 10 series thinking they were end game GPU's, nothing better will ever come along?
NVidia blind sided the industry with ray tracing. It didn't really matter because no games supported it and it's wasn't even really usable on ANY of the cards in the 20 series. People were made about the waste silicon.
If people were mad, why did they buy them? Weirdly, they're getting more use today than any other time, more games are coming out requiring RT cores, Indiana Jones literally will not run on a 1080Ti, but happily run on a 2060 as an example.
Their upscaling tech, which was necessary if you wanted to play with RT, wasn't really available on the 20 series.
I'm getting more confused, DLSS has been a thing since the 20 series released, the 20 series has had it's DLSS version updated every generational release since, Or do you mean Ray-Reconstruction? Which was added to the 20 series when it released? What Upscaling tech wasn't available on the 20 series?
Then the 40 series came, we got price increases and new software locked tech. Then the 50 series came, and we got the same thing again. Then the 60 series will come out with DLSS 5 and the 50 series won't get it.
Right, this is where I think value, marketing and facts are confusing everything here.

20/30 series - never were sold with frame generation, no marketing, nothing, they aren't capable of Nvidia's DLSS Frame Gen, it didn't exist when they were launched.
40 series - marketed and sold with frame gen abilities in-mind, all other DLSS features were made available for 20/30 series owners (Ray-Reconstruction).
50 series - same as above except its now multi-frame gen.

Value should be based on price, what it can do at time of purchase (or launch) and never based on a promise. Nvidia never promised to add any future features, they claim its entirely based on how much better the tensor cores get, newer generation tensor cores can handle more.

You seem to be basing value (and therefore, feeling ripped off) on Nvidia NOT adding future features to older GPU's?
 
I mean, I was a 1080Ti owner, If it really helps, I could ask everyone on my Discord who have Nvidia cards? Ask them if they feel ripped off? I feel like, you know the answer from most people though.

Did they? You think lots of people bought into the 10 series thinking they were end game GPU's, nothing better will ever come along?

If people were mad, why did they buy them? Weirdly, they're getting more use today than any other time, more games are coming out requiring RT cores, Indiana Jones literally will not run on a 1080Ti, but happily run on a 2060 as an example.

I'm getting more confused, DLSS has been a thing since the 20 series released, the 20 series has had it's DLSS version updated every generational release since, Or do you mean Ray-Reconstruction? Which was added to the 20 series when it released? What Upscaling tech wasn't available on the 20 series?

Right, this is where I think value, marketing and facts are confusing everything here.

20/30 series - never were sold with frame generation, no marketing, nothing, they aren't capable of Nvidia's DLSS Frame Gen, it didn't exist when they were launched.
40 series - marketed and sold with frame gen abilities in-mind, all other DLSS features were made available for 20/30 series owners (Ray-Reconstruction).
50 series - same as above except its now multi-frame gen.

Value should be based on price, what it can do at time of purchase (or launch) and never based on a promise. Nvidia never promised to add any future features, they claim its entirely based on how much better the tensor cores get, newer generation tensor cores can handle more.

You seem to be basing value (and therefore, feeling ripped off) on Nvidia NOT adding future features to older GPU's?

DLSS's new features are softlocked. That's really the icing on the cake. I've been feeling ripped off for several years now. The fact of the matter is that there is no reason at all why nVidia can't bring all this stuff to all their graphics cards. I'd like to see how the 4090 performs with DLSS4. They aren't enabling on the 40 series because the claim "the 5070 is equivalent to a 4090" would be a lie.

There are layers and layers to what exactly made me hate nVidia, but it basically comes down to "it doesn't have to be this way". After owning nearly every generation of card from nVidia since the geforce 2 to the 10 series I cashed out. I had a 1080ti and it died during covid so I had to settle for a 6700xt, I wasn't interested in the 20 series when it came out, I was happy with the 1080ti. The 30 series was unavailable, which was frustrating but I wasn't paying scalper pricing. nVidia was holding back on vRAM at the time and I just thought it wasn't worth it. Then the 40 series got released at scalper pricing and many cards were gimped on vram. Again, it didn't have to be that way, but I automatically saw that as artificially limiting the life of the cards. The 50 series got released and vRAM is going to be a major issue on the 5070.

And I'm calling this now, nVidia will only officaly support the 50 series on SteamOS.
 

DLSS's new features are softlocked. That's really the icing on the cake. I've been feeling ripped off for several years now. The fact of the matter is that there is no reason at all why nVidia can't bring all this stuff to all their graphics cards. I'd like to see how the 4090 performs with DLSS4. They aren't enabling on the 40 series because the claim "the 5070 is equivalent to a 4090" would be a lie.

There are layers and layers to what exactly made me hate nVidia, but it basically comes down to "it doesn't have to be this way". After owning nearly every generation of card from nVidia since the geforce 2 to the 10 series I cashed out. I had a 1080ti and it died during covid so I had to settle for a 6700xt, I wasn't interested in the 20 series when it came out, I was happy with the 1080ti. The 30 series was unavailable, which was frustrating but I wasn't paying scalper pricing. nVidia was holding back on vRAM at the time and I just thought it wasn't worth it. Then the 40 series got released at scalper pricing and many cards were gimped on vram. Again, it didn't have to be that way, but I automatically saw that as artificially limiting the life of the cards. The 50 series got released and vRAM is going to be a major issue on the 5070.
Fair enough I suppose, I just don't know why you'd even want Frame Gen anyway, it feels horrible, I have a 40 series, tried it twice, turned it off and never looked back, why you want that so desperately in-order to feel less ripped off is confusing to me.

Let it be clear though, EVERYTHING else in DLSS 4 works on older cards, just not frame gen, which really isn't that much of a great loss...
And I'm calling this now, nVidia will only officaly support the 50 series on SteamOS.
Honestly? If they do that, they're leaving me no choice but to go to AMD, I'd love to take an older machine with an Nvidia card in it, put it in a small case, whack SteamOS on it and use it for the TV on older games, I don't plan on getting any 50 series GPU's for myself, the performance increases aren't that much over the 40 series.
 
Isn't this what the AI revolution was supposed to be about?


It boggles my mind when people start arguing about "fake frames".

If the image is good, it's good. What's the problem?

One thing is to improve raw performance as usual and use fake frames to improve performance in situations like 8K and Path Tracing. Another thing is to use fake frames to little improve raw performance and sell us bad meat as good meat.

If they give us at least 30% raw performance increase across the board for the same or less price and the DLSS 4 comes as extra, that might be acceptable. To give us 30% performance increase for more money and just better performance with fake frames at the same price, that is cheating.
 
I pretty much expected that the 5000 series would heavily benefit from AI.
I'd heard that integrated graphics would be able to scale up graphics to 4K and use DLSS to increase framerates while other AI improves sound quality. Isn't this what the AI revolution was supposed to be about?

Less expensive, more powerful, more portable computers that have advanced capabilities without thirsty CPU and GPU?

It boggles my mind when people start arguing about "fake frames".

If the image is good, it's good. What's the problem?

Theoretically, AI could make one game look like an entirely different game on the fly.

This, 100%. The noise we’re hearing is simply from those who refuse to accept the path GPU technology is advancing on. It’s no longer feasible to have a 1000W GPU with 5 slots. AI is the key to advancing how gaming graphics are computed.

Nvidia is the king of AI, no denying that, and what they’re doing with DLSS makes perfect sense. I won’t even bother with the repeated stereotypical arguments against DLSS anymore. No, I’m not saying everyone should run out and buy an Nvidia GPU, but the point is that Nvidia has the edge here, and they’ve had it for the past decade. There’s no stopping them.

They lead, and others follow. End of story (for now).
 
I'm cheap and I hate feeling like I'm getting ripped off. I have watched nVidia rip off every generation of gamers since they introduced the 20 series, and the 50 series is no different, I'm sure the 60 series will do this again. I'm not willing to pay $1000+ for a card that will last for 2 years if I'm lucky. I will(and have) just stopped playing new games. Gaming used to be a cheap hobby, but I can throw several ebay servers in my rack for the cost of a 4090.

And considering that many nVidia features just don't work on Linux, their software that they tout is useless to me. There was even a recent release that the general release of SteamOS is being delayed because of nVidia driver issues and the general release will instead be a "Beta".

Every generation they're giving us less and making us pay more. It's finally getting absurd enough that I've basically checked out of the the hobby. This has made it all that much easier by the fact that most games are microtransaction filled slot machines with a political agenda.

The games aren't worth playing, the hardware is too expensive and if I want to use any of their software features, I have to switch back to windows.

Something has to give at this point. Either games need to get better, Linux support needs to improve or the price needs to drop. I'm not asking for all 3 but that sad part is that we'll probably get none.
The first two points is just a preference statement. Linux has been the way it is since forever, isn't that what proton is for? The third one about price is the state of competition. We're on DLSS4 and AMD doesn't even have a ML upscaler. AMD Radeon motto was "NEVER SETTLE", they're clearly settling and following. This is why prices are insane now, no competition. I'm really hoping Intel can show up and take #2 and bring prices back to earth.

I update my rigs around R* games. They put the most effort in. There's plenty of modern games worth playing, just not worth $70.
 
The first two points is just a preference statement. Linux has been the way it is since forever, isn't that what proton is for?
If you don't know what Proton is for then you have no business talking about the way "linux has been"

Proton is essentially a very streamlined and optimized version of WINE that helps GAMES communicate with the OPERATING SYSTEM. The OPERATING SYSTEM then communicates with the HARDWARE through the DRIVERS to render the image. You need drivers to run natively in any operating system, there is no compatibility layer or emulating of drivers.

 
Back