Nvidia DLSS 4 Ray Reconstruction Analysis: Fixing Ugly Ray Tracing Noise

So for the specific game and application that I play, 8K interests me. That game is EvE. The scale of those battles blown up onto a large screen creates a sort of 3D effect that has to be experienced to be understood. That said, I sit very close to a 65" 4K120 TV that I use as a "monitor" in this application to get this "experience". At those distances, the DPI is VERY low. Frankly, modern hardware could probably play modern EvE at 8K120. There are 2 MAJOR issues stopping me from buying an 8K display for this purpose.

1) and BY FAR THE LARGEST, is that input lag on 8K is REALLY bad. The short of that is the more pixels, the more processing the driver in the display has to do to change each pixel. Going from 1080P to 4k on the same display will increase latency, but nothing that people will really notice. I HAVE NOT used a single 8k Display, TV or otherwise, that has acceptable levels of input lag

2)8K120 just isn't a thing right now. I will not throw down money for an 8K display if it is limited at 60hz. Whether it is because of the display or the GPU output, I will not be upgrading until 8k120 is a real thing. This is even more infuriating because the organization that controls the HDMI standard wont allow display ports on TV's and they also won't allow proper HDMI support for anything about 4k60 on Linux. There are work arounds for this and JANK is just something you have to accept if you want to become a Linux user. The thing is, the less jank you have to put up with, the better.

So even if I wasn't a stubborn linux user who plays too much EvE, I still thing that we are a VERY far off from 8k being relevant. I think the issues with modern hardware basically requiring upscaling to play modern games which is causing a regress in performance(and graphics) not progress, It's going to be awhile before 8k comes close to being relevant. I find something like the creation of a "stop gap" measure of a 6k standard (3240P?). 6k panels can also be made from cuts of defective 8K panels so it is theoretically a way to make more money per mm^2 than cutting a defective 8K panel into 3 4K panels.

But anyway. As someone who hopes to own a 100" ~8k panel for gaming some day, I feel we're still a good bit off from that. I had a best buy let me hook my laptop upto a Samsung QN800D that I almost bought, but the deal break was the input lag. In 4k, it was perfectly acceptable, you couldn't feel it. If you tried to game at 8K, it instantly was probably somewhere in the 100MS and you could definitely feel that. Even basically desktop use and veiwing files felt like my mouse cursor was moving through a heavy oil or something. It was a very odd experience. Again, 4k was fine, but 8k is completely off the table until inputlag gets address. Also, if I was to use this as a desktop monitor, I would expect 8k120. I'm not a high refresh elitest, I don't think I can really see or feel anything passed 120, but I can 100% feel the difference when web browsing or looking over excel spreed sheets on a 60hz vs or 120hz monitor. It's absolutely tolerable, but I'm not paying 8K TV money for a "tolerable" experience. My only requirement for spending exceptional amounts of money is an exceptional experience, which just isn't there right now. I'd rather replace tmy current display with a 4k OLED TV that accepts a 240hz input than go 8K60. And I actually think TVs can only accept a maximum of 4k165 right now for some arbitrary reason set by the HDMI foundation, but double check those numbers and the reason behind them because I'm not saying that with 100% certainty.

So, for now, I'll have to deal with sitting arms length away at my 65" 4k screen for my EvE gaming needs and being able to see individual pixels because the DPI is low. Part of me was considering a 4k120 projector for awhile because the way the light is, well, projected naturally creates a blending effect that is somehow both sharp and blurry at the same time(think CRT). But, the logistics of that whole idea are, well, absurd to put it politely.
What about a 5k monitor or a VR headset? 5k actually has a lot of utility in the creative/development space since you can natively render some scene/image intended to be viewed at 4k while there is still room to put sidebars or other software controls on screen. So, while 5k may not have the effect of 8k, it would be better than 4k, and there should be plenty of options on the market. I only mention VR since those displays are high fidelity and VR videos are often produced in 5k, 6k, 7k, and even 8k, so that might work for your case (I dunno if it will or not).
 
So for the specific game and application that I play, 8K interests me. That game is EvE. The scale of those battles blown up onto a large screen creates a sort of 3D effect that has to be experienced to be understood. That said, I sit very close to a 65" 4K120 TV that I use as a "monitor" in this application to get this "experience". At those distances, the DPI is VERY low. Frankly, modern hardware could probably play modern EvE at 8K120. There are 2 MAJOR issues stopping me from buying an 8K display for this purpose.

1) and BY FAR THE LARGEST, is that input lag on 8K is REALLY bad. The short of that is the more pixels, the more processing the driver in the display has to do to change each pixel. Going from 1080P to 4k on the same display will increase latency, but nothing that people will really notice. I HAVE NOT used a single 8k Display, TV or otherwise, that has acceptable levels of input lag

2)8K120 just isn't a thing right now. I will not throw down money for an 8K display if it is limited at 60hz. Whether it is because of the display or the GPU output, I will not be upgrading until 8k120 is a real thing. This is even more infuriating because the organization that controls the HDMI standard wont allow display ports on TV's and they also won't allow proper HDMI support for anything about 4k60 on Linux. There are work arounds for this and JANK is just something you have to accept if you want to become a Linux user. The thing is, the less jank you have to put up with, the better.

So even if I wasn't a stubborn linux user who plays too much EvE, I still thing that we are a VERY far off from 8k being relevant. I think the issues with modern hardware basically requiring upscaling to play modern games which is causing a regress in performance(and graphics) not progress, It's going to be awhile before 8k comes close to being relevant. I find something like the creation of a "stop gap" measure of a 6k standard (3240P?). 6k panels can also be made from cuts of defective 8K panels so it is theoretically a way to make more money per mm^2 than cutting a defective 8K panel into 3 4K panels.

But anyway. As someone who hopes to own a 100" ~8k panel for gaming some day, I feel we're still a good bit off from that. I had a best buy let me hook my laptop upto a Samsung QN800D that I almost bought, but the deal break was the input lag. In 4k, it was perfectly acceptable, you couldn't feel it. If you tried to game at 8K, it instantly was probably somewhere in the 100MS and you could definitely feel that. Even basically desktop use and veiwing files felt like my mouse cursor was moving through a heavy oil or something. It was a very odd experience. Again, 4k was fine, but 8k is completely off the table until inputlag gets address. Also, if I was to use this as a desktop monitor, I would expect 8k120. I'm not a high refresh elitest, I don't think I can really see or feel anything passed 120, but I can 100% feel the difference when web browsing or looking over excel spreed sheets on a 60hz vs or 120hz monitor. It's absolutely tolerable, but I'm not paying 8K TV money for a "tolerable" experience. My only requirement for spending exceptional amounts of money is an exceptional experience, which just isn't there right now. I'd rather replace tmy current display with a 4k OLED TV that accepts a 240hz input than go 8K60. And I actually think TVs can only accept a maximum of 4k165 right now for some arbitrary reason set by the HDMI foundation, but double check those numbers and the reason behind them because I'm not saying that with 100% certainty.

So, for now, I'll have to deal with sitting arms length away at my 65" 4k screen for my EvE gaming needs and being able to see individual pixels because the DPI is low. Part of me was considering a 4k120 projector for awhile because the way the light is, well, projected naturally creates a blending effect that is somehow both sharp and blurry at the same time(think CRT). But, the logistics of that whole idea are, well, absurd to put it politely.
What about a 5k monitor or a VR headset? 5k actually has a lot of utility in the creative/development space since you can natively render some scene/image intended to be viewed at 4k while there is still room to put sidebars or other software controls on screen. So, while 5k may not have the effect of 8k, it would be better than 4k, and there should be plenty of options on the market. I only mention VR since those displays are high fidelity and VR videos are often produced in 5k, 6k, 7k, and even 8k, so that might work for your case (I dunno if it will or not).
Update: Maybe something like this one? Edit: nvmd, that one only has 1440 vertical.
 
I find the comments on here disregarding ray tracing as very amusing. Nvidia are not the gatekeepers or the inventors of RT. It's just the natural progression of 3D technology. Having real time lighting is something that happened decades ago in CGI movies for example.

Nvidia are clearly working very hard to deliver playable RT experiences to its consumers. DLSS is one of the components of that. I find it difficult to be mad at Nvidia for this, especially as we are starting to see games that require RT capable cards.

It won't be very long until you can't turn off RT. When that happens, the lowest common denominators in the tech community will no doubt attempt to blame Nvidia for the natural progression of technology. But personally, I will be thankful that Nvidia have spent so much money and effort in trying to deliver playable RT experiences to its users.

To those claiming they dont want or need RT. Do you even play 3D games? Sounds like you should be ok on 2D sprites. Heck, they dont even need to be in color right?
 
I find the comments on here disregarding ray tracing as very amusing. Nvidia are not the gatekeepers or the inventors of RT. It's just the natural progression of 3D technology. Having real time lighting is something that happened decades ago in CGI movies for example.

Nvidia are clearly working very hard to deliver playable RT experiences to its consumers. DLSS is one of the components of that. I find it difficult to be mad at Nvidia for this, especially as we are starting to see games that require RT capable cards.

It won't be very long until you can't turn off RT. When that happens, the lowest common denominators in the tech community will no doubt attempt to blame Nvidia for the natural progression of technology. But personally, I will be thankful that Nvidia have spent so much money and effort in trying to deliver playable RT experiences to its users.

To those claiming they dont want or need RT. Do you even play 3D games? Sounds like you should be ok on 2D sprites. Heck, they dont even need to be in color right?
It's finally happened, nVidia has convinced people that making games look worse is the future!

RT isn't the problem, DLSS isn't the problem. It's the fact that using RT and needing DLSS to make them playable makes games look worse than they did 10 years ago. These "improvements" have caused what's called a "regression". People don't have a problem with RT, they have a problem with the regression of quality and performance we had 10 years ago that it is causing.

Now people are mad at nVidia pushing the industry towards it and almost 8 years later, still can't even provide a quality native experience to the 98% of gamers that don't buy a 90 class card. We still haven't hit true, full ray tracing. Fully ray traced/path traced settings cause 90 class card's to get sub 30FPS.

Now everyone is looking at nVidia saying "you had nearly 8 years to bring us an affordable ray traced experience, instead you're using AI to fake ray tracing and making us pay MORE for it."

People would have a quality raster game than the unoptimized RT games, that don't even have real ray tracing, and pay $1000 for the privilege.

Finally, frame gen is useless if you're getting sub 60FPS native.
 
Last edited:
It's finally happened, nVidia has convinced people that making games look worse is the future!

RT isn't the problem, DLSS isn't the problem. It's the fact that using RT and needing DLSS to make them playable makes games look worse than they did 10 years ago. These "improvements" have caused what's called a "regression". People don't have a problem with RT, they have a problem with the regression of quality and performance we had 10 years ago that it is causing.

Now people are mad at nVidia pushing the industry towards it and almost 8 years later, still can't even provide a quality native experience to the 98% of gamers that don't buy a 90 class card. We still haven't hit true, full ray tracing. Fully ray traced/path traced settings cause 90 class card's to get sub 30FPS.

Now everyone is looking at nVidia saying "you had nearly 8 years to bring us an affordable ray traced experience, instead you're using AI to fake ray tracing and making us pay MORE for it."

People would have a quality raster game than the unoptimized RT games, that don't even have real ray tracing, and pay $1000 for the privilege.

Finally, frame gen is useless if you're getting sub 60FPS native.
This very article is showing how Nvidias DLSS is improving. And from my experience RT and DLSS on look better than both turned off. This is a subjective opinion, you cant tell someone that they are wrong or right about that. And based on the demand and sale of Nvidias parts over its competitors I feel the market largely likes RT and DLSS too.

I dont understand how you can be mad at Nvidia for trying? They have delivered by far the best RT experience than any other company. And its not like 3D games werent going to get path traced lighting models. Sure an optimised raster game will run smoother but its not like all of a sudden RT games will drop and they will run smoothly. We are in a transition right now. We are already beginning to see games requiring RT hardware to run. In a few years they all will, and the best experiences will be on Nvidia as they currently (according to like every reviewer) offer by far the best solution.

And believe me, games were always going to go down that route. What makes me laugh is when I comments claiming RT is a gimmick invented by Nvidia to sell more cards. That is the lowest IQ take in the 3D gaming industry at this point.

But right now I dont understand why you harbour any resentment at Nvidia. If you prefer raster only you can still turn off RT and AMD will sell you a better value card for that. Why have such a bee in your bonnet over Nvidia pushing something you don't have to buy or use?
 
Is anyone curious about the next AI feature rtx 6000 will get?
MegaMultiFrame generation?
I would personally just take the old connector.
Also, I feel like Nvidia will stop supporting older DLSS features along with old GPUs.
Imagine not being able to rely in AI frames in new games anymore because your video card is not new enough...
 
But right now I dont understand why you harbour any resentment at Nvidia. If you prefer raster only you can still turn off RT and AMD will sell you a better value card for that. Why have such a bee in your bonnet over Nvidia pushing something you don't have to buy or use?
I can answer(and already answered) your question by replying to this part of your post

nVidia pushed RT on games with the 20 series. DLSS came with it, but DLSS was originally used as a cool "nice to have" tech. nVidias push of RT on the industry has made the gaming industry dependant on upscaling. Even raster games have become poorly optimized because of this. Like I said before, it isn't RT or DLSS or FrameGen that is the problem. The problem is that these technologies have become requirements. Further, RT+DLSS+FrameGen looks WORSE and performs WORSE than games did 10 years ago on hardware that cost half as much.

It has made midranged graphics cards basically pointless because you end up upscaling 360P to 1080P to get anything close to 60FPS.

Think of it in DirectX versions. The first generation that supports a new DX version typically only does so well on the high-end cards, but second and third gen cards can basically run all the features effortlessly. Nvidia has us on generation 4 of their ray tracing cards and even their topend, money is no object cards can't really give a native RT experience.

Instead, half the silicon is now taken up by "AI" stuff. nVidia has repurposed the AI hardware into things like better DLSS and FrameGen. The result is artifacts and graphics fidelity issues. Ray Reconstruction usong AI wouldn't be necessary if they just made a card powerful enough to run things at a native resolution without DLSS.

We are now stuck in a feedback loop where more silicon on the card needs to be dedicated to AI to make up for the lack of native horsepower. It's making graphics quality GO DOWN instead of UP. Further, it's getting more expensive to run these features that shouldn't be necessary in the first place.

They created a problem, didn't solve it properly and are now selling people a "solution" that doesn't actually fix the problem. Developers are looking at spending money on game optimization and saying, "well, it's cheaper to just out DLSS in".
 
How is enabling upscaling that looks worse than native = improving graphical fidelity lol. Who cares if you're new version of upscaling looks better than your old one by 5% if it still looks worse tha native?

It's generally faster, which matters for people not running with the latest xx90 class card, as it allows us to turn on other more impactful graphical features.
 
I don't knopw anyone that uses RTing, or wants it. So this is in the "couldn't care less" area of games.

Raytracing is an inherently stupid approach to start with and it's ridiculous we still use it rather than beam tracing.
 
I can answer(and already answered) your question by replying to this part of your post

nVidia pushed RT on games with the 20 series. DLSS came with it, but DLSS was originally used as a cool "nice to have" tech. nVidias push of RT on the industry has made the gaming industry dependant on upscaling. Even raster games have become poorly optimized because of this. Like I said before, it isn't RT or DLSS or FrameGen that is the problem. The problem is that these technologies have become requirements. Further, RT+DLSS+FrameGen looks WORSE and performs WORSE than games did 10 years ago on hardware that cost half as much.

It has made midranged graphics cards basically pointless because you end up upscaling 360P to 1080P to get anything close to 60FPS.

Think of it in DirectX versions. The first generation that supports a new DX version typically only does so well on the high-end cards, but second and third gen cards can basically run all the features effortlessly. Nvidia has us on generation 4 of their ray tracing cards and even their topend, money is no object cards can't really give a native RT experience.

Instead, half the silicon is now taken up by "AI" stuff. nVidia has repurposed the AI hardware into things like better DLSS and FrameGen. The result is artifacts and graphics fidelity issues. Ray Reconstruction usong AI wouldn't be necessary if they just made a card powerful enough to run things at a native resolution without DLSS.

We are now stuck in a feedback loop where more silicon on the card needs to be dedicated to AI to make up for the lack of native horsepower. It's making graphics quality GO DOWN instead of UP. Further, it's getting more expensive to run these features that shouldn't be necessary in the first place.

They created a problem, didn't solve it properly and are now selling people a "solution" that doesn't actually fix the problem. Developers are looking at spending money on game optimization and saying, "well, it's cheaper to just out DLSS in".
Your comment hasnt answered my question at all. Why are you upset at Nvidia for a free feature that you dont have to use?

And also you are incorrect. DLSS absolutely does fix problems with RT in games. The reviewer here is saying good things about DLSS. In fact most reviewers do. Are they all wrong?
 
Your comment hasnt answered my question at all. Why are you upset at Nvidia for a free feature that you dont have to use?

And also you are incorrect. DLSS absolutely does fix problems with RT in games. The reviewer here is saying good things about DLSS. In fact most reviewers do. Are they all wrong?
I've explained myself several times. I need you to explain something to me,

>>>what makes developers relying on DLSS to increase performance in unpotimized games that look like **** and run worse acceptable<<<<
 
I've explained myself several times. I need you to explain something to me,

>>>what makes developers relying on DLSS to increase performance in unpotimized games that look like **** and run worse acceptable<<<<
You havent explained yourself at all. Right now all I can glean from your posts is Nvidia DLSS = bad but theres no logical reasoning as to why.

DLSS is universally praised by reviewers. And I can confirm that I certainly prefer to turn RT and DLSS on. I find the image quality is better with both things turned on. Im sorry if you dont like it but you are clearly in the minority.

But this completely optional. And it costs zero dollars.

What's the beef? Why is Nvidia worthy of criticism for providing these options to its consmers?
 
Back