The Elder Scrolls Online is getting DLSS and its evil twin, DLAA

mongeese

Posts: 643   +123
Staff
In brief: Nvidia DLSS is great at improving performance in modern titles at a small cost to image quality. But what about older titles that don’t need to run faster? Apparently, it can do the exact opposite for them: improve image quality at a negligible performance cost.

In a recent livestream, its developers announced that The Elder Scrolls Online would be updated with DLSS in the fall. It’s a welcome change for players with older hardware, but it won’t excite owners of newer GPUs. Any 3000-series RTX GPU can easily churn out frames at 4k with max settings.

To give them something too, ESO’s developers talked to Nvidia about using native resolution frames as input for DLSS, skipping the upscaling step but using DLSS’s baked-in deep learning anti-aliasing—which is what they ended up calling the result of those talks—DLAA.

DLAA does the same things as traditional anti-aliasing techniques, but it’s infused with magic DLSS sauce.

DLAA is exceptionally complicated as far as anti-aliasing techniques go. It will benefit from the mature foundation of DLSS but could be hindered by vestigial upscaling code. It could also have a small negative impact on performance if it adds more overhead than other anti-aliasing techniques. Presumably, it’ll be restricted to RTX GPUs like DLSS is.

Although it may or may not be the best anti-aliasing technique out there, it’s certainly the most novel. Hopefully, Nvidia will offer it to other developers for experimentation.

Permalink to story.

 
I guess this is a good feature to add to ESO. I play at 4k on my 1070ti and don't often have any issues dropping below 60fps. It's a very well optimized game, but I'm curious if this implies they intend on adding raytracing
 
The naming for DLSS was already confusing to begin with because as far as I know “Super Sampling” refers to the method of rendering at higher than native resolutions in order to get more detail and effectively replace Anti-Aliasing, similar to Nvidia’s DSR. I assume they called it DLSS because the training data is at higher that native resolution, but in practice the output is equal to the native resolution and the render resolution is lower to give and FPS boost. in my opinion what they are calling DLAA has more on common with Super-Sampling than DLSS, and DLSS should have a better name that describes its use case, such as DLUS, Deep-Learning Up-Scaling.
 
As much as I don't like nvidia and DON'T CARE at all about DLSS, this DLAA is quite a different thing and I actually am interested in it.

From how much it's praised it seems to be the best AA tech there is atm. I'm waiting for screenshots and 800% zoom comparisons of course :)

P.S. AMD and Intel should take notice (for FSR 2.0 evolution and it's derivatives and XeSS and it's tech).
 
As much as I don't like nvidia and DON'T CARE at all about DLSS, this DLAA is quite a different thing and I actually am interested in it.

From how much it's praised it seems to be the best AA tech there is atm. I'm waiting for screenshots and 800% zoom comparisons of course :)

P.S. AMD and Intel should take notice (for FSR 2.0 evolution and it's derivatives and XeSS and it's tech).
I agree, I think if the companies want to sell top end hardware with extra performance than what is required for 4k 60, they should be doing image quality improvement for people without 144hz monitors.
 
I agree, I think if the companies want to sell top end hardware with extra performance than what is required for 4k 60, they should be doing image quality improvement for people without 144hz monitors.
Better AA is for everyone at any tier and resolution that can afford some extra fps tax on their GPU. It's game dependent, but a welcome addition nonetheless.

Hopefully AMD and intel come with their solutions too, so it won't be restricted again, to the nvidia RTX tax, only.
 
The naming for DLSS was already confusing to begin with because as far as I know “Super Sampling” refers to the method of rendering at higher than native resolutions in order to get more detail and effectively replace Anti-Aliasing, similar to Nvidia’s DSR. I assume they called it DLSS because the training data is at higher that native resolution, but in practice the output is equal to the native resolution and the render resolution is lower to give and FPS boost. in my opinion what they are calling DLAA has more on common with Super-Sampling than DLSS, and DLSS should have a better name that describes its use case, such as DLUS, Deep-Learning Up-Scaling.
I couldn't agree more. DLAA is *literally* using a deep learning-based super-sampling method, whereas DLSS does the opposite of super sampling (upgrading a low-res image to a hi-res image). But I think DLAA is a good name because it's easy to understand for average consumers.
Evil? Techspot showing their AMD bias again? /jk
Almost slapped you.
 
The naming for DLSS was already confusing to begin with because as far as I know “Super Sampling” refers to the method of rendering at higher than native resolutions in order to get more detail and effectively replace Anti-Aliasing, similar to Nvidia’s DSR. I assume they called it DLSS because the training data is at higher that native resolution, but in practice the output is equal to the native resolution and the render resolution is lower to give and FPS boost. in my opinion what they are calling DLAA has more on common with Super-Sampling than DLSS, and DLSS should have a better name that describes its use case, such as DLUS, Deep-Learning Up-Scaling.
"Supersampling in simple terms is running the application at a higher resolution than your monitor supports."

You assumed and you were right, so I don't see where any confusion could come from. I came to the same conclusion myself.

"It could also have a small negative impact on performance if it adds more overhead than other anti-aliasing techniques." @mongeese

Think of it like an Ultra Ultra Quality preset using mass amounts of AA.
It reminds me of when NVIDIA had 32xCSAA as an option when SLi was still young.
 
Last edited:
"Supersampling in simple terms is running the application at a higher resolution than your monitor supports."

You assumed and you were right, so I don't see where any confusion could come from. I came to the same conclusion myself.

"It could also have a small negative impact on performance if it adds more overhead than other anti-aliasing techniques." @mongeese

Think of it like an Ultra Ultra Quality preset using mass amounts of AA.
It reminds me of when NVIDIA had 32xCSAA as an option when SLi was still young.

If we want to keep thing simple, neither DLSS nor DLAA use a render resolution that is higher than your monitor, so it’s not super sampling. Just a little confusing that’s all 🤷
 
If we want to keep thing simple, neither DLSS nor DLAA use a render resolution that is higher than your monitor, so it’s not super sampling. Just a little confusing that’s all 🤷
DLAA is responsible for just AA. DLSS uses that and more to upscale. You can't upscale an entire image just going after jaggies.
It would be misleading to call it anything else but DLAA.
As far as DLSS goes, it's been out long enough that everyone already knows what it is and does - and they aren't complaining. If they don't know what DLSS is or what Super-sampling is (it's not just for AA), it's a short google search away!

AA and DSR are not similar and are incapable of replacing the other.
DSR forces your GPU to render and display entire frames at that resolution. Like using Resolution Scale option beyond 100% in games. DSR is far more taxing than just AA which is why no one used it. It was good for making grass and fences look better or something. SSAA was the the most taxing AA method on a GPU early on and is why other methods like MSAA, MFAA, TXAA and TAA came into play.
 
Last edited:
At 4K anti-alias is not so important. Going to higher resolutions in the future, it will become completely irrelevant.
 
The best thing they've done recently is they've added multithreaded rendering so the GPU is now properly utilized: https://forums.elderscrollsonline.com/en/discussion/584299/

You have to go and check the box as it's not the default setting now, but after I checked it my 5600 XT is now pumping out 165 FPS (max refresh of my monitor) in most areas at 1440p, even where there's lots of other people. Before, crowded areas would tank the framerate, now it's smooth as butter.
 
At 4K anti-alias is not so important. Going to higher resolutions in the future, it will become completely irrelevant.
It depends on your screen resolution, AA can sometimes fix some rendering issues and stabilise the image (less pixel crawls, flickering). I don't think we'll move to 8K anytime soon.
 
Looks like many people don't actually know what super sampling is .... It doesn't necessarily mean you have to upscale, that's just the **old way** of doing it. Super sampling is actually color samples taken at several instances ** inside the pixel** (not just at the center as normal), and an average color value is calculated. The easiest way **used to be** upscaling or turning 1 pixel into 4, sample each one and then use that information to build back down to a single pixel but with AI and Tensor Cores this is no longer necessary and they can divide a pixel into 4 or more pieces without upscaling

As long as you are taking more than one sample per pixel it is still super sampling .... Super Sampling is used for many other D/A and A/D and Modeling applications than just graphics
 
This might benefit sub 4K resolutions more, which is honestly a really good thing. If your monitor is <4K and you have frames to spare, you are not going to enable DLSS, especially if it exceeds your FreeSync/GSync, but why not benefit from the excellent AA that DLSS has to offer? Of course, until we see it in action we won't know if the AA is better or has less overhead than traditional AA methods, but it seems these developers see benefit to it, its not even Nvidia that is pushing this, so that's a good sign.
 
This might benefit sub 4K resolutions more, which is honestly a really good thing. If your monitor is <4K and you have frames to spare, you are not going to enable DLSS, especially if it exceeds your FreeSync/GSync, but why not benefit from the excellent AA that DLSS has to offer? Of course, until we see it in action we won't know if the AA is better or has less overhead than traditional AA methods, but it seems these developers see benefit to it, its not even Nvidia that is pushing this, so that's a good sign.
The Creative Director is pushing it, because he's really happy with it:

"This is debuting their new tech, which is called NVIDIA DLAA. It's the same kind of concept (DLSS), you won't get a performance boost out of this but what you get is absolutely incredible anti-aliasing. It's unbelievable, it's crazy how good it is."
 
Back