Testing AMD's new Radeon Image Sharpening: Is It Better than Nvidia's DLSS?

AMD recognizes that DX11 is the next step and they’ll consider adding that in a future software update if there’s “demand from the community for that feature.”

Wait, WHAT?! smh....
They know it's the next step, but will only consider it if asked to? Wow.
Why does AMD struggle so hard with software?

I won't post the link, but apparently the sharpening aspect of this software has already been modded to run on NVIDIA hardware.
 
Dlss is DOA it looks bad and has a performance hit.

Self driving cars exist because of Deep Learning, and despite being responsible for human fatalities, it has progressed to transport trucks and now airplanes. But you expect immediate results when it's used to improve gaming? That's not how it works, sorry.

You're just a gamer, so it's cool if all this goes over your head.
 
AMD needs to add a little of noise reduction to it's sharpening algorithm as that's the major side-effect. You definitely get a slightly grainier image and don't forget that can also increase the perceived sharpening. Also the ways sharpening works it's creating increased contrast around edges causing a white halo on one side of the edge and a black halo on the other side. For very fine subjects like foliage this can definitely increase the overall brightness. This occurs when sharpening photographs. In more advanced sharpening tools you can control the white and black halos, and if I have a lot of fine foliage I will decrease the white halo considerably and boots the black halo to minimise the change in brightness. SO this is not per se an AMD issue, it's just how sharpening works. Ultimately AI sharpening will create actual detail and increase sharpness in a more advanced way avoiding halos.
 
Dlss is DOA it looks bad and has a performance hit.

Self driving cars exist because of Deep Learning, and despite being responsible for human fatalities, it has progressed to transport trucks and now airplanes. But you expect immediate results when it's used to improve gaming? That's not how it works, sorry.

You're just a gamer, so it's cool if all this goes over your head.
is that why tesla dropped nvidia ?
pure brilliance thanks.
 
I'm not impressed by either team. Can't we focus on something more important? Maybe because I'm getting older but I see no difference.
 
Here's the truth about Radeon Sharpening and their built in color controls for games. They are miles behind Nvidias' Game Filter, by a long shot. Game Filter allows sharpening, clarity, hdr and a ton of other image modifications. AMD has a very long way to go.
made me giggle how people can be so clueless
pretty sure I have an amd card and nvidia card to compare buddy. the level of sharpening seen in this article can be achieved by reshade as well. you saying im clueless is a joke. it's image sharpening. nothing more.
 
I think NVIDIA will eventually drop their Tensor cores again from gaming cards and improve their RT technology only.
It's a bold first gen, they have to take some risks and go into the unknown when innovating, not everything works out great immediately.

You clearly don't know what Deep Learning even is, let alone how it works....

Deep learning is a machine learning technique that teaches computers to do what comes naturally to humans: learn by example. Deep learning is a key technology behind driverless cars, enabling them to recognize a stop sign, or to distinguish a pedestrian from a lamppost. It is the key to voice control in consumer devices like phones, tablets, TVs, and hands-free speakers. Deep learning is getting lots of attention lately and for good reason. It’s achieving results that were not possible before.

In deep learning, a computer model learns to perform classification tasks directly from images, text, or sound. Deep learning models can achieve state-of-the-art accuracy, sometimes exceeding human-level performance. Models are trained by using a large set of labeled data and neural network architectures that contain many layers.
Deep Learning sucks for games. Period. DLSS was impressive on tech demos, because it's the same video over and over again and the AI can learn and adapt to the pattern. But gaming is dynamic; no two sessions are the same, and variables are constantly changing. In that implementation, DLSS performs poorly, and the video and screen shots show it.
DLSS is really cool tech, and I don't doubt it has a lot of great implementations ahead of it, but when it comes to gaming, it's DOA.
 
Deep Learning sucks for games. Period. DLSS was impressive on tech demos, because it's the same video over and over again and the AI can learn and adapt to the pattern. But gaming is dynamic; no two sessions are the same, and variables are constantly changing. In that implementation, DLSS performs poorly, and the video and screen shots show it.
DLSS is really cool tech, and I don't doubt it has a lot of great implementations ahead of it, but when it comes to gaming, it's DOA.

You have no idea what you're talking about. Zero.

"Deep learning requires massive amounts of training data to work properly, incorporating nearly every scenario the algorithm will encounter."
 
"As more gamers get their hands on Navi GPUs, it will be interesting to see how broadly Radeon Image Sharpening is adopted and recommended based on good results."

Question is, are PC gamers going to buy Navi GPU's? Last time I checked Steam Survey AMD had something like 25% of cards while Nvidia had 73% the remaining are Intel integrated solution. AMD is way behind on the PC market. If this tech becomes available at next gen it sure will be popular. If it is locked on PC, my gut tells me people will continue to buy Nvidia.

Yes they will, but I don't think for long when NVIDIA finally feels forced to drop 7nm Ampere and obliterate Navi. Sadly, even with 7nm and "RDNA" Navi still can't compete with Turing on 12nm (which is basically 16nm refined like Intel 14nm++++ ffs how many +). Just optical shrinking alone would gives Ampere 40% performance uplift, more than what you can squeeze out of refining Navi for next gen.
 
"As more gamers get their hands on Navi GPUs, it will be interesting to see how broadly Radeon Image Sharpening is adopted and recommended based on good results."

Question is, are PC gamers going to buy Navi GPU's? Last time I checked Steam Survey AMD had something like 25% of cards while Nvidia had 73% the remaining are Intel integrated solution. AMD is way behind on the PC market. If this tech becomes available at next gen it sure will be popular. If it is locked on PC, my gut tells me people will continue to buy Nvidia.

Yes they will, but I don't think for long when NVIDIA finally feels forced to drop 7nm Ampere and obliterate Navi. Sadly, even with 7nm and "RDNA" Navi still can't compete with Turing on 12nm (which is basically 16nm refined like Intel 14nm++++ ffs how many +). Just optical shrinking alone would gives Ampere 40% performance uplift, more than what you can squeeze out of refining Navi for next gen.

I wouldn't count on Nvidia going 7nm anytime soon. They just released super cards and Nvidia is historically slow when moving to a new node. That will likely go double now, given their massive die sizes. The only reason AMD can do 7nm with Navi is because the die size is much smaller. I would not be surprised if AMD has 5nm cards out when Nvidia does it's 7nm.
 
NVIDIA slow moving to new node has less to do with die size but more with less than ideal competition from AMD. They can easily cut that RTX core to make smaller die size, but that's mostly the reason to sell their cards since Turing isn't much of a generational jump from Pascal. It's been yawn worthy wait since 2017, last time when GTX 1080 Ti drops.

And even if AMD drops 5nm Vs NVIDIA 7nm it's likely they still won't take performance crown from NVIDIA. Since they retire Radeon VII and NVIDIA releasing 2080 SUPER, now AMD is lagging 2 tiers behind NVIDIA. At least with Vega they were only lagging behind 1 tier that was GTX 1080 Ti.
 
This article is inaccurate, and I've seen a lot of reporting on this that is inaccurate and I think it comes from dishonest marketing from AMD.
No, it's not. TechSpot already took a deep look into DLSS a while ago, then they looked again and reported exactly what it does and where it fails:
https://www.techspot.com/article/1712-nvidia-dlss/
So your proof that this TechSpot article is accurate is... another TechSpot article? lol
DLSS is an anti-aliasing algorithm
No, it's not that either. Read TechSpot's article... "it’s more of an image reconstruction technique that renders a game at a sub-native resolution, then uses AI inferencing to upscale and improve the image"
Now, RIS may be completely different on what it does (as explained in the article) and so are the results, hence the explanation and comparison since both try to achieve better performance with little image degradation.
Yes, it is. It is absolutely an anti-aliasing algorithm. That is what supersampling is https://en.wikipedia.org/wiki/Supersampling and Nvidia calls it that. That it works a bit differently than other AA algorithms is... the point. Why do you think they upscale (aka super sample) the image? It's to reduce aliasing. People get confused because you set a high resolution and then turn DLSS on, which is equivalent to setting a lower resolution and turning on SSAA.
It makes no sense to compare RIS to DLSS directly, and it doesn't matter how many TechSpot articles you link to.
 
When RIS is used with GPU Upscaling, it's very much comparable to DLSS, as it's two very different approaches to the same thing: rendering at a lower resolution, but displaying a frame of near-similar visual quality to a higher resolution. On that basis alone, the article, and its headline, is valid.
The article literally compares DLSS and RIS at the same "base" resolution (which DLSS will then upscale) and then concludes that RIS is simply a better alternative because there is no performance loss. There is nothing there about using RIS with the same level of upscaling.
And there is no comparison on anti-aliasing at all, which is weird because DLSS also is doing an equivalent of TAA. One of the big downsides to downsampling is aliasing, and this article doesn't even touch on that, which is dumb because image sharpening increases that problem.
So yes, statements that this article makes like "It’s also clear that Radeon Image Sharpening is a superior equivalent to Nvidia’s DLSS, often by a considerable margin." are absolutely 100% asinine.
And then there are the contradictions:
"Radeon Image Sharpening isn’t simply a sharpening filter, it uses a contrast adaptive sharpening or CAS algorithm that AMD recently introduced in their FidelityFX suite."
and then this at the end:
"In our previous DLSS analysis we had stated a simple image sharpening filter would probably work better than DLSS with a lower performance cost, and it turns out AMD has proven us right by implementing exactly that."
(Nvidia also has their own simple image sharpening filter in Freestyle but TechSpot choose to ignore that of course).
 
The article literally compares DLSS and RIS at the same "base" resolution (which DLSS will then upscale) and then concludes that RIS is simply a better alternative because there is no performance loss. There is nothing there about using RIS with the same level of upscaling.
There is some, in the form of the 4K 0.7x shader scaling in Metro Exodus, although it is a shame that 4K 0.7 ss wasn't tested with DLSS.

And there is no comparison on anti-aliasing at all, which is weird because DLSS also is doing an equivalent of TAA. One of the big downsides to downsampling is aliasing, and this article doesn't even touch on that, which is dumb because image sharpening increases that problem.
Time constraints possibly prevented that from happening? Try asking if the team could do a 'Part 2' version of the article, which could look at matters such as edge aliasing.

And then there are the contradictions:
"Radeon Image Sharpening isn’t simply a sharpening filter, it uses a contrast adaptive sharpening or CAS algorithm that AMD recently introduced in their FidelityFX suite."
and then this at the end:
"In our previous DLSS analysis we had stated a simple image sharpening filter would probably work better than DLSS with a lower performance cost, and it turns out AMD has proven us right by implementing exactly that."
(Nvidia also has their own simple image sharpening filter in Freestyle but TechSpot choose to ignore that of course).
I would say that it's simply not the best way to phrase it, rather than it being an outright contradiction, but again, how about just asking the team to review it's phrasing or maybe edit it slightly?
 
Awesome review!
With a future AIB card there should be a fairly large chance of a +5% overclock without to much effort. Add that to the 27% performance boost gained with 80% GPU scaling & RIS and you have a total performance rating of +166% (compared to a GTX 1080) whilst the 2080TI comes in at 169%!

This means that 80% GPU Scaling with no significant quality loss with a decent overclock gives you a 10k card for just around 4,5k!

Request: I would love to see a direct comparison with a 5700 XT 80% GPU Scaling + RIS with a 2080TI card in like 10 games.
 
I'm confused by how the scaling works. The pics look nice and I could see myself using the sharpening+lower res for more fps. But how does .7 of 2160p equal 1800p? Isn't .7 of 4k much closer to 1440p?
 
Because AMD is #2, they decided that they couldn't expect developers to go to the trouble of supporting Radeon Image Sharpening in their games. It's too bad about DirectX 11, though. But since there are plenty of 4K TV sets that upscale 1080p TV programs - obviously without any support - I definitely felt, when I first heard of DLSS, that the need for support in games was not necessary. Ideally, there would be both options; a mode that works independently of support, and an option for games to support the resolution enhancement with better results.
 
Back