AMD FSR 2.0 vs. DLSS: 8 Generations of GeForce and Radeon GPUs Benchmarked

So basically, if your card doesn't have DLSS, FSR 2.0 is really good...

This makes it a no-brainer for anyone with an older card or an AMD one...

What it isn't, is a replacement for DLSS - which will be in all Nvidia cards 2000 series and newer.

As the point of this, for AMD, is to make you purchase a new AMD card instead of a new Nvidia one... that makes it a failure.

Great for consumers - not so great for AMD... kind of wondering why they did it - but thanks!
Like for Gsync , AMD doesn't want nvidia to have a proprietary solution and dominate the market with it. That's why they have to make something less closed. And it works. Most linux users prefer an AMD card but it seems that Nvidia is going to take the same train and release open source drivers. Also it seems that freesync is winning over Gsync. And I feel that rtx will soon be irrelevant... Hopefully.
 
Adding to Avro incredible post, its really sad that people have forgotten the number one rule of being a customer:

You need to demand more for your money!

If a company supports or creates open standards, releases products that help the industry and by consequence you, then you give them your money.

Instead, we have corporations worshippers that worry more about the company making more and more money instead of their own pockets, like the nvdrones.
 
Last edited by a moderator:
Thanks for the very thorough review.

Looking at the results makes me wonder if memory has something to do with the differences in FSR 2 performance.

It would be really interesting to compare a 4GB 5500 XT to the 8GB model as this should clearly show the effect memory has (or hasn‘t).

sure.
Being a temporary process, it requires access to a greater amount of data (current and previous stored frames). Also, some of these data are floating point. Let's say each pixel is made up of RGB color (24 or 32 bits, 8 * channel + alignment or some extra data), motion vectors (2x32 or 2x16 bits) and depth (32 bits), that's 88 bits minimum. then there is the spatial operation (fsr is space-time) which reads at least 5 pixels for each one to be processed, so we can get to about 440 bits per processed pixel (maybe only 216 if most of the information is from the central pixel ). then the algorithm needs at least two frames, so multiply by 2 for 880 bits (or maybe just 432).
accessing different buffers or textures has a latency, which can be reduced if the data is interlaced (a single read accesses several pixel data at the same time) but I don't know of any engine that produces the render data in this way, unless the developer decides to do it to use FSR in a better way, more optimally.
I have to look at the code on github.
but with all this, we see that a better way to arrange the data to reduce latencies, higher memory speed and a larger amount of cache is what makes the difference, also, the FP16 native processing
 
At one point last year AMD said RDNA3 might see FSR hardware acceleration added and we could see huge gains if true. Already FSR2 works better with the newer and faster cards but hw acceleration would be an enormous benefit. If that were the case it would be great to be able to choose your rendering resolution rather than say only having say 1440p for 4K if choosing quality. 1800p hw accelerated would easily beat out 1440p software accelerated and offer higher fidelity.

I hope this feature is coming. I want to avoid Nvidia if I can for the gen. I have no reason to believe RDNA3 will be far more power efficient and if they can massively improve RT engines and get close to Lovelace I'm in. They will easily be as good if not better on rasterisation and we are expecting ~80-100% uplift there from both camps.
maybe some vector/matrix operations that fsr does, will run 3 or 4 times faster in rdna3, if it finally includes array accelerators. for example convolutions, which are also fundamental in neural networks, so these will also be executed faster (training and inference)
 
Except it looks worse, considerably so especially with distracting shimmer and flicker

I think this is affected by the rather high default sharpening setting. Lower sharpening could lessen this problem, we'll see with more games supporting FSR2 and more tests if this is true.
 
~20% boost on old or low-end GPU is not worth the image quality loss. If I play 1080p I rather lower some unimportant settings to gain those ~20% than to use FSR2.0
 
Last edited:
~20% boost on old or low-end GPU is not worth the image quality loss. If I play 1080p I rather lower some unimportant settings to gain those ~20% than to use FSR2.0
if it gives enough FPS to make it playable without making the game look like it was made in the 80s then you will enable it. this could be huge for low end laptops that struggle to reach/hold 60 FPS.

we just need more games to see how well it fares in different engines and situations.
 
Ehh, who the heck wants to play deathloop anyway. pc gaming is so bad. that's why most steam gamers play cs go which is from like 1992. also most steam gamers are 42 yrs old and live in mom's basement, look it up it's actual verified fact.
 
You need to demand more for your money!
As it turns out, competition is the biggest driver for more value and open source alternatives. I do demand more, and I'm getting it right now, thanks to all parties that compete.

It just happens that buying a card that was roughly in line with the entire markets price to performance ratio, that came with extra features, pushed the market to develop alternatives, and now I get to enjoy them all, how awesome is that!
Instead, we have corporations worshippers that worry more about the company making more and more money
You're not wrong, the AMD worshippers are the worst I've ever seen them lately.
 
Wooosh....
At least you tried your best, the ignore button strikes again. I don't have the time or inclination to tolerate your toxicity.
And you're not wrong, but who can blame them?
They spent years listening to the Nvidia\Intel wienies talking like their Mommies had just bought a better car than everyone else's Mommies.
And they're certainly handling it maturely.
 
Last edited:
while AMD has always been in favour of open standards (CrossfireX, Havok, OpenCL & FSR). This is why Linux users use almost exclusively Radeon cards.

Well not quite. It wasn't until AMD released their AMDGPU driver that most sane Linux users started to flock over to Team Red. The reason being mostly that nvidia drivers are a nightmare to mantain and develop for while AMD's (and Intel's) are not.

Before the open source driver, nvidia was the only way realistic way to go in Linux.
 
FSR 2.0 is great marketing for AMD and - like DLSS - raises the profile of upscaling on the PC where it has traditionally been overlooked.
 
Well not quite. It wasn't until AMD released their AMDGPU driver that most sane Linux users started to flock over to Team Red. The reason being mostly that nvidia drivers are a nightmare to mantain and develop for while AMD's (and Intel's) are not.

Before the open source driver, nvidia was the only way realistic way to go in Linux.
Yes but that was a long time ago. In fact it was so long ago that it is long irrelevant. The question is, how do you know that it's a nightmare to maintain nVidia drivers? I'm not saying that it's not true, it's just that I've never heard about it and I've been building PCs since 1988 so this is a chance for me to learn something new for once. What makes nVidia drivers a nightmare to maintain compared to ATi and Intel?
 
Yes but that was a long time ago. In fact it was so long ago that it is long irrelevant. The question is, how do you know that it's a nightmare to maintain nVidia drivers? I'm not saying that it's not true, it's just that I've never heard about it and I've been building PCs since 1988 so this is a chance for me to learn something new for once. What makes nVidia drivers a nightmare to maintain compared to ATi and Intel?
This isn't exactly my area of expertise but to add to your comment, it looks like from what I have read so far that Radeons were the Linux users go to because of their support even back when they were ATI. True?
 
"The question is, how do you know that it's a nightmare to maintain nVidia drivers?"

As a Linux vfx artist, Nvidia is really the only option I have atm. Cuda/Optix is implemented everywhere and a must these days, unfortunately or not. Their closed official drivers were always stable tbh.
 
Yes but that was a long time ago. In fact it was so long ago that it is long irrelevant. The question is, how do you know that it's a nightmare to maintain nVidia drivers? I'm not saying that it's not true, it's just that I've never heard about it and I've been building PCs since 1988 so this is a chance for me to learn something new for once. What makes nVidia drivers a nightmare to maintain compared to ATi and Intel?
AMDGPU is like 5 years old. Not that long ago.

As for the nvidia driver is a nightmare to work with because they usually don't follow any community standards most of the time while Intel or AMD do so if you are writing a piece of software that has to support a nvidia card, you will probably end up needing to write extra or special code to support their hardware.

Examples? Wayland usually has issues specific to nvidia that the other vendors do not have because nvidia decided to not use the GBM buffer that everybody was using and using EGL-Streams instead, forcing every DE and compositor developer to add EGL compatibility to their software if they wanted to run with nvidia hard.

Sway, I hear they don't even support nvidia hardware running on the official driver because of the same reasons explained above. You can use the nouveau driver which is simply not up to par most of the time

Also for many distros, they can't even distribute the official driver on their man repos because of licensing which is not only a hassle for the user but also a security hole.

And finally in my own experience the last 3-4 years with a nvidia laptop I can't even count how many times after an update, my external monitor has stopped to work, or how many hacks here and there I have had to apply to fix issues that popped from time time. None of these issues applied on my previous (Intel) laptop and I hear from AMD users that theirs is also a very smoot experience.

In my case, it is clear that as long as I have to work under Linux I won't be buying any more hardware from them. Not even after they open-source their drivers :D
 
AMDGPU is like 5 years old. Not that long ago.

As for the nvidia driver is a nightmare to work with because they usually don't follow any community standards most of the time while Intel or AMD do so if you are writing a piece of software that has to support a nvidia card, you will probably end up needing to write extra or special code to support their hardware.

Examples? Wayland usually has issues specific to nvidia that the other vendors do not have because nvidia decided to not use the GBM buffer that everybody was using and using EGL-Streams instead, forcing every DE and compositor developer to add EGL compatibility to their software if they wanted to run with nvidia hard.

Sway, I hear they don't even support nvidia hardware running on the official driver because of the same reasons explained above. You can use the nouveau driver which is simply not up to par most of the time

Also for many distros, they can't even distribute the official driver on their man repos because of licensing which is not only a hassle for the user but also a security hole.

And finally in my own experience the last 3-4 years with a nvidia laptop I can't even count how many times after an update, my external monitor has stopped to work, or how many hacks here and there I have had to apply to fix issues that popped from time time. None of these issues applied on my previous (Intel) laptop and I hear from AMD users that theirs is also a very smoot experience.

In my case, it is clear that as long as I have to work under Linux I won't be buying any more hardware from them. Not even after they open-source their drivers :D

AMDGPU was released "officially" in April 2015. We've also had some form of it much earlier (beta versions?). For example (2014 article):

It's been a while since then.
 
Last edited:
This isn't exactly my area of expertise but to add to your comment, it looks like from what I have read so far that Radeons were the Linux users go to because of their support even back when they were ATI. True?
I'd love to say yes but I really don't know because I've never been a Linux user. The only operating systems that I've used have been MS-DOS, OS/2 and Windows (if you exclude Android that is).
 
Back