Make no mistake, FidelityFX Super Resolution (or FSR, for short) is AMD’s direct competitor to Nvidia’s DLSS, a feature that's increasingly become a key selling point for GeForce graphics cards in the last 18 months. It’s taken AMD a long time to prepare their own upscaling feature, but starting today AMD is ready to compete and now it needs to bring support for FSR in more games.
This article is going to be a comprehensive overview of FSR. We’ve got a stack of quality comparisons at different resolutions, performance benchmarks across several GPUs, and comparisons to other upscaling technologies. There is a lot of ground to cover.
What is FSR?
FSR is an upscaling technology that's designed to improve the performance of games at minimal loss to visual quality. Like DLSS and checkerboard rendering, the concept is that you can reduce the game’s rendering resolution, and use an algorithm to upscale the image to your target resolution, typically the native resolution of your monitor. Reducing the rendering resolution improves performance substantially, and if the upscaling technique is good enough, this can be achieved while preserving most of the image detail.
The most prominent form of this today is Nvidia's DLSS, the company’s proprietary AI-based temporal upscaling solution that runs on GeForce RTX GPUs. Temporal upscaling means data is accumulated from multiple frames and combined into the final image, with the AI component running on Nvidia’s Tensor cores to assist with this reconstruction. DLSS has gone through more than one iteration, and right now at version 2.0, it's a major improvement over the initial release and it's also gathered decent game support after much work from Nvidia.
FSR takes a different approach. Instead of using temporal upscaling, FSR relies exclusively on spatial upscaling. AMD tells us that AI is not used at any stage of the FSR process (so FSR is not the technology described in that patent that's been floating around). This greatly simplifies the algorithm – spatial upscaling does not rely on data from multiple frames, or motion vectors, which makes it easier to integrate into games as there are less data inputs. However, with less data to work with, spatial upscaling algorithms need to be really good at figuring out how to reconstruct the image, and traditionally this is where they've fallen short.
AMD hasn't gone into great detail on how their algorithm works, but they tell us this is not a simple rehash of bilinear upscaling, which is the ‘standard’ method for spatial upscaling. AMD calls their technique an "advanced edge reconstruction algorithm," which is combined with a sharpening pass to create the final image. There is only one input to the algorithm, which is the lower resolution frame.
Despite being a simpler technique, FSR still requires per-game integration. This is because FSR needs to be run before the final effects stage in the pipeline, so before the HUD is rendered, and before things like film grain are implemented. If FSR was applied to the final frame output from a game, many elements (like the HUD) would be upscaled, likely with artifacts and other visual problems. By optimizing each game, you can make sure only the intended portion of the game is run at a reduced render resolution.
AMD has taken the approach of spatial upscaling for two reasons. The first is ease of integration. FSR is a single compute shader with simple data inputs, and is open source technology. The second is that by not using AI, AMD is hoping to achieve broader support and from the get go it can support AMD GPUs from the Radeon RX 480 era (2016) and newer, as well as Nvidia GPUs from the GeForce 10 series and newer. That means popular mainstream GPUs like the GTX 1060 and GTX 1660 can take advantage of FSR.
FSR vs. DLSS
Some people will compare the way FSR works to the way DLSS works and might conclude that FSR is not an actual competitor to DLSS because they work in fundamentally different ways. But in my opinion that's not true. In practical terms, both technologies have been designed and created to achieve the same goal: improve gaming performance by rendering at a lower resolution, but at minimal loss of visual quality through upscaling.
It’s only fair to directly compare them, and we’ll do that later in this article.
FSR offers 4 quality modes: Ultra Quality, Quality, Balanced and Performance.
Each has a distinct scale factor: 1.3x for Ultra Quality, 1.5x for Quality, 1.7x for Balanced and 2.0x for Performance. What this means is that at a target resolution of 4K, the Quality mode runs at 1440p, and the Performance mode at 1080p, with the other modes somewhere in-between.
FSR supports any resolution, it’s not limited to 4K and 1440p, and the scale factors will apply in the same way. FSR also works with dynamic resolution scaling and theoretically allows for any scale factor, but for now AMD is keeping it simple with four fixed modes.
AMD has promised support in other titles coming soon, including Far Cry 6, Resident Evil Village, Dota 2 and Baldur’s Gate 3 as some of the more recognizable names. Necromunda Hired Gun will also be an interesting one as that game already supports DLSS, so it might become our first true DLSS vs FSR comparison.
But at the end of the day, this support list is just a promise. Nvidia did the same when they launched DLSS, but many of those initial titles never came to support DLSS and it's taken them significant efforts (and time) to bring DLSS 2.0 to a decent variety of titles.
Image Quality Comparison
For the upcoming section on quality comparison, we made image and video captures taken on a Radeon RX 6800 XT using a new driver that adds FSR support. Our test system was running a Ryzen 7 5800X inside, along with 16GB of DDR4-3200 memory.
This comparison is much better suited to live video footage and commentary. You can watch my Hardware Unboxed video below with all the details. Or you can read my conclusions below the video, followed by FSR performance benchmarks.
FSR holds up quite well when using the Ultra Quality or Quality modes at 4K. Without zooming in, these modes look similar to native rendering in Godfall, which is the kind of result you want to achieve. As you move down to Quality there is some softness introduced but it holds up well, even for elements like foliage. However, the Balanced and Performance modes are a clear step down in terms of quality and in the case of the latter, blurriness. Not the image quality you’d expect from native 4K.
I was impressed with FSR’s ability to preserve fine detail with the Ultra Quality and Quality modes. Image quality also holds up well in a game like Anno 1800, which is another title that has a lot of fine detail in its native presentation, nice and sharp overall.
If you are at all concerned with image quality (which you probably are if you're gaming at 4K), there is no way you’d play using the Performance mode, I’d turn down other image quality settings to improve frame rate before using FSR Performance as it’s basically a blur filter.
Moving on to 1440p image quality, FSR isn’t quite as good at this lower resolution, and there’s a narrower range of quality modes which are usable in my opinion. There are only small differences between the 1440p native image, and 1440p using FSR in its Ultra Quality mode, but as we move down to the Quality mode, the image becomes softer. As expected, when we use the Balanced or Performance modes, image quality falls away substantially and is quite blurry, especially in the Performance mode. FSR is simply not very good at upscaling a 720p image to 1440p and retaining the clarity of the native presentation.
Then at 1080p, FSR is not amazing. Even using the Ultra Quality mode, there's a noticeable loss to detail compared to the native presentation.
Benchmarks: FSR with Radeon GPUs
Now it’s time to talk performance. We had a few choices on how to test FSR performance based on the amount of time we had to play with it, and we decided to stick with testing one game across eight GPUs using three resolutions and all four FSR modes.
We could have tested more than just Godfall, but the other games AMD provided were a lot more difficult to benchmark and if we had tested them we wouldn’t be able to have done as comprehensive visual analysis. Godfall was tested using Epic settings, no additional sharpening, no ray tracing, and no motion blur unless otherwise specified. We don’t think the ray tracing implementation in Godfall is worth enabling, so we didn’t.
Using the Radeon RX 6800 XT, the performance uplift provided at 4K is impressive. Moving from native to Ultra Quality FSR took the average frame rate to 100 FPS, a 44 percent improvement. The Quality mode is also usable at this resolution, and saw a 65 percent uplift over native. The lower modes improve on this again, but don’t deliver 4K image quality.
At 1440p, we're more likely to be CPU limited on such a powerful GPU, especially as we lower the render resolution. However despite this, moving from 1440p native to Ultra Quality FSR still provided a 25 percent performance improvement at a minimal impact to visual quality. The Quality mode approaches a 40% increase, but I wouldn’t recommend it.
Interestingly we also get to see the overhead that FSR requires, given the 4K Quality mode renders at the same resolution as 1440p native. With this RDNA2 GPU we see an 8% reduction in frame rate compared to native rendering, which is not nothing, but in the grand scheme of things is small for something run on the same shader units as the game itself.
The Radeon RX 5700 XT benefits in a similar way to the 6800 XT. At 4K I achieved a 41% uplift over native using the Ultra Quality mode, and 66% improvement using the Quality mode.
The Quality mode basically makes the 5700 XT a 4K capable GPU in this game, as performance rises from a hard to play 35 FPS to nearly 60 FPS. The 5700 XT also benefits more at 1440p than the 6800 XT, with a 29% uplift going from native rendering to FSR Ultra Quality.
Vega 64 is the next AMD GPU we have for testing, using a prior generation architecture. At 4K we see a 39% performance improvement for the Ultra Quality mode, and 60% for Quality. Then at 1440p, Vega 64 benefited to a lesser extent than prior GPUs with just a 20% uplift for the Ultra Quality mode. This appears to be down to a higher performance overhead on the older architecture, a 10% reduction in frame rate comparing 4K Quality to 1440p native, higher than the 8% on RDNA2.
With the RX 570 4GB, Godfall doesn’t play well at all unless we use Medium settings as the game requires a lot more than 4GB of VRAM in basically all conditions on higher settings.
FSR doesn’t solve VRAM limitations to that degree. But using Medium settings, I saw a substantial benefit from switching from 1440p native to 1440p Ultra Quality, a 47 percent uplift, which takes this GPU out of the unplayable zone at this resolution.
Benchmarks: FSR with GeForce GPUs
FidelityFX Super Resolution is also supported on Nvidia GPUs, so let’s test that out.
The RTX 3080 is more prone to being CPU limited as we lower the resolution, and isn’t as powerful as the RX 6800 XT at lower resolutions. This is why we only saw a minimal 14% performance uplift at 1440p: we’re basically rendering at 1080p here with the GPU being limited. At 4K, the performance uplift was more in line with the 6800 XT, but to a lesser extent...
The Ultra Quality mode provided a 34% increase to frame rate, compared to 44% on the 6800 XT. Quality provided a 53% uplift, compared to 65% on AMD’s equivalent GPU. These numbers are nothing to scoff at though, for 4K gamers after a performance boost at minimal image quality impact.
The RTX 2070 plays very nicely with FSR. At 4K, we saw a 46 and 74 percent performance increase for the Ultra Quality and Quality modes, respectively. These are some of the highest results yet, and it seems that FSR is able to increase your frame rate more the lower the native frame rate is.
At 1440p we saw a 30 percent performance boost using FSR Ultra Quality, which is similar to the 5700 XT at this resolution. The performance overhead on this Nvidia GPU is in line with the overhead on AMD’s RDNA2 GPUs, so it doesn’t look like this feature has been written in a specific way to hurt performance on Nvidia GPUs.
With the GTX 1660 Super you can expect huge performance uplifts from FSR at certain resolutions given the card isn’t as powerful as some of the others we’ve been testing. At 4K I saw a 59% performance increase moving from 4K native to Ultra Quality, and nearly double the performance using the Quality mode. Meanwhile at 1440p, the 1660 Super also benefited from 33% higher performance going from native to Ultra Quality FSR.
Then with the Pascal-based GTX 1070 Ti, the performance story is very similar to the previous eight GPUs tested. A 52% and 82% performance increase for Ultra Quality and Quality at 4K respectively, with a 29% increase at 1440p Ultra Quality.
This makes the 1070 Ti a much more playable GPU at both resolutions, and while I probably wouldn’t use this card at 4K in this game, the gains at 1440p are very decent for improving the smoothness of gameplay.
The State of FSR: What We Learned
Time to give my thoughts after this first look at AMD’s FidelityFX Super Resolution. I have to say that I'm surprised with what AMD's been able to achieve with FSR. At the time of the announcement when AMD explained how FSR would work, I was expecting it to be decent but not overly impressive, ultimately falling somewhere between DLSS 1.0 and 2.0 in quality. But the reality is that FSR is pretty decent in some circumstances, and is competitive with DLSS 2.0 at times.
Based on my testing so far, my recommendations are straightforward: at 4K both the Ultra Quality and Quality modes get pretty close to native rendering, while delivering a ~40% performance uplift with the Ultra Quality mode, and ~65% with Quality, across a range of GPUs from both AMD and Nvidia. I think most gamers will gladly take this sort of performance uplift for the relatively minor impact to visual quality that FSR has in most circumstances.
FSR is also usable at 1440p but using the Ultra Quality mode only. Using this setting, image quality is pretty good and gets close to native. Provided that you are not CPU limited when lowering the render resolution, I was able to typically achieve a ~30% performance improvement in Godfall.
The other modes are not particularly usable. The Balanced and Performance modes suffer from a noticeable loss of quality and introduction of artifacts like shimmering, regardless of the resolution. Also at 1080p, FSR doesn’t really cut it with even the Ultra Quality mode enabled. FSR is better than traditional upscaling, so perhaps these lower modes will become handy for integrated graphics when it’s better than nothing.
Now, when comparing FSR and DLSS 2.0, it's more complicated. In the best cases, FSR is pretty competitive with DLSS 2.0, although we aren’t able to compare both in the same game just yet. However based on my extensive testing and re-testing of these techniques, in the higher quality modes, the image quality FSR offers is only very marginally behind DLSS while providing a similar performance uplift. It also doesn’t suffer from ghosting in motion, as FSR is not a temporal solution, and the performance overhead of using FSR appears lower than DLSS for a given render resolution.
However, DLSS 2.0 is clearly better at upscaling from lower render resolutions, such as transforming 1080p into 4K. A spatial upscaler is simply not going to be as good as a temporal upscaler that can gather information from more sources in its quest to reconstruct the image. What this means in practice is that DLSS 2.0 is much better in its Balanced and Performance modes compared to FSR; where FSR can be blurry, DLSS preserves more detail. Both techniques can suffer from shimmering or similar artefacts depending on the situation.
While DLSS 2.0 may be technically superior in some situations, AMD’s counter for that is broader support and ease of integration. FSR works on lower-end products, even those from Nvidia, which is pretty neat. A $230 GPU (in normal times) like the GTX 1660 Super is not fast enough to run Godfall at 1440p using Epic settings, but with a 33% performance boost from FSR in the Ultra Quality mode that resolution becomes more of a reality without turning down any other settings. Nvidia isn’t catering for this customer with DLSS, which puts us in this weird situation where AMD is doing a better job of supporting Nvidia’s mainstream GPU buyers than Nvidia.
It will also cause all sorts of discussions at developers. With limited resources, do they integrate FSR that can cater to a wider audience but isn’t quite as good as DLSS 2.0, or go for the better quality option in DLSS that’s limited to RTX GPU owners? Obviously, the preferred option is to include both, however that may not be feasible at most studios, and even some big developers may choose the easy route here. There’s a lot to play out there in the next few years and this appears to be AMD’s strategy.
Supporting more GPUs is nice, but it means very little if FSR isn’t implemented in more games. Nvidia's had a multiple-year head start to develop their DLSS ecosystem, and that’s paying off now with DLSS 2.0 supported in many big titles.
It’s all well and good to show 7 games at launch, but honestly the list of FSR games you can play right now doesn’t get me excited. DLSS 2.0, on the other hand, is supported in Fortnite, Call of Duty Warzone, Cyberpunk 2077, Metro Exodus Enhanced Edition, Death Stranding, and so on. Those are big games, including titles that remain popular today.
Now of course, AMD has to start somewhere, and launching in more than one game with several more promised and with decent technology is better than I was expecting. Not every game has DLSS support either, but the point is that FSR isn’t a selling point for AMD right now, just as DLSS wasn’t a selling point for Nvidia when it launched. It will take time for FSR to be supported in an adequate amount of games, whereas DLSS is miles in front and today it's something that might sway you to Nvidia's GPUs.