Mod adds AMD FSR 3 frame generation to titles that don't officially support it

Daniel Sims

Posts: 1,375   +43
Staff
The big picture: A week after AMD published the FSR 3 source code on GPUOpen, modders have predictably started unofficially integrating the company's frame generation technology into various games. Additionally, it is possible to combine this feature with Nvidia's DLSS upscaling. Meanwhile, Intel plans to respond to frame generation using a fundamentally different method.

A mod is now available that adds AMD's FSR 3 frame generation to titles that don't officially support it. The exploit requires an Nvidia RTX graphics card and is compatible with games that include DLSS 3 frame generation, as it replaces the feature in settings menus.

To install the mod, first, ensure hardware-accelerated GPU scheduling is enabled and HDR is disabled. Then, extract the files "dlssg_to_fsr3_amd_is_better.dll" and "dbghelp.dll" into the directory that contains a game's executable. A notification of successful installation should appear upon booting the game. In the settings menu, turn on frame generation and choose between DLSS and FSR upscaling.

The mod's author has only tested it with Cyberpunk 2077 and The Witcher 3.

Team Red's answer to DLSS 3 debuted in October in Forspoken and Immortals of Aveum. Earlier this month, it came to Avatar: Frontiers of Pandora and Like a Dragon Gaiden: The Man Who Erased His Name. Additionally, official FSR 3 support is coming to titles like Cyberpunk 2077, Squad, Like a Dragon: Infinite Wealth, Starfield, and Black Myth: Wukong.

The primary advantage of FSR 3 is that it supports all GPUs since Radeon RX 5000 and GeForce RTX 2000, whereas DLSS 3 is locked to RTX 4000. However, TechSpot's analysis showed that FSR 3's image quality is generally less stable than DLSS 3.

One reason is that official FSR 3 implementations require FSR upscaling, which often produces more artifacts than DLSS. However, the FSR 3 mod allows users with RTX 2000 and 3000 GPUs to combine AMD's technology with DLSS upscaling and ray reconstruction for potentially better results.

Meanwhile, Intel recently released a paper discussing its take on frame generation, called ExtraSS. Like FSR 3 and DLSS 3, ExtraSS creates new frames utilizing AI to increase perceived framerates artificially. However, that's where the similarities end.

While FSR 3 and DLSS 3 use interpolation – generating frames based on past and future output frames – ExtraSS opts for extrapolation, only using prior frames. While extrapolation might result in inferior image quality or lower stability compared to interpolation, it doesn't add latency like FSR 3 and DLSS 3, resulting in more responsive controls.

Intel hasn't yet indicated when it will debut ExtraSS or what games will support it.

Permalink to story.

 
The fact that these two technologies have a possibility to work together, even with problems, is simply amazing.
What if incompatibility is not as common as we think? What if it is more often just a tool to improve sales?
 
Good to see that now it's out in the open modders are acting fast, combining DLSS (upscaling) with AMD FG could be a fun little combo.

I don't know why people still cling to the notion that rendering at a given screens native resolution = unmatched quality, and a peak that cannot be surpassed. If it's the best, why does super sampling exist?

Especially when underpinned by super meh TAA, it's no wonder upscaling can provide better results than that, even starting from a lower resolution.
 
DLSS terminology is really sketchy. Nvidia Frame generation is 40 series only. 3.5 DLSS can benefit 30 series cards now so when it says only available to cards that support DLSS 3, do we mean already get offered frame generation (40series) or can run 3.5 (20 series so all RTX). It’s annoyingly complex and referred to by people at different stages of "history"/understanding etc. A year ago articles about DLSS 3 wouldn’t infer 20/30 series compatibility at all.

Also. Side note. As a 3060ti owner @1440p, I am finding the option of an AMD card as my next card (a 7800xt or higher) much more attractive than a 4070 super with 12GB and a 12 pin power connector. As all the new 40 series super cards will have. And I thought DLSS was great but the ram limitations in nvidia cards are taking the biscuit and the bus limitations so cards are slower at 4K too. I think I’m switching teams next upgrade.
 
Last edited:
Unfortunately you will be forced to use it...
Yeah, the amount of AI being used for different aspects, it isn't as simple as someone designing a level using a design tool then those triangles being stuck on a screen. Bump mapping, different forms of FSAA, different graphics cards using different algorithms. Theres been an element of image reconstruction going on for a long time.
 
Almost all modern 3D games use TAA, you are never seeing “100% resolution quality” and it’s been like that for 10+ years now…
In vermitide 2 I subjectively can't tell the difference between dlaa native dlss and native anti aliasing taa at 4k max settings. Initially the dlaa did have distracting artifacts but they were ironed out. Also at 120hz the dlaa option is using significantly more power than taa in this title. 350 watts vs 275 watts respectively.
 
Almost all modern 3D games use TAA, you are never seeing “100% resolution quality” and it’s been like that for 10+ years now…
I don't use TAA. If anything I use forced MSAA. But DLSS/FSR are far more aggressive than TAA in reducing image quality; which is more germane to what I'm talking about.
 
Last edited:
In vermitide 2 I subjectively can't tell the difference between dlaa native dlss and native anti aliasing taa at 4k max settings. Initially the dlaa did have distracting artifacts but they were ironed out. Also at 120hz the dlaa option is using significantly more power than taa in this title. 350 watts vs 275 watts respectively.
Okay, My point wasn't about the differences in image quality, my point was dispel the notion you get, and I quote "100% resolution quality" in modern games, you don't and haven't for quite some time now.
I don't use TAA.
You can't turn it off, that's why your comment makes no sense, lets use Cyberpunk 2077 as an example since it's popular for this subject as it supports all the above.

If you turn the game settings all the way off in the menu's DLSS, FSR etc... so it's "native" it's actually not, you need to mod the game to turn off TAA. Here's an example mod:

If you do this, you'll lose things like screen space reflections, lots of various parts of a scene in modern games is blended together using TAA so it's on by default and cannot be turned off.

This has been going on for years by the way, your "100% resolution quality" comment is simply false in modern games. Not all, I'm talking specifically on highly graphical modern games, all games being made in UE5 for example.

Edit: Found a YouTube video of TAA being turned on and off in Cyberpunk:
As you can see, the game looks TERRIBLE without TAA. If you needed anymore evidence that you are definitely playing with TAA enabled.
 
Last edited:
I don't use TAA. If anything I use forced MSAA. But DLSS/FSR are far more aggressive than TAA in reducing image quality; which is more germane to what I'm talking about.

Forced MSAA is something that hasn't worked for a VERY LONG time, if anything, you could use downsampling as a form of SSAA. But even at 4k, without TAA you are gonna get noticeable pixel crawling and shimmering.
 
Tried in Cp2077, works great on rtx3080 10G, animations and camera are visibly smoother, does not feel laggy at all.
Base fps around 60 (PT w. 1440p DLSS balanced), with fsr3fg it's around 100.
Nvidia need to make dlssFG available for 2000/3000 at the very least now.
Though to be completely honest, the main thing that keeps me interested in their cards is path tracing+ray reconstruction and dlssSR, so I'm staying on team green whether they do that or not.

I mean, RT psycho to PT is literally like low to ultra to me

I use forced MSAA.
So you're using an outdated (almost ancient in thhis industry), shimmering mess that halves the performance and still doesn't get rid of most jaggies and are bragging about it. Great.
 
Last edited:
Tried in Cp2077, works great on rtx3080 10G, animations and camera are visibly smoother, does not feel laggy at all.
Base fps around 60 (PT w. 1440p DLSS balanced), with fsr3fg it's around 100.
Nvidia need to make dlssFG available for 2000/3000 at the very least now.
Though to be completely honest, the main thing that keeps me interested in their cards is path tracing+ray reconstruction and dlssSR, so I'm staying on team green whether they do that or not.

I mean, RT psycho to PT is literally like low to ultra to me

As a 3080 10gb user as well, can confirm.
I had been running PT mode 1440 performance with a few culled settings before anyway, nice frame boost at both that and balanced now in PT mode.
There's a little visual instability here and there with it, but overall it's stable and you're not going to be hunting for occasional reflection in a puddle fizzle while in combat or such.

I have a sibling with a 3090 that confirms it working in Starfield just fine.
 
I keep saying the same thing. Thank AMD for what Nvidia should be doing but they can't see past their wallets.
 
Back