Recently two separate Reddit threads brought an interesting topic to discussion: if you hook up your FreeSync monitor to an AMD GPU display output, while your primary Nvidia GPU is still in your PC, it may be possible to use your Nvidia graphics card and take advantage of FreeSync.
One of the threads centered on doing it with two discrete GPUs in the one PC – one from each vendor – while the other, more widely circulated thread suggested all you need is an Nvidia GPU and an AMD APU like the Ryzen 3 2200G. If true, this is a big deal because Nvidia GPUs don’t natively support FreeSync. Instead, Nvidia forces gamers who want adaptive sync into buying a G-Sync monitor, which are usually around $200 more expensive than their FreeSync alternatives.
Getting FreeSync working on Nvidia GPUs would mean gamers can buy cheaper displays and get the exact same experience, or those that already have FreeSync displays with Nvidia GPUs would be able to unleash the benefit of adaptive sync.
So let’s talk about the APU method first, because this one seems to have been vigorously tested and proven to work at this point. Basically, if your system has an AMD Raven Ridge APU like the Ryzen 3 2200G, and an Nvidia graphics card, getting FreeSync to work is quite simple. Just unplug your FreeSync monitor from the Nvidia GPU, then plug it into your motherboard instead. This makes the integrated Vega GPU the primary display output.
Then all you have to do is a few software side tweaks: you have to enable the integrated graphics in your motherboard’s BIOS if it’s disabled by default; and then you simply head into the Nvidia Control Panel, and set the games you plan on playing to use the Nvidia GPU. Alternatively you can use Windows 10’s new graphics settings feature to set games to use the Nvidia GPU.
What this does is render games on the Nvidia GPU, then simply pass the rendered data to the integrated Vega GPU which sends it to the display. As Ryzen APUs support FreeSync, adaptive sync data is packed up into the display stream even though the Nvidia GPU is actually rendering the game. Simple, easy solution to enable FreeSync while still harnessing the power of your Nvidia GPU.
Of course, not everyone has a system with a Ryzen APU and not everyone wants to build a Ryzen APU system, after all there are plenty of CPUs much faster than the Ryzen 5 2400G that gamers want to use. So this APU method for getting FreeSync support is a bit limited.
But it does open up the question: could current Nvidia GPU owners simply add in a discrete AMD GPU to their system, rather than an APU, plug their display into the AMD card, and get FreeSync support while still using the Nvidia GPU to render games?
This could be a pretty neat workaround for those with expensive FreeSync monitors and powerful Nvidia GPUs, as you could add in something like a $100 Radeon RX 550, giving you FreeSync support for less than the typical $200 cost of the G-Sync module in competing monitors.
Even if only for the sake of experimentation, this is definitely something worth looking into, so I decided to give it a try. I whipped out my Intel Core i7-8700K test system, and set about installing both an Nvidia and AMD GPU.
On the Nvidia side we’re using the Gigabyte GeForce GTX 1070 Ti Windforce, which will be the primary rendering GPU. Then the idea would be to pair it with the RX 550 because it’s the cheapest AMD GPU you can buy right now that supports FreeSync. I don’t have an RX 550 on hand, but I do have another FreeSync-capable graphics card so I used that instead, the Radeon RX Vega 64.
The setup process is pretty easy. Both GPUs are installed in the system, my FreeSync display is hooked up to the AMD GPU’s display outputs via DisplayPort, and both Nvidia and AMD drivers are simultaneously installed. And of course, FreeSync is enabled in Radeon Settings.
After doing all of this, I immediately noticed a few issues. Unlike with the APU method, there is no way to set in software what GPU is to be used by default. You can’t launch the Nvidia Control Panel when your display output is an AMD GPU, and when the display is plugged into the Nvidia GPU, the Nvidia Control Panel doesn’t give any option to set the Nvidia GPU as default. Windows 10 also doesn’t distinguish between the two GPUs in its graphics settings screen; it will only allow you to use the GPU the display is hooked up to.
The reason for this difference compared to the APU method is simple: the APU’s integrated graphics is listed as a ‘power saving’ option, so both the Nvidia Control Panel and Windows 10 settings allow you to prioritize a ‘high performance’ GPU instead. But when you have any two discrete GPUs in your system, both are classed as ‘high performance’, so the option to choose a GPU disappears.
And this makes sense, most users don’t have two different graphics cards in their system and if they did and wanted to game, they’d simply plug their monitor into the highest performing one. Outside of niche hacks like this, there’s no real reason for a GPU selection option to exist.
But this doesn’t mean this FreeSync on Nvidia GPUs hack is dead in the water. Some games have a built-in selection option allowing you to choose which GPU is used for rendering. And it’s with this option that you can get it working in some circumstances.
So first, let’s show the baseline with the Nvidia GPU hooked up directly to the FreeSync monitor. The game I’m showing is Middle-earth Shadow of War, because it’s one of the titles that has a GPU selector, in just the basic benchmark tool. In the top left we have the GPU utilization of each title, on the top is Vega 64’s utilization, and on the bottom is the 1070 Ti’s utilization. Keep that in mind for later.
Then in the upper right corner, I’ve enabled a feature of this FreeSync monitor that shows the current refresh rate. Note this isn’t the frame rate, but the actual refresh rate of the panel. It’s also why I’m videoing the monitor rather than using a capture card. When FreeSync is active, the refresh rate fluctuates and you’ll see no screen tearing. When FreeSync is not active, the refresh rate will stay static at 144 Hz and you’ll see some tearing.
In this first instance of the Nvidia GPU hooked up directly to the FreeSync monitor, you’ll notice only the Nvidia GPU is being utilized, but crucially the monitor’s refresh rate is fixed at 144 Hz and there’s some screen tearing as the frame rate output is below 144 FPS. This shows, as expected, that Nvidia GPUs don’t support FreeSync.
Then we have the same benchmark being run on the AMD GPU, with the AMD GPU hooked up to the monitor. You’ll see in the top left corner that the Nvidia GPU isn’t being used, and the refresh rate is fluctuating, so FreeSync is working.
Lastly, we get to the good stuff. This is the AMD GPU hooked up to the FreeSync monitor, but the game has been set to use the Nvidia GPU for rendering. You’ll see in the top left corner that the Nvidia GPU has the highest utilization, and the AMD card is being utilized a little bit as well. But the magic is really happening with the refresh rate number.
It’s fluctuating in line with the render rate, indicating FreeSync is working. There’s no tearing either. But all the rendering is being done on the Nvidia GPU.
So it definitely works. In fact, it works quite well. The Nvidia GPU handles the rendering, and the AMD GPU handles FreeSync. Now of course in my situation using a 1070 Ti for rendering and Vega 64 for FreeSync makes little sense because Vega 64 could just do both, but this exact same process should work with the much cheaper and slower RX 550. So owners of something like the GTX 1080 could add in an RX 550 for a small cost and get FreeSync support.
You’re probably wondering, is there a performance impact from sending data from the Nvidia GPU to the display through an additional AMD GPU? The answer to that is yes. This chart shows the difference in Hitman’s benchmark using DirectX 12 Ultra settings. Both average framerates and 1% lows take a hit of about 4 percent. This was also the margin in Shadow of War’s average framerates reported by the benchmark.
But there are a number of problems with this Nvidia FreeSync hack. In fact, there are so many problems I can’t see anyone actually using this trick in practice.
For starters, without the ability to select the primary rendering GPU in Windows or in the Nvidia Control Panel – which is possible with the APU method, but not with two GPUs – you are limited to games that have built-in GPU selectors.
Out of the collection of games we regularly benchmark with, just four have that feature: Shadow of War, Hitman, Far Cry 5 and Watch Dogs 2. Plenty of other popular games, including Fortnite, Battlefield 1, GTA 5 and so on, don’t have a GPU selector, so this FreeSync trick won’t work in those games.
And then on top of that, the GPU selector didn’t work in Far Cry 5 and Watch Dogs 2. Attempting to switch to the Nvidia GPU and restarting the game just left me with a blank screen on launch. This was with both the latest Nvidia drivers, and drivers from well before people uncovered this workaround, so I think it’s just a bug rather than Nvidia blocking the workaround for those games.
So far I haven’t found a way to globally select to use the Nvidia GPU like is possible with the APU method. If you do have a neat trick to enable GPU selection, let us know, but otherwise we’re stuck with this Nvidia FreeSync hack in a very limited selection of games. If it worked across the board, the way it does with the APU method, that would make the trick somewhat worthwhile, but not as it stands.
Then there’s the issue of Nvidia potentially blocking this. I tested with the latest 399.07 drivers, but I’m confident Nvidia will be looking to patch and disable this FreeSync workaround if it gains any traction. Surely they want to keep their graphics cards only compatible with G-Sync monitors and lock people in their ecosystem. So I wouldn’t be rushing out to buy a cheap Radeon to use this FreeSync hack for the time being.
The APU method is more functional, but again, you run the risk of Nvidia blocking the workaround, and you are also forced into using an AMD APU, so it’s not practical for hardcore gamers. For now let's call this a proof of concept more than anything else, but it's a cool hack that certainly got people talking.