FreeSync on Nvidia GPUs Workaround: Impractical, But It Works

Cool.
I once tried adding a cheap geforce to my radeon rig for physx processing, but it also worked in a very limited number of apps. I don't remember well but I think both cards had to be connected to the monitor to be active.
It was fun but with no practical use either ;)
Oh, but the combined gflops number in opencl benchmarks looked nice.
 
Cool.
I once tried adding a cheap geforce to my radeon rig for physx processing, but it also worked in a very limited number of apps. I don't remember well but I think both cards had to be connected to the monitor to be active.
It was fun but with no practical use either ;)
Oh, but the combined gflops number in opencl benchmarks looked nice.

NVIDIA "fixed" the PhysX workaround in drivers so it no longer worked.
 
There's almost certainly a registry setting *somewhere* that can be finagled to make this work; I find it hard to believe there isn't a setting deep within Windows that controls GPU rendering under the hood.
 
What's the performance impact with the APU method? As a 2200G owner, this sounds interesting.
 
Cool.
I once tried adding a cheap geforce to my radeon rig for physx processing, but it also worked in a very limited number of apps. I don't remember well but I think both cards had to be connected to the monitor to be active.
It was fun but with no practical use either ;)
Oh, but the combined gflops number in opencl benchmarks looked nice.


In some games, like Alice and Borderlands, enabling PhysX was a huge improvement in visuals. I totally enjoyed Alice back then and it was more than practical in those games. It was great.

And, no, only the AMD card was needed to be connected to the monitor.

Nvidia finally locked it and the patch stop getting updates. Of course there was also that 256? beta driver that was published with OUT Nvidia's lock proving that all that BS from Nvidia that PhysX could not work with an AMD graphics card as primary was,...well... BS.
 
Would you guys like a YouTube video demonstration? I have a channel here I didn't know this wasn't common knowledge until I read this article

Are you sure you aren't mistaking the article's intent? This is a discussion about getting the adaptive sync capabilities of a Freesync monitor to work with an Nvidia card, not about simply using a Freesync monitor with an Nvidia card (that obviously works, but the adaptive sync will not be active). If you are seeing locked fps, you likely have v-sync on. If what you are saying is true, to prove it you would need to see a framerate counter in one corner of the screen (e.g. from the Nvidia HUD or FRAPS), and in a separate corner the built-in on-screen refresh rate counter for the monitor itself. Those two numbers should match almost exactly AND fluctuate in tandem (not be locked) due to the normal fps variations in games.
 
Are you sure you aren't mistaking the article's intent? This is a discussion about getting the adaptive sync capabilities of a Freesync monitor to work with an Nvidia card, not about simply using a Freesync monitor with an Nvidia card (that obviously works, but the adaptive sync will not be active). If you are seeing locked fps, you likely have v-sync on. If what you are saying is true, to prove it you would need to see a framerate counter in one corner of the screen (e.g. from the Nvidia HUD or FRAPS), and in a separate corner the built-in on-screen refresh rate counter for the monitor itself. Those two numbers should match almost exactly AND fluctuate in tandem (not be locked) due to the normal fps variations in games.
I was reading about it and I think I just had adaptive Vsync on and that's why it's at 75fps my bad
 
Cool.
I once tried adding a cheap geforce to my radeon rig for physx processing, but it also worked in a very limited number of apps. I don't remember well but I think both cards had to be connected to the monitor to be active.
It was fun but with no practical use either ;)
Oh, but the combined gflops number in opencl benchmarks looked nice.

I ran a Radeon + NV gpu for Physx.

there was no need for both to be connected to the monitor for it to work.

And while it was patched in later NV drivers if you kept your drivers below that it worked fine, I was only concerned with keep the primary gpu drivers up to date.

Games that I enjoying it on were Borderlands 2 and Batman Arkham City
 
Time for AMD to update their drivers to allow this kind of mod . They'd sell lots of $100 graphics cards , and nVidia would be dead in the water with G Sync monitor licensing
 
No mention of the Hades Canyon NUC.
Sounds like this might be a nice upgrade for those with the parts and if it works.
A GTX 1080(Ti) in an eGPU for rendering and using the Vega graphics for Freesync on a 1440 or 4K monitor combined with an Intel i7.

The biggest pity which I see overlooked in all the debates about AMD versus Nvidia (especially now that the new RTX cards are being released) is that if AMD graphics cards keep slipping further and further behind Nidia, then we will be stuck with just G-Sync monitors for gaming.
 
"But there are a number of problems with this Nvidia FreeSync hack. In fact, there are so many problems I can’t see anyone actually using this trick in practice."

The bigger argument is how about Microsoft stop breaking freesync/gsync functionality in non (real) fullscreen games (dx12 doesn't have fullscreen its only borderless windowed, as I've seen cited.). Currently I need to use Gsync (even though it's not being used, since I cap the fps) and have to reset DWM at bootup to get gsync to work in "simulated" borderless windows.

and that Gsync is actually better, marginally. Would rather see AMD take on the Ti class of cards.. I'd pay a little less for close to the performance. Granted, it'd help if it also wouldn't be released 7+ months after the ti.

It may be a silly question but how is Gsync not a monopoly on nvidia? In a bad way. Didn't Microsoft get in similar trouble at one point for only packaging IE with Windows? They weren't forced to use IE, just like you're not forced to use G-sync. I'm not saying do away with Gsync, but why not support freesync as well? The only real Argument I see is to lock Gsync monitor users in with Nvidia.

I have the gsync duo, but I'd go freesync in a future build if amd would release a card on time and competitive.

I believe ultimately, the frame syncing should should be a fully supported standard by all, or OS based.
 
Last edited:
@TommyGun
It's easy to tell, things will be noticeably smoother.
I thought the Gsync monitors were a gimmick until I saw them in action, I then picked up a 27" HP Omen Gsync 1MS for $400 NIB when Amazon wanted $719 for the SAME EXACT THING, model # and all, best purchase/technology since an SSD.
How much is from GSYNC, and how much is from the 144hz refresh though?
 
There are OTHER ways to force or tell Windows 10 what GPU to use.

The easy one, is to set your Main Display to the one connected to the NVidia GPU. This puts the TaskBar on that Display and should automatically set that GPU as primary for any game launched from the Main Start Menu/Taskbar. Simply move the Game Window to the ATI display, and it should continue to render using the NVidia GPU.

Some experimentation with this method might be worth someone's time, as I think launching from the Taskbar on the other display will lock the game to that GPU as well, if I remember right.

There are also a couple of technical ways which I don't fully remember the steps off the top of my head, and it would be worth the research.

Window's use of GPUs is unique, and as newer 'games' stop locking themselves to a GPU, more SMP capabilities can be utilized by the OS. Already the OS will use all GPUs depending on priority/need/load between processes and thread when it can.
 
Back