FreeSync and G-Sync: What You Need to Know

Going from 60Hz to 144Hz in gaming is a revelation for FPS games. My frag count doubled with the smoothness offered by FreeSync.
 
Good data to know in a brief bit of reading. I recently bought a Freesync monitor and have been wanting to pair it with an 6800 XT, but that looks like it probably won't happen for a year now so I'll be sticking with my GTX 1080.

I just wish that since HDMI 2.1 cables have the capability of 4K HDR @ 120 Hz, support could find its way to HDMI.
 
What you need to know?

One is an industry open standard and the other belongs to a company that loves to lock you in with their proprietary tech (sometimes stolen from the open standards, then locked behind a paywall).

For the fanbois, the option is clear, for the rest of us, its safe to say that we prefer the open standards.
 
Good data to know in a brief bit of reading. I recently bought a Freesync monitor and have been wanting to pair it with an 6800 XT, but that looks like it probably won't happen for a year now so I'll be sticking with my GTX 1080.

I just wish that since HDMI 2.1 cables have the capability of 4K HDR @ 120 Hz, support could find its way to HDMI.


I'm positive you can use Freesync monitors on Nvidia cards. I think you can't use GSync monitors on AMD cards, gotta love greedy Nvidia.
 
I'm positive you can use Freesync monitors on Nvidia cards. I think you can't use GSync monitors on AMD cards, gotta love greedy Nvidia.
True but why would anyone want to purchase a gsync monitor in the first place?
Especially at the premium cost of Gsync.
While Nvidia is greedy I will still thank them for making freesync compatible with Nvidia graphics cards.
Love my 48 inch oled 4k 10 bit 120hz with xc3 ultra 3090 at 2ghz!
 
True but why would anyone want to purchase a gsync monitor in the first place?
Especially at the premium cost of Gsync.
While Nvidia is greedy I will still thank them for making freesync compatible with Nvidia graphics cards.
Love my 48 inch oled 4k 10 bit 120hz with xc3 ultra 3090 at 2ghz!

Don't think they had much of a choice considering the market adopted Freesync as the standard.
 
I do wish that Nvidia would enable gsync over hdmi because of an older freesync monitor I use in my bedroom, but as it is their decision to support freesync really made it easier for me to justify getting a better monitor because I wouldn't be locked into buying just one brand. Now I have a freesync premium gsync compatible 144hz 1440p monitor and I love it .
 
Don't think they had much of a choice considering the market adopted Freesync as the standard.
They had a choice but they decided to use freesync to it's advantage from what I recall freesync monitors became labeled gsync compatible for a cost to the vendor. Hence the greedy part. Hardware unboxed had a video on this exact topic.
Update seems like they wanted to ride the wave and milk Gsync as long as possible and now they are lowering standard to compete with Freesync premium pro for Gsync ultimate.
 
Last edited:
There is another part to this which is quite interesting - the difference between performance of gsync compatible and free sync with different panel types.

Quick set of assumptions : VA panels are an option chosen because of the improved contrast - they are the best monitor for contrast in a dark room as long as Q&A has given you a good panel. In comparison IPS has a faster response time but suffers from IPS glow. TN panels have even faster response time but contract is poor - blacks can look very grey in darkened rooms.

Now, VA panels suffer from slower response time and this affects how they respond to LFC. I have a 165hz VA panel (Dell 3220DGF) and the Freesync range is 48-165hz. This means that when it drops to 47hz, it will double the frames to reach 94hz, smoothing out the judder. There is a big issue with this - VA panels are brighter at higher refresh rates. This ends up causing a flicker (also on some IPS panels but its less noticeable because of the faster response times of IPS monitors) that only occurs at the lower framerates.

It doesn't appear to be an exact science to avoid it - some people use CRU to change the Freesync ranges on their monitors so the upper range of Freesync is less than 2.5x the lower range. Setting your range to 58-143hz would mean that LFC would not kick in but when the frames drop below 58, you would be relying on Vsync and some response lag as a result. In the meantime within the Freesync range, you can enjoy Freesyncs unlocked frames matched to your monitors refresh rate without any tearing. This is in tandem with locking your monitors maximum refresh rate so lock it to 143frames per second - you don't want it going above 143hz (the Freesync range) as that would activate VSync.

The point is it doesn't take much for this to become a lot more complicated than just turning on Gsync compatible in the Nvidia Control Panel and might warrant some further in depth investigation. I read that no VA panels have been certified as Gsync Compatible because of the flickering issues even though the model number of the Dell 3220DGF suggests G for Gsync and F for Freesync, but they were named before the panels were certified with Gsync.
 
Last edited:
There is another part to this which is quite interesting - the difference between performance of gsync compatible and free sync with different panel types.

Quick set of assumptions : VA panels are an option chosen because of the improved contrast - they are the best monitor for contrast in a dark room as long as Q&A has given you a good panel. In comparison IPS has a faster response time but suffers from ISP glow. TN panels have even faster response time but contract is poor - blacks can look very grey in darkened rooms.

Now, VA panels suffer from slower response time and this affects how they respond to LFC. I have a 165hz VA panel (Dell 3220DGF) and the Freesync range is 48-165hz. This means that when it drops to 47hz, it will double the frames to reach 94hz, smoothing out the judder. There is a big issue with this - VA panels are brighter at higher refresh rates. This ends up causing a flicker (also on some IPS panels but its less noticeable because of the faster response times of IPS monitors) that only occurs at the lower framerates.

It doesn't appear to be an exact science to avoid it - some people use CRU to change the Freesync ranges on their monitors so the upper range of Freesync is less than 2.5x the lower range. Setting your range to 58-143hz would mean that LFC would not kick in but when the frames drop below 58, you would be relying on Vsync and some response lag as a result. In the meantime within the Freesync range, you can enjoy Freesyncs unlocked frames matched to your monitors refresh rate without any tearing. This is in tandem with locking your monitors maximum refresh rate so lock it to 143frames per second - you don't want it going above 143hz (the Freesync range) as that would activate VSync.

The point is it doesn't take much for this to become a lot more complicated than just turning on Gsync compatible in the Nvidia Control Panel and might warrant some further in depth investigation. I read that no VA panels have been certified as Gsync Compatible because of the flickering issues even though the model number of the Dell 3220DGF suggests G for Gsync and F for Freesync, but they were named before the panels were certified with Gsync.
Missing Oled which has response times even competitive with the best TN panels as well as the best contrast ratio and color gamut at the risk of burn in. Linus compared the cx oled at 120hz to a 240hz monitor and they were comparable in response FYI.
 
Missing Oled which has response times even competitive with the best TN panels as well as the best contrast ratio and color gamut at the risk of burn in.

OLED is fantastic, though I wouldn't buy one for games where a big part of the UI / digits and so on are always on. On a lot of games (most of them) you will plays for hours and the burn in will start very soon. LG has some measures against it but there is no escape, it belongs to the technology oled. So oled is fantastic for movies, Netflix, etc. Everything else (TV with logo, games with lots of fixed UI elements, publicity with fixed elements) is to destroy the screen fast.
 
Now, VA panels suffer from slower response time and this affects how they respond to LFC. I have a 165hz VA panel (Dell 3220DGF) and the Freesync range is 48-165hz. This means that when it drops to 47hz, it will double the frames to reach 94hz, smoothing out the judder. There is a big issue with this - VA panels are brighter at higher refresh rates. This ends up causing a flicker (also on some IPS panels but its less noticeable because of the faster response times of IPS monitors) that only occurs at the lower framerates... The point is it doesn't take much for this to become a lot more complicated than just turning on Gsync compatible in the Nvidia Control Panel and might warrant some further in depth investigation.
+1. ^ This is exactly the stuff I'd like to see more in-depth technical articles here on, rather than more vapourware CPU / GPU reviews.
 
OLED is fantastic, though I wouldn't buy one for games where a big part of the UI / digits and so on are always on. On a lot of games (most of them) you will plays for hours and the burn in will start very soon. LG has some measures against it but there is no escape, it belongs to the technology oled. So oled is fantastic for movies, Netflix, etc. Everything else (TV with logo, games with lots of fixed UI elements, publicity with fixed elements) is to destroy the screen fast.
It's True but for me who games a few hours a week If am lucky and use all the anti burn in features the risk is minimal. You definitely need to know your risk.
 
OLED is fantastic, though I wouldn't buy one for games where a big part of the UI / digits and so on are always on. On a lot of games (most of them) you will plays for hours and the burn in will start very soon. LG has some measures against it but there is no escape, it belongs to the technology oled. So oled is fantastic for movies, Netflix, etc. Everything else (TV with logo, games with lots of fixed UI elements, publicity with fixed elements) is to destroy the screen fast.

LG B6 owner here. OLED is fine as long as you take a few basic steps:

1: Turn the OLED light for non-HDR use down to about 30.
2: 10 minute screensaver (and for the love of god, not a static one)

One other thing: My B6 was basically ruined by WFH. It held up very well for about four years until then, but it's clear nearly a year of constant WFH (which is obviously the OLED worst case) has seriously degraded the pixels.

It's also worth noting that starting with the B7 lineup LG re-did the pixel structure to be far more resistant to burn in; I can't speak for those displays.

My recommendation is that OLED is perfectly fine for gaming, but I'd stay away if you do any sort of WFH with static screen elements.
 
What you need to know?

One is an industry open standard and the other belongs to a company that loves to lock you in with their proprietary tech (sometimes stolen from the open standards, then locked behind a paywall).

For the fanbois, the option is clear, for the rest of us, its safe to say that we prefer the open standards.

An open standard that as of DP 2.0 is still optional, leaving it as a feature that can be added only for "premium" displays...with a premium price.

HDMI VRR is going to win by default, since HDMI Forum added VRR into the baseline specification.
 
I have a Dell s3220dgf 165Hz monitor that has freesync, that when turned on you can through the nvidea control panel enable g-sync, however if it's enabled the screen flickers so that is not an option. I have a RTX3070 and from what I read; the installation of R417.71 and later drivers are compatible with adaptive sync tech. So I turned on the freesync on the monitor. In nvidea panel under (manage 3D settings) I set vertical sync to "adaptive sync". I set the low latency mode to ultra, set the max frame rate at a high fps (have mine at 650), and set my refresh rate to "Highest Available". With that being done I don't see any blurring or stuttering when I put my game at ultra high settings. I'll also be the first to admit that my knowledge about computer components is very limited, but these settings do seem to improve the performance of my gaming pc.
 
There is another part to this which is quite interesting - the difference between performance of gsync compatible and free sync with different panel types.

Quick set of assumptions : VA panels are an option chosen because of the improved contrast - they are the best monitor for contrast in a dark room as long as Q&A has given you a good panel. In comparison IPS has a faster response time but suffers from IPS glow. TN panels have even faster response time but contract is poor - blacks can look very grey in darkened rooms.

Now, VA panels suffer from slower response time and this affects how they respond to LFC. I have a 165hz VA panel (Dell 3220DGF) and the Freesync range is 48-165hz. This means that when it drops to 47hz, it will double the frames to reach 94hz, smoothing out the judder. There is a big issue with this - VA panels are brighter at higher refresh rates. This ends up causing a flicker (also on some IPS panels but its less noticeable because of the faster response times of IPS monitors) that only occurs at the lower framerates.

It doesn't appear to be an exact science to avoid it - some people use CRU to change the Freesync ranges on their monitors so the upper range of Freesync is less than 2.5x the lower range. Setting your range to 58-143hz would mean that LFC would not kick in but when the frames drop below 58, you would be relying on Vsync and some response lag as a result. In the meantime within the Freesync range, you can enjoy Freesyncs unlocked frames matched to your monitors refresh rate without any tearing. This is in tandem with locking your monitors maximum refresh rate so lock it to 143frames per second - you don't want it going above 143hz (the Freesync range) as that would activate VSync.

The point is it doesn't take much for this to become a lot more complicated than just turning on Gsync compatible in the Nvidia Control Panel and might warrant some further in depth investigation. I read that no VA panels have been certified as Gsync Compatible because of the flickering issues even though the model number of the Dell 3220DGF suggests G for Gsync and F for Freesync, but they were named before the panels were certified with Gsync.
There is another part to this which is quite interesting - the difference between performance of gsync compatible and free sync with different panel types.

Quick set of assumptions : VA panels are an option chosen because of the improved contrast - they are the best monitor for contrast in a dark room as long as Q&A has given you a good panel. In comparison IPS has a faster response time but suffers from IPS glow. TN panels have even faster response time but contract is poor - blacks can look very grey in darkened rooms.

Now, VA panels suffer from slower response time and this affects how they respond to LFC. I have a 165hz VA panel (Dell 3220DGF) and the Freesync range is 48-165hz. This means that when it drops to 47hz, it will double the frames to reach 94hz, smoothing out the judder. There is a big issue with this - VA panels are brighter at higher refresh rates. This ends up causing a flicker (also on some IPS panels but its less noticeable because of the faster response times of IPS monitors) that only occurs at the lower framerates.

It doesn't appear to be an exact science to avoid it - some people use CRU to change the Freesync ranges on their monitors so the upper range of Freesync is less than 2.5x the lower range. Setting your range to 58-143hz would mean that LFC would not kick in but when the frames drop below 58, you would be relying on Vsync and some response lag as a result. In the meantime within the Freesync range, you can enjoy Freesyncs unlocked frames matched to your monitors refresh rate without any tearing. This is in tandem with locking your monitors maximum refresh rate so lock it to 143frames per second - you don't want it going above 143hz (the Freesync range) as that would activate VSync.

The point is it doesn't take much for this to become a lot more complicated than just turning on Gsync compatible in the Nvidia Control Panel and might warrant some further in depth investigation. I read that no VA panels have been certified as Gsync Compatible because of the flickering issues even though the model number of the Dell 3220DGF suggests G for Gsync and F for Freesync, but they were named before the panels were certified with Gsync.
I have the same monitor and your right that there does not seem to be an exact science to fixing the problem. I've look around the internet for a solution and there does not seem to be a silver bullet for this issue. I've given up on g-sync with it and just try to enhance the adaptive or freesync the best I can. At this point I got rid of the stuttering and blurring but it may be that there are games that will need further fine tuning.
 
I have the same monitor and your right that there does not seem to be an exact science to fixing the problem. I've look around the internet for a solution and there does not seem to be a silver bullet for this issue. I've given up on g-sync with it and just try to enhance the adaptive or freesync the best I can. At this point I got rid of the stuttering and blurring but it may be that there are games that will need further fine tuning.

I found another tech spot article relating to VRR and adaptive sync and gsync that is quite interesting.

 
Last edited:
It was alot of guess work on my part, I was more experimenting than anything. But after I changed the latency mode setting to ultra, the vertical setting offers 2 choices, g-sync or adaptive so knowing that g-sync doesn't work for me I set it on adaptive. There's a setting that says "preferred refresh rate with my monitor name right there, so I put it on the highest available. Having a monitor with 165Hz refresh rate it was a no brainer. The real guess work was setting the background FPS and the max frame rate . With both of them I turned them on and set the sliding bar half way. I don't know if all these settings are the optimum settings, but I did obtain a much smoother performance on my games without sacrificing quality or performance. I have a RTX 3070 and so far I've been able to use the highest settings.
 
Back