FreeSync vs. G-Sync: 2017 Update, What You Need to Know

Julio Franco

Posts: 9,097   +2,048
Staff member
Price and quality notwithstanding, the only real reason to choose one over the other is the video card you are using.... Until Vega, AMD has no high-end cards that compete with Nvidia's 1080, 1080Ti, TitanX, etc... So if you want to game with a pricy adaptive sync monitor, your only real choice is GSync.... No AMD card can run high-end games at high-end resolutions past 40FPS....

Let's hope Vega changes this and gives Freesync a reason to exist.
 
A clear and well written article. Thanks!

I've been using an ASUS PG279Q (27" with G-Sync) for about 15 months and I'm very pleased with its performance. Mind you, it wasn't cheap (£700+) but the build quality and the picture are excellent. I'll be interested to see what the v2 adaptive sync technologies bring to the table.
 
I like the way freesync 2 is handling things - I havent had a FS monitor, but ive heard alot about certain models with flickering issues and stuff lately. Itll be cool to know that buying a freesync 2 monitor also means your getting good specs everywhere else too.
 
Where are the G-Sync 5K monitors running over DisplayPort 1.4?!
We've been waiting a very long time now.
 
Price and quality notwithstanding, the only real reason to choose one over the other is the video card you are using.... Until Vega, AMD has no high-end cards that compete with Nvidia's 1080, 1080Ti, TitanX, etc... So if you want to game with a pricy adaptive sync monitor, your only real choice is GSync.... No AMD card can run high-end games at high-end resolutions past 40FPS....

Let's hope Vega changes this and gives Freesync a reason to exist.

That may change on the 14th.. when the embargo is lifted. That said, Nvidia, sure to have its R&D by now.. could drop prices.

It's odd that if FS tech is essentially open to all.. that Nvidia doesn't stick both on their cards. It would give them more availability to pull people from other setups. I like G-sync over FS, but FS wasn't even an option when Gsync came out...
 
No AMD card can run high-end games at high-end resolutions past 40FPS....

Needs more details. By high resolution do you mean 1080p, 2K, or 4K? What is a "high-end game"? I run two water-cooled r290Xs (a 3-4 year old card) and play all of my games (Overwatch mostly, with some Diablo 3) at >120 FPS at 1080p 144hz monitor.
 
I never liked the execution of FreeSync. It was a quickly pushed out tech built on top of an existing standard in response to G-Sync and too much of the weight is on the monitor manufacturers to improve it, and not AMD.

Although more expensive, G-Sync is still the superior option. Personally I'm fine with just a high refresh monitor.

By the looks of this article FreeSync is STILL a mess.
 
Price and quality notwithstanding, the only real reason to choose one over the other is the video card you are using.... Until Vega, AMD has no high-end cards that compete with Nvidia's 1080, 1080Ti, TitanX, etc... So if you want to game with a pricy adaptive sync monitor, your only real choice is GSync.... No AMD card can run high-end games at high-end resolutions past 40FPS....

Let's hope Vega changes this and gives Freesync a reason to exist.

That may change on the 14th.. when the embargo is lifted. That said, Nvidia, sure to have its R&D by now.. could drop prices.

It's odd that if FS tech is essentially open to all.. that Nvidia doesn't stick both on their cards. It would give them more availability to pull people from other setups. I like G-sync over FS, but FS wasn't even an option when Gsync came out...

Nvidia makes money off of the Gsync monitors. If they allowed FS then they'd lose more in lost monitor licensing profits compared to increased market share.

It's a crude by Nvidia to maximize profit but at the cost of market adoption and the consumers. If AMD were to consistently be on parity with Nvidia (not just one generation of products) then G-Sync would quickly become obsolete. That's a huge feat though.
 
I was going to buy a 1080 + 27" 1440p IPS G-sync monitor about 3 months ago. It was very tempting.

My wife (a huge Nvidia fanboy) convinced me that I should just wait for Vega to come out. Even if it's just 1080 level performance, the cost of ownership between a Vega card and a FS monitor will be significantly less than a 1080 + Gsync. Plus, if Vega flops, the Geforces will be a little cheaper (this was before the cryptocurrency crazy -- and fortunately now it seems markets are stabilizing again).
 
I have a 27" 2560x1440 iiyama gaming monitor with Freesync, coupled with a RX480 and I recently bought Battlefield 1. Boy oh boy not only does the 480 shine using DX12 but low frame rates and stutter are history thanks to FreeSync!
 
It's a crude by Nvidia to maximize profit but at the cost of market adoption and the consumers. If AMD were to consistently be on parity with Nvidia (not just one generation of products) then G-Sync would quickly become obsolete. .
I like both and would be happy with either at this point but for history sake, G-Sync released first and was available months ahead of basic FreeSync support.
 
I was thinking on a ViewSonic FreeSync monitor but I need another GPU, then I bought an AOC non-Freesync monitor and its well under 60fps :V
 
No AMD card can run high-end games at high-end resolutions past 40FPS....

Needs more details. By high resolution do you mean 1080p, 2K, or 4K? What is a "high-end game"? I run two water-cooled r290Xs (a 3-4 year old card) and play all of my games (Overwatch mostly, with some Diablo 3) at >120 FPS at 1080p 144hz monitor.
LOL Overwatch and Diablo 3 run rubbish game engines. They aren't testing AAA title game engines at all.

A high end game is like Battlefield 1, Rise of the Tomb Raider, Gears of War 4, Deus Ex Mankind Divided and so on. And 1080p is not a high-end resolution either - if you are a gaming, this is the bare minimum. Anyone playing on a desktop below that resolution doesn't really have gaming hardware.
 
Last edited:
LOL Overwatch and Diablo 3 run rubbish game engines. They aren't testing AAA title game engines at all.

A high end game is like Battlefield 1, Rise of the Tomb Raider, Gears of War 4, Deus Ex Mankind Divided and so on. And 1080p is not a high-end resolution either - if you are a gaming, this is the bare minimum. Anyone playing on a desktop below that resolution doesn't really have gaming hardware.

OK. Here's BF1 benchmarks. One 290X runs at 68/95 FPS. My point stands that AMD (even older cards) can still perform very well on modern titles.
https://www.techspot.com/review/1267-battlefield-1-benchmarks/page2.html
 
While a G-sync monitor would be nice, I have no complaints about Fast sync's performance on my recent play through of Deus Ex MD. Have a 1080 Ti and a 1440p, 120Hz monitor and saw 100+ fps avg, no tearing and no microstutter. Smooth as butter. I guess I can wait for prices to come down.
 
1080p is Both a minimum AND a maximum - there is little reason to go beyond it as frame-rates drop and costs Rise.
THAT is why 1080 became a standard in the first place -- is 4k better for Giant screens? well, duh.. yes. If you have a higher than well-above-average income and that is what pleases you, then you should spend ALL your money on tech baubles (when You're buying items that so very few can justify, YOU are driving tech forward, good on yer)
(1080 will prolly be the Last standard dependent upon the -correct- lower, scan number -- a shytpile of pixels Sounds/markets better than number-of-scanlines. Not that Anyone would ever do this, but you -could- have a 4k screen with 480 scanlines)
 
Last edited:
My CrossOver Ultrawide 3440x1440 AH-IPS FreeSync 48Hz-95Hz Gaming monitor cost only ~527$USD. (~800$ CAD with import taxe)

If I have gone with nVidia GTX1080(~748$ CAD), I would have to get an equivalent Acer Predator x34 Ultrawide 3440x1440 IPS GSync 60Hz-100Hz gaming monitor. And that's is about 1080$ USD. Or ~1628$ CAD

I believe I have saved at least 550$ USD in monitor cost alone, which is a brand new RX Vega 64.
The CrossOver monitor is on sale on fleabay atm. Anyone interested can go ahead and get it, grab a Rx Vega while your at it too.

Oh and BTW, my CrossOver have PERFECT pixel(without even specifically ordering a Pixel perfect unit). No dead pixels, no stuck pixels. NONE!
 
Thanks for the very informative article!

I decided to keep my 1080p LED for constant 60fps, even though my graphics card supports G-Sync. Mainly because I'm satisfied with it.
 
1080p is Both a minimum AND a maximum - there is little reason to go beyond it as frame-rates drop and costs Rise.
THAT is why 1080 became a standard in the first place -- is 4k better for Giant screens? well, duh.. yes. If you have a higher than well-above-average income and that is what pleases you, then you should spend ALL your money on tech baubles (when You're buying items that so very few can justify, YOU are driving tech forward, good on yer)
(1080 will prolly be the Last standard dependent upon the -correct- lower, scan number -- a shytpile of pixels Sounds/markets better than number-of-scanlines. Not that Anyone would ever do this, but you -could- have a 4k screen with 480 scanlines)
1080p became the standard because for a long time it was the highest affordable resolution- not because someone arbitrarily decided that anything higher or lower is a waste of money. You're clearly making a generalized statement based on your own income/comfort level: "If I can't comfortably afford it, then you don't need it either".

There's plenty of reason to go beyond 1080p. 1440p and 4K look much sharper... and you get a lot more real estate. We're not talking about a Faberge Egg here. For $1200 one can get an awesome GPU/monitor setup that will bring them joy day in and day out for years. Try taking up some other common hobbies like golf or skiing or motorcycling- none of which require "higher than well-above-average income"- and see how far you get with $1200 over several years.
 
Last edited:
No AMD card can run high-end games at high-end resolutions past 40FPS....

Needs more details. By high resolution do you mean 1080p, 2K, or 4K? What is a "high-end game"? I run two water-cooled r290Xs (a 3-4 year old card) and play all of my games (Overwatch mostly, with some Diablo 3) at >120 FPS at 1080p 144hz monitor.
LOL Overwatch and Diablo 3 run rubbish game engines. They aren't testing AAA title game engines at all.

A high end game is like Battlefield 1, Rise of the Tomb Raider, Gears of War 4, Deus Ex Mankind Divided and so on. And 1080p is not a high-end resolution either - if you are a gaming, this is the bare minimum. Anyone playing on a desktop below that resolution DOESN"T REALLY HAVE A GAMING HARDWARE.
The Sith Lord has hurt the feelings of a Padawan gamer... ;) (emphasis mine)
 
Back