FreeSync vs. G-Sync: 2017 Update, What You Need to Know

No AMD card can run high-end games at high-end resolutions past 40FPS....

Needs more details. By high resolution do you mean 1080p, 2K, or 4K? What is a "high-end game"? I run two water-cooled r290Xs (a 3-4 year old card) and play all of my games (Overwatch mostly, with some Diablo 3) at >120 FPS at 1080p 144hz monitor.
LOL Overwatch and Diablo 3 run rubbish game engines. They aren't testing AAA title game engines at all.

A high end game is like Battlefield 1, Rise of the Tomb Raider, Gears of War 4, Deus Ex Mankind Divided and so on. And 1080p is not a high-end resolution either - if you are a gaming, this is the bare minimum. Anyone playing on a desktop below that resolution doesn't really have gaming hardware.
Actually, while you want to justify spending more money on your hardware, that's not necessarily what makes you have gaming hardware or not. The best gamer may very well be playing on cheaper hardware and running surprisingly low resolution because of reaction time. Play at whatever resolution you want, just target your frame rates above 60fps. You will be able to be a better gamer than the guy you just killed because he thinks his 4k 144hz multi gpu rig is better because he can see the leaves on the tree move 2 miles away. Resolution is nice if your hardware can support it, but when you start losing frames just to increase your resolution, you are not a gamer, your a tourist.
 
Is there any perceivable difference in the way a good Freesync and Gsync monitor operates? I have a Gsync monitor and am curious to know if I am paying for nothing
 
Price and quality notwithstanding, the only real reason to choose one over the other is the video card you are using.... Until Vega, AMD has no high-end cards that compete with Nvidia's 1080, 1080Ti, TitanX, etc... So if you want to game with a pricy adaptive sync monitor, your only real choice is GSync.... No AMD card can run high-end games at high-end resolutions past 40FPS....

Let's hope Vega changes this and gives Freesync a reason to exist.
Actually GSync and Freesync are more useful for people who have mid to low range GPUs where you can't get 60+ FPS even with lowered settings and you'll see a lot of tearing and input lag. High-end cards (GTX 1080 or better) can do just fine even without adaptive sync with the exception of maybe 4K gaming at ultra settings.
This is why the lowered price of freesync is so important as it caters to the market that actually sees the most benefit from this technology. (100-250$ GPUs)
 
I never liked the execution of FreeSync. It was a quickly pushed out tech built on top of an existing standard in response to G-Sync and too much of the weight is on the monitor manufacturers to improve it, and not AMD.

Although more expensive, G-Sync is still the superior option. Personally I'm fine with just a high refresh monitor.

By the looks of this article FreeSync is STILL a mess.
Since when does having options = a mess?
Freesync goes from the mid-range monitors to high-end. GSync is just for high-end monitor. Outside of limited sales Gsync monitors are all above 400$ which puts them well outside of the price range that the majority of gamers are willing to pay for a monitor. a 1080p 144Hz GSync monitor costs 400-500$ for crying out loud. It's stupid expensive.
Freesync does indeed require you to look at the specs of monitors more closely, but it's not that hard to do and those who don't, don't care/need or don't know what adaptive sync is.
 
Actually GSync and Freesync are more useful for people who have mid to low range GPUs where you can't get 60+ FPS even with lowered settings and you'll see a lot of tearing and input lag. High-end cards (GTX 1080 or better) can do just fine even without adaptive sync with the exception of maybe 4K gaming at ultra settings.
This is why the lowered price of freesync is so important as it caters to the market that actually sees the most benefit from this technology. (100-250$ GPUs)
For the average Freesync monitor, yes... but if you are a HIGH-END gamer... and paying for the most expensive monitors (huge 4K and 1440p ones), then freesync is useless as you can't even get to 40FPS at highest settings with any AMD card....You need a high-end nvidia card to drive those pixels - so GSync is the only way to go.... for now....
 
Since when does having options = a mess?
Freesync goes from the mid-range monitors to high-end. GSync is just for high-end monitor. Outside of limited sales Gsync monitors are all above 400$ which puts them well outside of the price range that the majority of gamers are willing to pay for a monitor. a 1080p 144Hz GSync monitor costs 400-500$ for crying out loud. It's stupid expensive.
Freesync does indeed require you to look at the specs of monitors more closely, but it's not that hard to do and those who don't, don't care/need or don't know what adaptive sync is.

You're forgetting even gamers are lazy and will just look for FreeSync on the box.

G-Sync is more expensive (Nvidia approves the panels, keeps module updated), but you are guaranteed the best experience. FreeSync is a bit of hit and miss.
 
I never liked the execution of FreeSync. It was a quickly pushed out tech built on top of an existing standard in response to G-Sync and too much of the weight is on the monitor manufacturers to improve it, and not AMD.

Although more expensive, G-Sync is still the superior option. Personally I'm fine with just a high refresh monitor.

By the looks of this article FreeSync is STILL a mess.
Since when does having options = a mess?
Freesync goes from the mid-range monitors to high-end. GSync is just for high-end monitor. Outside of limited sales Gsync monitors are all above 400$ which puts them well outside of the price range that the majority of gamers are willing to pay for a monitor. a 1080p 144Hz GSync monitor costs 400-500$ for crying out loud. It's stupid expensive.
Freesync does indeed require you to look at the specs of monitors more closely, but it's not that hard to do and those who don't, don't care/need or don't know what adaptive sync is.

But what if you have an AMD video card and you want the features of a G-Sync monitor?

Seems like everything Nvidia is stupid expensive. Crazy but they sound like Intel.
 
But what if you have an AMD video card and you want the features of a G-Sync monitor?

Seems like everything Nvidia is stupid expensive. Crazy but they sound like Intel.
beside the proprietary gsync tech, most monitors have both gsync and freesync equivalents. there are also some features that freesync has but gsync doesn't.
in the end your monitor purchase is dictated by you GPU (unless you already have the monitor in which case you'll have to pick a GPU :D )
 
Well "2K" has colloquially been taken to mean "1440p" wheres 1080 pixels is nowhere near 2000.
By who?!? As the previous user stated, 2k is 2000x 1000 rounded (1920x1080)... and almost no one uses 2k as a term anyways - we call it 1080p...

The # in front of the “K” has always been the larger digit rounded... hence 4K, 8K, etc...
 
By who?!? As the previous user stated, 2k is 2000x 1000 rounded (1920x1080)... and almost no one uses 2k as a term anyways - we call it 1080p...

The # in front of the “K” has always been the larger digit rounded... hence 4K, 8K, etc...
Well the larger digit in my case is 2560 which clips down to 2K also...

But that said I always refer to my res as being "1440p" or "QHD"
 
Last edited:
I thought I'd surprise my brother by saying I was now at 1440p. He said he was on 4K. And I said "Great but I can GAME on mine at Ultra". He didn't respond.
 
Given the price difference wouldn't it make more sense to spend the money on a higher spec graphics card and pair it w a standard monitor? If it can pump out >60 FPS it obviates the need for adaptive sync anyhow and is likely a more future-proof option...
 
Given the price difference wouldn't it make more sense to spend the money on a higher spec graphics card and pair it w a standard monitor? If it can pump out >60 FPS it obviates the need for adaptive sync anyhow and is likely a more future-proof option...
let's not forget that adaptive sync helps even at 60+ fps (it stops that annoying screen tearing). and the higher the res of your monitor the easier it will be to get sub 60 FPS in some of the more demanding titles. but as you start upgrading your GPU it makes more sense to have a monitor that supports both adaptive sync and high refresh rates.
the idea is that with freesync you don't pay a lot extra (certainly not enough to upgrade your GPU) while with gsync you do have to pay a lot.
 
Please add this monitor to the list of good FreeSync Monitors

Crossover UW3535 TIO HDR Turbo Boost Clock 120Hz 35" Curved Gaming Monitor NEW Sept 2017 model!!

Edit: if anyone is grabbing a CrossOver Monitors make sure you buy a certify DisplayPort 1.2 cable, the cable that comes with it isn't up to the task.
 
Last edited:
Price and quality notwithstanding, the only real reason to choose one over the other is the video card you are using.... Until Vega, AMD has no high-end cards that compete with Nvidia's 1080, 1080Ti, TitanX, etc... So if you want to game with a pricy adaptive sync monitor, your only real choice is GSync.... No AMD card can run high-end games at high-end resolutions past 40FPS....

Let's hope Vega changes this and gives Freesync a reason to exist.

I know this article is about 1/2 a year old, and by now Vega's out & a known quantity, but I just couldn't stop myself from replying to point out just how much absolutely ridiculously BS stuffed FUD is in this post (even before Vega was available). To the poster, near everything you said is pure, and utter bull****.

First off, unless we're talking maxed out AAA gaming @ 4K/60hz (which not even a GTX 1080 Ti can do) AMD already had plenty fast enough cards available PRIOR TO VEGA to properly take advantage of a 1440p/144Hz or 4K/60Hz monitor with FreeSync. The R9 Fury & Fury X (AMD Fiji based GPU's) are both killer 1440p cards that'll push FAR above 40fps in most titles, with the latter generally falling right behind the "fast enough for adaptive sync because it says "Nvidia" on it" GTX 1070, and the regular Fury isn't too far behind (both of which were DRAMATICALLY cheaper, nearly by 1/2 than the 1070 in 2016 & pre-crypto boom 2017). Hell, even the comparatively ancient (launched 2012) "Hawaii" based cards (290/X, 390/X) kick surprising amounts of arse at 1440p (and absolutely burn any Kepler [GTX 600/700] cards to the freaking ground), and despite being 5+ years old will more often than not, exceed your "40fps" mark in the latest titles.

And if you crank the resolution up even further to 4K, the Fury X's HBM advantage erases the small gap, bringing it up to a near deadheat with the GTX 1070, and in games where it isn't crazy VRAM limited, is often even a tich faster (being HBM with it's 512GB/s memory bandwidth, it performs far better in that regard than any regular 4GB card, but still can't compete with having 8GB in those few VRAM killing corner cases).

Lastly, adaptive sync is most beneficial at the LOW END of GPU's, not the high-end, which is plainly obvious if you've ever tested them with both or understand how the tech works. Adaptive Sync let's your not quite fast enough for a locked 60fps low-mid range card, have an equally smooth experience at 40-60fps, without having to start knocking settings down like you would otherwise. Pretty sure this is even discussed in the article. (And is a MAJOR benefit for AMD/FreeSync with it near a free "value-add" in today's monitors, and not another $200 expense, on top of the ≈$200 they already paid for their card). Next time try doing some actual research before you just start pulling random crap outta your arse.
 
Last edited:
I know this article is about 1/2 a year old, and by now Vega's out & a known quantity, but I just couldn't stop myself from replying to point out just how much absolutely ridiculously BS stuffed FUD is in this post (even before Vega was available). To the poster, near everything you said is pure, and utter bull****.

First off, unless we're talking maxed out AAA gaming @ 4K/60hz (which not even a GTX 1080 Ti can do) AMD already had plenty fast enough cards available PRIOR TO VEGA to properly take advantage of a 1440p/144Hz or 4K/60Hz monitor with FreeSync. The R9 Fury & Fury X (AMD Fiji based GPU's) are both killer 1440p cards that'll push FAR above 40fps in most titles, with the latter generally falling right behind the "fast enough for adaptive sync because it says "Nvidia" on it" GTX 1070, and the regular Fury isn't too far behind (both of which were DRAMATICALLY cheaper, nearly by 1/2 than the 1070 in 2016 & pre-crypto boom 2017). Hell, even the comparatively ancient (launched 2012) "Hawaii" based cards (290/X, 390/X) kick surprising amounts of arse at 1440p (and absolutely burn any Kepler [GTX 600/700] cards to the freaking ground), and despite being 5+ years old will more often than not, exceed your "40fps" mark in the latest titles.

And if you crank the resolution up even further to 4K, the Fury X's HBM advantage erases the small gap, bringing it up to a near deadheat with the GTX 1070, and in games where it isn't crazy VRAM limited, is often even a tich faster (being HBM with it's 512GB/s memory bandwidth, it performs far better in that regard than any regular 4GB card, but still can't compete with having 8GB in those few VRAM killing corner cases).

Lastly, adaptive sync is most beneficial at the LOW END of GPU's, not the high-end, which is plainly obvious if you've ever tested them with both or understand how the tech works. Adaptive Sync let's your not quite fast enough for a locked 60fps low-mid range card, have an equally smooth experience at 40-60fps, without having to start knocking settings down like you would otherwise. Pretty sure this is even discussed in the article. (And is a MAJOR benefit for AMD/FreeSync with it near a free "value-add" in today's monitors, and not another $200 expense, on top of the ≈$200 they already paid for their card). Next time try doing some actual research before you just start pulling random crap outta your arse.

Well, this IS an old thread... I was talking about HIGH-END gaming - 4K resolution with 40+ FPS on AAA games... which, at that time, only Nvidia cards could do... Your BS about Fury cards notwithstanding, I'd love to see some benchmarks FROM LAST YEAR showing them running 4k AAA games at well over 40FPS (let alone 60FPS).

We had already ascertained that Freesync was excellent at the low end... no one was arguing that... My argument, that pretty much everyone agreed to, was that at the ridiculously overpriced high-end of adaptive sync tech, it only really made sense to purchase Gsync.

Unfortunately, it turned out that Vega wasn't really the Nvidia-killer that we all hoped for, so I can't in all honesty recommend a Freesync monitor at the uber-high-end price even now.

Saying that, I wouldn't really recommend an Nvidia Gsync monitor at the high end either... but if you just MUST have an adaptive sync monitor for your uber setup, it really only makes sense, STILL, to go Nvidia.
 
Well, this IS an old thread... I was talking about HIGH-END gaming - 4K resolution with 40+ FPS on AAA games... which, at that time, only Nvidia cards could do... Your BS about Fury cards notwithstanding, I'd love to see some benchmarks FROM LAST YEAR showing them running 4k AAA games at well over 40FPS (let alone 60FPS).

We had already ascertained that Freesync was excellent at the low end... no one was arguing that... My argument, that pretty much everyone agreed to, was that at the ridiculously overpriced high-end of adaptive sync tech, it only really made sense to purchase Gsync.

Unfortunately, it turned out that Vega wasn't really the Nvidia-killer that we all hoped for, so I can't in all honesty recommend a Freesync monitor at the uber-high-end price even now.

Saying that, I wouldn't really recommend an Nvidia Gsync monitor at the high end either... but if you just MUST have an adaptive sync monitor for your uber setup, it really only makes sense, STILL, to go Nvidia.
Lol sure you did *rolls eyes*. You do realize what you are saying right? That adaptive sync is only worth it if you have a GTX 1080 Ti (and was completely pointless prior to it's existence) & a 4K/60Hz monitor. You know, because 4K is just sooooo popular *facepalm*, and so many gamers prefer the extra resolution on 20-40" monitor to a faster refresh rate. Yup, let's just forget that 1440p/144Hz or 1080p/240Hz and all that good stuff exists. And let's ALSO forget that the Fury X & GTX 1070 put out about ≈30-35fps in the latest AAA titles and will easily remain within the 30-60Hz & 40-60Hz adaptive sync ranges of available 4K panels with only minor tweaks to graphics settings.

You're right, buying a 4K/60Hz adaptive sync monitor generally doesn't make sense unless you have a GTX 1080 Ti, unless you are willing to fiddle with settings, but to say there's no point in using A.S. on cheaper cards with 1080p & 1440p monitors instead is just ****ing ridiculous rofl. 1080p & 1440p monitors come with plenty huge enough adaptive sync ranges, for even the slowest modern cards to be able to stay above the minimum FPS, and thus have a DRASTICALLY improved gameplay experience (and it is drastic, which you'd know if you'd ever actually have used one before instead of just pulling stuff for outta your ***; and much more so with the low end cards with trouble hitting 60fps+, than the other way around).

And, let's be real, 4K is largely a waste on a monitor, and even more so for gaming 1st & foremost seeing the lack of adequately powerful enough GPU's. I have both a 4K/60Hz screen for work & a 1440p/144Hz (26-144Hz FreeSync range with LFC) gaming monitor, and aside from the difference in usable screen real estate, I prefer using & looking at the latter FAR more than the former.
 
Lol sure you did *rolls eyes*. You do realize what you are saying right? That adaptive sync is only worth it if you have a GTX 1080 Ti (and was completely pointless prior to it's existence) & a 4K/60Hz monitor. You know, because 4K is just sooooo popular *facepalm*, and so many gamers prefer the extra resolution on 20-40" monitor to a faster refresh rate. Yup, let's just forget that 1440p/144Hz or 1080p/240Hz and all that good stuff exists. And let's ALSO forget that the Fury X & GTX 1070 put out about ≈30-35fps in the latest AAA titles and will easily remain within the 30-60Hz & 40-60Hz adaptive sync ranges of available 4K panels with only minor tweaks to graphics settings.

You're right, buying a 4K/60Hz adaptive sync monitor generally doesn't make sense unless you have a GTX 1080 Ti, unless you are willing to fiddle with settings, but to say there's no point in using A.S. on cheaper cards with 1080p & 1440p monitors instead is just ****ing ridiculous rofl. 1080p & 1440p monitors come with plenty huge enough adaptive sync ranges, for even the slowest modern cards to be able to stay above the minimum FPS, and thus have a DRASTICALLY improved gameplay experience (and it is drastic, which you'd know if you'd ever actually have used one before instead of just pulling stuff for outta your ***; and much more so with the low end cards with trouble hitting 60fps+, than the other way around).

And, let's be real, 4K is largely a waste on a monitor, and even more so for gaming 1st & foremost seeing the lack of adequately powerful enough GPU's. I have both a 4K/60Hz screen for work & a 1440p/144Hz (26-144Hz FreeSync range with LFC) gaming monitor, and aside from the difference in usable screen real estate, I prefer using & looking at the latter FAR more than the former.
So I'm going to conclude that you didn't actually READ the entire thread... because we are pretty much in agreement here.... Freesync monitors make sense in the low-mid range, not at 4k 40+FPS.... While Gsync doesn't make much sense at the high end either, if you MUST have adaptive sync, then it's the only option at that end...

You necro'd a 6 month old thread to AGREE with me yet pretend I'm full of crap??
 
Back