AMD FreeSync Review: Laying the groundwork for the ideal adaptive sync standard

You speak as if technology never comes down in cost.

No I speak as if Nvidia charges a fixed fee, which they do. G-Sync is a brand to Nvidia and I highly doubt they will ever reduce the cost so that cheaper monitors can have it. What really nails home the fact is that most of the people on these cheaper systems are most likely using integrated graphics solutions (intel or amd). Both of which will be using free-sync as Nvidia's G-Sync is only for Nvidia video cards.
 
Nice writeup - that was actually really informative.
One question though:

How does (or could) something like Freesync affect video playback outside of gaming?
Let me give you an example.

I watch NBA games a lot. Often, these are 30fps, occasionally 60fps. Neither are particularly smooth. Assuming I'm rendering on a Freesync capable monitor and GPU, and watching videos using hardware accelerated playback (e.g. DXVA), will I see any benefit?
 
G-Sync is a brand to Nvidia and I highly doubt they will ever reduce the cost so that cheaper monitors can have it.
The only thing that gives that comment credibility is the fact about Adaptive-sync becoming a standard. But you are speaking in absolutes, as if nVidia would have never lowered the price regardless. G-sync will evolve and become mainstream. Sadly though people will call them copy-cats for evolving instead of industry leaders for pioneering the revolution to adaptive-sync.
 
All this bickering aside, As long as they both work everyone can have a choice of which slice of the pie they want, and eat it too. Right? Riiiiight? I honestly don't care who "wins" as long as display technology advances so that we can all enjoy a better visual experience. :cool:
 
I watch NBA games a lot. Often, these are 30fps, occasionally 60fps. Neither are particularly smooth. Assuming I'm rendering on a Freesync capable monitor and GPU, and watching videos using hardware accelerated playback (e.g. DXVA), will I see any benefit?

You won't see any benefit with 30 FPS content as most monitors can show this content without repeating frames, even without FreeSync.

You might see an improvement with 25p or 24p content, which can be repeated to 50 Hz or 48 Hz respectively, which is smoother than repeating frames to 60 or 30 Hz.
 
You might see an improvement with 25p or 24p content, which can be repeated to 50 Hz or 48 Hz respectively, which is smoother than repeating frames to 60 or 30 Hz.
which is why I wish 120hz monitors would come back in full force. At 120hz, 24fps, 30fps and 60fps all natively (read:evenly) scale to the native rate without any effort. When did this 144hz nonsense start and why? I never really thought about it until now.
 
which is why I wish 120hz monitors would come back in full force. At 120hz, 24fps, 30fps and 60fps all natively (read:evenly) scale to the native rate without any effort. When did this 144hz nonsense start and why? I never really thought about it until now.

Most 144 Hz monitors can be driven at 30, 60 and 120 Hz
 
Is that what they call smart shoppers nowadays? Enjoy your inferior 48 and 56Hz minimums.

MSI GeForce FX 5200 128MB
XFX GeForce 6600 LE 256MB*
BFG GeForce 6600 GT 256MB
MSI GeForce 7600 LE 256MB*
BFG GeForce 7900 GS 512MB (x2 SLi)
Zotac GeForce 8800 GT 512MB
Sapphire Radeon 4830 512MB
Diamond Radeon 4850 1GB*
Sapphire Radeon Vapor-X 5770 1GB
Sapphire Radeon 6950 2GB (x2 Crossfire)
Sapphire 7870 Gigahertz Edition 2GB
MSI GTX 970 4GB

What part of that list screams nVIDIA fanboy to you? Hmm?
*Cards I bought used.

I never even said I was buying a freesync monitor so don't tell me what I will or will not enjoy.

Secondly they clearly said the minimums will change but at launch this is what is it for now.

I don't care if you owned 100 Radeons and 1 Geforce in the past. What you wrote clearly shows your bias and you currently are a Nv Fanboy which everyone reading this can see just accept it and move on there is no need to be in denial.
 
which is why I wish 120hz monitors would come back in full force. At 120hz, 24fps, 30fps and 60fps all natively (read:evenly) scale to the native rate without any effort. When did this 144hz nonsense start and why? I never really thought about it until now.

Most 144 Hz monitors can be driven at 30, 60 and 120 Hz
Are these newer displays simply limiting it's refresh rate down to whatever output says it should be? Or is it repeating frames (or really fancy displays use interpolation) to upconvert to the native resolution? I ask because I don't know, and my admittedly old understanding was if it's doing the later, if you are not outputting an even factor of the refresh rate, stuttering can occur.
 
I never even said I was buying a freesync monitor so don't tell me what I will or will not enjoy.

Secondly they clearly said the minimums will change but at launch this is what is it for now.

I don't care if you owned 100 Radeons and 1 Geforce in the past. What you wrote clearly shows your bias and you currently are a Nv Fanboy which everyone reading this can see just accept it and move on there is no need to be in denial.

I've been ignoring the bickering going on here but screaming fanboy when someone doesn't agree with you is just about the dumbest thing going today...

I personally want to see head to head comparisons of Gsync vs Async. See the real pros and cons side by side. So far I don't have that information... It's all speculation. I myself used to work for AMD. So I was partial to them for a very long time. Now I've been running Nvidia since the 600 series when the 7970 wouldn't do what I needed it to. I've seen Gsync, I haven't seen Async. But I'm all about cheaper options to the users also... Plus, competition drives prices down regardless. Gsync might even end up being an otherwise free module instead of the $200 option for manufacturers, etc. Time will tell.

On paper, I see pros for both sides. Cons also. I want to see the shootouts! Right now I agree that the available 48Hz minimums are a bad thing for Async setups. But that's not in the spec, that's the manufacturer's. I believe the spec is clear way down to 2 or something... Again, time will tell how that works out.

I'll be doing a giant upgrade (multiple monitors/SLI or CF, etc.) hopefully around the holidays this year. I hope they have that sorted out by then. Also, WHY can't they put both on the same monitor? Even of they have two separate 'video input' ports, one for each, it is still possible... Especially if Gsync price drops, I see that in the future also unless one concedes to the other.
 
I've been ignoring the bickering going on here but screaming fanboy when someone doesn't agree with you is just about the dumbest thing going today...

I personally want to see head to head comparisons of Gsync vs Async. See the real pros and cons side by side. So far I don't have that information... It's all speculation. I myself used to work for AMD. So I was partial to them for a very long time. Now I've been running Nvidia since the 600 series when the 7970 wouldn't do what I needed it to. I've seen Gsync, I haven't seen Async. But I'm all about cheaper options to the users also... Plus, competition drives prices down regardless. Gsync might even end up being an otherwise free module instead of the $200 option for manufacturers, etc. Time will tell.

On paper, I see pros for both sides. Cons also. I want to see the shootouts! Right now I agree that the available 48Hz minimums are a bad thing for Async setups. But that's not in the spec, that's the manufacturer's. I believe the spec is clear way down to 2 or something... Again, time will tell how that works out.

I'll be doing a giant upgrade (multiple monitors/SLI or CF, etc.) hopefully around the holidays this year. I hope they have that sorted out by then. Also, WHY can't they put both on the same monitor? Even of they have two separate 'video input' ports, one for each, it is still possible... Especially if Gsync price drops, I see that in the future also unless one concedes to the other.

I didn't say that because he didn't agree with me that is your presumption.
I said it because you can clearly see it in his post there is obvious bias there. I'm not the only person that spotted it.

He also called himself a fan boy in his own post.
 
Last edited:
Excellent right up @Scorpus, glad to see your view on the FreeSync right now and I am glad to hear it works well.

I will say, I think both these techs have more of a point at refreshes well above 60FPS in the range of 75-144 where maintaining those settings is very difficult even with the highest machine. I do agree that we need the monitors to at least support down to 40FPS for stability though as 48 seems a bit to high for a monitor like that.

I will be waiting to see what the prices are like on the 4K monitors though personally. Would love to game at 144hz or something but in this case I am going to see what happens with the higher resolution variants as it maybe enough for me if I feel like finally making the investment. Though I won't even consider it until I see the CFX driver working.
 
I'll be doing a giant upgrade (multiple monitors/SLI or CF, etc.) hopefully around the holidays this year. I hope they have that sorted out by then. Also, WHY can't they put both on the same monitor? Even of they have two separate 'video input' ports, one for each, it is still possible... Especially if Gsync price drops, I see that in the future also unless one concedes to the other.
I'll be building a new computer from scratch around that time as well, which is why I sincerely hope AMD's driver support for FreeSync matures quickly, and that display manufacturers get their butts in gear. I want to see a more level playing field head to head comparison like SirGCal.

As for combining the two, I wonder how hard it would be to simply make it an option in the display's OSD to turn G-Sync on or turn it off to "failover" to a standard display port controller (Which would have Adaptive Sync, naturally). You could have a separate connector if they cannot gang on a single connector. However, that would probably needlessly increase the price of the monitor since you would effectively have a pair of scaler and controller boards...
 
Are these newer displays simply limiting it's refresh rate down to whatever output says it should be? Or is it repeating frames (or really fancy displays use interpolation) to upconvert to the native resolution?
Interesting. When you think about it all we need is adaptive-sync between the range or 73 to 144. All lower frame rates can be repeated within the 98 to 144 range.

Adaptive-sync range of 98 to 144: ()
  • 17 - 24 (6 frame repeat)
  • 20 - 28 (5 frame repeat)
  • 25 - 36 (4 frame repeat)
  • 33 - 48 (3 frame repeat)
  • 49 - 72 (2 frame repeat)
Only covers up to 72 FPS, so Adaptive-sync would need to start at 73 FPS. The additional 25 FPS when dropping to 73 will only lower the number of needed frame repeats.
 
The only thing that gives that comment credibility is the fact about Adaptive-sync becoming a standard. But you are speaking in absolutes, as if nVidia would have never lowered the price regardless. G-sync will evolve and become mainstream. Sadly though people will call them copy-cats for evolving instead of industry leaders for pioneering the revolution to adaptive-sync.

Nvidia definitely isn't a copy cat although every company is guilty of taking a note or two for the competition.

I base my above assumption on a few things. The first is that Nvidia is known to lock up proprietary tech and to not budge on it's use. An example of this is PhysX. They purchased this tech a while ago and have used it exclusively for their cards. The big problem is they have not optimized or put much effort into actually making it into a worth while physics engine. I can't ever remember a news article about the awesome improvements Nvidia is doing to PhysX. All I ever see is how poorly it runs on other brands of video cards and that is mostly a gimmick.

The second is that Nvidia didn't create an open standard. If Nvidia really wanted their tech to reach mass adoption they would have come up with a method that worked for everyone, not the other way around. Now what we get from Nvidia is a very controlled G-Sync market in which the only one who stands to make the most profit is Nvidia.

I'm giving Nvidia the benefit of the doubt here but the give me, the customer, no reason to believe they are working to make G-Sync a mass adoptable tech. Nvidia will do what it has always done, expand and market a premium brand.
 
I never even said I was buying a freesync monitor so don't tell me what I will or will not enjoy.

Secondly they clearly said the minimums will change but at launch this is what is it for now.

I don't care if you owned 100 Radeons and 1 Geforce in the past. What you wrote clearly shows your bias and you currently are a Nv Fanboy which everyone reading this can see just accept it and move on there is no need to be in denial.

Think of it like this. G-Sync is Mantle and FreeSync is DX11. The module is giving you to-the-metal access, and FreeSync is the software solution to that. nVIDIA has direct control of their module and drivers, while AMD has drivers and hope that monitor manufacturers choose the best hardware available to them at a good price, and that's not a lot of control.

FreeSync is a modified version of 1.2a, and adopted the 1.2a+ name, and monitor manufacturers don't even have to support it. The whole point of G-Sync was to get rid of tearing and input lag across the entire range of the monitors refresh rate, AND make sub 60fps gaming smooth, and they've done that... and FreeSync hasn't. Because you can play at such low fps (even below 30fps) with G-Sync, that means you can keep your GPU longer and you don't get that with FreeSync. FreeSync currently still requires a minimum ~50fps to see it's benefits, which is no different from what you want with a standard 60Hz monitor. As a result, you're paying $700 just for adaptive sync above 60fps. That's its biggest problem, and it's a huge deal breaker.

I'd rather pay more for G-Sync and get the full experience AND keep my GPU longer, versus getting half of it, while still having to honor the ~50fps minimum at all times to avoid hitching and stutters.

You can't say FreeSync is the same [as G-Sync], because it isn't there yet. Like I said in my original comment, FreeSync is a good START, but it has a ways to go.
 
Last edited:
The only thing that gives that comment credibility is the fact about Adaptive-sync becoming a standard. But you are speaking in absolutes, as if nVidia would have never lowered the price regardless. G-sync will evolve and become mainstream. Sadly though people will call them copy-cats for evolving instead of industry leaders for pioneering the revolution to adaptive-sync.

Nvidia definitely isn't a copy cat although every company is guilty of taking a note or two for the competition.

I base my above assumption on a few things. The first is that Nvidia is known to lock up proprietary tech and to not budge on it's use. An example of this is PhysX. They purchased this tech a while ago and have used it exclusively for their cards. The big problem is they have not optimized or put much effort into actually making it into a worth while physics engine. I can't ever remember a news article about the awesome improvements Nvidia is doing to PhysX. All I ever see is how poorly it runs on other brands of video cards and that is mostly a gimmick.

Cool PhysX stuff:

PhysX open to UE4 devs:
http://blogs.nvidia.com/blog/2015/03/04/nvidia-opens-physx-code-to-ue4-developers/
 
Last edited:

I hope you know that your post only furthers my point. Nvdia is "allowing" UE4 to use PhysX so they can be in more games. They aren't opening up PhysX to further improvements nor allowing more video cards to take advantage of it.

The only thing that will come of this is possibly more PhysX games and more crippled AMD hardware. I'll take Havoc over PhysX any day.
 
So I've seen this article linked a few times now and I feel it's necessary to correct a gross misunderstanding by the author on how G-Sync works when below a panels normal refresh range.

The core difference between FreeSync and G-Sync is now they handle sub the 40 range. FreeSync goes into 'dumb panel' mode outside it's panel range (say 40-144hz) while G-Sync uses it's technology to drive the panel at perfect ratios of the FPS inside the panel range. Now the VESA spec for Adaptive Sync does allow for a 9-144hz range, but the reality is that due to LCD technology anything below 40hz is mainly theoretical due to ghosting and flickering effects. This is why nVidia abandoned their initial sync technology and opted for the module, the knew that technology like 'FreeSync' couldn't handle these sub 40 rates.

In this low range, FreeSync operates just like any other non-sync panel so you have to deal with stuttering or tearing depending on if you have Vsync on or off. The solution by G-Sync is why the module is needed. It takes a framerate like 29 FPS and instead of just turning sync off it instead drives the panel at 58hz which is a perfect ratio and eliminates both the stutter and tearing effects that in inherent with FreeSync. Let's say the framerate drops again to 14 FPS, now G-Sync drives the panel at 42hz, another perfect ratio.

But don't take my word, for justice and science!
http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

TLDR: In the sub 40 range G-Sync is technologically superior to FreeSync. In other words G-Sync actually WORKS in this range.
 
Back