TechSpot

AMD FreeSync Review: Laying the groundwork for the ideal adaptive sync standard

By Scorpus
Mar 19, 2015
Post New Reply
  1. amd freesync review amd lg monitor g-sync freesync adaptive sync 34um67

    While they may be entering the market second with their version of the technology, AMD has laid the groundwork with FreeSync for the ideal adaptive sync standard going forward. The company has delivered on their promises to create a cheaper, more flexible, open standard for variable refresh.

    The core experience delivered by adaptive sync is inherently the same regardless of whether you choose FreeSync or G-Sync. Wherever possible, the display will refresh itself at the instant a frame from the GPU has finished rendering, removing stuttering, tearing and general jank that is present from fixed-refresh solutions. The result is smoother, more responsive gameplay, which makes playing at 40 FPS feel just as good as 60 FPS.

    In this article we'll go through some of the main differences between AMD and Nvidia's implementations, why monitor OEMs will play a crucial role for FreeSync to reach its full potential, and whether AMD's adaptive sync technology is the better choice for gamers compared to Nvidia's closed G-Sync.

    Read the complete review.

     
  2. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,333   +267

    I'm glad to see that FreeSync seems to (in the author's eyes) live up to the hype and offers a similar experience to G-Sync, and if anything his points of contention were aimed at the display rather than the technology. Adaptive sync is something that should have been implemented years ago, and I eagerly await to see what the market, especially the Display market does in the near future. I will also keep an eye out for reviews that include crossfire performance as well, since I would imagine most people pushing greater than 1440p are likely using SLI/Crossfire.
     
    Puiu, Evernessince and Julio Franco like this.
  3. amstech

    amstech TechSpot Enthusiast Posts: 1,455   +606

    Great write-up, graphs and opinions on the technology. I'm pretty impressed with both but I like how AMD doesn't seem to have as many strings attached.
    Fanboys will argue to Mars and back over which one is better but as of right now it looks like they will both be well received.
    The real question for most gamers now is, 1440p @ 120Hz/144Hz or 4K @ 60Hz. Personally, I'm waiting for the top dog GPU's to not suck like they do now and hopefully the future cards being released have performance that actually merit's their price tags.
     
  4. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,630   +431

    Call me an nVIDIA fanboy (I'm not, but some will after reading this), but even though FreeSync is the "cheaper" option, the execution of the technology as a whole is abysmal.

    -Offering DVI and HDMI alongside DP Will cause confusion. There is a reason G-Sync monitors are DP ONLY.
    -Unacceptable minimum refresh rates you have to hunt for determined by monitor manufacturers rather than a fixed or at least reasonable range implemented across ALL FreeSync supported monitors.
    -Limited number of supported AMD GPU's.
    -Ghosting
    -Price. Yea I said it!
    -Drivers...

    I have more issues with the tech, but I'll stop here. The only positive thing I have to say about FreeSync, is it's a good start, but nowhere near where it should be at this stage if G-Sync is its competition.

    And it's not so "cheap" in Canada (yea yea our dollar is weak right now, but GOT DAMN!). Especially for a minimum 56Hz refresh!!! That's just insulting.

    BenQ XL2730Z
    http://www.newegg.ca/Product/Produc...cm_re=BenQ_XL2730Z-_-9SIA7BB2ND6057-_-Product
     
    Last edited: Mar 19, 2015
  5. Uncle Al

    Uncle Al TS Evangelist Posts: 1,660   +767

    You know, you make that thing curved and replace the Google self-driving car windshield with it, you could have a lot more fun commute every morning and night!
     
  6. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,333   +267

    Disclaimer: I'm not an ATI/AMD fanboy, the last card of theirs I've purchased was the Radeon 7000. lol

    @hahahanoobs

    The point that AMD would like to put across is that FreeSync can be supported in any monitor as long as it supports the Adaptive Sync standard. To wit, it is an option that consumers can take advantage of if they want to when purchasing what AMD hopes to be a basic monitor in the future. I doubt including other interconnects will be a point of confusion for anyone getting a monitor to specifically use FreeSync.

    The minimum refresh rate currently is pretty bad, without question. But that again is a problem with the manufacturer, and shortly after "launch." G-Sync has had some time to mature as well as having the advantage as being marketed specifically for gaming to a much stronger extent that FreeSync (in my opinion). You cannot fault FreeSync specifically for what the manufacturer does since other than needing to support DP Adaptive Sync they can do whatever they want, but I do hope said manufacturers will figure out they are severely limiting a feature they will try to market the hell out of for having.

    Incidentally, the ROG Swift has a minimum vertical refresh rate of 50hz unless I am reading it wrong, so that's not actually that much better. The Acer XB280HK tops out at 60hz at it's native resolution. So really, I think manufacturers are still trying to figure out how to make these displays right in general. I missed the part about ghosting, where was that in the review?

    The limited number of supported AMD GPUs will be self correcting over time, that is practically a moot point. "Out of box" driver support is lacking for sure, but like all drivers will mature over time. Eventually Crossfire support will be there, and I for one would be interested in how it does.

    I'm sorry the Canadian Dollar is weak. That's a painful price, practically a $200 difference compared to the US listing. You can blame global economies on that, not the tech. However it does raise the question of why that monitor is expensive to begin with... probably a money grab by BenQ since it's something new they can market for having.
     
    GhostRyder, Puiu and Evernessince like this.
  7. Evernessince

    Evernessince TS Evangelist Posts: 1,168   +575

    Literally all the things you listed are on a monitor to monitor basis except for drivers.

    I highly doubt you've even had a chance to try free-sync out so you're just shooting up a storm for no reason.
     
    Steve and Puiu like this.
  8. Puiu

    Puiu TS Evangelist Posts: 1,900   +528

    1. offering more than 1 cable option doesn't cause confusion
    2. refresh rates vary from monitor to monitor. just wait for a good monitor that suits your needs to come out. (you can get monitors that support 40Hz now)
    3. as it is with any new technology, it's best to wait until at least the second generation to buy it.
    right now both the drivers and monitors aren't yet ready for prime time.
    4. in just 1 year from now you will see tens of freesync compatible monitors and AMD will also release multiple video cards that support it soon.

    All reviews say the same thing:
    1. freesync works as intended
    2. it's cheaper (not amd's fault that canada has some problems, prices will always vary in every country)
    3. AMD delivered even it the technology still needs time to mature.
    4. just buy g-sync if you want a more mature tech now.
     
    Steve likes this.
  9. kingmustard

    kingmustard TS Member Posts: 18

    I'll wait for a 24" 1920x1200 IPS monitor before considering FreeSync.
     
  10. Traciatim

    Traciatim TS Rookie

    Do either of these variable sync techs work when the application being run is not run in full screen mode? I've heard that G-Sync does not and I'd like to hear some confirmation.
     
  11. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,333   +267

  12. kingmustard

    kingmustard TS Member Posts: 18

  13. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,630   +431

    G-Sync does 30Hz minimum across the board. When you buy G-Sync you get the same experience every other G-Sync user gets, and that's how it's done.

    AMD just had to raise their hand and say "Me too" in response to G-Sync, and then effed it all up with their launch. You can put all your blame on monitor manufacturers all you want, but that argument is weak at best. AMD has a history of innovating, and then getting outdone by their competition. This is another example of that.

    G-Sync was a hit with their demo, their DIY kit and their final product (module). Everything about G-Sync has been positive. No one can deny that. The only detractor is the price, but I'd rather pay extra for something I know works over something half-assed.
     
    Last edited: Mar 19, 2015
  14. Puiu

    Puiu TS Evangelist Posts: 1,900   +528

    in a year or 2 I'll look back at your posts and laugh so much. so much stubbornness for a gimmick.

    here's how it should be done (nvidia does it wrong):
    1. all monitors should be able to work with all GPU makers
    2. make it an industry standard and ensure that nobody has to pay royalty fees. this means that you will find both high end and low end monitors with freesync support
    3. attract the biggest OEMs and start a long term production plan (Samsung, LG, Benq, etc) and ensure that you will have a good variation of monitors (prices, performance, panels, etc) that aren't just focused on gaming
     
    GhostRyder likes this.
  15. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,333   +267

    Gotcha. Let me know if you catch wind of one, FreeSync or G-Sync.
     
  16. Lionvibez

    Lionvibez TS Evangelist Posts: 1,101   +345

    I would choose 1200p, 1440p and 1600p at 60Hz with a Freesync monitor over the above.

    Both are still way to taxing on single gpus and I'm not much into SLI or Xfire.

    Don't want to spend $500+ on gpu's to push 4k or 120/144hz.

    When single gpu's get powerful enough to handle 4k at 60hz then I would reconsider.
     
    Last edited: Mar 20, 2015
  17. Lionvibez

    Lionvibez TS Evangelist Posts: 1,101   +345

    You are an Nv fanboy after reading those two post.

    nothing more really needs to be said.
     
    GhostRyder and Steve like this.
  18. CaptainTom

    CaptainTom TS Booster Posts: 157   +61

    Waiting for a 76Hz 5k or a 120Hz 4k display...
     
  19. CaptainTom

    CaptainTom TS Booster Posts: 157   +61

    Hey was right about one thing: we now think he is an nvidia fanboy!
     
    Evernessince likes this.
  20. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,630   +431

    Is that what they call smart shoppers nowadays? Enjoy your inferior 48 and 56Hz minimums.

    MSI GeForce FX 5200 128MB
    XFX GeForce 6600 LE 256MB*
    BFG GeForce 6600 GT 256MB
    MSI GeForce 7600 LE 256MB*
    BFG GeForce 7900 GS 512MB (x2 SLi)
    Zotac GeForce 8800 GT 512MB
    Sapphire Radeon 4830 512MB
    Diamond Radeon 4850 1GB*
    Sapphire Radeon Vapor-X 5770 1GB
    Sapphire Radeon 6950 2GB (x2 Crossfire)
    Sapphire 7870 Gigahertz Edition 2GB
    MSI GTX 970 4GB

    What part of that list screams nVIDIA fanboy to you? Hmm?
    *Cards I bought used.
     
    Last edited: Mar 19, 2015
  21. Scorpus

    Scorpus TechSpot Staff Topic Starter Posts: 1,827   +188

    Honestly, I didn't test this, however I believe it doesn't work outside of fullscreen mode in either implementation
     
  22. Scorpus

    Scorpus TechSpot Staff Topic Starter Posts: 1,827   +188

    No it hasn't. G-Sync adds a significant price to a monitor, which FreeSync doesn't. It's also less flexible than FreeSync, which I described in the review that you probably didn't read.

    There is nothing "half-assed" about FreeSync. It works. But monitor manufacturers haven't implemented it ideally in the case of LG. Grab a different monitor, like those from BenQ or Acer, and it's the same experience as G-Sync
     
    Puiu, GhostRyder and Evernessince like this.
  23. hahahanoobs

    hahahanoobs TS Evangelist Posts: 1,630   +431

    "Everything about G-Sync has been positive. No one can deny that. The only detractor is the price, but I'd rather pay extra for something I know works over something half-assed."

    I covered this.
     
  24. Evernessince

    Evernessince TS Evangelist Posts: 1,168   +575

    So right now your sticking point is that the current monitor line-up doesn't provide a uniform experience? Somehow this equates to half-assed to you?

    If anything, the fact the monitor manufacturers can modify the monitor's free-sync range to their hearts content is a positive for both the customer and company. If companies don't want to spend as much on a tech that's gaming only, they won't have to. On the other hand, you will never see G-Sync on value monitors because of it's massive cost. AMD's tech is the only one suitable for mass adoption because of this.

    What you are doing is trying to take the few early samples that we have and make the whole free-sync brand look bad in turn. In reality, the tech has released and it works as intended and with more features then G-Sync. The only point you have against it is the small sync range in which current monitors operate. Well hell, if that's the only issue right now I sure as heck will be looking for your reaction when cheap monitor's come out supporting frame rates well below 30 fps. Then you can tell me how half baked the tech is.
     
    GhostRyder likes this.
  25. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 8,548   +2,894

    You speak as if technology never comes down in cost. Remind me again, what was the cost of SSD's during the first year they were released to market?

    I'm happy Adaptive-sync is becoming a standard. But without the introduction of G-sync, I wonder where we would be with this standard. Would it have even been a thought?

    Ohh, and I wish people would stop referring to free-sync as free. Hardware changes no matter how small, still have cost. The DP revision simply is not costing AMD anything. AMD played the waiting game for a new DP revision. If it wasn't for nVidia, we would still be waiting for a half-baked solution. At least now we have a solution that is not half-baked, thanks to both nVidia and AMD for opening the door and bringing demand to the table.
     
    madboyv1 likes this.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...