1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

FreeSync monitors now available, and they're cheaper than G-Sync equivalents

By Scorpus ยท 59 replies
Mar 6, 2015
Post New Reply
  1. ChuckyDhaBeast

    ChuckyDhaBeast TS Enthusiast Posts: 50   +29

    No multi-Gpu drivers yet?
    They have been talking about his for so long, and yet, no crossfire support.
    That is dissapointing.
     
  2. Scorpus

    Scorpus TechSpot Staff Topic Starter Posts: 1,971   +231

    The FreeSync spec can do a much wider refresh rate range than G-Sync, from memory it's 9 to 240 Hz or something like that. It's up to monitor manufacturers to include displays that can actually go down that low or up that high.

    If you see a monitor that doesn't go down to 30 Hz (a reasonable rate), blame the manufacturer
     
    Evernessince likes this.
  3. Awww man, but I like sh*tting all over AMD without waiting for review sites to do comparative reviews. Now I have to think about the monitor manufacturers?!? This is all getting too much for me D:
     
  4. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,455   +867

    G-Sync Mobile will not have the same features as monitors do/will with the module.
     
  5. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,455   +867

    I understand that, but it's still listed as a FreeSync monitor and AMD could still get the blame in some form since it's their name behind it, so users may just instead go G-Sync or wait for VRR with DP 1.3 if it can get down to ~30Hz minimums as standard on all VRR monitors.

    Techies in the know may verify the minimum refresh rate before purchase, but that's not standard practice for common folk. When you buy a G-Sync monitor you get the full experience without the homework into the technical jargon, and for those people the premium with it could be/is justified just based on knowing the tech will work.
     
    Last edited: Mar 9, 2015
  6. Evernessince

    Evernessince TS Evangelist Posts: 3,913   +3,364

    Yeah, cuz Nvidia is so much more likely to tell the truth. Tell me this, why would AMD release a tech that isn't going to be for everyday use if just to blow up in their face?

    Thinking, it helps.
     
  7. Evernessince

    Evernessince TS Evangelist Posts: 3,913   +3,364

    Both of them do the same thing (remove screen tearing). The only difference is that FreeSync doesn't really cost any money while G-Sync adds $200 onto the cost. You could buy a free-sync monitor for around the same price as the non-freesync one, so the barrier to entry is pretty much nill.
     
  8. Evernessince

    Evernessince TS Evangelist Posts: 3,913   +3,364

    Screen tearing only occurs when your GPU has more frames that your monitor can display.
    Thanks for the tip. Once you guys get your hands on a freesync monitor, a FreeSync information round-up should accompany the review. Sort of like a "What we knew about FreeSync and What we got".
     
    Puiu likes this.
  9. Burty117

    Burty117 TechSpot Chancellor Posts: 3,487   +1,288

    Well for starters, AMD's current CPU line up is basically a joke, so I wouldn't put it past them releasing a Display tech under the banner of "freesync" and it not work out for them.

    And the name itself is a bit of an oxymoron, "Freesync" yet, you have to buy a pretty expensive screen to gain access, your argument that it's "$150 cheaper" is pretty moot when the cheapest screen is $600. What is $150 to people thinking so spending such money in the first place?

    Besides, not many people have tried freesync and actually come away talking much about it, G-Sync on the other hand Nvidia actively show off. There is a reason for this, no way does Nvidia's chip in the screen just sync frames and refresh rates. It has dedicated memory on board for it. I bet once people get to actually play games on "freesync" we'll find it actually isn't as good as G-Sync, maybe due to frame timing or due to the fact current "freesync" monitors support the lowest of 54Hz. Only time will tell.
     
  10. Puiu

    Puiu TS Evangelist Posts: 3,373   +1,823

    according to this article it seems that every monitor will have it's own minimum refresh rate
    " Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum."
     
  11. Scorpus

    Scorpus TechSpot Staff Topic Starter Posts: 1,971   +231

    This will be sooner than you think ;)

    From what I've seen at trade shows of both G-Sync and FreeSync, both seem pretty damn similar
     
    Evernessince likes this.
  12. Evernessince

    Evernessince TS Evangelist Posts: 3,913   +3,364

    I think you need to read some of the articles you linked. Your google link of screen tearing at 30 fps proves my point that screen tearing occurs when the monitor cannot display the frames being handed to it. When the GPU is rendering less than 60 fps, sometimes the GPU isn't able to render everything at once and information between one and another is mixed.
     
  13. Evernessince

    Evernessince TS Evangelist Posts: 3,913   +3,364

    Coming into a thread about FreeSync and bringing something totally irrelevant like AMD's CPUs is always a good way to start off your comment. By your logic, any entity with a less than stellar product in their portfolio is now inherently evil.

    "your argument that it's "$150 cheaper" is pretty moot when the cheapest screen is $600."

    Oh hey, I have a $600 monitor and can say that $150 for a such a minor improvement isn't worth it. For some reason you seam to speak against saving money even though you are commenting on something you aren't in a position to buy anyways.

    "What is $150 to people thinking so spending such money in the first place?"

    It's pretty sad that here that you're referencing other people and not yourself here. You opinion isn't even worth dirt if you're just assuming what someone like me (ya know, ppl who spend big bucks on monitors) want.

    Your last paragraph is the funniest though.

    "I bet once people get to actually play games on "freesync" we'll find it actually isn't as good as G-Sync, maybe due to frame timing"

    Yes, it would be a pretty big issue if frame timing was off. It's kinda hard to mess that up though since it's the sole function of FreeSync. It's kinda already been accepted as a display port standard too so I HIGHLY doubt they are going to put anything flawed in there. It's VESA's word vs. Nvida.

    "or due to the fact current "freesync" monitors support the lowest of 54Hz"

    If you took the time to read any of the comments here, the author stated that FreeSync supports a much wider range than G-Sync. So much for that premium Nvidia euphoria. That's not gonna soften the blow for those who already spent.

    I've thrashed every one of points, only god knows why you decided to write such a favor-ridden piece.
     
  14. Evernessince

    Evernessince TS Evangelist Posts: 3,913   +3,364

    Take time to comprehend my comment. That's what I am saying. Your monitor is handed extra bits of information from the last frame on lower FPS. Thus you can see multiple frames of content at once.

    Aside from the fact that you avoid any form of actually sustenance, you debase yourself by playing the fanboy card.

    If I remember correctly, mantle is more than achieved it's goal. It's gotten microsoft to adopt a low level API and it gives major performance boosts. It's also in quite a few games. Nuff said.

    ohh wait ....

    Oh dang, someone fails again. Your on a roll too.

    That must be your second account. Why else would you act like those words are your own and reply directly to it? I guess someone has to agree with and it might as well be yourself.
     
  15. Puiu

    Puiu TS Evangelist Posts: 3,373   +1,823

    I'm not a fanboy, but I just don't like people hating on things just because they are misinformed or have preconceived notions. I've defended nvidia in many cases and I've also defended intel.
    I've proved that monitors with freesync will support 30Hz (I think the acer one already does)
    why the hell are you so bent on hating something that is hundreds of dollars cheaper, does almost the same thing and is an industry standard that anyone can adopt?
    both freesync and g-sync exist and the competition between the 2 will improve monitors for all of us. (at least until 1 of them wins majority of the market share in adaptive sync capable monitor sales)
     
    Last edited: Mar 7, 2015
    Evernessince likes this.
  16. DancingDirty7

    DancingDirty7 TS Rookie

    also adding that its just much more noticable above the monitors hz cause there are multiple misaligned frames on the screen in a given moment of screen tearing, but below the screens hz its just one misaligned frame on top of another and less frequent thus people think screen tearing is above monitors hz only.
     
  17. Evernessince

    Evernessince TS Evangelist Posts: 3,913   +3,364

    Thanks for adding that. I know my original post wasn't entirely clear on screen tearing and I got trolled by hahahanoobs because of that. Screen tearing should just be described as misaligned frames resulting from a difference in timing between the GPU and Monitor. Synchronizing the refresh rate with the frame rate solves the issue.
     
    Puiu likes this.
  18. Puiu

    Puiu TS Evangelist Posts: 3,373   +1,823

    meh, like a kid who doesn't like losing an argument. just accept that people here gave you logical arguments for everything you said and don't troll. we're not simpletons here, we all love technology and always make logical decisions based on cold hard facts not emotions.
     
  19. Puiu

    Puiu TS Evangelist Posts: 3,373   +1,823

    you leep saying concerns but you actually have no real arguments.
    what exactly are your so called "concerns"? write them here already
    1. minimum refresh rate? I already showed you that it's monitor based
    2. driver support? you will get driver support soon with multi-gpu coming later
    3. game support? games that don't have it in their options menu will be able to use it if you force it from the driver
    4. price? it's cheaper
    5. performance? should not be a big difference. we'll have benchmarks right here on techspot soon
    6. adoption rate? the biggest monitor manufacturers Samsung, LG, Benq, Acer, etc have or will soon have freesync capable monitors. the technology just came out.

    What VALID concerns? Your only concern is that AMD is pushing this technology. It's the fact that you don't trust AMD, even though it's an industry standard. I call this trolling.

    I read that article and I also read the comments. The only problem people have seen is that older video cards that don't have DP1.2a won't support adaptive sync and that the driver is a bit late. There are more comments in favor of freesync.

    and this is my first post since you mentioned that I said something wrong in my first post:
    ----
    what red flags?
    as far as I know the only difference is that g-sync offloads the processing to a second chip in the module (this is why it doesn't need a new scaler) while freesync uses the GPU and the new scaler. this should give g-sync a small boost in max FPS (we'll have to see benchmarks to see if it's just 1-2 fps or more)
    Didn't nvidia announce adaptive sync for laptops that doesn't use their module? (aka it uses something similar to freesync - people are saying it's actually freesync) link
    ---
    Yes, I was so wrong! ^_^

    I don't care if my posts get deleted, but I stand by what I said. You can't give a valid argument for your "concerns". It's all: "if it's amd then they will screw something in the future" type of concerns.

    PS: running away...
     
    Last edited: Mar 8, 2015
  20. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,455   +867

    According to AMD's own FAQ on FreeSync requirements, only the 295x2, 290X, 290, (TR reports the 285 will also), 260X and 260 are compatible with FreeSync for gaming. nVIDIA on the other hand has 3 entire generations' worth of cards that support their tech.

    As far as monitors having different minimum refresh rates, I'm willing to bet that sub 56Hz FreeSync monitors will cost more. How much more, I don't know, but it could be a turnoff for some people. Big companies rarely release their best version of a product overseas first and 56Hz might be them testing the market without going all in. So now we wait...

    http://support.amd.com/en-us/search/faq/219

    http://techreport.com/news/27000/amd-only-certain-new-radeons-will-work-with-freesync-displays
     
    Last edited: Mar 9, 2015
  21. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 11,286   +4,941

    I read somewhere that the only reason the module on the monitor is needed was because of communication limitations over the cable connection specifications. Much like what you say is needed in DP 1.2a, but can't be implemented in DVI or HDMI. And that Laptops don't have this cable limitation, therefor the module is not needed.
     
    Puiu likes this.
  22. Puiu

    Puiu TS Evangelist Posts: 3,373   +1,823

    indeed, the cable limitations are gone, laptops use Embedded Display Ports which supports variable refresh rates. I think the only limitations are the display refresh rate min/max limits and the scalers.
     
  23. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,455   +867

    *sigh*

    "At a technical level, the G-Sync module works by manipulating a display's VBLANK (vertical blanking interval), which is the time between the display drawing the last line of the current frame and drawing the first line of the next frame. During this VBLANK, the display holds the current frame before beginning to draw a new one." AKA, it replaces the traditional scaler.


    Mobile G-Sync still requires additional hardware in the form of a T Con chip for help with syncing frames. It is NOT a software only solution. And again, mobile G-Sync is not a copy and paste of what is in monitors. You will NOT get the same performance or features as you would WITH a module.
     
  24. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 11,286   +4,941

    Nor is Freesync, what is your point?

    My point was the chip is not needed on the monitor side of the interface and can be integrated into the GPU. In a desktop the GPU can not control the VBLANK through the cable specifications requiring the module to be on the monitor side. This lack of integration on the desktop will likely keep what could be integrated on the laptop as a separate component.
     
  25. hahahanoobs

    hahahanoobs TS Evangelist Posts: 2,455   +867

    Point is it's not a bandwidth limitation. VBLANK is done via a hardware SCALER. Google it.
    For someone who "read it somewhere" you seem pretty sure of yourself. Sorry to disappoint you.

    While you're there, Google what a T-conn chip is.

    Edit: spelling (you're)
     
    Last edited: Mar 9, 2015

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...