FreeSync monitors now available, and they're cheaper than G-Sync equivalents

No multi-Gpu drivers yet?
They have been talking about his for so long, and yet, no crossfire support.
That is dissapointing.
 
The FreeSync spec can do a much wider refresh rate range than G-Sync, from memory it's 9 to 240 Hz or something like that. It's up to monitor manufacturers to include displays that can actually go down that low or up that high.

If you see a monitor that doesn't go down to 30 Hz (a reasonable rate), blame the manufacturer
 
The FreeSync spec can do a much wider refresh rate range than G-Sync, from memory it's 9 to 240 Hz or something like that. It's up to monitor manufacturers to include displays that can actually go down that low or up that high.

If you see a monitor that doesn't go down to 30 Hz (a reasonable rate), blame the manufacturer

Awww man, but I like sh*tting all over AMD without waiting for review sites to do comparative reviews. Now I have to think about the monitor manufacturers?!? This is all getting too much for me D:
 
LOL no red flags at Nvidia? Didn't you see that they are bringing G-Sync to laptops and the laptops will not require a "G-Sync Module." This essentially proves that the module is nothing more than DRM and that Nvidia is milking as usual.

G-Sync Mobile will not have the same features as monitors do/will with the module.
 
The FreeSync spec can do a much wider refresh rate range than G-Sync, from memory it's 9 to 240 Hz or something like that. It's up to monitor manufacturers to include displays that can actually go down that low or up that high.

If you see a monitor that doesn't go down to 30 Hz (a reasonable rate), blame the manufacturer

I understand that, but it's still listed as a FreeSync monitor and AMD could still get the blame in some form since it's their name behind it, so users may just instead go G-Sync or wait for VRR with DP 1.3 if it can get down to ~30Hz minimums as standard on all VRR monitors.

Techies in the know may verify the minimum refresh rate before purchase, but that's not standard practice for common folk. When you buy a G-Sync monitor you get the full experience without the homework into the technical jargon, and for those people the premium with it could be/is justified just based on knowing the tech will work.
 
Last edited:
Can we get a better comparison pic with two very similar monitors that have the same screens? As in both glossy or both matte?
We shouldn't trust the AMD demonstration anyway. They would choose a scene and settings that make their tech look really great when it may not be in everyday use. Much like how the big box stores used to always show slow-moving animated films (Finding Nemo was like always on) to demonstrate their HD TVs. Animated films don't show color problems and detail like live action can. This isn't dishonest, it just might not be realistic to your own use.

I would imagine as soon as this tech is mainstream there will be plenty of videos and reviews comparing the two. Especially since one technology is significantly cheaper. We all want to know if the performance is similar between the two or if Nvidia will continue to be able to justify their price.

Yeah, cuz Nvidia is so much more likely to tell the truth. Tell me this, why would AMD release a tech that isn't going to be for everyday use if just to blow up in their face?

Thinking, it helps.
 
I don't know about you lot but Gsync & Freesync are technologies that haven't caught my attention one little bit but to be honest I haven't seen either in action live.

Both of them do the same thing (remove screen tearing). The only difference is that FreeSync doesn't really cost any money while G-Sync adds $200 onto the cost. You could buy a free-sync monitor for around the same price as the non-freesync one, so the barrier to entry is pretty much nill.
 
I am not saying about fresync in general but in that monitor. freesync can go as low as 24 I think.

Yea I just looked. 24-144Hz vert and hor if using VGA/HDMI on that BenQ.
http://gaming.benq.com/gaming-monitor/xl2730z/specification/#skip
Just to confirm, FreeSync doesn't work at all on VGA/HDMI, only Displayport.

Edit: Just had a look on your link, it does state 56-144Hz on DisplayPort, This is troublesome as that does look like FreeSync only works over 56fps in games.

I've seen G-sync work between 30-60fps (my brother has the ROG Swift) and it works wonders at those frame rates, if this is the case, that's a mighty shame :/

Screen tearing only occurs when your GPU has more frames that your monitor can display.
The FreeSync spec can do a much wider refresh rate range than G-Sync, from memory it's 9 to 240 Hz or something like that. It's up to monitor manufacturers to include displays that can actually go down that low or up that high.

If you see a monitor that doesn't go down to 30 Hz (a reasonable rate), blame the manufacturer

Thanks for the tip. Once you guys get your hands on a freesync monitor, a FreeSync information round-up should accompany the review. Sort of like a "What we knew about FreeSync and What we got".
 
Tell me this, why would AMD release a tech that isn't going to be for everyday use if just to blow up in their face?

Thinking, it helps.
Well for starters, AMD's current CPU line up is basically a joke, so I wouldn't put it past them releasing a Display tech under the banner of "freesync" and it not work out for them.

And the name itself is a bit of an oxymoron, "Freesync" yet, you have to buy a pretty expensive screen to gain access, your argument that it's "$150 cheaper" is pretty moot when the cheapest screen is $600. What is $150 to people thinking so spending such money in the first place?

Besides, not many people have tried freesync and actually come away talking much about it, G-Sync on the other hand Nvidia actively show off. There is a reason for this, no way does Nvidia's chip in the screen just sync frames and refresh rates. It has dedicated memory on board for it. I bet once people get to actually play games on "freesync" we'll find it actually isn't as good as G-Sync, maybe due to frame timing or due to the fact current "freesync" monitors support the lowest of 54Hz. Only time will tell.
 
Oh snap! So it is worthless @ 56Hz minimum in regards to at least the BenQ named in this article. Wow AMD... just wow. I see why the driver is coming after the monitors go on sale now.

#redflag
according to this article it seems that every monitor will have it's own minimum refresh rate
" Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum."
 
Thanks for the tip. Once you guys get your hands on a freesync monitor, a FreeSync information round-up should accompany the review. Sort of like a "What we knew about FreeSync and What we got".

This will be sooner than you think ;)

From what I've seen at trade shows of both G-Sync and FreeSync, both seem pretty damn similar
 

I think you need to read some of the articles you linked. Your google link of screen tearing at 30 fps proves my point that screen tearing occurs when the monitor cannot display the frames being handed to it. When the GPU is rendering less than 60 fps, sometimes the GPU isn't able to render everything at once and information between one and another is mixed.
 
Well for starters, AMD's current CPU line up is basically a joke, so I wouldn't put it past them releasing a Display tech under the banner of "freesync" and it not work out for them.

And the name itself is a bit of an oxymoron, "Freesync" yet, you have to buy a pretty expensive screen to gain access, your argument that it's "$150 cheaper" is pretty moot when the cheapest screen is $600. What is $150 to people thinking so spending such money in the first place?

Besides, not many people have tried freesync and actually come away talking much about it, G-Sync on the other hand Nvidia actively show off. There is a reason for this, no way does Nvidia's chip in the screen just sync frames and refresh rates. It has dedicated memory on board for it. I bet once people get to actually play games on "freesync" we'll find it actually isn't as good as G-Sync, maybe due to frame timing or due to the fact current "freesync" monitors support the lowest of 54Hz. Only time will tell.

Coming into a thread about FreeSync and bringing something totally irrelevant like AMD's CPUs is always a good way to start off your comment. By your logic, any entity with a less than stellar product in their portfolio is now inherently evil.

"your argument that it's "$150 cheaper" is pretty moot when the cheapest screen is $600."

Oh hey, I have a $600 monitor and can say that $150 for a such a minor improvement isn't worth it. For some reason you seam to speak against saving money even though you are commenting on something you aren't in a position to buy anyways.

"What is $150 to people thinking so spending such money in the first place?"

It's pretty sad that here that you're referencing other people and not yourself here. You opinion isn't even worth dirt if you're just assuming what someone like me (ya know, ppl who spend big bucks on monitors) want.

Your last paragraph is the funniest though.

"I bet once people get to actually play games on "freesync" we'll find it actually isn't as good as G-Sync, maybe due to frame timing"

Yes, it would be a pretty big issue if frame timing was off. It's kinda hard to mess that up though since it's the sole function of FreeSync. It's kinda already been accepted as a display port standard too so I HIGHLY doubt they are going to put anything flawed in there. It's VESA's word vs. Nvida.

"or due to the fact current "freesync" monitors support the lowest of 54Hz"

If you took the time to read any of the comments here, the author stated that FreeSync supports a much wider range than G-Sync. So much for that premium Nvidia euphoria. That's not gonna soften the blow for those who already spent.

I've thrashed every one of points, only god knows why you decided to write such a favor-ridden piece.
 
"Screen tearing only occurs when your GPU has more frames that your monitor can display."

You're dead wrong. You can get tearing below AND above your monitors' refresh rate.

Take time to comprehend my comment. That's what I am saying. Your monitor is handed extra bits of information from the last frame on lower FPS. Thus you can see multiple frames of content at once.

Do what you want man. I said all I had to say. I'm not gonna fight with every AMD fanboy on this one.

Aside from the fact that you avoid any form of actually sustenance, you debase yourself by playing the fanboy card.

Mantle. Nuff said.

If I remember correctly, mantle is more than achieved it's goal. It's gotten microsoft to adopt a low level API and it gives major performance boosts. It's also in quite a few games. Nuff said.

And if you took the time to read the comments here you'd know that no one said that.

ohh wait ....

I bet once people get to actually play games on "freesync" we'll find it actually isn't as good as G-Sync, maybe due to frame timing or due to the fact current "freesync" monitors support the lowest of 54Hz. Only time will tell.

Oh dang, someone fails again. Your on a roll too.

That must be your second account. Why else would you act like those words are your own and reply directly to it? I guess someone has to agree with and it might as well be yourself.
 
Do what you want man. I said all I had to say. I'm not gonna fight with every AMD fanboy on this one.
I'm not a fanboy, but I just don't like people hating on things just because they are misinformed or have preconceived notions. I've defended nvidia in many cases and I've also defended intel.
I've proved that monitors with freesync will support 30Hz (I think the acer one already does)
why the hell are you so bent on hating something that is hundreds of dollars cheaper, does almost the same thing and is an industry standard that anyone can adopt?
both freesync and g-sync exist and the competition between the 2 will improve monitors for all of us. (at least until 1 of them wins majority of the market share in adaptive sync capable monitor sales)
 
Last edited:
"Screen tearing only occurs when your GPU has more frames that your monitor can display."

You're dead wrong. You can get tearing below AND above your monitors' refresh rate.
also adding that its just much more noticable above the monitors hz cause there are multiple misaligned frames on the screen in a given moment of screen tearing, but below the screens hz its just one misaligned frame on top of another and less frequent thus people think screen tearing is above monitors hz only.
 
also adding that its just much more noticable above the monitors hz cause there are multiple misaligned frames on the screen in a given moment of screen tearing, but below the screens hz its just one misaligned frame on top of another and less frequent thus people think screen tearing is above monitors hz only.

Thanks for adding that. I know my original post wasn't entirely clear on screen tearing and I got trolled by hahahanoobs because of that. Screen tearing should just be described as misaligned frames resulting from a difference in timing between the GPU and Monitor. Synchronizing the refresh rate with the frame rate solves the issue.
 
I'll just let my comments speak for themselves. I have nothing to correct in anything I've said here.
I was having a nice talk with four other gentlemen before you rudely interrupted. You didn't get trolled, you got schooled. ;)

meh, like a kid who doesn't like losing an argument. just accept that people here gave you logical arguments for everything you said and don't troll. we're not simpletons here, we all love technology and always make logical decisions based on cold hard facts not emotions.
 
So when does tearing happen again...?

If you want someone to argue with, create an account and go bother these people that share my VALID concerns with FreeSync:
http://www.guru3d.com/news-story/single-gpu-amd-freesync-driver-march-19th.html
you leep saying concerns but you actually have no real arguments.
what exactly are your so called "concerns"? write them here already
1. minimum refresh rate? I already showed you that it's monitor based
2. driver support? you will get driver support soon with multi-gpu coming later
3. game support? games that don't have it in their options menu will be able to use it if you force it from the driver
4. price? it's cheaper
5. performance? should not be a big difference. we'll have benchmarks right here on techspot soon
6. adoption rate? the biggest monitor manufacturers Samsung, LG, Benq, Acer, etc have or will soon have freesync capable monitors. the technology just came out.

What VALID concerns? Your only concern is that AMD is pushing this technology. It's the fact that you don't trust AMD, even though it's an industry standard. I call this trolling.

I read that article and I also read the comments. The only problem people have seen is that older video cards that don't have DP1.2a won't support adaptive sync and that the driver is a bit late. There are more comments in favor of freesync.

and this is my first post since you mentioned that I said something wrong in my first post:
----
what red flags?
as far as I know the only difference is that g-sync offloads the processing to a second chip in the module (this is why it doesn't need a new scaler) while freesync uses the GPU and the new scaler. this should give g-sync a small boost in max FPS (we'll have to see benchmarks to see if it's just 1-2 fps or more)
Didn't nvidia announce adaptive sync for laptops that doesn't use their module? (aka it uses something similar to freesync - people are saying it's actually freesync) link
---
Yes, I was so wrong! ^_^

I don't care if my posts get deleted, but I stand by what I said. You can't give a valid argument for your "concerns". It's all: "if it's amd then they will screw something in the future" type of concerns.

PS: running away...
 
Last edited:
That is a red flag, but I'm more inclined to blame benq. What is the point with designing a monitor where the adaptivesync displayport signal only goes down to 56Hz ?!?!?

I almost think something is wrong, that stat is so bad to the point of being useless. Someone should contact benq to clarify. We know other monitors are supposed to have lower refresh ranges over displayport so unless this is some sort of error, no one should buy this monitor for freesync use.

According to AMD's own FAQ on FreeSync requirements, only the 295x2, 290X, 290, (TR reports the 285 will also), 260X and 260 are compatible with FreeSync for gaming. nVIDIA on the other hand has 3 entire generations' worth of cards that support their tech.

As far as monitors having different minimum refresh rates, I'm willing to bet that sub 56Hz FreeSync monitors will cost more. How much more, I don't know, but it could be a turnoff for some people. Big companies rarely release their best version of a product overseas first and 56Hz might be them testing the market without going all in. So now we wait...

http://support.amd.com/en-us/search/faq/219

http://techreport.com/news/27000/amd-only-certain-new-radeons-will-work-with-freesync-displays
 
Last edited:
Didn't nvidia announce adaptive sync for laptops that doesn't use their module? (aka it uses something similar to freesync - people are saying it's actually freesync)
I read somewhere that the only reason the module on the monitor is needed was because of communication limitations over the cable connection specifications. Much like what you say is needed in DP 1.2a, but can't be implemented in DVI or HDMI. And that Laptops don't have this cable limitation, therefor the module is not needed.
 
I read somewhere that the only reason the module on the monitor is needed was because of communication limitations over the cable connection specifications. Much like what you say is needed in DP 1.2a, but can't be implemented in DVI or HDMI. And that Laptops don't have this cable limitation, therefor the module is not needed.
indeed, the cable limitations are gone, laptops use Embedded Display Ports which supports variable refresh rates. I think the only limitations are the display refresh rate min/max limits and the scalers.
 
I read somewhere that the only reason the module on the monitor is needed was because of communication limitations over the cable connection specifications. Much like what you say is needed in DP 1.2a, but can't be implemented in DVI or HDMI. And that Laptops don't have this cable limitation, therefor the module is not needed.

*sigh*

"At a technical level, the G-Sync module works by manipulating a display's VBLANK (vertical blanking interval), which is the time between the display drawing the last line of the current frame and drawing the first line of the next frame. During this VBLANK, the display holds the current frame before beginning to draw a new one." AKA, it replaces the traditional scaler.

Mobile G-Sync still requires additional hardware in the form of a T Con chip for help with syncing frames. It is NOT a software only solution. And again, mobile G-Sync is not a copy and paste of what is in monitors. You will NOT get the same performance or features as you would WITH a module.
 
It is NOT a software only solution.
Nor is Freesync, what is your point?

My point was the chip is not needed on the monitor side of the interface and can be integrated into the GPU. In a desktop the GPU can not control the VBLANK through the cable specifications requiring the module to be on the monitor side. This lack of integration on the desktop will likely keep what could be integrated on the laptop as a separate component.
 
Nor is Freesync, what is your point?

My point was the chip is not needed on the monitor side of the interface and can be integrated into the GPU. In a desktop the GPU can not control the VBLANK through the cable specifications requiring the module to be on the monitor side. This lack of integration on the desktop will likely keep what could be integrated on the laptop as a separate component.

Point is it's not a bandwidth limitation. VBLANK is done via a hardware SCALER. Google it.
For someone who "read it somewhere" you seem pretty sure of yourself. Sorry to disappoint you.

While you're there, Google what a T-conn chip is.

Edit: spelling (you're)
 
Last edited:
Back