Nvidia G-Sync technology is coming to laptops, no custom hardware required

Shawn Knight

Posts: 15,291   +192
Staff member

mobile g-sync software nvidia gaming driver monitor graphics driver g-sync g-sync monitor asus g751

Nvidia’s G-Sync technology is currently available on a handful of displays but the cost of entry isn’t cheap. That’s because the current crop of monitors require custom hardware to work their gaming magic – a requirement that soon won’t be necessary for mobile gamers.

PC Perspective recently got their hands on a leaked alpha driver for the Asus G751 line of notebooks claiming to enable Nvidia G-Sync. Coincidentally, they happened to have a G751 in for review at the time. A bit skeptical, they installed the driver and sure enough, it seemed to enable G-Sync – all seemingly without any special hardware.

mobile g-sync software nvidia gaming driver monitor graphics driver g-sync g-sync monitor asus g751

After disassembling the laptop to search for custom hardware (they didn’t find any), it dawned on them that this was a true software solution.

The team then reached out to Nvidia who confirmed that G-Sync notebooks are indeed coming to the market. The driver on hand wasn’t hacked to enable G-Sync but rather was provided to OEMs building G-Sync systems for testing purposes. Somehow or another, an Asus Nordic Support rep accidentally leaked the driver which is how it got out into the wild.

Nvidia also stated that there will be some differences between G-Sync experiences on the desktop versus mobile. Unfortunately, they elected to hold off on discussing the differences until the mobile release of G-Sync. Depending on what those differences are, gamers will likely question the necessity of G-Sync hardware in desktop monitors.

Permalink to story.

 
LOL the G-Sync module was only ever used to verify 100% that someone was using a Nvidia GPU. What a bunch of crooks...
 
^Looks like someone hasn't heard of sales records or seen a Steam Hardware Survey.

Curious to see the differences between the two solutions.
 
Would be funny if Nvidia G-Sync didn't really need special gear to use it. That would be twice in one moth for Nvidia.
 
LOL the G-Sync module was only ever used to verify 100% that someone was using a Nvidia GPU. What a bunch of crooks...
Maybe you should check out some further reading.
The whole reason behind Nvidia pushing G-Sync modules was because VESA's DP1.2a spec - initially ratified nearly three years ago, didn't have adaptive sync capability and was going to slow to implement in any case. In fact it was only after G-Sync arrived that AMD pushed VESA into including adaptive sync.

Now, as many people are aware, open source is great, but brings it's own inbuilt inertia.
DisplayPort 1.2a was initially ratified April 2012. AMD got adaptive sync added to the spec as a feature in May 2014.
Total available FreeSync/Adaptive Sync monitors available in February 2015? versus how many G-Sync monitors?
It is like pretty much any open source standard vs proprietary. One is free and slow in implementation, the other expensive and quick to market.

As for G-Sync laptops without a module, that should be a given. Mobile computing uses eDP not DP 1.2a, and the laptop isn't reliant upon DP in/out, cabling, or desktop monitor scalars.
So basically they are doing freesync ^_^
Basically. But they'll end up calling it Adaptive Sync. Freesync requires AMD input to say which panel does, or does not meet the criteria to be allowed to use the Freesync title. Adaptive Sync just needs the hardware to meet the VESA specification. Asus are marketing an Adaptive Sync monitor which could be Freesync validated, but they choose to go with the VESA nomenclature instead.
 
Last edited:
LOL the G-Sync module was only ever used to verify 100% that someone was using a Nvidia GPU. What a bunch of crooks...
Maybe you should check out some further reading.
The whole reason behind Nvidia pushing G-Sync modules was because VESA's DP1.2a spec - initially ratified nearly three years ago, didn't have adaptive sync capability and was going to slow to implement in any case. In fact it was only after G-Sync arrived that AMD pushed VESA into including adaptive sync.

Now, as many people are aware, open source is great, but brings it's own inbuilt inertia.
DisplayPort 1.2a was initially ratified April 2012. AMD got adaptive sync added to the spec as a feature in May 2014.
Total available FreeSync/Adaptive Sync monitors available in February 2015? versus how many G-Sync monitors?
It is like pretty much any open source standard vs proprietary. One is free and slow in implementation, the other expensive and quick to market.

As for G-Sync laptops without a module, that should be a given. Mobile computing uses eDP not DP 1.2a, and the laptop isn't reliant upon DP in/out, cabling, or desktop monitor scalars.
So basically they are doing freesync ^_^
Basically. But they'll end up calling it Adaptive Sync. Freesync requires AMD input to say which panel does, or does not meet the criteria to be allowed to use the Freesync title. Adaptive Sync just needs the hardware to meet the VESA specification. Asus are marketing an Adaptive Sync monitor which could be Freesync validated, but they choose to go with the VESA nomenclature instead.

Open source is not always slower adoption than proprietary. Android is a testament to that. I would not say that G-Sync has a high adoption rate either. I don't know many people who was to spend $200 extra and be locked into a Nvidia card at the same time. The huge roadblock for Nvidia is that even if they could put the tech in every monitor, the added value is outpaced by the cost and the Nvidia Graphics card requirements. Considering laptops, Nvidia controls much less than 50% of the graphics market.

Free-sync doesn't have any of these issues and doesn't have any cost. Nvidia will be forced to drop G-Sync once Free-sync matures in a couple years.
 
Open source is not always slower adoption than proprietary. Android is a testament to that.
Really? The OS market share numbers paint a completely different picture.
World-Wide-Smartphone-Market-Share.png

I would not say that G-Sync has a high adoption rate either.
And who said it did? The point being made is that G-Sync ramped faster because the decision making to get it into the market came from a single companies executive, and not a diverse group of members passing specifications through an endless procession of committee's to make sure every member was satisfied with the end product.
I don't know many people who was to spend $200 extra and be locked into a Nvidia card at the same time.
Me either, but every G-Sync monitor sold so far has put cash into Nvidia's pockets, and more importantly provided marketing PR for what amounts for little cost. Lest you forget, PCB and the FGPA chip manufacture likely costs little more than a couple of dollars to produce per unit.
The huge roadblock for Nvidia is that even if they could put the tech in every monitor...
Doubtful that they would. G-Sync has likely served it's purpose to a degree. It provided a solution where none existed before. AMD are (once again) seen as late to the party - Freesync is seldom mentioned without referencing/comparing to the solution that existed first.
Considering laptops, Nvidia controls much less than 50% of the graphics market.
In this context it matters nothing since no extra hardware is required to enable any kind of adaptive sync. If that is taken out of the equation, the attach rates for mobile discrete graphics come down to the same parameters they presently have - which is Nvidia has a clear playing field thanks to AMD not being able to field anything of note to compare with Kepler/Maxwell's performance-per-watt SKUs. If you hadn't noticed, AMD's lineup is underpinned by 3-generations-old tech (Cape Verde and Pitcairn), So Nvidia makes it's cash on MXM modules, and AMD is in a race to the bottom (laptop feature set wise) specialling out APUs in order to combat Intel's chips.
Free-sync doesn't have any of these issues and doesn't have any cost. Nvidia will be forced to drop G-Sync once Free-sync matures in a couple years.
Very likely. Then Nvidia will just jump onto Adaptive Sync (if they don't do it sooner), and have 3+ years of G-Sync sales in the account books. That is pretty hard to fault from a business perspective.
 
So tired of these flat screen issues, years of shoddy performance. Bring back the crt's

up to 240hz at 1080p+ and they only used to cost 20quid used.
talk about a flying leap backwards.
 
Freesync is seldom mentioned without referencing/comparing to the solution that existed first.
If you think being first to the market guarantees success, you're not familiar with the success of Apple. They have never been the first to the market, yet they lead the music player, tablet and high-end smartphone markets. You don't have to do it first, you have to do it better for the public.
And not by any means is a $100 solution that works for half of the discrete GPU market better than a free solution that can work for everyone.
 
Nvidia's GPU products, software and engineering has been second to none going on several years. I would imagine their screen refresh technology follows the same lines. Freesync is AMD's kneejerk reaction and I doubt it will be as good, but we will see.
Flamevest on.
 
Nvidia's GPU products, software and engineering has been second to none going on several years. I would imagine their screen refresh technology follows the same lines. Freesync is AMD's kneejerk reaction and I doubt it will be as good, but we will see.
Flamevest on.
it's the same thing named differently and AMD's kneejerk reaction is what's going to win in the end. It's already clear because of the huge number of already announced freesync capable monitors even though the standard has just been recently adopted.
nobody likes a blind fanboy :D
 
Nvidia are a bit of a joke tbh. Their n-force motherboards used to die quickly if you overclocked them, at all.
Their top of the line gpu which was touted as the ultimate was revealed to have been gimped all along and they upped its power to leech more money from enthusiasts.
They make G-sync, charge a chit ton of money for it and then get a kick in the nuts from amd when they make available a free version.
The gtx 970 4GB memory does not make true use of its 4GB, and games that use over 3.5GB jump and stutter like crazy.
gg nvidia!
 
Nvidia are a bit of a joke tbh. Their n-force motherboards used to die quickly if you overclocked them, at all.
I'll go along with most of what you said, and allow you your opinion. But if you overclock, any failure is on you not the manufacturer.
 
Nvidia are a bit of a joke tbh. Their n-force motherboards used to die quickly if you overclocked them, at all.
I can see why you prefer to remain anonymous with comments like that. Some of the best AMD boards of all time sport nForce chipsets, including some truly iconic nF2 boards ( DFI LanParty NFII Ultra B and NF7-2, and the Asus A7N8X/ -E/ Deluxe.
Their top of the line gpu which was touted as the ultimate was revealed to have been gimped all along
Lame trolling attempt or lack of research? Their (Nvidia) top of the line GPU is the GM204-400, which powers the GTX 980, and is a fully functional piece of silicon.
They make G-sync, charge a chit ton of money for it and then get a kick in the nuts from amd when they make available a free version.
So Nvidia make money on G-Sync, while AMD get nothing for FreeSync. You might want to brush up on the meaning of the phrase "kick in the nuts", because what it means isn't what you think you it means.
 
it's the same thing named differently and AMD's kneejerk reaction is what's going to win in the end. It's already clear because of the huge number of already announced freesync capable monitors even though the standard has just been recently adopted.

nobody likes a blind fanboy

Yes, huuge number...errr...the number is 8 monitors and TV's combined.
http://support.amd.com/en-us/search/faq/284

sooo, you mentioned something about fanboys? ;)
 
Yes, huuge number...errr...the number is 8 monitors and TV's combined.
http://support.amd.com/en-us/search/faq/284

sooo, you mentioned something about fanboys? ;)
those are just the ones you can buy now. there are tens of monitors coming in 2015.
and they have Samsung, LG and Benq as early adopters... if that's not huge then what is? only benq has g-sync monitors of those 3. the price difference on "equivalent" monitors is also around 100$. (corrected some of the grammar due to a nazi pointing it out for me)
yep fanboys :D
 
Last edited:
I'm not one to bust balls for typo's but, 'Echivalent'?
Hahaha!
that's just sad... I feel sorry for your sense of humor.
the way we write it in our lang is similar to the way it's written in english. just didn't notice the typo.
equivalent = echivalent

"Rule #1: If you are losing an argument, correct their grammar."
 
Samsung and LG are the major players backing the open standard.
BenQ and ASUS are the major players backing the proprietary one.
Free isn't always better.

As a gamer, the choice is obvious IMO.
 
Samsung and LG are the major players backing the open standard.
BenQ and ASUS are the major players backing the proprietary one.
Free isn't always better.

As a gamer, the choice is obvious IMO.
BenQ is making both types. I think Acer and Asus the ones that have only gsync for now. I expect this to change sometime this year.
 
Back