Nvidia bids adieu to V-Sync limitations with G-Sync monitor technology

Shawn Knight

Posts: 15,289   +192
Staff member

nvidia g-sync nvidia g-sync monitor lag screen tearing

Nvidia recently announced a new monitor technology that will completely eliminate screen tearing, stutter and lag. G-Sync is a module said to replace the scaler in LCD monitors and will first be arriving in gaming-grade displays from the likes of Asus, Benq, Philips and Viewsonic.

As you likely know, deciding whether or not to enable V-Sync will typically affect a game in one way or another. With it disabled, the game’s frame rates won’t be limited to the refresh rate of your monitor. This means higher FPS overall but it also leads to screen tearing – when multiple frames become trapped in a single refresh during quick on-screen movements – and well, nobody likes that.

Enabling V-Sync forces your GPU to pump out just 60 frames per second which puts it in sync with a typical monitor’s refresh rate of 60 Hz. It helps to eliminate screen tearing but at the same time, you’re limited to just 60 FPS. Many of today’s higher-end cards and multi-card configurations are capable of pushing frames at a much faster rate. Faster FPS means better quality and more responsive input but until now, gamers have had to choose between one or the other.

The G-Sync module synchronizes the monitor’s refresh rate to the GPU’s render rate which means images are displayed the moment they are rendered. This supposedly translates into scenes appearing instantly with sharper objects and smoother gameplay.

We’re told that G-Sync monitors will be available starting early next year. No word yet on pricing for new monitors, though a G-Sync add-on that will work with the Asus' VG248QE 144Hz monitor will cost $175, giving us a hint that G-Sync enabled monitors will carry a $100-150 premium.

Meanwhile, game developers that were exposed to G-Sync all seem to agree it's a welcomed improvement to the whole PC gaming experience:

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better."
-- Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!"
-- Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter."
-- John Carmack, co-founder, iD Software

Permalink to story.

 
This is really cool. I've been researching and reviewing 120Hz monitors lately, but I'll wait until these G-Unit monitors come out first.
 
Think I'll have to get one of these monitors when they hit the shelves!
By the sounds of it, it will work with ANY gpu.
 
Remember kids, stuttering and lag aren't the fault of shoddy drivers or poorly optimized games. It's the monitors fault.
 
Edited with more specific pricing and quotes from game developers who were shown G-Sync at Nvidia's event. Looks like a promising addition to gaming monitors, hopefully for a lesser premium than that Asus-specific mod.
 
Remember kids, stuttering and lag aren't the fault of shoddy drivers or poorly optimized games. It's the monitors fault.

I truely hope your joking, this stops tearing when v-sync is off and stops stuttering when v-sync is on, plus allows you to use as much GPU power as you want and lowers lag between pressing a button and what happens on screen.
It works and from everything I've read, every person who's seen it cannot believe how well it works. I guess you pre-ordered an R9 290x?

Think I'll have to get one of these monitors when they hit the shelves!
By the sounds of it, it will work with ANY gpu.

Nope, this only works with a 650Ti boost and up and is also Nvidia only I'm afraid (or at least, as far as I'm aware, it appears to use the AUX on displayport to sync the frames to the refresh rates so it may be add able to AMD cards if nvidia allow it).

So to wrap things up, AMD push out a power hungry R9 290x that is barely faster than a standard 780 but uses 80 watts more power, and Mantle which currently is supported in a single game that is being patched in later this year.

Meanwhile Nvidia put their tech into major game engines like the unreal engine, they still have the fastest card on the market (780 Ti still withstanding of course, it may or may not be faster than the Titan) and G-Sync, which improves every game ever made and for the foreseeable future by fixing an ancient issue.

I just wish more details about the screens would be released! I would love a 1440p screen (27 inch) which had this tech and was capable of 96-144Hz, but god knows what the price on that would be!
 
Nvidia, Intel and AMD are going to proprietary the gaming world to death, reminds me of the early days of gaming with the sound cards and GFX drivers 'only supports XYZ' crap.
 
I don't want to sound like a NVIDIA fanboy. I watched the 3 hours long [after a while it became tortuous] AMD's presentation, something that bothered me was how much they compare everything to the competition and I told a friend and he said "all companies do that". But I've read GeForce.com articles and seen both companies presentations and videos, and it seems like in NVIDIA's world they are the only ones -in a good sense. They talk about their achievements and proposals without even mentioning the competition, their tech demos actually look intended for developers and seem to have a serious engineering purpose. I mean, NVIDIA compares their products vs their products and the usual top vs top competition comparison graphics; but AMD compares every single enhancement [Trueaudio, TressFX, DX 11.2 full support, Eyefinity, etc.] they do with one of the competition [PhysX, 3D Vision, Surround, etc.] and then minimizes competition's.
When I watch a NVIDIA presentation is like "let's forget about the competition and let's do something to improve gaming experience". I feel more comfortable not be reminded every 10 minutes how they have killer features that crush the competition. You can see for yourself reading GeForce.com articles and you'll understand my point, and don't get me wrong: I read both sides' articles, watch both sides' presentations and own products of both sides at home.
 
their tech demos actually look intended for developers and seem to have a serious engineering purpose. I mean, NVIDIA compares their products vs their products and the usual top vs top competition comparison graphics; but AMD compares every single enhancement [Trueaudio, TressFX, DX 11.2 full support, Eyefinity, etc.] they do with one of the competition [PhysX, 3D Vision, Surround, etc.] and then minimizes competition's.
When I watch a NVIDIA presentation is like "let's forget about the competition and let's do something to improve gaming experience". I feel more comfortable not be reminded every 10 minutes how they have killer features that crush the competition. You can see for yourself reading GeForce.com articles and you'll understand my point, and don't get me wrong: I read both sides' articles, watch both sides' presentations and own products of both sides at home.

Agreed completely, the tech demo with the pendulum was awesome, if you can find still images of it you can really start to see the benefit of G-Sync, but everyone there have said you've got to see it to truly believe. I also don't want to sound like a fan boy but I gotta hand it to nvidia, this is awesome.
 
I hope this tech becomes open so both AMD & Intel can implement it aswell.
This is something that should have been solved 10 years ago, but now is much better than never...
I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!
But if I'm going to invest in this it needs to come in an IPS panel, no TN crap
 
But if I'm going to invest in this it needs to come in an IPS panel, no TN crap
Sure TN panels have improved a lot in the last 5 years, but IPS and similar panel technologies are so much better and worth the premium. Oh, and that premium has decreased over the last 5 years too (unless you're trying to get one or more of the 30" monsters). IPS for life~
 
Wow, I hope this g-sync will further drive monitor prices down in the Philippines and I'm likewise hoping that more companies will join in after asus, ben q, Philips, and viewsonic.
 
I don't want to sound like a NVIDIA fanboy. I watched the 3 hours long [after a while it became tortuous] AMD's presentation, something that bothered me was how much they compare everything to the competition and I told a friend and he said "all companies do that". But I've read GeForce.com articles and seen both companies presentations and videos, and it seems like in NVIDIA's world they are the only ones -in a good sense.
That has been a deliberate marketing strategy from both Nvidia and Intel since Day 1 as a general rule. When was the last time you saw a Intel presentation, slide, or graphic that even acknowledges AMD's existence?
The inference is that the company is the leader (whether true or not), and leaders need not concern themselves with the companies in their wake. You also don't dilute the brand by sharing the spotlight with a competitor if at all possible. Confidence and arrogance are qualities associated with winners regardless whether you sympathise or despise the qualities themselves.

AMD never really had a chance to be a front runner. Establishing a brand takes time and a string of successes- a luxury that they would never be afforded since Intel was founded on the back of established engineering prowess whilst AMD's foundation laid in sales and marketing (primarily of scraps from Intel's banquet table). Intel already had a string of successes before AMD even got out of the starting blocks.
The engineering backround foundation of both Intel and Nvidia lends itself to long term unwavering goal setting. A sales and marketing foundation generally means goals are short term with plans and products in a constant state of flux as management attempt to second guess a future market rather than plan a product (both hardware and software ecosystem) in order to dictate a future market.
 
Man, this move is such a load of bull... Forcing us to spend a lot on an unneeded (in my case) GPU Upgrade (650 TI boost and up) and $175 premium for the display upgrade kit. And limiting us to the Green (Dark) side, with this proprietary technology. I wouldn't encourage that.

We've all been fooled into silently accepting Directx and look how that's turned out. OpenGL is being marginalized for no reason, even though it's at least as good, with a better rendering pipeline. All those Androids bear testimony to that. The same will happen here, if we mindlessly allow Nvidia to push this platform-dependent crap. Just my 2 cents...
 
Man, this move is such a load of bull... Forcing us to spend a lot on an unneeded (in my case) GPU Upgrade (650 TI boost and up) and $175 premium for the display upgrade kit.
You seem to be brainwashed by consumerism. Nobody is forcing you to upgrade. It's part of the myth that people must have the next new thing that arrives on the scene
And limiting us to the Green (Dark) side, with this proprietary technology.
The monitors work as standard monitors if a compatible GPU isn't involved. More expensive than the base model screen? Yes, certainly, but if you don't think the added feature is cost effective (or you play with Vsync enabled, or screen tearing isn't an issue for you) then no one is forcing your hand.
... if we mindlessly allow Nvidia to push this platform-dependent crap. Just my 2 cents...
Get used to it. AMD just released details of two proprietary API's aimed exclusively at gaming (TrueAudio and Mantle). With fabrication foundry costs for smaller process nodes likely to increase greatly at 20nm, 16/20nm and 10nm FinFET, margins on card sales and return on investment are going to get very tight as IGP/APUs eat into the volume lower mainstream market. With less leeway in pricing (which is largely immaterial anyway as Nvidia and AMD seem to have reached a mutually beneficial understanding) the primary differentiators become marketing bullet points and supposed features (PhysX, TressFX, TrueAudio, game bundles etc.)
 
You seem to be brainwashed by consumerism. Nobody is forcing you to upgrade. It's part of the myth that people must have the next new thing that arrives on the scene

If I was "brainwashed", I would be running the latest and greatest just because it exists, which I'm not. My good old GTX 460 served me fine to this day, since I'm not exactly the quintessential gamer. But I'd hate to have to upgrade just because a game of a series I enjoy starts working on that technology alone.

On the other hand, what often starts as innocent progress on one side of the competition can quickly turn into an exclusive "feature" later, that severely limits your future freedom of choice. We've seen it before and not just once...
And with competition on the GPU market limited to two players, a lot can go wrong.

The monitors work as standard monitors if a compatible GPU isn't involved. More expensive than the base model screen? Yes, certainly, but if you don't think the added feature is cost effective (or you play with Vsync enabled, or screen tearing isn't an issue for you) then no one is forcing your hand.

It remains to be seen how exactly they will integrate this new gimmick with display makers and how we'll be affected.

As for the last paragraph... You're right, but I would rather have companies investing more in technology and less in marketing campaigns. It's sad to see a bundle of games differentiate the rebadges of the major players...
 
If I was "brainwashed", I would be running the latest and greatest just because it exists, which I'm not. My good old GTX 460 served me fine to this day, since I'm not exactly the quintessential gamer. But I'd hate to have to upgrade just because a game of a series I enjoy starts working on that technology alone.

You clearly have no idea what this article is about, it stops tearing and stuttering and increases response times on the screen, nothing more, nothing less, Games don't "depend" on this tech, it simply improves image quality, I don't know if you've played any 3D games but you'll notice that your framerate is never at a constant level, this syncs your screen refresh to the given frames per second to stop tearing and the like, Games don't depend on anything from G-Sync however, it does mean you could play games with even higher visuals as it makes 30 frames per seconds feel extremely smooth still.

On the other hand, what often starts as innocent progress on one side of the competition can quickly turn into an exclusive "feature" later, that severely limits your future freedom of choice. We've seen it before and not just once...

And with competition on the GPU market limited to two players, a lot can go wrong.

What's wrong with that? If Nvidia stops AMD cards from using this tech then AMD can bring their own stuff to the table and make it cheaper driving competition? Maybe they do a cross licensing deal, Nvidia gets access to the Audio API's AMD have recently released in return for certain AMD cards to have access to G-Sync? You never know...

It remains to be seen how exactly they will integrate this new gimmick with display makers and how we'll be affected.

I don't usually say this but if it really works as proposed (and according to everyone who has seen it all agree) this is NOT a gimmick, I don't know what your beef is with fixing an ancient issue PC gamers have had to put up with for years but this is pretty sweet. And display makers will probably make a "G-Sync compatible" version of already existing displays and raise the price to cover the extra costs. Why would we be affected? It's a new tech that does cost more to produce than what is currently available, this is how the world works.

As for the last paragraph... You're right, but I would rather have companies investing more in technology and less in marketing campaigns. It's sad to see a bundle of games differentiate the rebadges of the major players...

If you rather a company invest in technology rather than Marketing campaigns you must really hate AMD right about now...
 
That has been a deliberate marketing strategy from both Nvidia and Intel since Day 1 as a general rule. When was the last time you saw a Intel presentation, slide, or graphic that even acknowledges AMD's existence?
The inference is that the company is the leader (whether true or not), and leaders need not concern themselves with the companies in their wake. You also don't dilute the brand by sharing the spotlight with a competitor if at all possible. Confidence and arrogance are qualities associated with winners regardless whether you sympathise or despise the qualities themselves...

Very good analysis dbz as usual, I slightly suspected that, but it feels less agressive and comfortable to watch when they don't compare every chance they get to others. I perceive it as something personal or like "projecting their trauma" and to my eyes that looks like they're degrading themselves or being sort of childish, eventually they project agression when they do that -the Fixer ads are a clear example of this.
 
The thing is that monitor vendors needed something to increase sells. This comes quite handy to them. The problem, the real and main problem is the lack of standardization, it does not seem to be so difficult to implement . Having each gpu manufacturer to implement it's own solution to this "extremely important" ( as many wants users to believe, but somehow completely ignored by the industry for decades ) will make things worst for consumers. What will monitor manufacturers do when intel and AMD come with it's own solution? Let's suppouse a 100usd prime for this benefit x 3 then the monitor will cost 300usd more to support all of them? seriously?! come on! If users don't plead for standardization who will?
 
Let's suppouse a 100usd prime for this benefit x 3 then the monitor will cost 300usd more to support all of them? seriously?! come on! If users don't plead for standardization who will?
Can you imagine monitors with nVidia or AMD ready logos? Why not, we are already buying other things such as motherboards with SLI/XFire ready logos? However I do see your point about standardization, I personally despise proprietary devices. But seriously the monitor would still function without G-Sync.

If you ask me I'd say make the G-Sync device removable so the other competitors can connect their devices if needed. Make the G-Sync device a monitor addin card.
 
Very good analysis dbz as usual, I slightly suspected that, but it feels less agressive and comfortable to watch when they don't compare every chance they get to others. I perceive it as something personal or like "projecting their trauma" and to my eyes that looks like they're degrading themselves or being sort of childish, eventually they project agression when they do that -the Fixer ads are a clear example of this.
I think a lot of that comes down to the company structure at AMD. Hardware releases seldom to be in sync with the software/driver division (so widespread that it's now a given that launches are addended "You can expect better as the drivers mature", long running sagas with Enduro, frame pacing, VCE, HDMI sound etc.), and the companies PR/Marketing is seldom on the same page as either hardware or software. The hardware is generally first rate, the software is a relatively new area that AMD is (belatedly) getting into...but the PR? pure W.Jerry Sanders III channelling.

It's as though you have the archetypal self-effacing and introspective engineers in one sealed area, software writers in solitary confinement being passed pre-release hardware days before launch through a slot in the wall and told "to get on with it"...while the marketing guys hired for their passive-aggressive qualities, run amok in their distant cubicle farm yelling "Make it so" after devising strategies made up entirely of throwing fridge word magnets at the office vending machine.

With the exception of Jen Hsun-Huang himself, both Intel and Nvidia seem to have a more organised and restrained approach to the whole business.
Can you imagine monitors with nVidia or AMD ready logos? Why not, we are already buying other things such as motherboards with SLI/XFire ready logos?
Technically, things in that arena have been proprietary since Intel removed AMD, Cyrix, IDT etc from their coattails and producing pin compatible CPUs ended at Socket 4, 5, and 7
 
"Why not, we are already buying other things such as motherboards with SLI/XFire ready logos?"
Indeed, how ever this may change if crossfire or sli is implemented using the pcie 3.0, someone says this is the way it will be implemented in AMD 290X card. So why don't plead for standards instead implementing proprietary solutions? And I find unlikely to invest +100usd just for gsync feature while I could get a better video card, cpu or an ssd with it. Maybe standardization could lower that costs by a lot. I must say that while gaming I may barely seen the issues gsync solves, since I'm paying attention to the game overall and not to minor glitches, so investing in a more powerful gpu/cpu would be more wise at least for me.
 
Back