Nvidia bids adieu to V-Sync limitations with G-Sync monitor technology

By on October 19, 2013, 11:00 AM
nvidia, monitor, g-sync, monitor lag, screen tearing, v-sync, vertical synchronization

Nvidia recently announced a new monitor technology that will completely eliminate screen tearing, stutter and lag. G-Sync is a module said to replace the scaler in LCD monitors and will first be arriving in gaming-grade displays from the likes of Asus, Benq, Philips and Viewsonic.

As you likely know, deciding whether or not to enable V-Sync will typically affect a game in one way or another. With it disabled, the game’s frame rates won’t be limited to the refresh rate of your monitor. This means higher FPS overall but it also leads to screen tearing – when multiple frames become trapped in a single refresh during quick on-screen movements – and well, nobody likes that.

Enabling V-Sync forces your GPU to pump out just 60 frames per second which puts it in sync with a typical monitor’s refresh rate of 60 Hz. It helps to eliminate screen tearing but at the same time, you’re limited to just 60 FPS. Many of today’s higher-end cards and multi-card configurations are capable of pushing frames at a much faster rate. Faster FPS means better quality and more responsive input but until now, gamers have had to choose between one or the other.

The G-Sync module synchronizes the monitor’s refresh rate to the GPU’s render rate which means images are displayed the moment they are rendered. This supposedly translates into scenes appearing instantly with sharper objects and smoother gameplay.

We’re told that G-Sync monitors will be available starting early next year. No word yet on pricing for new monitors, though a G-Sync add-on that will work with the Asus' VG248QE 144Hz monitor will cost $175, giving us a hint that G-Sync enabled monitors will carry a $100-150 premium.

Meanwhile, game developers that were exposed to G-Sync all seem to agree it's a welcomed improvement to the whole PC gaming experience:

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better."
-- Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!"
-- Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter."
-- John Carmack, co-founder, iD Software




User Comments: 60

Got something to say? Post a comment
hahahanoobs hahahanoobs said:

This is really cool. I've been researching and reviewing 120Hz monitors lately, but I'll wait until these G-Unit monitors come out first.

killeriii said:

Think I'll have to get one of these monitors when they hit the shelves!

By the sounds of it, it will work with ANY gpu.

Lurker101 said:

Remember kids, stuttering and lag aren't the fault of shoddy drivers or poorly optimized games. It's the monitors fault.

2 people like this |
Staff
Julio Franco Julio Franco, TechSpot Editor, said:

Edited with more specific pricing and quotes from game developers who were shown G-Sync at Nvidia's event. Looks like a promising addition to gaming monitors, hopefully for a lesser premium than that Asus-specific mod.

1 person liked this | Burty117 Burty117, TechSpot Chancellor, said:

Remember kids, stuttering and lag aren't the fault of shoddy drivers or poorly optimized games. It's the monitors fault.

I truely hope your joking, this stops tearing when v-sync is off and stops stuttering when v-sync is on, plus allows you to use as much GPU power as you want and lowers lag between pressing a button and what happens on screen.

It works and from everything I've read, every person who's seen it cannot believe how well it works. I guess you pre-ordered an R9 290x?

Think I'll have to get one of these monitors when they hit the shelves!

By the sounds of it, it will work with ANY gpu.

Nope, this only works with a 650Ti boost and up and is also Nvidia only I'm afraid (or at least, as far as I'm aware, it appears to use the AUX on displayport to sync the frames to the refresh rates so it may be add able to AMD cards if nvidia allow it).

So to wrap things up, AMD push out a power hungry R9 290x that is barely faster than a standard 780 but uses 80 watts more power, and Mantle which currently is supported in a single game that is being patched in later this year.

Meanwhile Nvidia put their tech into major game engines like the unreal engine, they still have the fastest card on the market (780 Ti still withstanding of course, it may or may not be faster than the Titan) and G-Sync, which improves every game ever made and for the foreseeable future by fixing an ancient issue.

I just wish more details about the screens would be released! I would love a 1440p screen (27 inch) which had this tech and was capable of 96-144Hz, but god knows what the price on that would be!

hahahanoobs hahahanoobs said:

Remember kids, stuttering and lag aren't the fault of shoddy drivers or poorly optimized games. It's the monitors fault.

Meanwhile, in Lurkers' perfect world...

1 person liked this | Ranger1st Ranger1st said:

Nvidia, Intel and AMD are going to proprietary the gaming world to death, reminds me of the early days of gaming with the sound cards and GFX drivers 'only supports XYZ' crap.

4 people like this | EEatGDL said:

I don't want to sound like a NVIDIA fanboy. I watched the 3 hours long [after a while it became tortuous] AMD's presentation, something that bothered me was how much they compare everything to the competition and I told a friend and he said "all companies do that". But I've read GeForce.com articles and seen both companies presentations and videos, and it seems like in NVIDIA's world they are the only ones -in a good sense. They talk about their achievements and proposals without even mentioning the competition, their tech demos actually look intended for developers and seem to have a serious engineering purpose. I mean, NVIDIA compares their products vs their products and the usual top vs top competition comparison graphics; but AMD compares every single enhancement [Trueaudio, TressFX, DX 11.2 full support, Eyefinity, etc.] they do with one of the competition [PhysX, 3D Vision, Surround, etc.] and then minimizes competition's.

When I watch a NVIDIA presentation is like "let's forget about the competition and let's do something to improve gaming experience". I feel more comfortable not be reminded every 10 minutes how they have killer features that crush the competition. You can see for yourself reading GeForce.com articles and you'll understand my point, and don't get me wrong: I read both sides' articles, watch both sides' presentations and own products of both sides at home.

2 people like this | Burty117 Burty117, TechSpot Chancellor, said:

their tech demos actually look intended for developers and seem to have a serious engineering purpose. I mean, NVIDIA compares their products vs their products and the usual top vs top competition comparison graphics; but AMD compares every single enhancement [Trueaudio, TressFX, DX 11.2 full support, Eyefinity, etc.] they do with one of the competition [PhysX, 3D Vision, Surround, etc.] and then minimizes competition's.

When I watch a NVIDIA presentation is like "let's forget about the competition and let's do something to improve gaming experience". I feel more comfortable not be reminded every 10 minutes how they have killer features that crush the competition. You can see for yourself reading GeForce.com articles and you'll understand my point, and don't get me wrong: I read both sides' articles, watch both sides' presentations and own products of both sides at home.

Agreed completely, the tech demo with the pendulum was awesome, if you can find still images of it you can really start to see the benefit of G-Sync, but everyone there have said you've got to see it to truly believe. I also don't want to sound like a fan boy but I gotta hand it to nvidia, this is awesome.

JC713 JC713 said:

Now this is great tech.

4 people like this |
Staff
Per Hansson Per Hansson, TS Server Guru, said:

I hope this tech becomes open so both AMD & Intel can implement it aswell.

This is something that should have been solved 10 years ago, but now is much better than never...

I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!

But if I'm going to invest in this it needs to come in an IPS panel, no TN crap

madboyv1, TechSpot Paladin, said:

But if I'm going to invest in this it needs to come in an IPS panel, no TN crap

Sure TN panels have improved a lot in the last 5 years, but IPS and similar panel technologies are so much better and worth the premium. Oh, and that premium has decreased over the last 5 years too (unless you're trying to get one or more of the 30" monsters). IPS for life~

misor misor said:

Wow, I hope this g-sync will further drive monitor prices down in the Philippines and I'm likewise hoping that more companies will join in after asus, ben q, Philips, and viewsonic.

5 people like this | dividebyzero dividebyzero, trainee n00b, said:

I don't want to sound like a NVIDIA fanboy. I watched the 3 hours long [after a while it became tortuous] AMD's presentation, something that bothered me was how much they compare everything to the competition and I told a friend and he said "all companies do that". But I've read GeForce.com articles and seen both companies presentations and videos, and it seems like in NVIDIA's world they are the only ones -in a good sense.

That has been a deliberate marketing strategy from both Nvidia and Intel since Day 1 as a general rule. When was the last time you saw a Intel presentation, slide, or graphic that even acknowledges AMD's existence?

The inference is that the company is the leader (whether true or not), and leaders need not concern themselves with the companies in their wake. You also don't dilute the brand by sharing the spotlight with a competitor if at all possible. Confidence and arrogance are qualities associated with winners regardless whether you sympathise or despise the qualities themselves.

AMD never really had a chance to be a front runner. Establishing a brand takes time and a string of successes- a luxury that they would never be afforded since Intel was founded on the back of established engineering prowess whilst AMD's foundation laid in sales and marketing (primarily of scraps from Intel's banquet table). Intel already had a string of successes before AMD even got out of the starting blocks.

The engineering backround foundation of both Intel and Nvidia lends itself to long term unwavering goal setting. A sales and marketing foundation generally means goals are short term with plans and products in a constant state of flux as management attempt to second guess a future market rather than plan a product (both hardware and software ecosystem) in order to dictate a future market.

tomkaten tomkaten said:

Man, this move is such a load of bull... Forcing us to spend a lot on an unneeded (in my case) GPU Upgrade (650 TI boost and up) and $175 premium for the display upgrade kit. And limiting us to the Green (Dark) side, with this proprietary technology. I wouldn't encourage that.

We've all been fooled into silently accepting Directx and look how that's turned out. OpenGL is being marginalized for no reason, even though it's at least as good, with a better rendering pipeline. All those Androids bear testimony to that. The same will happen here, if we mindlessly allow Nvidia to push this platform-dependent crap. Just my 2 cents...

Obzoleet Obzoleet said:

Now this is interesting indeed!

dividebyzero dividebyzero, trainee n00b, said:

Man, this move is such a load of bull... Forcing us to spend a lot on an unneeded (in my case) GPU Upgrade (650 TI boost and up) and $175 premium for the display upgrade kit.

You seem to be brainwashed by consumerism. Nobody is forcing you to upgrade. It's part of the myth that people must have the next new thing that arrives on the scene

And limiting us to the Green (Dark) side, with this proprietary technology.

The monitors work as standard monitors if a compatible GPU isn't involved. More expensive than the base model screen? Yes, certainly, but if you don't think the added feature is cost effective (or you play with Vsync enabled, or screen tearing isn't an issue for you) then no one is forcing your hand.

... if we mindlessly allow Nvidia to push this platform-dependent crap. Just my 2 cents...

Get used to it. AMD just released details of two proprietary API's aimed exclusively at gaming (TrueAudio and Mantle). With fabrication foundry costs for smaller process nodes likely to increase greatly at 20nm, 16/20nm and 10nm FinFET, margins on card sales and return on investment are going to get very tight as IGP/APUs eat into the volume lower mainstream market. With less leeway in pricing (which is largely immaterial anyway as Nvidia and AMD seem to have reached a mutually beneficial understanding) the primary differentiators become marketing bullet points and supposed features (PhysX, TressFX, TrueAudio, game bundles etc.)

tomkaten tomkaten said:

You seem to be brainwashed by consumerism. Nobody is forcing you to upgrade. It's part of the myth that people must have the next new thing that arrives on the scene

If I was "brainwashed", I would be running the latest and greatest just because it exists, which I'm not. My good old GTX 460 served me fine to this day, since I'm not exactly the quintessential gamer. But I'd hate to have to upgrade just because a game of a series I enjoy starts working on that technology alone.

On the other hand, what often starts as innocent progress on one side of the competition can quickly turn into an exclusive "feature" later, that severely limits your future freedom of choice. We've seen it before and not just once...

And with competition on the GPU market limited to two players, a lot can go wrong.

The monitors work as standard monitors if a compatible GPU isn't involved. More expensive than the base model screen? Yes, certainly, but if you don't think the added feature is cost effective (or you play with Vsync enabled, or screen tearing isn't an issue for you) then no one is forcing your hand.

It remains to be seen how exactly they will integrate this new gimmick with display makers and how we'll be affected.

As for the last paragraph... You're right, but I would rather have companies investing more in technology and less in marketing campaigns. It's sad to see a bundle of games differentiate the rebadges of the major players...

1 person liked this | Burty117 Burty117, TechSpot Chancellor, said:

If I was "brainwashed", I would be running the latest and greatest just because it exists, which I'm not. My good old GTX 460 served me fine to this day, since I'm not exactly the quintessential gamer. But I'd hate to have to upgrade just because a game of a series I enjoy starts working on that technology alone.

You clearly have no idea what this article is about, it stops tearing and stuttering and increases response times on the screen, nothing more, nothing less, Games don't "depend" on this tech, it simply improves image quality, I don't know if you've played any 3D games but you'll notice that your framerate is never at a constant level, this syncs your screen refresh to the given frames per second to stop tearing and the like, Games don't depend on anything from G-Sync however, it does mean you could play games with even higher visuals as it makes 30 frames per seconds feel extremely smooth still.

On the other hand, what often starts as innocent progress on one side of the competition can quickly turn into an exclusive "feature" later, that severely limits your future freedom of choice. We've seen it before and not just once...

And with competition on the GPU market limited to two players, a lot can go wrong.

What's wrong with that? If Nvidia stops AMD cards from using this tech then AMD can bring their own stuff to the table and make it cheaper driving competition? Maybe they do a cross licensing deal, Nvidia gets access to the Audio API's AMD have recently released in return for certain AMD cards to have access to G-Sync? You never know...

It remains to be seen how exactly they will integrate this new gimmick with display makers and how we'll be affected.

I don't usually say this but if it really works as proposed (and according to everyone who has seen it all agree) this is NOT a gimmick, I don't know what your beef is with fixing an ancient issue PC gamers have had to put up with for years but this is pretty sweet. And display makers will probably make a "G-Sync compatible" version of already existing displays and raise the price to cover the extra costs. Why would we be affected? It's a new tech that does cost more to produce than what is currently available, this is how the world works.

As for the last paragraph... You're right, but I would rather have companies investing more in technology and less in marketing campaigns. It's sad to see a bundle of games differentiate the rebadges of the major players...

If you rather a company invest in technology rather than Marketing campaigns you must really hate AMD right about now...

EEatGDL said:

That has been a deliberate marketing strategy from both Nvidia and Intel since Day 1 as a general rule. When was the last time you saw a Intel presentation, slide, or graphic that even acknowledges AMD's existence?

The inference is that the company is the leader (whether true or not), and leaders need not concern themselves with the companies in their wake. You also don't dilute the brand by sharing the spotlight with a competitor if at all possible. Confidence and arrogance are qualities associated with winners regardless whether you sympathise or despise the qualities themselves...

Very good analysis dbz as usual, I slightly suspected that, but it feels less agressive and comfortable to watch when they don't compare every chance they get to others. I perceive it as something personal or like "projecting their trauma" and to my eyes that looks like they're degrading themselves or being sort of childish, eventually they project agression when they do that -the Fixer ads are a clear example of this.

Guest said:

The thing is that monitor vendors needed something to increase sells. This comes quite handy to them. The problem, the real and main problem is the lack of standardization, it does not seem to be so difficult to implement . Having each gpu manufacturer to implement it's own solution to this "extremely important" ( as many wants users to believe, but somehow completely ignored by the industry for decades ) will make things worst for consumers. What will monitor manufacturers do when intel and AMD come with it's own solution? Let's suppouse a 100usd prime for this benefit x 3 then the monitor will cost 300usd more to support all of them? seriously?! come on! If users don't plead for standardization who will?

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

Let's suppouse a 100usd prime for this benefit x 3 then the monitor will cost 300usd more to support all of them? seriously?! come on! If users don't plead for standardization who will?
Can you imagine monitors with nVidia or AMD ready logos? Why not, we are already buying other things such as motherboards with SLI/XFire ready logos? However I do see your point about standardization, I personally despise proprietary devices. But seriously the monitor would still function without G-Sync.

If you ask me I'd say make the G-Sync device removable so the other competitors can connect their devices if needed. Make the G-Sync device a monitor addin card.

dividebyzero dividebyzero, trainee n00b, said:

Very good analysis dbz as usual, I slightly suspected that, but it feels less agressive and comfortable to watch when they don't compare every chance they get to others. I perceive it as something personal or like "projecting their trauma" and to my eyes that looks like they're degrading themselves or being sort of childish, eventually they project agression when they do that -the Fixer ads are a clear example of this.

I think a lot of that comes down to the company structure at AMD. Hardware releases seldom to be in sync with the software/driver division (so widespread that it's now a given that launches are addended "You can expect better as the drivers mature", long running sagas with Enduro, frame pacing, VCE, HDMI sound etc.), and the companies PR/Marketing is seldom on the same page as either hardware or software. The hardware is generally first rate, the software is a relatively new area that AMD is (belatedly) getting into...but the PR? pure W.Jerry Sanders III channelling.

It's as though you have the archetypal self-effacing and introspective engineers in one sealed area, software writers in solitary confinement being passed pre-release hardware days before launch through a slot in the wall and told "to get on with it"...while the marketing guys hired for their passive-aggressive qualities, run amok in their distant cubicle farm yelling "Make it so" after devising strategies made up entirely of throwing fridge word magnets at the office vending machine.

With the exception of Jen Hsun-Huang himself, both Intel and Nvidia seem to have a more organised and restrained approach to the whole business.

Can you imagine monitors with nVidia or AMD ready logos? Why not, we are already buying other things such as motherboards with SLI/XFire ready logos?

Technically, things in that arena have been proprietary since Intel removed AMD, Cyrix, IDT etc from their coattails and producing pin compatible CPUs ended at Socket 4, 5, and 7

Guest said:

"Why not, we are already buying other things such as motherboards with SLI/XFire ready logos?"

Indeed, how ever this may change if crossfire or sli is implemented using the pcie 3.0, someone says this is the way it will be implemented in AMD 290X card. So why don't plead for standards instead implementing proprietary solutions? And I find unlikely to invest +100usd just for gsync feature while I could get a better video card, cpu or an ssd with it. Maybe standardization could lower that costs by a lot. I must say that while gaming I may barely seen the issues gsync solves, since I'm paying attention to the game overall and not to minor glitches, so investing in a more powerful gpu/cpu would be more wise at least for me.

bugejakurt said:

So if I am understanding correctly the number of screen Hz is the number of frames rendered on screen per second?

So if you have a monitor with 120Hz and the GPU outputs a maximum of 100 FPS in a particular game then you won't have any benefit with G-Sync? Or maybe a higher number of Hz than FPS results in more blurred images, etc?

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Technically, things in that arena have been proprietary since Intel removed AMD, Cyrix, IDT etc from their coattails and producing pin compatible CPUs ended at Socket 4, 5, and 7
Yeah I know, and I was furious when they did it. Furious to the point I made a decision to stick with Intel (little did I know they were the ones I should have been furious with), at that very moment.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

So if you have a monitor with 120Hz and the GPU outputs a maximum of 100 FPS in a particular game then you won't have any benefit with G-Sync? Or maybe a higher number of Hz than FPS results in more blurred images, etc?
In order for the monitor to sync with the GPU, it must first be able to refresh at a faster rate than the GPU. If the monitor can not refresh as quickly as the GPU, I'm not sure if G-Sync would function properly. So I doubt they will be putting G-Sync on a monitor that is not capable of at least 120Hz.

bugejakurt said:

In order for the monitor to sync with the GPU, it must first be able to refresh at a faster rate than the GPU. If the monitor can not refresh as quickly as the GPU, I'm not sure if G-Sync would function properly. So I doubt they will be putting G-Sync on a monitor that is not capable of at least 120Hz.

Yes exactly but If the monitor is refreshing with 120Hz and the GPU is capable of maxing 100 FPS in a particular game, then you don't need G-Sync? On the other hand I think G-Sync is limited with the maximum refresh rate of the monitor (it cannot go beyond).

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

Yes exactly but If the monitor is refreshing with 120Hz and the GPU is capable of maxing 100 FPS in a particular game
Under that scenario there will be 20 frames every second posted twice, causing a slight stutter. Posting the same frame twice on a 120Hz monitor would be the same as dropping the refresh to 60Hz. The ideal solution would be to match the synchronization, so that every frame is complete and updated.

Burty117 Burty117, TechSpot Chancellor, said:

Yes exactly but If the monitor is refreshing with 120Hz and the GPU is capable of maxing 100 FPS in a particular game, then you don't need G-Sync? On the other hand I think G-Sync is limited with the maximum refresh rate of the monitor (it cannot go beyond).

If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps.

hahahanoobs hahahanoobs said:

I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!

LMAO. That has to be the dumbest thing anyone working in the tech industry could say. Little success because the largest GPU maker on the planet has access to it? nVIDIA is the juggernaut here, not AMD. G-Sync is a BIG deal to millions of gamers around the world.

Billions are made off of proprietary products and services. It's how you stop the other guy from copying your recipe after you did all the work. You're just mad because you're an AMD fanboy, and if AMD isn't included then you think it will fail. HA! I run AMD GPU's and I would hope it comes to all GPU makers too, but I also know how businesses run, and I'm not going to bash nVIDIA for wanting to have the better product. Think about it, why would you need 3 GPU makers if you want all their good technologies to be open? You wouldn't, and their would be no reason for price drops. Isn't that what all the AMD fanboys say they want? Competition? Well then I guess it's AMD's turn to step up and do something better. Oh right, you want everything given to you. You're living in a fantasy world.

Will G-Sync succeed being exclusive to nVIDIA? It will as long as gamers are experiencing stutter, lag and tearing it will.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Think about it, why would you need 3 GPU makers if you want all their good technologies to be open?
Now who is making a dumb comment? Until the monitors have been reworked to support G-Sync, you don't know if it will or not. The monitor manufacturers may not agree unless it is standardized. Keeping it proprietary may not be a choice for nVidia.

dividebyzero dividebyzero, trainee n00b, said:

Might be more a case of an option rather than a standard. Doesn't seem greatly removed from some screen manufacturers providing scalar and non-scalar options. Scalar gives you the option to change the resolution but introduces input lag, whilst non-scalar keeps the response time intact at the cost of having to change native resolution (if required) via the graphics control panel of the attached computer.

Having said that VESA just love making up standards for just about everything so I could see them jumping on it, producing a slew of white papers and working groups, and a final specification ratification sometime around 2019.

GhostRyder GhostRyder said:

If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps.

We already technically have Dynamic V-Sync Burty, I mean if you think about it, the tech exists to do something like this as it is. This is cool, but I will wait to see if and how well it works in the long run, anything beyond 60hz has seemed to not really give much benefits yet and even so most cards cant handle games at that level without dropping the graphics settings. This sounds like a good concept, but we need to wait to see how well it works and when the tech catches up to make it easily obtainable at least on the high end.

Yes though, I understand this is built into the monitors and maybe it will be a huge improvement, but until I see it I cant say much about it.

Whether your talking about the mantle, g-sync, true audio, or whatever, at the moment its all talk until we see some results from whichever side is producing these new things.

LMAO. That has to be the dumbest thing anyone working in the tech industry could say. Little success because the largest GPU maker on the planet has access to it? nVIDIA is the juggernaut here, not AMD. G-Sync is a BIG deal to millions of gamers around the world.

Billions are made off of proprietary products and services. It's how you stop the other guy from copying your recipe after you did all the work. You're just mad because you're an AMD fanboy, and if AMD isn't included then you think it will fail. HA! I run AMD GPU's and I would hope it comes to all GPU makers too, but I also know how businesses run, and I'm not going to bash nVIDIA for wanting to have the better product. Think about it, why would you need 3 GPU makers if you want all their good technologies to be open? You wouldn't, and their would be no reason for price drops. Isn't that what all the AMD fanboys say they want? Competition? Well then I guess it's AMD's turn to step up and do something better. Oh right, you want everything given to you. You're living in a fantasy world.

Will G-Sync succeed being exclusive to nVIDIA? It will as long as gamers are experiencing stutter, lag and tearing it will.

Agreed with Clifford on your comment, whose making the dumb comment here? Your saying this fictional un-released all talk product is going to change the world??? This is all talk at the moment, its going to be cool if it works and when we see it for ourselves, but until then, its just something of speculation.

I hope this tech becomes open so both AMD & Intel can implement it aswell.

This is something that should have been solved 10 years ago, but now is much better than never...

I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!

But if I'm going to invest in this it needs to come in an IPS panel, no TN crap

Im with this guy, honestly I wish both manufacturers would stop all this personally. It hurts the gamers more than anything and makes buying cards extremely difficult for the average gamer. It comes down to things like "Oh I can play Batman, but now if I play tomb raider with the tressfx on its going to have issues" and vice versa. Either way, its annoying to anyone and everyone.

I look forward to seeing this tech in the near future, but I also don't think its going to change the world.

Guest said:

Well, burty, you have been infected by nvidia markeing strategy. you have zero idea how the gimmick physically works. oh my god. I wish the era of 3dfx neverended, people used to be smart back then. today, everybody gets infected by bombastic marketing bullshit talk. dont forget to suck it up. was it tom speaking? I swear I heard that voice on pcper. and he never presented any scientific infomation. just the old marketing talk.. how do we get out of this cr*p? only when a huge asteroid hits the poor earth, only then will the evil marketig finally end.

hahahanoobs hahahanoobs said:

Now who is making a dumb comment? Until the monitors have been reworked to support G-Sync, you don't know if it will or not. The monitor manufacturers may not agree unless it is standardized. Keeping it proprietary may not be a choice for nVidia.

It was a rhetorical question, not a comment. And their are already four monitor making heavyweights already supporting G-Sync. ASUS, Viewsonic, Philips, and BenQ.

hahahanoobs hahahanoobs said:

We already technically have Dynamic V-Sync Burty, I mean if you think about it, the tech exists to do something like this as it is. This is cool, but I will wait to see if and how well it works in the long run, anything beyond 60hz has seemed to not really give much benefits yet and even so most cards cant handle games at that level without dropping the graphics settings. This sounds like a good concept, but we need to wait to see how well it works and when the tech catches up to make it easily obtainable at least on the high end.

Yes though, I understand this is built into the monitors and maybe it will be a huge improvement, but until I see it I cant say much about it.

Whether your talking about the mantle, g-sync, true audio, or whatever, at the moment its all talk until we see some results from whichever side is producing these new things.

Agreed with Clifford on your comment, whose making the dumb comment here? Your saying this fictional un-released all talk product is going to change the world??? This is all talk at the moment, its going to be cool if it works and when we see it for ourselves, but until then, its just something of speculation.

No benefit going higher than 60Hz? LOL, clearly you have NO idea what you're talknig about, yet you want to argue with someone that actually does. That's smart.

BenQ, ASUS, Philips and Viewsonic are on board because it does work. The fix comes from the GPU driver in addition to the G-Sync module. Stop speculating and read/watch the proof already online.

Buddy I replied to said it was huge, so where do you get off singling me out? Oh right, because called him out on everything but that part.

Im with this guy, honestly I wish both manufacturers would stop all this personally. It hurts the gamers more than anything and makes buying cards extremely difficult for the average gamer. It comes down to things like "Oh I can play Batman, but now if I play tomb raider with the tressfx on its going to have issues" and vice versa. Either way, its annoying to anyone and everyone.

I look forward to seeing this tech in the near future, but I also don't think its going to change the world.

^So eliminating stutter, lag and tearing is no big deal? Gotcha. Meanwhile, nVIDIA spent money on coming up with a fix that has plagued 3D gaming since the beginning, and they should just give that tech out to anyone that wants it, so they can sell more GPU's? Okay, let me know when you start a business, so I can watch it fail in the first 6 months, because you gave what you worked so hard on, away to your competitors.

*sheesh*

hahahanoobs hahahanoobs said:

If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps.

No, that is not what G-Sync is. The module (and Kepler driver) forces the monitor to refresh only when the frame from the GPU buffer is ready. How it is now, the monitor is on a timer. Every 16ms it refreshes no matter what. Hence the stutter, lag and tearing we hate so much.

Also, turning Vsync on eliminates tearing, but you get stuttering and lag. Turning Vsync off you get tearing. G-Sync doesn't make you choose one over the other.

Watch and learn, instead of going by speculation. This is the internet, and not having the answer is not an excuse to make stuff up.

GhostRyder GhostRyder said:

Ok first of all, nice post boosting once again, you do this on every thread you post on.

Second, are you kidding, you have no idea what your speaking of, ive played on a 120hz setup, the benefits are very limited and most games cause the cards framerate to drop below 120hz, except on a crazy Tri-Quad Sli/CFX setup on a 1080p display that refreshes at 120hz, it is simply not feasible and the benefits are limited because the game is just going to be skipping around unless you sacrifice the quality which in return ruins the point of a high end setup. Stop pretending to be an expert and post boosting constantly while calling everyone ******. I have a 3D monitor that refreshes at 120hz and I can setup a 120hz mode, it was hard to run and the benefits in games was extremely limited as going for a higher resolution setup was a better idea and most people tend to agree with quality over quantity in the respect of higher resolution vs a 120hz setup.

Well, burty, you have been infected by nvidia markeing strategy. you have zero idea how the gimmick physically works. oh my god. I wish the era of 3dfx neverended, people used to be smart back then. today, everybody gets infected by bombastic marketing bullshit talk. dont forget to suck it up. was it tom speaking? I swear I heard that voice on pcper. and he never presented any scientific infomation. just the old marketing talk.. how do we get out of this cr*p? only when a huge asteroid hits the poor earth, only then will the evil marketig finally end.

Umm first of all guest, don't insult burty because he likes NVidia, its his choice, get over it. If this is a marketing ploy, then woopty freekin do, life moves on as usual. This idea is cool and could be an awesome thing, however until its seen in real with normal users, its just talk.

@Burty117, I do however, disagree with your statement towards the R290X.

Burty117 Burty117, TechSpot Chancellor, said:

No, that is not what G-Sync is. The module (and Kepler driver) forces the monitor to refresh only when the frame from the GPU buffer is ready. How it is now, the monitor is on a timer. Every 16ms it refreshes no matter what. Hence the stutter, lag and tearing we hate so much.

Also, turning Vsync on eliminates tearing, but you get stuttering and lag. Turning Vsync off you get tearing. G-Sync doesn't make you choose one over the other.

Watch and learn, instead of going by speculation. This is the internet, and not having the answer is not an excuse to make stuff up.

Yeah, I kinda watched the whole presentation days ago, and I know exactly how it works thank you very much and they said, right there in the presentation they sync the refresh rate of the screen to your frames being outputted from your graphics card dynamically in real time to stop tearing, stuttering and lag. I understand the timings behind it etc...

So yes, that is exactly what G-Sync is.

@Burty117, I do however, disagree with your statement towards the R290X.

You have to admit though, it is a bit disappointing so far, I was genuinely ready for a "titan killer" as that was the figure AMD were throwing around I guess if Mantle takes off though and is supported in more than one game (and it actually makes a considerable difference) it may change, still can't get over the fact it eats an extra 80 watts though!

4 people like this |
Staff
Per Hansson Per Hansson, TS Server Guru, said:

I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!

LMAO. That has to be the dumbest thing anyone working in the tech industry could say. Little success because the largest GPU maker on the planet has access to it? nVIDIA is the juggernaut here, not AMD. G-Sync is a BIG deal to millions of gamers around the world.

Billions are made off of proprietary products and services. It's how you stop the other guy from copying your recipe after you did all the work. You're just mad because you're an AMD fanboy, and if AMD isn't included then you think it will fail. HA! I run AMD GPU's and I would hope it comes to all GPU makers too, but I also know how businesses run, and I'm not going to bash nVIDIA for wanting to have the better product. Think about it, why would you need 3 GPU makers if you want all their good technologies to be open? You wouldn't, and their would be no reason for price drops. Isn't that what all the AMD fanboys say they want? Competition? Well then I guess it's AMD's turn to step up and do something better. Oh right, you want everything given to you. You're living in a fantasy world.

Will G-Sync succeed being exclusive to nVIDIA? It will as long as gamers are experiencing stutter, lag and tearing it will.

Before you make a bigger fool of yourself perhaps you should look in my profile what the manufacturer is of my Graphics card.

I prefer open PC standards, if it were not for them we would not have the ability to upgrade our PC's as we do.

Back in the days before the PCI & ISA standards existed you would have several proprietary & incompatible standards.

This made life for the user wanting to upgrade to the latest & greatest much more expensive & difficult.

Then again I would not expect someone that replies in the way you did to be able to comprehend this.

hahahanoobs hahahanoobs said:

Before you make a bigger fool of yourself perhaps you should look in my profile what the manufacturer is of my Graphics card.

I prefer open PC standards, if it were not for them we would not have the ability to upgrade our PC's as we do.

Back in the days before the PCI & ISA standards existed you would have several proprietary & incompatible standards.

This made life for the user wanting to upgrade to the latest & greatest much more expensive & difficult.

Then again I would not expect someone that replies in the way you did to be able to comprehend this.

And you're still living in the past.

hahahanoobs hahahanoobs said:

Yeah, I kinda watched the whole presentation days ago, and I know exactly how it works thank you very much and they said, right there in the presentation they sync the refresh rate of the screen to your frames being outputted from your graphics card dynamically in real time to stop tearing, stuttering and lag. I understand the timings behind it etc...

So yes, that is exactly what G-Sync is.

If you knew what it was you wouldn't have said using a 120Hz monitor is the same thing.

2 people like this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

And you're still living in the past.

And you are not learning from past mistakes!

1 person liked this | Burty117 Burty117, TechSpot Chancellor, said:

If you knew what it was you wouldn't have said using a 120Hz monitor is the same thing.

I didn't put anything of a sort in my post, which is quoted below for reference:

"If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps."

GhostRyder GhostRyder said:

You have to admit though, it is a bit disappointing so far, I was genuinely ready for a "titan killer" as that was the figure AMD were throwing around I guess if Mantle takes off though and is supported in more than one game (and it actually makes a considerable difference) it may change, still can't get over the fact it eats an extra 80 watts though!

Maybe, the extra watts makes it a bit more hungry in the power consumption area, however I really want to see this as a big issue so long as it cools itself. In the benchmarks from that source, it seems close to the titan (Like 30 points difference in firestrike) so its pretty good considering its going to be significantly cheaper that a titan and its faster than a 780 (even if the margin is not that huge). Plus im also keeping in mind this card is not officially released and that there are full revised drivers ready so this of course is subject to change. Im going to make my full judgements once people start getting their hands on them and seeing how they truly perform. But thats just me...

Anyways also @Burty117, is it just me or does this sound like a more automated Dynamic V-sync. Sounds cool if it is because to me that sounds like something that could benefit us more in the long run once we start getting further into the beyond 60hz range. Having this on a hardware level along with software could provide some interesting results in the nearby future, I really want to see this in action for myself.

dividebyzero dividebyzero, trainee n00b, said:

I didn't put anything of a sort in my post, which is quoted below for reference:

"If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps."

Yes. [link] GPU dictates framerate, G-Sync overrides the monitors native refresh rate to achieve parity with rendered frame output.

Nvidia is also releasing the G-Sync module as a standalone unit, so potentially any current DP optioned monitor the user has can be modded to use G-Sync. As for setup cost, I'm reminded of the initial Eyefinity investments that required a (then) rather sizeable expenditure of cash for suitable active DP adapters if the card purchased did not have the accessories bundled.

With accessories/peripherals, wider uptake equals manufacturing costs amortized over a larger number of units and wider range of second-source vendors- HDMI cables/peripherals being the classic example of what happens even when royalty fees are applicable.

hahahanoobs hahahanoobs said:

I didn't put anything of a sort in my post, which is quoted below for reference:

"If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps."

lol. Not even close. G-Sync is not static at ANY refresh rate.

edit: think of how adaptive v-sync works. It's up and down depending on the framerate. G-Sync is similar to that, but better, because it doesn't make you choose between Vsync being on or off, because their are pros and cons to both. G-Sync eliminates the cons.

3 people like this | dividebyzero dividebyzero, trainee n00b, said:

lol. Not even close. G-Sync is not static at ANY refresh rate.

Are you being deliberately obtuse?

Burty was obviously using the figures quoted as an example...unless you are of the opinion that the graphics card renders at an unwavering 100 fps.

:smh:

hahahanoobs hahahanoobs said:

Are you being deliberately obtuse?

Burty was obviously using the figures quoted as an example...unless you are of the opinion that the graphics card renders at an unwavering 100 fps.

:smh:

"Burty was obviously using the figures quoted as an example..."

Probably the most non technical example used in a technical discussion.

"unless you are of the opinion that the graphics card renders at an unwavering 100 fps."

Read: G-Sync is not static at ANY refresh rate

AKA, the monitor displays the frames as they are ready from the graphics card frame buffer using the G-Sync module which replaces the traditional scaler.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.