Nvidia bids adieu to V-Sync limitations with G-Sync monitor technology

So if I am understanding correctly the number of screen Hz is the number of frames rendered on screen per second?

So if you have a monitor with 120Hz and the GPU outputs a maximum of 100 FPS in a particular game then you won't have any benefit with G-Sync? Or maybe a higher number of Hz than FPS results in more blurred images, etc?
 
Technically, things in that arena have been proprietary since Intel removed AMD, Cyrix, IDT etc from their coattails and producing pin compatible CPUs ended at Socket 4, 5, and 7
Yeah I know, and I was furious when they did it. Furious to the point I made a decision to stick with Intel (little did I know they were the ones I should have been furious with), at that very moment.
 
So if you have a monitor with 120Hz and the GPU outputs a maximum of 100 FPS in a particular game then you won't have any benefit with G-Sync? Or maybe a higher number of Hz than FPS results in more blurred images, etc?
In order for the monitor to sync with the GPU, it must first be able to refresh at a faster rate than the GPU. If the monitor can not refresh as quickly as the GPU, I'm not sure if G-Sync would function properly. So I doubt they will be putting G-Sync on a monitor that is not capable of at least 120Hz.
 
In order for the monitor to sync with the GPU, it must first be able to refresh at a faster rate than the GPU. If the monitor can not refresh as quickly as the GPU, I'm not sure if G-Sync would function properly. So I doubt they will be putting G-Sync on a monitor that is not capable of at least 120Hz.

Yes exactly but If the monitor is refreshing with 120Hz and the GPU is capable of maxing 100 FPS in a particular game, then you don't need G-Sync? On the other hand I think G-Sync is limited with the maximum refresh rate of the monitor (it cannot go beyond).
 
Yes exactly but If the monitor is refreshing with 120Hz and the GPU is capable of maxing 100 FPS in a particular game
Under that scenario there will be 20 frames every second posted twice, causing a slight stutter. Posting the same frame twice on a 120Hz monitor would be the same as dropping the refresh to 60Hz. The ideal solution would be to match the synchronization, so that every frame is complete and updated.
 
Yes exactly but If the monitor is refreshing with 120Hz and the GPU is capable of maxing 100 FPS in a particular game, then you don't need G-Sync? On the other hand I think G-Sync is limited with the maximum refresh rate of the monitor (it cannot go beyond).

If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps.
 
I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!

LMAO. That has to be the dumbest thing anyone working in the tech industry could say. Little success because the largest GPU maker on the planet has access to it? nVIDIA is the juggernaut here, not AMD. G-Sync is a BIG deal to millions of gamers around the world.

Billions are made off of proprietary products and services. It's how you stop the other guy from copying your recipe after you did all the work. You're just mad because you're an AMD fanboy, and if AMD isn't included then you think it will fail. HA! I run AMD GPU's and I would hope it comes to all GPU makers too, but I also know how businesses run, and I'm not going to bash nVIDIA for wanting to have the better product. Think about it, why would you need 3 GPU makers if you want all their good technologies to be open? You wouldn't, and their would be no reason for price drops. Isn't that what all the AMD fanboys say they want? Competition? Well then I guess it's AMD's turn to step up and do something better. Oh right, you want everything given to you. You're living in a fantasy world.

Will G-Sync succeed being exclusive to nVIDIA? It will as long as gamers are experiencing stutter, lag and tearing it will.
 
Think about it, why would you need 3 GPU makers if you want all their good technologies to be open?
Now who is making a dumb comment? Until the monitors have been reworked to support G-Sync, you don't know if it will or not. The monitor manufacturers may not agree unless it is standardized. Keeping it proprietary may not be a choice for nVidia.
 
Might be more a case of an option rather than a standard. Doesn't seem greatly removed from some screen manufacturers providing scalar and non-scalar options. Scalar gives you the option to change the resolution but introduces input lag, whilst non-scalar keeps the response time intact at the cost of having to change native resolution (if required) via the graphics control panel of the attached computer.

Having said that VESA just love making up standards for just about everything so I could see them jumping on it, producing a slew of white papers and working groups, and a final specification ratification sometime around 2019.
 
If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps.
We already technically have Dynamic V-Sync Burty, I mean if you think about it, the tech exists to do something like this as it is. This is cool, but I will wait to see if and how well it works in the long run, anything beyond 60hz has seemed to not really give much benefits yet and even so most cards cant handle games at that level without dropping the graphics settings. This sounds like a good concept, but we need to wait to see how well it works and when the tech catches up to make it easily obtainable at least on the high end.

Yes though, I understand this is built into the monitors and maybe it will be a huge improvement, but until I see it I cant say much about it.

Whether your talking about the mantle, g-sync, true audio, or whatever, at the moment its all talk until we see some results from whichever side is producing these new things.

LMAO. That has to be the dumbest thing anyone working in the tech industry could say. Little success because the largest GPU maker on the planet has access to it? nVIDIA is the juggernaut here, not AMD. G-Sync is a BIG deal to millions of gamers around the world.

Billions are made off of proprietary products and services. It's how you stop the other guy from copying your recipe after you did all the work. You're just mad because you're an AMD fanboy, and if AMD isn't included then you think it will fail. HA! I run AMD GPU's and I would hope it comes to all GPU makers too, but I also know how businesses run, and I'm not going to bash nVIDIA for wanting to have the better product. Think about it, why would you need 3 GPU makers if you want all their good technologies to be open? You wouldn't, and their would be no reason for price drops. Isn't that what all the AMD fanboys say they want? Competition? Well then I guess it's AMD's turn to step up and do something better. Oh right, you want everything given to you. You're living in a fantasy world.

Will G-Sync succeed being exclusive to nVIDIA? It will as long as gamers are experiencing stutter, lag and tearing it will.

Agreed with Clifford on your comment, whose making the dumb comment here? Your saying this fictional un-released all talk product is going to change the world??? This is all talk at the moment, its going to be cool if it works and when we see it for ourselves, but until then, its just something of speculation.

I hope this tech becomes open so both AMD & Intel can implement it aswell.
This is something that should have been solved 10 years ago, but now is much better than never...
I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!
But if I'm going to invest in this it needs to come in an IPS panel, no TN crap

Im with this guy, honestly I wish both manufacturers would stop all this personally. It hurts the gamers more than anything and makes buying cards extremely difficult for the average gamer. It comes down to things like "Oh I can play Batman, but now if I play tomb raider with the tressfx on its going to have issues" and vice versa. Either way, its annoying to anyone and everyone.

I look forward to seeing this tech in the near future, but I also don't think its going to change the world.
 
Well, burty, you have been infected by nvidia markeing strategy. you have zero idea how the gimmick physically works. oh my god. I wish the era of 3dfx neverended, people used to be smart back then. today, everybody gets infected by bombastic marketing bullshit talk. dont forget to suck it up. was it tom speaking? I swear I heard that voice on pcper. and he never presented any scientific infomation. just the old marketing talk.. how do we get out of this cr*p? only when a huge asteroid hits the poor earth, only then will the evil marketig finally end.
 
Now who is making a dumb comment? Until the monitors have been reworked to support G-Sync, you don't know if it will or not. The monitor manufacturers may not agree unless it is standardized. Keeping it proprietary may not be a choice for nVidia.


It was a rhetorical question, not a comment. And their are already four monitor making heavyweights already supporting G-Sync. ASUS, Viewsonic, Philips, and BenQ.
 
We already technically have Dynamic V-Sync Burty, I mean if you think about it, the tech exists to do something like this as it is. This is cool, but I will wait to see if and how well it works in the long run, anything beyond 60hz has seemed to not really give much benefits yet and even so most cards cant handle games at that level without dropping the graphics settings. This sounds like a good concept, but we need to wait to see how well it works and when the tech catches up to make it easily obtainable at least on the high end.

Yes though, I understand this is built into the monitors and maybe it will be a huge improvement, but until I see it I cant say much about it.

Whether your talking about the mantle, g-sync, true audio, or whatever, at the moment its all talk until we see some results from whichever side is producing these new things.



Agreed with Clifford on your comment, whose making the dumb comment here? Your saying this fictional un-released all talk product is going to change the world??? This is all talk at the moment, its going to be cool if it works and when we see it for ourselves, but until then, its just something of speculation.

No benefit going higher than 60Hz? LOL, clearly you have NO idea what you're talknig about, yet you want to argue with someone that actually does. That's smart.

BenQ, ASUS, Philips and Viewsonic are on board because it does work. The fix comes from the GPU driver in addition to the G-Sync module. Stop speculating and read/watch the proof already online.

Buddy I replied to said it was huge, so where do you get off singling me out? Oh right, because called him out on everything but that part.



Im with this guy, honestly I wish both manufacturers would stop all this personally. It hurts the gamers more than anything and makes buying cards extremely difficult for the average gamer. It comes down to things like "Oh I can play Batman, but now if I play tomb raider with the tressfx on its going to have issues" and vice versa. Either way, its annoying to anyone and everyone.

I look forward to seeing this tech in the near future, but I also don't think its going to change the world.

^So eliminating stutter, lag and tearing is no big deal? Gotcha. Meanwhile, nVIDIA spent money on coming up with a fix that has plagued 3D gaming since the beginning, and they should just give that tech out to anyone that wants it, so they can sell more GPU's? Okay, let me know when you start a business, so I can watch it fail in the first 6 months, because you gave what you worked so hard on, away to your competitors.

*sheesh*
 
If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps.


No, that is not what G-Sync is. The module (and Kepler driver) forces the monitor to refresh only when the frame from the GPU buffer is ready. How it is now, the monitor is on a timer. Every 16ms it refreshes no matter what. Hence the stutter, lag and tearing we hate so much.

Also, turning Vsync on eliminates tearing, but you get stuttering and lag. Turning Vsync off you get tearing. G-Sync doesn't make you choose one over the other.

Watch and learn, instead of going by speculation. This is the internet, and not having the answer is not an excuse to make stuff up.
 
Ok first of all, nice post boosting once again, you do this on every thread you post on.

Second, are you kidding, you have no idea what your speaking of, ive played on a 120hz setup, the benefits are very limited and most games cause the cards framerate to drop below 120hz, except on a crazy Tri-Quad Sli/CFX setup on a 1080p display that refreshes at 120hz, it is simply not feasible and the benefits are limited because the game is just going to be skipping around unless you sacrifice the quality which in return ruins the point of a high end setup. Stop pretending to be an expert and post boosting constantly while calling everyone *****s. I have a 3D monitor that refreshes at 120hz and I can setup a 120hz mode, it was hard to run and the benefits in games was extremely limited as going for a higher resolution setup was a better idea and most people tend to agree with quality over quantity in the respect of higher resolution vs a 120hz setup.

Well, burty, you have been infected by nvidia markeing strategy. you have zero idea how the gimmick physically works. oh my god. I wish the era of 3dfx neverended, people used to be smart back then. today, everybody gets infected by bombastic marketing bullshit talk. dont forget to suck it up. was it tom speaking? I swear I heard that voice on pcper. and he never presented any scientific infomation. just the old marketing talk.. how do we get out of this cr*p? only when a huge asteroid hits the poor earth, only then will the evil marketig finally end.
Umm first of all guest, don't insult burty because he likes NVidia, its his choice, get over it. If this is a marketing ploy, then woopty freekin do, life moves on as usual. This idea is cool and could be an awesome thing, however until its seen in real with normal users, its just talk.

Burty117, I do however, disagree with your statement towards the R290X.
 
No, that is not what G-Sync is. The module (and Kepler driver) forces the monitor to refresh only when the frame from the GPU buffer is ready. How it is now, the monitor is on a timer. Every 16ms it refreshes no matter what. Hence the stutter, lag and tearing we hate so much.



Also, turning Vsync on eliminates tearing, but you get stuttering and lag. Turning Vsync off you get tearing. G-Sync doesn't make you choose one over the other.



Watch and learn, instead of going by speculation. This is the internet, and not having the answer is not an excuse to make stuff up.


Yeah, I kinda watched the whole presentation days ago, and I know exactly how it works thank you very much and they said, right there in the presentation they sync the refresh rate of the screen to your frames being outputted from your graphics card dynamically in real time to stop tearing, stuttering and lag. I understand the timings behind it etc...

So yes, that is exactly what G-Sync is.

Burty117, I do however, disagree with your statement towards the R290X.


You have to admit though, it is a bit disappointing so far, I was genuinely ready for a "titan killer" as that was the figure AMD were throwing around :( I guess if Mantle takes off though and is supported in more than one game (and it actually makes a considerable difference) it may change, still can't get over the fact it eats an extra 80 watts though!
 
I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!

LMAO. That has to be the dumbest thing anyone working in the tech industry could say. Little success because the largest GPU maker on the planet has access to it? nVIDIA is the juggernaut here, not AMD. G-Sync is a BIG deal to millions of gamers around the world.

Billions are made off of proprietary products and services. It's how you stop the other guy from copying your recipe after you did all the work. You're just mad because you're an AMD fanboy, and if AMD isn't included then you think it will fail. HA! I run AMD GPU's and I would hope it comes to all GPU makers too, but I also know how businesses run, and I'm not going to bash nVIDIA for wanting to have the better product. Think about it, why would you need 3 GPU makers if you want all their good technologies to be open? You wouldn't, and their would be no reason for price drops. Isn't that what all the AMD fanboys say they want? Competition? Well then I guess it's AMD's turn to step up and do something better. Oh right, you want everything given to you. You're living in a fantasy world.

Will G-Sync succeed being exclusive to nVIDIA? It will as long as gamers are experiencing stutter, lag and tearing it will.
Before you make a bigger fool of yourself perhaps you should look in my profile what the manufacturer is of my Graphics card.
I prefer open PC standards, if it were not for them we would not have the ability to upgrade our PC's as we do.
Back in the days before the PCI & ISA standards existed you would have several proprietary & incompatible standards.
This made life for the user wanting to upgrade to the latest & greatest much more expensive & difficult.
Then again I would not expect someone that replies in the way you did to be able to comprehend this.
 
Before you make a bigger fool of yourself perhaps you should look in my profile what the manufacturer is of my Graphics card.

I prefer open PC standards, if it were not for them we would not have the ability to upgrade our PC's as we do.

Back in the days before the PCI & ISA standards existed you would have several proprietary & incompatible standards.

This made life for the user wanting to upgrade to the latest & greatest much more expensive & difficult.

Then again I would not expect someone that replies in the way you did to be able to comprehend this.

And you're still living in the past.
 
Yeah, I kinda watched the whole presentation days ago, and I know exactly how it works thank you very much and they said, right there in the presentation they sync the refresh rate of the screen to your frames being outputted from your graphics card dynamically in real time to stop tearing, stuttering and lag. I understand the timings behind it etc...



So yes, that is exactly what G-Sync is.

If you knew what it was you wouldn't have said using a 120Hz monitor is the same thing.
 
If you knew what it was you wouldn't have said using a 120Hz monitor is the same thing.

I didn't put anything of a sort in my post, which is quoted below for reference:

"If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps."
 
You have to admit though, it is a bit disappointing so far, I was genuinely ready for a "titan killer" as that was the figure AMD were throwing around :( I guess if Mantle takes off though and is supported in more than one game (and it actually makes a considerable difference) it may change, still can't get over the fact it eats an extra 80 watts though!

Maybe, the extra watts makes it a bit more hungry in the power consumption area, however I really want to see this as a big issue so long as it cools itself. In the benchmarks from that source, it seems close to the titan (Like 30 points difference in firestrike) so its pretty good considering its going to be significantly cheaper that a titan and its faster than a 780 (even if the margin is not that huge). Plus im also keeping in mind this card is not officially released and that there are full revised drivers ready so this of course is subject to change. Im going to make my full judgements once people start getting their hands on them and seeing how they truly perform. But thats just me...

Anyways also Burty117, is it just me or does this sound like a more automated Dynamic V-sync. Sounds cool if it is because to me that sounds like something that could benefit us more in the long run once we start getting further into the beyond 60hz range. Having this on a hardware level along with software could provide some interesting results in the nearby future, I really want to see this in action for myself.
 
I didn't put anything of a sort in my post, which is quoted below for reference:
"If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps."
Yes. The G-Sync unit works in concert with the system GPU. GPU dictates framerate, G-Sync overrides the monitors native refresh rate to achieve parity with rendered frame output.

Nvidia is also releasing the G-Sync module as a standalone unit, so potentially any current DP optioned monitor the user has can be modded to use G-Sync. As for setup cost, I'm reminded of the initial Eyefinity investments that required a (then) rather sizeable expenditure of cash for suitable active DP adapters if the card purchased did not have the accessories bundled.
With accessories/peripherals, wider uptake equals manufacturing costs amortized over a larger number of units and wider range of second-source vendors- HDMI cables/peripherals being the classic example of what happens even when royalty fees are applicable.
 
I didn't put anything of a sort in my post, which is quoted below for reference:



"If your monitor was at 120Hz and your graphics card was outputting 100fps you would get tearing, If you put V-Sync on you'd lose a lot of frame per seconds, G-Sync solves exactly this by making the monitor sync at 100Hz when your graphics card can only output 100fps, that is why in all their demos you can barely see any difference at all from 60fps to 30fps."

lol. Not even close. G-Sync is not static at ANY refresh rate.

edit: think of how adaptive v-sync works. It's up and down depending on the framerate. G-Sync is similar to that, but better, because it doesn't make you choose between Vsync being on or off, because their are pros and cons to both. G-Sync eliminates the cons.
 
Back