TechSpot

Nvidia bids adieu to V-Sync limitations with G-Sync monitor technology

By Shawn Knight
Oct 19, 2013
Post New Reply
  1. Nvidia recently announced a new monitor technology that will completely eliminate screen tearing, stutter and lag. G-Sync is a module said to replace the scaler in LCD monitors and will first be arriving in gaming-grade displays from the likes of...

    Read more
     
    Burty117 and cliffordcooley like this.
  2. hahahanoobs

    hahahanoobs TS Maniac Posts: 1,009   +103

    This is really cool. I've been researching and reviewing 120Hz monitors lately, but I'll wait until these G-Unit monitors come out first.
     
  3. killeriii

    killeriii TS Enthusiast Posts: 213   +14

    Think I'll have to get one of these monitors when they hit the shelves!
    By the sounds of it, it will work with ANY gpu.
     
  4. Lurker101

    Lurker101 TS Addict Posts: 639   +127

    Remember kids, stuttering and lag aren't the fault of shoddy drivers or poorly optimized games. It's the monitors fault.
     
  5. Julio Franco

    Julio Franco TechSpot Editor Posts: 6,602   +358

    Edited with more specific pricing and quotes from game developers who were shown G-Sync at Nvidia's event. Looks like a promising addition to gaming monitors, hopefully for a lesser premium than that Asus-specific mod.
     
    cliffordcooley and misor like this.
  6. Burty117

    Burty117 TechSpot Chancellor Posts: 2,537   +330

    I truely hope your joking, this stops tearing when v-sync is off and stops stuttering when v-sync is on, plus allows you to use as much GPU power as you want and lowers lag between pressing a button and what happens on screen.
    It works and from everything I've read, every person who's seen it cannot believe how well it works. I guess you pre-ordered an R9 290x?

    Nope, this only works with a 650Ti boost and up and is also Nvidia only I'm afraid (or at least, as far as I'm aware, it appears to use the AUX on displayport to sync the frames to the refresh rates so it may be add able to AMD cards if nvidia allow it).

    So to wrap things up, AMD push out a power hungry R9 290x that is barely faster than a standard 780 but uses 80 watts more power, and Mantle which currently is supported in a single game that is being patched in later this year.

    Meanwhile Nvidia put their tech into major game engines like the unreal engine, they still have the fastest card on the market (780 Ti still withstanding of course, it may or may not be faster than the Titan) and G-Sync, which improves every game ever made and for the foreseeable future by fixing an ancient issue.

    I just wish more details about the screens would be released! I would love a 1440p screen (27 inch) which had this tech and was capable of 96-144Hz, but god knows what the price on that would be!
     
    misor likes this.
  7. hahahanoobs

    hahahanoobs TS Maniac Posts: 1,009   +103

    Meanwhile, in Lurkers' perfect world...
     
  8. Ranger1st

    Ranger1st TS Enthusiast Posts: 275   +77

    Nvidia, Intel and AMD are going to proprietary the gaming world to death, reminds me of the early days of gaming with the sound cards and GFX drivers 'only supports XYZ' crap.
     
    freythman likes this.
  9. EEatGDL

    EEatGDL TS Booster Posts: 305   +51

    I don't want to sound like a NVIDIA fanboy. I watched the 3 hours long [after a while it became tortuous] AMD's presentation, something that bothered me was how much they compare everything to the competition and I told a friend and he said "all companies do that". But I've read GeForce.com articles and seen both companies presentations and videos, and it seems like in NVIDIA's world they are the only ones -in a good sense. They talk about their achievements and proposals without even mentioning the competition, their tech demos actually look intended for developers and seem to have a serious engineering purpose. I mean, NVIDIA compares their products vs their products and the usual top vs top competition comparison graphics; but AMD compares every single enhancement [Trueaudio, TressFX, DX 11.2 full support, Eyefinity, etc.] they do with one of the competition [PhysX, 3D Vision, Surround, etc.] and then minimizes competition's.
    When I watch a NVIDIA presentation is like "let's forget about the competition and let's do something to improve gaming experience". I feel more comfortable not be reminded every 10 minutes how they have killer features that crush the competition. You can see for yourself reading GeForce.com articles and you'll understand my point, and don't get me wrong: I read both sides' articles, watch both sides' presentations and own products of both sides at home.
     
  10. Burty117

    Burty117 TechSpot Chancellor Posts: 2,537   +330

    Agreed completely, the tech demo with the pendulum was awesome, if you can find still images of it you can really start to see the benefit of G-Sync, but everyone there have said you've got to see it to truly believe. I also don't want to sound like a fan boy but I gotta hand it to nvidia, this is awesome.
     
    hahahanoobs and MrAnderson like this.
  11. JC713

    JC713 TS Evangelist Posts: 7,082   +920

    Now this is great tech.
     
     
  12. Per Hansson

    Per Hansson TS Server Guru Posts: 1,933   +126 Staff Member

    I hope this tech becomes open so both AMD & Intel can implement it aswell.
    This is something that should have been solved 10 years ago, but now is much better than never...
    I would just hate it if this becomes some proprietary tech that only sees very limited success, because it really is a game changer!
    But if I'm going to invest in this it needs to come in an IPS panel, no TN crap
     
  13. madboyv1

    madboyv1 TechSpot Paladin Posts: 974   +54

    Sure TN panels have improved a lot in the last 5 years, but IPS and similar panel technologies are so much better and worth the premium. Oh, and that premium has decreased over the last 5 years too (unless you're trying to get one or more of the 30" monsters). IPS for life~
     
  14. misor

    misor TS Maniac Posts: 1,026   +154

    Wow, I hope this g-sync will further drive monitor prices down in the Philippines and I'm likewise hoping that more companies will join in after asus, ben q, Philips, and viewsonic.
     
  15. dividebyzero

    dividebyzero trainee n00b Posts: 4,968   +738

    That has been a deliberate marketing strategy from both Nvidia and Intel since Day 1 as a general rule. When was the last time you saw a Intel presentation, slide, or graphic that even acknowledges AMD's existence?
    The inference is that the company is the leader (whether true or not), and leaders need not concern themselves with the companies in their wake. You also don't dilute the brand by sharing the spotlight with a competitor if at all possible. Confidence and arrogance are qualities associated with winners regardless whether you sympathise or despise the qualities themselves.

    AMD never really had a chance to be a front runner. Establishing a brand takes time and a string of successes- a luxury that they would never be afforded since Intel was founded on the back of established engineering prowess whilst AMD's foundation laid in sales and marketing (primarily of scraps from Intel's banquet table). Intel already had a string of successes before AMD even got out of the starting blocks.
    The engineering backround foundation of both Intel and Nvidia lends itself to long term unwavering goal setting. A sales and marketing foundation generally means goals are short term with plans and products in a constant state of flux as management attempt to second guess a future market rather than plan a product (both hardware and software ecosystem) in order to dictate a future market.
     
    EEatGDL, Burty117, tomkaten and 2 others like this.
  16. tomkaten

    tomkaten TS Booster Posts: 101   +33

    Man, this move is such a load of bull... Forcing us to spend a lot on an unneeded (in my case) GPU Upgrade (650 TI boost and up) and $175 premium for the display upgrade kit. And limiting us to the Green (Dark) side, with this proprietary technology. I wouldn't encourage that.

    We've all been fooled into silently accepting Directx and look how that's turned out. OpenGL is being marginalized for no reason, even though it's at least as good, with a better rendering pipeline. All those Androids bear testimony to that. The same will happen here, if we mindlessly allow Nvidia to push this platform-dependent crap. Just my 2 cents...
     
  17. Obzoleet

    Obzoleet TS Enthusiast Posts: 171   +7

    Now this is interesting indeed! :)
     
  18. dividebyzero

    dividebyzero trainee n00b Posts: 4,968   +738

    You seem to be brainwashed by consumerism. Nobody is forcing you to upgrade. It's part of the myth that people must have the next new thing that arrives on the scene
    The monitors work as standard monitors if a compatible GPU isn't involved. More expensive than the base model screen? Yes, certainly, but if you don't think the added feature is cost effective (or you play with Vsync enabled, or screen tearing isn't an issue for you) then no one is forcing your hand.
    Get used to it. AMD just released details of two proprietary API's aimed exclusively at gaming (TrueAudio and Mantle). With fabrication foundry costs for smaller process nodes likely to increase greatly at 20nm, 16/20nm and 10nm FinFET, margins on card sales and return on investment are going to get very tight as IGP/APUs eat into the volume lower mainstream market. With less leeway in pricing (which is largely immaterial anyway as Nvidia and AMD seem to have reached a mutually beneficial understanding) the primary differentiators become marketing bullet points and supposed features (PhysX, TressFX, TrueAudio, game bundles etc.)
     
  19. tomkaten

    tomkaten TS Booster Posts: 101   +33

    If I was "brainwashed", I would be running the latest and greatest just because it exists, which I'm not. My good old GTX 460 served me fine to this day, since I'm not exactly the quintessential gamer. But I'd hate to have to upgrade just because a game of a series I enjoy starts working on that technology alone.

    On the other hand, what often starts as innocent progress on one side of the competition can quickly turn into an exclusive "feature" later, that severely limits your future freedom of choice. We've seen it before and not just once...
    And with competition on the GPU market limited to two players, a lot can go wrong.

    It remains to be seen how exactly they will integrate this new gimmick with display makers and how we'll be affected.

    As for the last paragraph... You're right, but I would rather have companies investing more in technology and less in marketing campaigns. It's sad to see a bundle of games differentiate the rebadges of the major players...
     
  20. Burty117

    Burty117 TechSpot Chancellor Posts: 2,537   +330

    You clearly have no idea what this article is about, it stops tearing and stuttering and increases response times on the screen, nothing more, nothing less, Games don't "depend" on this tech, it simply improves image quality, I don't know if you've played any 3D games but you'll notice that your framerate is never at a constant level, this syncs your screen refresh to the given frames per second to stop tearing and the like, Games don't depend on anything from G-Sync however, it does mean you could play games with even higher visuals as it makes 30 frames per seconds feel extremely smooth still.

    What's wrong with that? If Nvidia stops AMD cards from using this tech then AMD can bring their own stuff to the table and make it cheaper driving competition? Maybe they do a cross licensing deal, Nvidia gets access to the Audio API's AMD have recently released in return for certain AMD cards to have access to G-Sync? You never know...

    I don't usually say this but if it really works as proposed (and according to everyone who has seen it all agree) this is NOT a gimmick, I don't know what your beef is with fixing an ancient issue PC gamers have had to put up with for years but this is pretty sweet. And display makers will probably make a "G-Sync compatible" version of already existing displays and raise the price to cover the extra costs. Why would we be affected? It's a new tech that does cost more to produce than what is currently available, this is how the world works.

    If you rather a company invest in technology rather than Marketing campaigns you must really hate AMD right about now...
     
    cliffordcooley likes this.
  21. EEatGDL

    EEatGDL TS Booster Posts: 305   +51

    Very good analysis dbz as usual, I slightly suspected that, but it feels less agressive and comfortable to watch when they don't compare every chance they get to others. I perceive it as something personal or like "projecting their trauma" and to my eyes that looks like they're degrading themselves or being sort of childish, eventually they project agression when they do that -the Fixer ads are a clear example of this.
     
  22. The thing is that monitor vendors needed something to increase sells. This comes quite handy to them. The problem, the real and main problem is the lack of standardization, it does not seem to be so difficult to implement . Having each gpu manufacturer to implement it's own solution to this "extremely important" ( as many wants users to believe, but somehow completely ignored by the industry for decades ) will make things worst for consumers. What will monitor manufacturers do when intel and AMD come with it's own solution? Let's suppouse a 100usd prime for this benefit x 3 then the monitor will cost 300usd more to support all of them? seriously?! come on! If users don't plead for standardization who will?
     
  23. cliffordcooley

    cliffordcooley TechSpot Paladin Posts: 6,406   +1,593

    Can you imagine monitors with nVidia or AMD ready logos? Why not, we are already buying other things such as motherboards with SLI/XFire ready logos? However I do see your point about standardization, I personally despise proprietary devices. But seriously the monitor would still function without G-Sync.

    If you ask me I'd say make the G-Sync device removable so the other competitors can connect their devices if needed. Make the G-Sync device a monitor addin card.
     
    freythman likes this.
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,968   +738

    I think a lot of that comes down to the company structure at AMD. Hardware releases seldom to be in sync with the software/driver division (so widespread that it's now a given that launches are addended "You can expect better as the drivers mature", long running sagas with Enduro, frame pacing, VCE, HDMI sound etc.), and the companies PR/Marketing is seldom on the same page as either hardware or software. The hardware is generally first rate, the software is a relatively new area that AMD is (belatedly) getting into...but the PR? pure W.Jerry Sanders III channelling.

    It's as though you have the archetypal self-effacing and introspective engineers in one sealed area, software writers in solitary confinement being passed pre-release hardware days before launch through a slot in the wall and told "to get on with it"...while the marketing guys hired for their passive-aggressive qualities, run amok in their distant cubicle farm yelling "Make it so" after devising strategies made up entirely of throwing fridge word magnets at the office vending machine.

    With the exception of Jen Hsun-Huang himself, both Intel and Nvidia seem to have a more organised and restrained approach to the whole business.
    Technically, things in that arena have been proprietary since Intel removed AMD, Cyrix, IDT etc from their coattails and producing pin compatible CPUs ended at Socket 4, 5, and 7
     
  25. "Why not, we are already buying other things such as motherboards with SLI/XFire ready logos?"
    Indeed, how ever this may change if crossfire or sli is implemented using the pcie 3.0, someone says this is the way it will be implemented in AMD 290X card. So why don't plead for standards instead implementing proprietary solutions? And I find unlikely to invest +100usd just for gsync feature while I could get a better video card, cpu or an ssd with it. Maybe standardization could lower that costs by a lot. I must say that while gaming I may barely seen the issues gsync solves, since I'm paying attention to the game overall and not to minor glitches, so investing in a more powerful gpu/cpu would be more wise at least for me.
     


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.