Asus monitor with Nvidia G-Sync now up for pre-order

Well I'm lucky enough I don't have to pay quite so much for the monitor, so I'm happy to get one, and the 780Ti will boost performance sure, but not night and day levels, playing BF4 on a 1440p monitor and a 1080p monitor is (almost) night and day difference on a non overclocked 780. Hell I probably won't bother with the 780Ti if this screen does come in as the 780 is almost overkill at 1080p.

In games I can notice the difference relatively clearly between 60Hz and 75Hz, this is because my 75Hz monitor is only 1280x1024 so I can keep 75fps in pretty much anything I throw at it, I use this screen more often than I should because it is just soo smooth, This is why I want this G-Sync monitor, because I want that smoothness at a higher resolution.
Well whatever floats your boat, though getting above 120FPS on a 780 might be a challenge even at 1080p on BF4 mostly because those games are so demending in the upwards of beyond 75hz. My monitors do refresh at 75hz as well so I have my Dynamic V-Sync capped there and for me it stays smooth as silk on BF4. I prefer gaming at higher resolution in the long run because higher FPS gains I see minimal differences even in a game like BF3/4, I have a friend with NVidia 3D and a 3D monitor that runs at 120hz and even playing BF3 on it, the difference between my 75hz setup was so miniscule that I personally saw no point.

I think G-SYNC can be nice, but the fact that I would have to replce all three monitors and pay at least 1500 bucks to make the same setup would about kill me let alone having to buy at least 2 780ti's to keep up with that type of setup (Since HD 6990's cant use G-SYNC :p). If mantle comes out and gives me a 20% boost in performance on BF4 like its claiming, then theres not going to be a second thought and ill grab some 290X cards. Otherwise, im grabbing those 780ti Dual 8pin cards (When they come out) and running NVidia surround, though I doubt even then ill grab a Gsync monitor until I can get one for 300.
 
To be able to compare the higher DPI versus higher refresh rate you need the type of content that can take proper advantage of both. StarCraft 2, which I used an example, has the level of detail that looks so much better in higher DPI, while higher refresh rate for such game would make 0 difference. Surely there is plenty of content for which it would be the opposite, like FPS games. That's not to downsize the importance of higher DPI screens, rather to say there isn't enough content good enough for them. But then again, this is only as far as the computer games go, while for everything else higher DPI represents much higher value than higher refresh rate (when above 60Hz).

Ok I see what your trying to say, I Have StarCraft 2, and between 1080p and 1440p the image looks better, agreed, the game doesn't look any more realistic though. Again agreed with refresh rates and DPI, games like StarCraft 2 (actually most RTS games) don't benefit from higher refresh rates / frame rates above 60 really, it just becomes silky smooth, but third person / first person shooters, Racing games etc, Do benefit from a higher frame rate / refresh rate above 60 and it matters so much so, people such as myself are quite happy, without a second thought, change from 1440p @ 60Hz to go for 1080p @ 144Hz inc/G-sync. I'm looking forward to it just so I get get rid of the damn tearing in Battlefield xD
 
...but no....no matter how "crisp" your cartoon image....it is still a cartoon image....but pointed accepted about anti aliasing it is wasteful of resources......but on that same note why do I not see aliasing watching a blue ray movie at 1080p.....if game image quality could even approach the realism of blu ray playback at 1080p I'd be more than happy and not want for any higher resolution.
To answer your question:
In gaming each frame is individually rendered for the screen space - basically a snapshot at an instance in game time. A movie frame is an aggregated total of what happens on camera in a 24th/25th/30th, or 48th (The Hobbit) of a second depending on the frame rate standard used, and as such the image is less defined. The blurriness adds to the illusion of smooth motion.
Pause a game and note the clarity/definition of the image, and then pause a movie and note that the image crispness isn't as defined. Post process Gaussian blur based antialiasing algorithms in PC gaming (MLAA, SMAA, TXAA, FXAA), as well as 3dfx's older T-Buffer tech attempt to mimic the film experience.

As for the subject at hand, I think I'd prefer to add the G-Sync module kit to the LG UM95 IPS when it arrives. A little more screen real estate without the 4K pricetag, and a little more versatility in non-gaming workloads.
 
Last edited:
I have all of this and still want G-Sync added to my monitor.
Then you should wait on that external module thats supposed to be coming. If that works just as well, I might even go ahead and jump back with a coupe of those on my monitors (Depending on how many is needed for 3 monitors, I would guess three because of logic unless someone one will support three).
 
Then you should wait on that external module thats supposed to be coming. If that works just as well, I might even go ahead and jump back with a coupe of those on my monitors (Depending on how many is needed for 3 monitors, I would guess three because of logic unless someone one will support three).

As far as I'm aware those external module's are not possible currently and are just speculation, @LNCPapa may be waiting a very long time xD
 
Or you can just get a video card that can pump out wayt higher framerates than 60fps and turn off V-Sync.
I never EVER notice screen tearing on my monitor at 1920x1200 running games at 90+ fps.

Instead of that $500 money sink you can get a decent video card or SLI/Crossfire whatever you already have for a much better use off the money.
 
To answer your question:
In gaming each frame is individually rendered for the screen space - basically a snapshot at an instance in game time. A movie frame is an aggregated total of what happens on camera in a 24th/25th/30th, or 48th (The Hobbit) of a second depending on the frame rate standard used, and as such the image is less defined. The blurriness adds to the illusion of smooth motion.
Pause a game and note the clarity/definition of the image, and then pause a movie and note that the image crispness isn't as defined. Post process Gaussian blur based antialiasing algorithms in PC gaming (MLAA, SMAA, TXAA, FXAA), as well as 3dfx's older T-Buffer tech attempt to mimic the film experience.

Thanks for the explanation....kind of highlights what I'm saying.....I'd much rather see gpu processing power used to make things more realistic....perhaps by mathematically merging the last two frames in the buffer to make a more blurry composite and so mimic the situation in movies.....it would prefer to see the gpu power budget spent like this rather than just using more pixels.

Getting back to the G-sync debate....1080p for me is not an issue and as a monitor is a longer term investment neither is the price....for me the issues are the reduced game responsiveness that Vsync produces...I really notice it.....and the screen tearing.

But I'm not sure that a standard 144hz monitor would not sufficiently solve the problems without the need for G-sync with its added cost and possible game incompatibilities.
 
As far as I'm aware those external module's are not possible currently and are just speculation, @LNCPapa may be waiting a very long time xD
Yea its just specualtion, but it would be a very sad missed oportunity if they dont because there are more than just 4 brands of monitors out there. Plus forcing people to have to buy new monitors is a hard justification at the same resolution as their previous one. I mean there are exceptions of course, but most gamers at this point are already on either 1920x1080/1200 or 2560x1440/1600.
 
Or you can just get a video card that can pump out wayt higher framerates than 60fps and turn off V-Sync.
I never EVER notice screen tearing on my monitor at 1920x1200 running games at 90+ fps.

Instead of that $500 money sink you can get a decent video card or SLI/Crossfire whatever you already have for a much better use off the money.

Some people don't want to deal with extra heat, electric bills and SLI/Crossfire Profiles, just accept it.

Yea its just specualtion, but it would be a very sad missed oportunity if they dont because there are more than just 4 brands of monitors out there. Plus forcing people to have to buy new monitors is a hard justification at the same resolution as their previous one. I mean there are exceptions of course, but most gamers at this point are already on either 1920x1080/1200 or 2560x1440/1600.

They can't bring out a module, G-Sync works by essentially outputting directly to the LCD Panel, would be impossible to implement on current screens.

Anyway I think G-Sync will be one of those things you Consider when it's that time again to upgrade your monitor, when prices are more reasonable, I've seen the demo's and read what people have to say about it, it is crazy smooth still even when the fps are all over the place between 30-60fps, that adds longevity to your GPU since you wouldn't need to upgrade it as often if your after all the eye candy, or at least g-sync gives that illusion.
 
As far as I'm aware those external module's are not possible currently and are just speculation, @LNCPapa may be waiting a very long time xD
I doubt they'll be available as external modules at least in the short term since it may well require a SIG ratification.
A couple of notes:
1. The price for the kit is high because (as Nvidia has already stated) the present (proof of concept) G-Sync unit uses an over-engineered FGPA for processing. Nvidia have stated that once the tech has gained traction, the G-Sync module will feature a cheap purpose built ASIC similar to the scalar chips found in current monitors.
2. The actual add-in board shouldn't be that far away It isn't any different from changing timing control boards on present monitors, and just as with T-Con modding it's less about the hardware installation than it is doing your homework on compatibility. My guess is that it needs to be made apparent that the monitor needs to support DisplayPort, and that end user fitting will void the panel vendors warranty (bearing in mind that dis/reassembling some panels can be a tricky operation- esp with off-brand vendors)
@LNCPapa
Unless your monitor has DisplayPort input you'll be out of luck.
 
Oh yeah... I know I'm out of luck DBZ. I was just pointing out that even though I have all those things he said he would rather have there is still a need for this type of technology. I always wondered why we couldn't just adjust the refresh of the display devices as certain applications like media players have the capability to do similar things if the device supports it such as 24hz for media.
 
I think everyone thinking of buying one of these should wait and watch Linus's latest video. It makes some good points as to why you should wait.
 
I always wondered why we couldn't just adjust the refresh of the display devices as certain applications like media players have the capability to do similar things if the device supports it such as 24hz for media.
WHAT! You mean the vendors spending a couple of extra bucks per monitor control board so that instead of fifty different models distinguished a feature increment at a time, every model has inbuilt colour calibration, multi I/O options, and multi channel LVDS....sounds like crazy talk.
 
There was recent news that a new way of mass producing OLED via some kind of printer has been developed and that LG will be releasing screens on said technology 2014.....if this does come to pass soon I believe I'm right in saying oled has better colour reproduction and very high refresh rates(so would probably make G-sync unnecessary).....even though the production method is meant to make them more affordable I guess early adopters will pay a premium as ever.....but may be worth waiting on that as a monitor is a longer term purchase than a graphics card.
 
Anyway I think G-Sync will be one of those things you Consider when it's that time again to upgrade your monitor, when prices are more reasonable, I've seen the demo's and read what people have to say about it, it is crazy smooth still even when the fps are all over the place between 30-60fps, that adds longevity to your GPU since you wouldn't need to upgrade it as often if your after all the eye candy, or at least g-sync gives that illusion.

Agreed, I just dont think people on something like this should jump the gun expecting everything to be as amazing as described. In the past, its always been the same way no matter who is stating the product/showing it and who is manufacturing it. They always have some sort of exaggeration or show only best case scenarios in a demo, which makes the product sound like the greatest thing since sliced bread. When we see these in the open world and actually in use by an everyday user (Or at least reviewed by Linus or the likes) then we can see if its all hype or eyeball melting beautiful.

In all honesty, the other thing is that this requires it seems display port from what im reading/understanding (Could be mistaken, but I keep hearing mention of requires display port) which is an oddity to begin with meaning that you will need a display port cord which is already something you dont exactly see sitting on every shelf on top of the fact most NVidia GPUs (Ill reference the 780ti for this) has only one Display Port. So if your only wanting to game on one monitor, your fine though I feel at 1080p a 780ti would be a bit overkill. But if you want to game on multiple displays (For instance 3 way surround) you will need at least 3 cards to do this (Which for me, im already considering doing so :p). Thats just my views on this right now, hopefully all our questions will be answered in due time.
 
Agreed, I just dont think people on something like this should jump the gun expecting everything to be as amazing as described. In the past, its always been the same way no matter who is stating the product/showing it and who is manufacturing it. They always have some sort of exaggeration or show only best case scenarios in a demo, which makes the product sound like the greatest thing since sliced bread. When we see these in the open world and actually in use by an everyday user (Or at least reviewed by Linus or the likes) then we can see if its all hype or eyeball melting beautiful.

In all honesty, the other thing is that this requires it seems display port from what im reading/understanding (Could be mistaken, but I keep hearing mention of requires display port) which is an oddity to begin with meaning that you will need a display port cord which is already something you dont exactly see sitting on every shelf on top of the fact most NVidia GPUs (Ill reference the 780ti for this) has only one Display Port. So if your only wanting to game on one monitor, your fine though I feel at 1080p a 780ti would be a bit overkill. But if you want to game on multiple displays (For instance 3 way surround) you will need at least 3 cards to do this (Which for me, im already considering doing so :p). Thats just my views on this right now, hopefully all our questions will be answered in due time.

Yeah, Linus has had a G-Sync screen for like 2 weeks now and he does say it's really good but can't really review it properly as he can't show us due to the way the technology works, it's something we need to see ourselves and is in contact with Nvidia to have some kind of "G-Sync day" but he did say that could be a while off. He also mentioned the price is a big thumbs down as it is currently overly expensive, Only reason I'm considering one is because I can get one cheaper and I really hate screen tear, like really hate it, pretty much ruins the experience for me and get it a lot on a 60Hz 1440p monitor, The fact this fixes my biggest bug bear means quite a lot to me, it's the reason I'm all for it xD

Good shout on the Nvidia cards only having one displayport though, I do wonder how they plan on doing Surround with a single port :/
 
That's how they pull it off, I wonder how G-Sync works in this config? Since I thought G-Sync used some kind of AUX signal that only DisplayPort has to sync the fps and refresh rate? Only time will tell :)
I'm only guessing, which would mean the signal is broadcast. And if the monitor recognizes this signal, it will take advantage and refresh accordingly. As for the monitors that don't support GSync, they will continue to refresh at their predefined rates.

Looking at the link above, it looks as if DisplayPort 1.2 will only support one 4K monitor. It is a pixel limit on bandwidth.
 
Last edited:
aah! So you are right my good sir:
http://www.displayport.org/embedded...le-displays-from-a-single-displayport-output/

That's how they pull it off, I wonder how G-Sync works in this config? Since I thought G-Sync used some kind of AUX signal that only DisplayPort has to sync the fps and refresh rate? Only time will tell :)
Of *$%# hes right I totally forgot you can do that, I dont mess with display port as much except with the mini->HDMI adapters on my HD 6990's. Since one can handle 4 1080p displays on it, now we just have to wait on the price to go down and a conformation on how many displays a display port will do with G-SYNC involved (Which I would still assume based off readings would still be 4).

Now my biggest will be who will be the first to release one of their top tier cards with a dual 8 pin and better power delivery for higher overclocking. At this point whoever wins that race will have a check for 2 GPU's and waterblocks. Between G-SYNC and Mantle, my head is spinning around in circles on decisions. I watched the Linus video a few days ago on it and they did sound impressed, but I really want to see it with my own eyes first before I judge anything and of course we have to wait on driver maturity as that will be a defining factor in all of this. I still dont see a point much to gaming at FPS beyond the 60-75hz threshold yet, but this may convince me otherwise.
 
Yeah, Linus has had a G-Sync screen for like 2 weeks now and he does say it's really good but can't really review it properly as he can't show us due to the way the technology works, it's something we need to see ourselves and is in contact with Nvidia to have some kind of "G-Sync day" but he did say that could be a while off.
There is a downloadable demo available if you haven't already seen it. Check the link in this PC Per article
 
Back