Asus monitor with Nvidia G-Sync now up for pre-order

By on December 17, 2013, 8:30 AM

Back in October, Nvidia announced G-Sync, a new technology designed to eliminate screen tearing and stuttering in games. Through the inclusion of a dedicated chip inside the monitor itself, the monitor's refresh rate can be synchronized to the GPU's render rate, allowing frames to be displayed the instant they are rendered.

Until now, G-Sync wasn't ready for the public, with Asus' 24-inch monitor fitted with the G-Sync chip going up for pre-order today. The VG248QE with G-Sync will be available for $499, which is around $220 more than the standard monitor; if you already have the VG248QE, you can get the G-Sync chip installed for $299.

Previously, Nvidia had said that the G-Sync add-on chip would cost $175, so the early prices are a little more expensive than first anticipated. That said, the current solution sees the chip installed into the non-G-Sync version of the monitor by the retailers, which could explain the mark-up.

The panel itself is Asus' 24-inch 1920 x 1080 TN TFT LCD (LED-backlit) with a maximum refresh rate of 144 Hz (3D-capable), and comes with HDMI, DVI-D and DisplayPort connectors on the back. To get G-Sync working you'll require an Nvidia GeForce GTX 650 Ti graphics card or higher, running on driver version 331.58.

More displays with G-Sync integrated are expected to go on sale from Q1 2014 from other manufacturers, including BenQ, Philips and ViewSonic.




User Comments: 45

Got something to say? Post a comment
VitalyT VitalyT said:

This came out a little too late, imo.

Today DELL released their long-awaited UP2414Q, which sets a new benchmark for professional monitors in terms of both quality and pricing.

When nVidia releases G-sync for a similar 4K product, I will be interested, but for now I'd rather go for a higher DPI than just higher render rate.

LukeDJ LukeDJ said:

This came out a little too late, imo.

Today DELL released their long-awaited UP2414Q, which sets a new benchmark for professional monitors in terms of both quality and pricing.

When nVidia releases G-sync for a similar 4K product, I will be interested, but for now I'd rather go for a higher DPI than just higher render rate.

I dunno man, I'd rather the G-sync monitor at this point, for gaming anyway. It provides more benefits than extra pixels do in my opinion.

VitalyT VitalyT said:

I dunno man, I'd rather the G-sync monitor at this point, for gaming anyway. It provides more benefits than extra pixels do in my opinion.

Either way, it is one hell of a compromise when it comes to choosing one over the other. You'd definitely want the higher DPI one for everything but the gaming, and the higher-rate one for gaming. Having both is too awkward for most users, and we are not likely to see a product that combines both earlier than the end of next year...

But then again, too many gamers today have upgraded to 1440p and 1600p monitors, making this new 1080p higher-rate product a big step back. Myself, I've been using DELL U3014 since April this year, and one with 4K would be the next logical step, and there is no way I'd swap it for a 1080p monitor, I don't care if it got 1000Ghz update rate.

1 person liked this | LukeDJ LukeDJ said:

Either way, it is one hell of a compromise when it comes to choosing one over the other. You'd definitely want the higher DPI one for everything but the gaming, and the higher-rate one for gaming. Having both is too awkward for most users, and we are not likely to see a product that combines both earlier than the end of next year...

But then again, too many gamers today have upgraded to 1440p and 1600p monitors, making this new 1080p higher-rate product a big step back. Myself, I've been using DELL U3014 since April this year, and one with 4K would be the next logical step, and there is no way I'd swap it for a 1080p monitor, I don't care if it got 1000Ghz update rate.

Very true. Anyway, what do I care, I can't afford either :p

1 person liked this | Skidmarksdeluxe Skidmarksdeluxe said:

$500 for some lousy 24" TN monitor?... No thanks, I most definitely pass. You've gotta be some hardcore bloody minded gamer to even consider this. If it was an IPS monitor, it would've made the price a bit easier to swallow but even that's too much. I can see this G-Sync thing being a very niche product for the time being. Hopefully nVidia can get the price of this $175 chip down to twenty bucks because that's all it's worth to me as far as I'm concerned. Who's running the show at nVidia nowadays? Apple execs?

cliffordcooley cliffordcooley, TechSpot Paladin, said:

$500 for some lousy 24" TN monitor?... No thanks, I most definitely pass.
I was thinking the same thing.

1 person liked this | Burty117 Burty117, TechSpot Chancellor, said:

I'm getting this monitor early next week, I've currently got a 1440p monitor and I'm quite happy to be going "back" to 1080p. 1440p (even with a GTX 780) is bloody hard to run the latest games with all the candy turned up AND keep at 60fps, Battlefield 4 is an example, with V-Sync enabled you get serious lag, without it you get some of the worst tearing I've ever seen, I'm quite happy to drop down to 1080p and get 144Hz PLUS G-Sync doing it's thing, to me, that's a serious upgrade.

If I was a photographer It would be a downgrade but as a gamer it's an upgrade.

Vrmithrax Vrmithrax, TechSpot Paladin, said:

I'm sure the monitor's performance will be great... But for those of us running AMD graphics, it represents a potentially massive investment to change over. Yay for proprietary hardware locked to a single manufacturer! Bleh... (sad face)

spencer spencer said:

300$ for a little chip, I'd rather crossfire. And go with v-sync and I would still get better performance.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

But for those of us running AMD graphics, it represents a potentially massive investment to change over. Yay for proprietary hardware locked to a single manufacturer! Bleh... (sad face)
Speaking about that, how is AMD's Mantle project coming along? Has anyone noticed any benefits with Mantle?

1 person liked this | LNCPapa LNCPapa said:

Some of you guys making snap judgments need to do a bit more research on G-Sync. It's not only about refresh rate but more about frame presentation. I will agree though that the price is a bit hard to swallow based on the monitor's other limitations.

EEatGDL said:

I'm buying one of this, I was about to change in January my old 17", 7 years old monitor for a gaming monitor and it is inside my budget, so I would gladly do so with G-SYNC.

3 people like this | technogiant said:

Personally speaking as a gamer I think the race to 4K is the wrong way to go and think it will hold back the development of photo realism.

If game developers have to ensure that current hardware can run their games at 60fps on 4K monitors then they are going to have to turn down the effects they use.

I'd much rather see 1080p hang around for a while and have game developers work on techniques to increase the realism at that resolution rather than using increasing gpu power just to push more pixels....more pixels does not equate to more realism guys.

Guest said:

What technogiant said.

VitalyT VitalyT said:

....more pixels does not equate to more realism guys.

More pixels replace the need for spatial antialiasing, which is there because individual pixels are too visible. Modern video cards spend a lot of power to implement it (supersampling). With high DPI displays we can switch spatial aliasing off without any sacrifice in quality, and your video card can spend more resources to render higher resolution natively.

This will look even better than supersampling, because the latter is an approximation.

I have two systems here to compare:

I play StarCraft 2 on 2 systems, one on DELL U3014 (2560x1600, with anti-aliasing set to maximum, and one on the new MacBook Pro 15" (2880x1800) with antialiasing switched off. And the latter looks better, crisper, and overly eye-pleasing.

So, yeah, more pixels does equate to more realism.

GhostRyder GhostRyder said:

I'm getting this monitor early next week, I've currently got a 1440p monitor and I'm quite happy to be going "back" to 1080p. 1440p (even with a GTX 780) is bloody hard to run the latest games with all the candy turned up AND keep at 60fps, Battlefield 4 is an example, with V-Sync enabled you get serious lag, without it you get some of the worst tearing I've ever seen, I'm quite happy to drop down to 1080p and get 144Hz PLUS G-Sync doing it's thing, to me, that's a serious upgrade.

If I was a photographer It would be a downgrade but as a gamer it's an upgrade.

I thought you were going to invest in a 780ti anyways? Wouldnt that bump your performance up just fine anyways? I think buying that monitor might feel like more of an upgrade than it really is. Its price is set to high for what it in reality is giving to you, if I were you I would go with a 780ti and just overclock it to keep the 1440p eye candy going.

I dunno guys, I like the idea of G-SYNC, but at this price without actually seeing some type of performance (Linus hasnt been able to even post a preview yet and theres really nothing out of NVidias controlled environments showcasing this) makes it a very hard sell. Ill gladly be sticking with my 3-way eyefinity (Or depending on if the 290X stays at the price its at, NVidia Surround) over getting one G-SYNC monitor.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Realism as in 720P movie compared to 4k animation. If it still looks like an animation, it doesn't matter how many pixels you give it. The 720P movie will still look more realistic. Make the games realistic and then add more pixels.

And in the meantime, we can all benefit from G-Sync.

Burty117 Burty117, TechSpot Chancellor, said:

....more pixels does not equate to more realism guys.

More pixels replace the need for spatial antialiasing, which is there because individual pixels are too visible. Modern video cards spend a lot of power to implement it (supersampling). With high DPI displays we can switch spatial aliasing off without any sacrifice in quality, and your video card can spend more resources to render higher resolution natively.

This will look even better than supersampling, because the latter is an approximation.

I have two systems here to compare:

I play StarCraft 2 on 2 systems, one on DELL U3014 (2560x1600, with anti-aliasing set to maximum, and one on the new MacBook Pro 15" (2880x1800) with antialiasing switched off. And the latter looks better, crisper, and overly eye-pleasing.

So, yeah, more pixels does equate to more realism.

I have a 1440P monitor and a 1080p monitor, I can tell you with absolute fact I can run battlefield 4 with higher framerates at 1080p with anti-aliasing turned right up than with the 1440p monitor with no anti-aliasing.

Also, more pixel's do not equate to more realism, I would rather they concentrated on more realistic lighting, higher polygon counts on characters and facial animation, better textures, that is more realism, for example, we'll take an old PS1 game, Original Gran Tursimo, if we could play that today at 4k guess what? it wouldn't be any more realistic, if they gave it better textures, lighting etc... It would look more realistic.

I can see the difference between 1080p and 1440p, I really can, but it is a much smaller difference when compare to 480p vs 1080p, Yes the image looks better, but the games realism isn't any better, just looks crisper, and considering the performance trade off, I'd rather a slightly less crisper image and much better framerates/graphics than a slightly crisper image. This is coming from someone who "upgraded" to 1440p for gaming, man do I regret it xD

But for everything else, video's, web browsing and using music/video editing suites it is much better at. For gaming though, I wish I'd passed.

technogiant said:

So, yeah, more pixels does equate to more realism.

...but no....no matter how "crisp" your cartoon image....it is still a cartoon image....but pointed accepted about anti aliasing it is wasteful of resources......but on that same note why do I not see aliasing watching a blue ray movie at 1080p.....if game image quality could even approach the realism of blu ray playback at 1080p I'd be more than happy and not want for any higher resolution.

Burty117 Burty117, TechSpot Chancellor, said:

I thought you were going to invest in a 780ti anyways? Wouldnt that bump your performance up just fine anyways? I think buying that monitor might feel like more of an upgrade than it really is. Its price is set to high for what it in reality is giving to you, if I were you I would go with a 780ti and just overclock it to keep the 1440p eye candy going.

Well I'm lucky enough I don't have to pay quite so much for the monitor, so I'm happy to get one, and the 780Ti will boost performance sure, but not night and day levels, playing BF4 on a 1440p monitor and a 1080p monitor is (almost) night and day difference on a non overclocked 780. Hell I probably won't bother with the 780Ti if this screen does come in as the 780 is almost overkill at 1080p.

In games I can notice the difference relatively clearly between 60Hz and 75Hz, this is because my 75Hz monitor is only 1280x1024 so I can keep 75fps in pretty much anything I throw at it, I use this screen more often than I should because it is just soo smooth, This is why I want this G-Sync monitor, because I want that smoothness at a higher resolution.

howzz1854 said:

Honestly the screen tear with V-sync off doesn't bother me at all. I turn V-sync off for every game that I play, and I mostly play FPS games. I much rather have a S-IPS panel with decent refresh rate and great color representation at 2550X1440 than a lousy TN with this G-sync stuff. I'd rather use the money and put it into another graphics card upgrade.

hahahanoobs hahahanoobs said:

I can't wait until more manufacturers are selling these. I definitely want one.

VitalyT VitalyT said:

To be able to compare the higher DPI versus higher refresh rate you need the type of content that can take proper advantage of both. StarCraft 2, which I used an example, has the level of detail that looks so much better in higher DPI, while higher refresh rate for such game would make 0 difference. Surely there is plenty of content for which it would be the opposite, like FPS games. That's not to downsize the importance of higher DPI screens, rather to say there isn't enough content good enough for them. But then again, this is only as far as the computer games go, while for everything else higher DPI represents much higher value than higher refresh rate (when above 60Hz).

Guest said:

IPS ..or GTFO...

this panel is worth 99.99 max ..I personally won't even pay that for this.

My DELL 30" laughs at this monitor. (paid $450 for it) I would rather live with tearing then with a washed out picture on a tiny screen.

GhostRyder GhostRyder said:

Well I'm lucky enough I don't have to pay quite so much for the monitor, so I'm happy to get one, and the 780Ti will boost performance sure, but not night and day levels, playing BF4 on a 1440p monitor and a 1080p monitor is (almost) night and day difference on a non overclocked 780. Hell I probably won't bother with the 780Ti if this screen does come in as the 780 is almost overkill at 1080p.

In games I can notice the difference relatively clearly between 60Hz and 75Hz, this is because my 75Hz monitor is only 1280x1024 so I can keep 75fps in pretty much anything I throw at it, I use this screen more often than I should because it is just soo smooth, This is why I want this G-Sync monitor, because I want that smoothness at a higher resolution.

Well whatever floats your boat, though getting above 120FPS on a 780 might be a challenge even at 1080p on BF4 mostly because those games are so demending in the upwards of beyond 75hz. My monitors do refresh at 75hz as well so I have my Dynamic V-Sync capped there and for me it stays smooth as silk on BF4. I prefer gaming at higher resolution in the long run because higher FPS gains I see minimal differences even in a game like BF3/4, I have a friend with NVidia 3D and a 3D monitor that runs at 120hz and even playing BF3 on it, the difference between my 75hz setup was so miniscule that I personally saw no point.

I think G-SYNC can be nice, but the fact that I would have to replce all three monitors and pay at least 1500 bucks to make the same setup would about kill me let alone having to buy at least 2 780ti's to keep up with that type of setup (Since HD 6990's cant use G-SYNC :P). If mantle comes out and gives me a 20% boost in performance on BF4 like its claiming, then theres not going to be a second thought and ill grab some 290X cards. Otherwise, im grabbing those 780ti Dual 8pin cards (When they come out) and running NVidia surround, though I doubt even then ill grab a Gsync monitor until I can get one for 300.

Burty117 Burty117, TechSpot Chancellor, said:

To be able to compare the higher DPI versus higher refresh rate you need the type of content that can take proper advantage of both. StarCraft 2, which I used an example, has the level of detail that looks so much better in higher DPI, while higher refresh rate for such game would make 0 difference. Surely there is plenty of content for which it would be the opposite, like FPS games. That's not to downsize the importance of higher DPI screens, rather to say there isn't enough content good enough for them. But then again, this is only as far as the computer games go, while for everything else higher DPI represents much higher value than higher refresh rate (when above 60Hz).

Ok I see what your trying to say, I Have StarCraft 2, and between 1080p and 1440p the image looks better, agreed, the game doesn't look any more realistic though. Again agreed with refresh rates and DPI, games like StarCraft 2 (actually most RTS games) don't benefit from higher refresh rates / frame rates above 60 really, it just becomes silky smooth, but third person / first person shooters, Racing games etc, Do benefit from a higher frame rate / refresh rate above 60 and it matters so much so, people such as myself are quite happy, without a second thought, change from 1440p @ 60Hz to go for 1080p @ 144Hz inc/G-sync. I'm looking forward to it just so I get get rid of the damn tearing in Battlefield xD

dividebyzero dividebyzero, trainee n00b, said:

...but no....no matter how "crisp" your cartoon image....it is still a cartoon image....but pointed accepted about anti aliasing it is wasteful of resources......but on that same note why do I not see aliasing watching a blue ray movie at 1080p.....if game image quality could even approach the realism of blu ray playback at 1080p I'd be more than happy and not want for any higher resolution.

To answer your question:

In gaming each frame is individually rendered for the screen space - basically a snapshot at an instance in game time. A movie frame is an aggregated total of what happens on camera in a 24th/25th/30th, or 48th (The Hobbit) of a second depending on the frame rate standard used, and as such the image is less defined. The blurriness adds to the illusion of smooth motion.

Pause a game and note the clarity/definition of the image, and then pause a movie and note that the image crispness isn't as defined. Post process Gaussian blur based antialiasing algorithms in PC gaming (MLAA, SMAA, TXAA, FXAA), as well as 3dfx's older T-Buffer tech attempt to mimic the film experience.

As for the subject at hand, I think I'd prefer to add the G-Sync module kit to the LG UM95 IPS when it arrives. A little more screen real estate without the 4K pricetag, and a little more versatility in non-gaming workloads.

LNCPapa LNCPapa said:

I much rather have a S-IPS panel with decent refresh rate and great color representation at 2550X1440

I have all of this and still want G-Sync added to my monitor.

GhostRyder GhostRyder said:

I have all of this and still want G-Sync added to my monitor.

Then you should wait on that external module thats supposed to be coming. If that works just as well, I might even go ahead and jump back with a coupe of those on my monitors (Depending on how many is needed for 3 monitors, I would guess three because of logic unless someone one will support three).

Burty117 Burty117, TechSpot Chancellor, said:

Then you should wait on that external module thats supposed to be coming. If that works just as well, I might even go ahead and jump back with a coupe of those on my monitors (Depending on how many is needed for 3 monitors, I would guess three because of logic unless someone one will support three).

As far as I'm aware those external module's are not possible currently and are just speculation, @LNCPapa may be waiting a very long time xD

lipe123 said:

Or you can just get a video card that can pump out wayt higher framerates than 60fps and turn off V-Sync.

I never EVER notice screen tearing on my monitor at 1920x1200 running games at 90+ fps.

Instead of that $500 money sink you can get a decent video card or SLI/Crossfire whatever you already have for a much better use off the money.

technogiant said:

To answer your question:

In gaming each frame is individually rendered for the screen space - basically a snapshot at an instance in game time. A movie frame is an aggregated total of what happens on camera in a 24th/25th/30th, or 48th (The Hobbit) of a second depending on the frame rate standard used, and as such the image is less defined. The blurriness adds to the illusion of smooth motion.

Pause a game and note the clarity/definition of the image, and then pause a movie and note that the image crispness isn't as defined. Post process Gaussian blur based antialiasing algorithms in PC gaming (MLAA, SMAA, TXAA, FXAA), as well as 3dfx's older T-Buffer tech attempt to mimic the film experience.

Thanks for the explanation....kind of highlights what I'm saying.....I'd much rather see gpu processing power used to make things more realistic....perhaps by mathematically merging the last two frames in the buffer to make a more blurry composite and so mimic the situation in movies.....it would prefer to see the gpu power budget spent like this rather than just using more pixels.

Getting back to the G-sync debate....1080p for me is not an issue and as a monitor is a longer term investment neither is the price....for me the issues are the reduced game responsiveness that Vsync produces...I really notice it.....and the screen tearing.

But I'm not sure that a standard 144hz monitor would not sufficiently solve the problems without the need for G-sync with its added cost and possible game incompatibilities.

GhostRyder GhostRyder said:

As far as I'm aware those external module's are not possible currently and are just speculation, @LNCPapa may be waiting a very long time xD

Yea its just specualtion, but it would be a very sad missed oportunity if they dont because there are more than just 4 brands of monitors out there. Plus forcing people to have to buy new monitors is a hard justification at the same resolution as their previous one. I mean there are exceptions of course, but most gamers at this point are already on either 1920x1080/1200 or 2560x1440/1600.

Burty117 Burty117, TechSpot Chancellor, said:

Or you can just get a video card that can pump out wayt higher framerates than 60fps and turn off V-Sync.

I never EVER notice screen tearing on my monitor at 1920x1200 running games at 90+ fps.

Instead of that $500 money sink you can get a decent video card or SLI/Crossfire whatever you already have for a much better use off the money.

Some people don't want to deal with extra heat, electric bills and SLI/Crossfire Profiles, just accept it.

Yea its just specualtion, but it would be a very sad missed oportunity if they dont because there are more than just 4 brands of monitors out there. Plus forcing people to have to buy new monitors is a hard justification at the same resolution as their previous one. I mean there are exceptions of course, but most gamers at this point are already on either 1920x1080/1200 or 2560x1440/1600.

They can't bring out a module, G-Sync works by essentially outputting directly to the LCD Panel, would be impossible to implement on current screens.

Anyway I think G-Sync will be one of those things you Consider when it's that time again to upgrade your monitor, when prices are more reasonable, I've seen the demo's and read what people have to say about it, it is crazy smooth still even when the fps are all over the place between 30-60fps, that adds longevity to your GPU since you wouldn't need to upgrade it as often if your after all the eye candy, or at least g-sync gives that illusion.

dividebyzero dividebyzero, trainee n00b, said:

As far as I'm aware those external module's are not possible currently and are just speculation, @LNCPapa may be waiting a very long time xD

I doubt they'll be available as external modules at least in the short term since it may well require a SIG ratification.

A couple of notes:

1. The price for the kit is high because (as Nvidia has already stated) the present (proof of concept) G-Sync unit uses an over-engineered FGPA for processing. Nvidia have stated that once the tech has gained traction, the G-Sync module will feature a cheap purpose built ASIC similar to the scalar chips found in current monitors.

2. [link] It isn't any different from changing timing control boards on present monitors, and just as with T-Con modding it's less about the hardware installation than it is doing your homework on compatibility. My guess is that it needs to be made apparent that the monitor needs to support DisplayPort, and that end user fitting will void the panel vendors warranty (bearing in mind that dis/reassembling some panels can be a tricky operation- esp with off-brand vendors)

@LNCPapa

Unless your monitor has DisplayPort input you'll be out of luck.

LNCPapa LNCPapa said:

Oh yeah... I know I'm out of luck DBZ. I was just pointing out that even though I have all those things he said he would rather have there is still a need for this type of technology. I always wondered why we couldn't just adjust the refresh of the display devices as certain applications like media players have the capability to do similar things if the device supports it such as 24hz for media.

JC713 JC713 said:

I think everyone thinking of buying one of these should wait and watch Linus's latest video. It makes some good points as to why you should wait.

dividebyzero dividebyzero, trainee n00b, said:

I always wondered why we couldn't just adjust the refresh of the display devices as certain applications like media players have the capability to do similar things if the device supports it such as 24hz for media.

WHAT! You mean the vendors spending a couple of extra bucks per monitor control board so that instead of fifty different models distinguished a feature increment at a time, every model has inbuilt colour calibration, multi I/O options, and multi channel LVDS....sounds like crazy talk.

technogiant said:

There was recent news that a new way of mass producing OLED via some kind of printer has been developed and that LG will be releasing screens on said technology 2014.....if this does come to pass soon I believe I'm right in saying oled has better colour reproduction and very high refresh rates(so would probably make G-sync unnecessary).....even though the production method is meant to make them more affordable I guess early adopters will pay a premium as ever.....but may be worth waiting on that as a monitor is a longer term purchase than a graphics card.

GhostRyder GhostRyder said:

Anyway I think G-Sync will be one of those things you Consider when it's that time again to upgrade your monitor, when prices are more reasonable, I've seen the demo's and read what people have to say about it, it is crazy smooth still even when the fps are all over the place between 30-60fps, that adds longevity to your GPU since you wouldn't need to upgrade it as often if your after all the eye candy, or at least g-sync gives that illusion.

Agreed, I just dont think people on something like this should jump the gun expecting everything to be as amazing as described. In the past, its always been the same way no matter who is stating the product/showing it and who is manufacturing it. They always have some sort of exaggeration or show only best case scenarios in a demo, which makes the product sound like the greatest thing since sliced bread. When we see these in the open world and actually in use by an everyday user (Or at least reviewed by Linus or the likes) then we can see if its all hype or eyeball melting beautiful.

In all honesty, the other thing is that this requires it seems display port from what im reading/understanding (Could be mistaken, but I keep hearing mention of requires display port) which is an oddity to begin with meaning that you will need a display port cord which is already something you dont exactly see sitting on every shelf on top of the fact most NVidia GPUs (Ill reference the 780ti for this) has only one Display Port. So if your only wanting to game on one monitor, your fine though I feel at 1080p a 780ti would be a bit overkill. But if you want to game on multiple displays (For instance 3 way surround) you will need at least 3 cards to do this (Which for me, im already considering doing so :P). Thats just my views on this right now, hopefully all our questions will be answered in due time.

Burty117 Burty117, TechSpot Chancellor, said:

Agreed, I just dont think people on something like this should jump the gun expecting everything to be as amazing as described. In the past, its always been the same way no matter who is stating the product/showing it and who is manufacturing it. They always have some sort of exaggeration or show only best case scenarios in a demo, which makes the product sound like the greatest thing since sliced bread. When we see these in the open world and actually in use by an everyday user (Or at least reviewed by Linus or the likes) then we can see if its all hype or eyeball melting beautiful.

In all honesty, the other thing is that this requires it seems display port from what im reading/understanding (Could be mistaken, but I keep hearing mention of requires display port) which is an oddity to begin with meaning that you will need a display port cord which is already something you dont exactly see sitting on every shelf on top of the fact most NVidia GPUs (Ill reference the 780ti for this) has only one Display Port. So if your only wanting to game on one monitor, your fine though I feel at 1080p a 780ti would be a bit overkill. But if you want to game on multiple displays (For instance 3 way surround) you will need at least 3 cards to do this (Which for me, im already considering doing so :p). Thats just my views on this right now, hopefully all our questions will be answered in due time.

Yeah, Linus has had a G-Sync screen for like 2 weeks now and he does say it's really good but can't really review it properly as he can't show us due to the way the technology works, it's something we need to see ourselves and is in contact with Nvidia to have some kind of "G-Sync day" but he did say that could be a while off. He also mentioned the price is a big thumbs down as it is currently overly expensive, Only reason I'm considering one is because I can get one cheaper and I really hate screen tear, like really hate it, pretty much ruins the experience for me and get it a lot on a 60Hz 1440p monitor, The fact this fixes my biggest bug bear means quite a lot to me, it's the reason I'm all for it xD

Good shout on the Nvidia cards only having one displayport though, I do wonder how they plan on doing Surround with a single port :/

1 person liked this | technogiant said:

Good shout on the Nvidia cards only having one displayport though, I do wonder how they plan on doing Surround with a single port :/

Doesn't display port somehow allow you to run more than one monitor from it?...sure I read that somewhere.

Burty117 Burty117, TechSpot Chancellor, said:

Doesn't display port somehow allow you to run more than one monitor from it?...sure I read that somewhere.

aah! So you are right my good sir:

[link]

That's how they pull it off, I wonder how G-Sync works in this config? Since I thought G-Sync used some kind of AUX signal that only DisplayPort has to sync the fps and refresh rate? Only time will tell

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

That's how they pull it off, I wonder how G-Sync works in this config? Since I thought G-Sync used some kind of AUX signal that only DisplayPort has to sync the fps and refresh rate? Only time will tell
I'm only guessing, which would mean the signal is broadcast. And if the monitor recognizes this signal, it will take advantage and refresh accordingly. As for the monitors that don't support GSync, they will continue to refresh at their predefined rates.

Looking at the link above, it looks as if DisplayPort 1.2 will only support one 4K monitor. It is a pixel limit on bandwidth.

GhostRyder GhostRyder said:

aah! So you are right my good sir:

[link]

That's how they pull it off, I wonder how G-Sync works in this config? Since I thought G-Sync used some kind of AUX signal that only DisplayPort has to sync the fps and refresh rate? Only time will tell

Of *$%# hes right I totally forgot you can do that, I dont mess with display port as much except with the mini->HDMI adapters on my HD 6990's. Since one can handle 4 1080p displays on it, now we just have to wait on the price to go down and a conformation on how many displays a display port will do with G-SYNC involved (Which I would still assume based off readings would still be 4).

Now my biggest will be who will be the first to release one of their top tier cards with a dual 8 pin and better power delivery for higher overclocking. At this point whoever wins that race will have a check for 2 GPU's and waterblocks. Between G-SYNC and Mantle, my head is spinning around in circles on decisions. I watched the Linus video a few days ago on it and they did sound impressed, but I really want to see it with my own eyes first before I judge anything and of course we have to wait on driver maturity as that will be a defining factor in all of this. I still dont see a point much to gaming at FPS beyond the 60-75hz threshold yet, but this may convince me otherwise.

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

Yeah, Linus has had a G-Sync screen for like 2 weeks now and he does say it's really good but can't really review it properly as he can't show us due to the way the technology works, it's something we need to see ourselves and is in contact with Nvidia to have some kind of "G-Sync day" but he did say that could be a while off.

There is a downloadable demo available if you haven't already seen it. Check the link in this PC Per article

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.