Asus monitor with Nvidia G-Sync now up for pre-order

Scorpus

TechSpot Staff
Staff member
Back in October, Nvidia announced G-Sync, a new technology designed to eliminate screen tearing and stuttering in games. Through the inclusion of a dedicated chip inside the monitor itself, the monitor's refresh rate can be synchronized to the GPU's render...

[newwindow="https://www.techspot.com/news/55037-asus-monitor-with-nvidia-g-sync-now-up-for-pre-order.html"]Read more[/newwindow]
 

VitalyT

Russ-Puss
TechSpot Elite
This came out a little too late, imo.

Today DELL released their long-awaited UP2414Q, which sets a new benchmark for professional monitors in terms of both quality and pricing.

When nVidia releases G-sync for a similar 4K product, I will be interested, but for now I'd rather go for a higher DPI than just higher render rate.
 

LukeDJ

TS Maniac
This came out a little too late, imo.

Today DELL released their long-awaited UP2414Q, which sets a new benchmark for professional monitors in terms of both quality and pricing.

When nVidia releases G-sync for a similar 4K product, I will be interested, but for now I'd rather go for a higher DPI than just higher render rate.
I dunno man, I'd rather the G-sync monitor at this point, for gaming anyway. It provides more benefits than extra pixels do in my opinion.
 

VitalyT

Russ-Puss
TechSpot Elite
I dunno man, I'd rather the G-sync monitor at this point, for gaming anyway. It provides more benefits than extra pixels do in my opinion.
Either way, it is one hell of a compromise when it comes to choosing one over the other. You'd definitely want the higher DPI one for everything but the gaming, and the higher-rate one for gaming. Having both is too awkward for most users, and we are not likely to see a product that combines both earlier than the end of next year...

But then again, too many gamers today have upgraded to 1440p and 1600p monitors, making this new 1080p higher-rate product a big step back. Myself, I've been using DELL U3014 since April this year, and one with 4K would be the next logical step, and there is no way I'd swap it for a 1080p monitor, I don't care if it got 1000Ghz update rate.
 

LukeDJ

TS Maniac
Either way, it is one hell of a compromise when it comes to choosing one over the other. You'd definitely want the higher DPI one for everything but the gaming, and the higher-rate one for gaming. Having both is too awkward for most users, and we are not likely to see a product that combines both earlier than the end of next year...

But then again, too many gamers today have upgraded to 1440p and 1600p monitors, making this new 1080p higher-rate product a big step back. Myself, I've been using DELL U3014 since April this year, and one with 4K would be the next logical step, and there is no way I'd swap it for a 1080p monitor, I don't care if it got 1000Ghz update rate.
Very true. Anyway, what do I care, I can't afford either :p
 
  • Like
Reactions: cliffordcooley

Skidmarksdeluxe

TS Evangelist
$500 for some lousy 24" TN monitor?... No thanks, I most definitely pass. You've gotta be some hardcore bloody minded gamer to even consider this. If it was an IPS monitor, it would've made the price a bit easier to swallow but even that's too much. I can see this G-Sync thing being a very niche product for the time being. Hopefully nVidia can get the price of this $175 chip down to twenty bucks because that's all it's worth to me as far as I'm concerned. Who's running the show at nVidia nowadays? Apple execs?
 
  • Like
Reactions: cliffordcooley

Burty117

TechSpot Chancellor
I'm getting this monitor early next week, I've currently got a 1440p monitor and I'm quite happy to be going "back" to 1080p. 1440p (even with a GTX 780) is bloody hard to run the latest games with all the candy turned up AND keep at 60fps, Battlefield 4 is an example, with V-Sync enabled you get serious lag, without it you get some of the worst tearing I've ever seen, I'm quite happy to drop down to 1080p and get 144Hz PLUS G-Sync doing it's thing, to me, that's a serious upgrade.

If I was a photographer It would be a downgrade but as a gamer it's an upgrade.
 
  • Like
Reactions: LukeDJ

Vrmithrax

TechSpot Paladin
I'm sure the monitor's performance will be great... But for those of us running AMD graphics, it represents a potentially massive investment to change over. Yay for proprietary hardware locked to a single manufacturer! Bleh... (sad face)
 

spencer

TS Addict
300$ for a little chip, I'd rather crossfire. And go with v-sync and I would still get better performance.
 

LNCPapa

TS Special Forces
Some of you guys making snap judgments need to do a bit more research on G-Sync. It's not only about refresh rate but more about frame presentation. I will agree though that the price is a bit hard to swallow based on the monitor's other limitations.
 
  • Like
Reactions: LukeDJ

EEatGDL

TS Evangelist
I'm buying one of this, I was about to change in January my old 17", 7 years old monitor for a gaming monitor and it is inside my budget, so I would gladly do so with G-SYNC.
 

technogiant

TS Member
Personally speaking as a gamer I think the race to 4K is the wrong way to go and think it will hold back the development of photo realism.
If game developers have to ensure that current hardware can run their games at 60fps on 4K monitors then they are going to have to turn down the effects they use.
I'd much rather see 1080p hang around for a while and have game developers work on techniques to increase the realism at that resolution rather than using increasing gpu power just to push more pixels....more pixels does not equate to more realism guys.
 

VitalyT

Russ-Puss
TechSpot Elite
....more pixels does not equate to more realism guys.
More pixels replace the need for spatial antialiasing, which is there because individual pixels are too visible. Modern video cards spend a lot of power to implement it (supersampling). With high DPI displays we can switch spatial aliasing off without any sacrifice in quality, and your video card can spend more resources to render higher resolution natively.

This will look even better than supersampling, because the latter is an approximation.

I have two systems here to compare:

I play StarCraft 2 on 2 systems, one on DELL U3014 (2560x1600, with anti-aliasing set to maximum, and one on the new MacBook Pro 15" (2880x1800) with antialiasing switched off. And the latter looks better, crisper, and overly eye-pleasing.

So, yeah, more pixels does equate to more realism.
 

GhostRyder

TS Evangelist
I'm getting this monitor early next week, I've currently got a 1440p monitor and I'm quite happy to be going "back" to 1080p. 1440p (even with a GTX 780) is bloody hard to run the latest games with all the candy turned up AND keep at 60fps, Battlefield 4 is an example, with V-Sync enabled you get serious lag, without it you get some of the worst tearing I've ever seen, I'm quite happy to drop down to 1080p and get 144Hz PLUS G-Sync doing it's thing, to me, that's a serious upgrade.

If I was a photographer It would be a downgrade but as a gamer it's an upgrade.
I thought you were going to invest in a 780ti anyways? Wouldnt that bump your performance up just fine anyways? I think buying that monitor might feel like more of an upgrade than it really is. Its price is set to high for what it in reality is giving to you, if I were you I would go with a 780ti and just overclock it to keep the 1440p eye candy going.

I dunno guys, I like the idea of G-SYNC, but at this price without actually seeing some type of performance (Linus hasnt been able to even post a preview yet and theres really nothing out of NVidias controlled environments showcasing this) makes it a very hard sell. Ill gladly be sticking with my 3-way eyefinity (Or depending on if the 290X stays at the price its at, NVidia Surround) over getting one G-SYNC monitor.
 

cliffordcooley

TS Redneck
Realism as in 720P movie compared to 4k animation. If it still looks like an animation, it doesn't matter how many pixels you give it. The 720P movie will still look more realistic. Make the games realistic and then add more pixels.

And in the meantime, we can all benefit from G-Sync.
 

Burty117

TechSpot Chancellor
....more pixels does not equate to more realism guys.
More pixels replace the need for spatial antialiasing, which is there because individual pixels are too visible. Modern video cards spend a lot of power to implement it (supersampling). With high DPI displays we can switch spatial aliasing off without any sacrifice in quality, and your video card can spend more resources to render higher resolution natively.

This will look even better than supersampling, because the latter is an approximation.

I have two systems here to compare:

I play StarCraft 2 on 2 systems, one on DELL U3014 (2560x1600, with anti-aliasing set to maximum, and one on the new MacBook Pro 15" (2880x1800) with antialiasing switched off. And the latter looks better, crisper, and overly eye-pleasing.

So, yeah, more pixels does equate to more realism.
I have a 1440P monitor and a 1080p monitor, I can tell you with absolute fact I can run battlefield 4 with higher framerates at 1080p with anti-aliasing turned right up than with the 1440p monitor with no anti-aliasing.

Also, more pixel's do not equate to more realism, I would rather they concentrated on more realistic lighting, higher polygon counts on characters and facial animation, better textures, that is more realism, for example, we'll take an old PS1 game, Original Gran Tursimo, if we could play that today at 4k guess what? it wouldn't be any more realistic, if they gave it better textures, lighting etc... It would look more realistic.

I can see the difference between 1080p and 1440p, I really can, but it is a much smaller difference when compare to 480p vs 1080p, Yes the image looks better, but the games realism isn't any better, just looks crisper, and considering the performance trade off, I'd rather a slightly less crisper image and much better framerates/graphics than a slightly crisper image. This is coming from someone who "upgraded" to 1440p for gaming, man do I regret it xD

But for everything else, video's, web browsing and using music/video editing suites it is much better at. For gaming though, I wish I'd passed.
 

technogiant

TS Member
So, yeah, more pixels does equate to more realism.
...but no....no matter how "crisp" your cartoon image....it is still a cartoon image....but pointed accepted about anti aliasing it is wasteful of resources......but on that same note why do I not see aliasing watching a blue ray movie at 1080p.....if game image quality could even approach the realism of blu ray playback at 1080p I'd be more than happy and not want for any higher resolution.
 

Burty117

TechSpot Chancellor
I thought you were going to invest in a 780ti anyways? Wouldnt that bump your performance up just fine anyways? I think buying that monitor might feel like more of an upgrade than it really is. Its price is set to high for what it in reality is giving to you, if I were you I would go with a 780ti and just overclock it to keep the 1440p eye candy going.
Well I'm lucky enough I don't have to pay quite so much for the monitor, so I'm happy to get one, and the 780Ti will boost performance sure, but not night and day levels, playing BF4 on a 1440p monitor and a 1080p monitor is (almost) night and day difference on a non overclocked 780. Hell I probably won't bother with the 780Ti if this screen does come in as the 780 is almost overkill at 1080p.

In games I can notice the difference relatively clearly between 60Hz and 75Hz, this is because my 75Hz monitor is only 1280x1024 so I can keep 75fps in pretty much anything I throw at it, I use this screen more often than I should because it is just soo smooth, This is why I want this G-Sync monitor, because I want that smoothness at a higher resolution.
 

howzz1854

TS Evangelist
Honestly the screen tear with V-sync off doesn't bother me at all. I turn V-sync off for every game that I play, and I mostly play FPS games. I much rather have a S-IPS panel with decent refresh rate and great color representation at 2550X1440 than a lousy TN with this G-sync stuff. I'd rather use the money and put it into another graphics card upgrade.
 

VitalyT

Russ-Puss
TechSpot Elite
To be able to compare the higher DPI versus higher refresh rate you need the type of content that can take proper advantage of both. StarCraft 2, which I used an example, has the level of detail that looks so much better in higher DPI, while higher refresh rate for such game would make 0 difference. Surely there is plenty of content for which it would be the opposite, like FPS games. That's not to downsize the importance of higher DPI screens, rather to say there isn't enough content good enough for them. But then again, this is only as far as the computer games go, while for everything else higher DPI represents much higher value than higher refresh rate (when above 60Hz).
 
G

Guest

IPS ..or GTFO...

this panel is worth 99.99 max ..I personally won't even pay that for this.

My DELL 30" laughs at this monitor. (paid $450 for it) I would rather live with tearing then with a washed out picture on a tiny screen.