Philips announces 27-inch monitor equipped with Nvidia G-Sync technology

Shawn Knight

Posts: 15,240   +192
Staff member

philips nvidia g-sync gaming monitor

Nvidia promised to do away with V-Sync limitations late last year with the introduction of G-Sync, a monitor technology that, in layman’s terms, will get rid of screen tearing, stutter and lag. We’re now seeing the fruits of Nvidia’s labor at this year’s Consumer Electronics Show as Philips looks to be the first manufacturer on the block (if you don’t count Asus’ modded VG248QE monitor) with G-Sync technology.

The Philips 272G5DYEB is a 27-incher with a resolution of 1,920 x 1,080. Personally, I believe a 27-inch display is best equipped with a higher resolution panel, but I digress. Moving on, the monitor can display up to 16.7 million colors and has a brightness rating of 300cd/cm² and a contrast ratio of 1000:1.

Viewing angles are locked in at 170° on the horizontal and 160° on the vertical axis but of course it’s the G-Sync technology that will propel gamers to spend dough on this display.

For those not already familiar, G-Sync is a dedicated chip built into the monitor that synchronizes the refresh rate to the GPU’s render rate which means images are displayed the moment they are rendered. This supposedly translates into scenes appearing instantly with sharper objects and smoother gameplay. What’s more, users will be able to experience the full potential of their graphics card(s) without the throttling of V-Sync.

The Philips 272G5DYEB won’t be available until this spring and when it does launch, expect to pay $649 for the opportunity. We’ll keep an eye out for similar announcements this week from other G-Sync partners including Asus, BenQ and ViewSonic.

Permalink to story.

 
*rubs hands together in anticipation*
How well BF4 performs with Mantle will be the deciding factor on whether I jump ship or not.
 
Last edited:
o.O

...

the price ...THE PRICE! .. WTF!?

Welcome to the overpriced world of proprietary hardware that is only interesting to a small percentage of the total population (and, if you think about it, only half of that percentage, since the other half uses AMD and won't be affected by G-Sync)... Ouch!
 
At 1920x1080 I wouldn't pay anything for it... WAY WAY too low of resolution for a 27" monitor... cmon Phillips... go big or go home!!!
 
This is only the start. There will be plenty of other hi rez monitors that will come after this. Give me a 30" 2560x1440 monitor and I'll buy it. I want something that is at or below 96dpi and I'll buy it. I can settle for a little over, but I want this to go with my 680 sli
 
Also, Asus has stated the reasons behind using TN:

"Why is the display TN rather than IPS/PVA/MVA, etc?

Not all TN’s are made the same: the premium panel used in the PG278Q is of very high quality. IPS panels (and their derivatives like PVA/MVA etc) are not suitable for a multitude of reasons: 1) the response rate is simply not fast enough to react to the active change in refresh rate and 2) They cannot reliably achieve >60Hz without significantly affecting the quality of the image. IGZO technology (and LTPS – low temperature polysilicon – likewise) – yields 100′s of times faster electron mobility versus standard amorphous silicon panels – and thus can provide a response rate comparable to TN (up to 60Hz currently), but, however desirable this technology is, it is still currently cost prohibitively for many PC gaming enthusiasts in 2014, which is why ROG has used a better price:performance, high quality TN panel."
 
What about AMD freesync? that only requires a screen that supports VESA standard vblank and drivers support? Why spending money on proprietary things?
 
What about AMD freesync? that only requires a screen that supports VESA standard vblank and drivers support? Why spending money on proprietary things?

Well after reading up on it, It doesn't do everything g-sync does, I'm sure there is a reason Nvidia are physically putting hardware into monitors with 1.5GB of RAM in, I don't think it's there for fun anyway. I'm sure this VBLANK control will help but won't be as smooth. We'll see.
 
other 30% .. at best.
If you are talking all PC systems (netbooks, notebooks, desktops etc), ~60% Intel, ~20% AMD, NVidia and others make up the rest so from that perspective, most of the world won't have access to G-Sync.

If you are talking discrete graphics it is (just from AMD and NV) about 60/40 NV/AMD. Those figures are based on Q1 2013 so it may have changed a bit since then but I wouldn't think significantly.
 
Wow everybody talks bad about 27" 1080p .Everybody but me must be running dual TITANS to push 4k
 
Wow everybody talks bad about 27" 1080p .Everybody but me must be running dual TITANS to push 4k

umm no one said anything about a 4k 27 inch monitor.

I will take one that does 2560x1440 you can push that with a single highend gpu.

When I was upgrading my monitor in 2010, the first thing I looked at was my bugdet for a screen and gpu. So I picked up a 24' 1920x1200 IPS because I knew down the road I didn't want to go SLI or Xfire to push a 30` 2560x1600 screen. I picked up a unlocked 6950 at the time and now I'm on a 7970 Ghz and because I'm only pushing 1200p I can max out most games on this single gpu.

These are the things a person needs to considering in my opinion when looking at gpu's and monitors.

The people that don't plan properly buy monitors they cannot afford to push with the gpu they have....
 
I will take one that does 2560x1440 you can push that with a single highend gpu.

When I was upgrading my monitor in 2010, the first thing I looked at was my bugdet for a screen and gpu. So I picked up a 24' 1920x1200 IPS because I knew down the road I didn't want to go SLI or Xfire to push a 30` 2560x1600 screen. I picked up a unlocked 6950 at the time and now I'm on a 7970 Ghz and because I'm only pushing 1200p I can max out most games on this single gpu.
You are referencing a HD7970HGz edition and 1200p and citing you can max most games on that single GPU. There isn't a whole let better in single GPU land. 2560x1440 is about 60% MORE pixels. It is very hard for any single GPU gfx card to handle 1440p in many modern games.
 
You are referencing a HD7970HGz edition and 1200p and citing you can max most games on that single GPU. There isn't a whole let better in single GPU land. 2560x1440 is about 60% MORE pixels. It is very hard for any single GPU gfx card to handle 1440p in many modern games.

Doesn't a 290x or 780 ti give playable fps at 1440p.

hmm looking at some of the reviews it seems those two can, some games with AA on some off and vice versa. So it seems we may still be a generation behind until 1440p and 1600p with MSAA on most to all of the time with a single gpu.

Also to be clear I did say max but I'm not one of those people that has to play at 8xAA.

I am happy to lower AA and drop from ultra to very high or high to get playable fps.
 
Last edited:
Doesn't a 290x or 780 ti give playable fps at 1440p.
You get min framerates of around 45 fps with a 290x (admittedly on Ultra preset settings) so for gaming in the heat of battle that is running pretty low. I call anything below 40 "unplayable". Synthetic benchmarks are also different to a 64-player server load. Not sure what it would look like there (better or worse?).
 
Back