This editorial is an open response to AnandTech’s Desperately Seeking Quality LCDs article published last June 17.
For the last 2+ years there have been two developments in the LCD market that I know I’m not alone in disliking:
(1) Glossy panels, you either love them or hate them – I’m in the latter group.
(2) So-called LCD “post processing”, used on many high-end displays.
Furthermore, the response time race also known as the “ms race” has had a very negative effect on LCD quality. This somewhat relates to the megapixel race seen in point and shoot digital cameras, where marketing went crazy for higher megapixel counts at the cost of reduced performance in low-light conditions.
It is a commonly known fact that 60hz is what most people will find a LCD pleasing to look at, and this is also close to what our eyes are capable of processing. 60hz is also what 99% of LCDs sold today operate at, with very few exceptions.
One second = 1000ms, thus a refresh rate of 1000ms / 60hz = 16.7ms.
What this means is that at 60hz the screen is redrawn once every 16ms. So why do we see LCD displays continuing to push below 16ms when there is no way for it to render that fast at 60hz? The answer is simple: marketing.
You may recall that when LCD panels first came to market there were serious problems with ghosting. This wasn’t as big of an issue with more static output like the Windows desktop or application windows, but became more noticeable when watching movies or gaming. The ghosting issue got fixed progressively as years passed, but nevertheless was something that everyone using those early displays witnessed and disliked. And thus it became easy to market lower response times because it really made a difference.
In today’s mainstream to high-end market this long ago stopped being a problem, and yet the marketing around it still lingers on. So in order to get “grey to grey” response time values (the time it takes to change one pixel from one shade of grey to another shade of grey) down to marketable specs like 2ms, other things have had to take a back seat.
I became aware of this with my last LCD purchase about a year ago, a 27” Dell 2709W monitor that cost about $800. It had great color output and was very pleasing to look at after reducing the brightness from 100 to 0. I really liked the monitor, but eventually it became evident there was some type of lag when compared to my previous LCD display, a Eizo S2000 20”. And when I say lag, I’m not referring to ghosting, but the kind of lag where the screen takes some time to reflect the change of an action performed by the user, measured in milliseconds.
After reading some reviews on my newly bought LCD (yes I know, great timing, but I hope you don’t end up in the same boat as me) it became evident that the display had a so-called “input lag” of approximately 50-60ms. If you think about the low millisecond numbers you might be a bit shocked, and well, so was I.
Reinforcing my previous point on the kind of lag I was experiencing, if you watched a movie this wouldn’t be a problem, you would just see the movie 60ms later than when your computer renders it. When playing a game you won’t notice visible lag either. The problem occurs when you, for example, fire a railgun perfectly aimed at an enemy moving in a straight line on your screen. Even though you are aiming right at his head, your shot will be 60ms behind him, since what you see is 60ms old information. This becomes a problem if you play fast paced games like Quake 3 or Counter Strike.
So in a nutshell, you won’t be able to hit your target when you aim at them while they are moving, sounds fun right? As it turns out this is exactly how I found out about “input lag”.
So what causes input lag?
In order to lower grey to grey response times to (marketing friendly) values like 2ms, the display has to store the image in a buffer so it can do advanced post processing on it to determine which pixels on the LCD it needs to boost to get to those low 2ms response times.
But that means that if you are storing 2 frames in the display, you are effectively 32ms behind the action before you have even told the transistors what to draw (from before, at 60hz, 1 frame = 16ms so 2 frames = 32ms).
Sounds retarded? Absolutely, but so is cramming 15MP onto a compact point and shoot camera with a sensor as big as a small grain. Here’s a good example, the renowned photography site dpreview.com wrote this on 11/2005:
“As if noise and detail levels weren’t bad enough from the latest batch of digital cameras based around the 8 megapixel Sharp CCD they’ve today announced one that crams even more pixels into tiny package the RJ21W3BA0ET is a ten megapixel 1/1.7″ CCD with 3766 horizontal and 2801 vertical pixels (total) and a pixel pitch of just 2.05 µm. We always kind of hope that the next compact sensor announcement will have some real innovation like higher sensitivity and lower noise but it appears as though market forces just want ‘more megapixels’.”
Know what was introduced late last year? The Canon Digital IXUS 980 IS, it has the exact same sensor size but offers 14.7MP. Want to make a bet on how improved these cameras are at taking pictures in difficult conditions?
Input lag is never mentioned on the technical specifications for LCDs, do you think anyone would buy a new $800 monitor that has a 60ms lag in big bold letters on the box? So with all that whining out of the way, what is the point of this article you may ask? Jarred Walton over at Anandtech had a very similar issue and complaint with the current LCD market, i.e. that it sucks.
There are a few types of display technologies used on monitor LCDs today. The most common are TN (twisted nematic) based display. These usually suffer from mediocre viewing angels, bad color uniformity and are generally built to be really cheap, but they tend to have great response times.
Next up the food chain are MVA and PVA panels. A few years ago these had rather poor grey to grey pixel response times, so to combat this a buffer that does post-processing was introduced, and otherwise slow displays could be marketed as being fast. The $800 Dell 2709W I bought was a PVA panel type, while my two year older Eizo S2000 is MVA but lacks post processing, so it has a very good response time and the grey to grey time is “only” 16ms.
And on the top of the food chain is the IPS panel which as you can guess comes with a hefty price premium. It is not uncommon for IPS displays to cost $1000 and upwards (that much money getting you a base model, like the Eizo ColorEdge CG222W).
IPS panels carry the advantages of TN panels with very low grey to grey response time (lower than it really needs to be) and usually no input lag. It has great viewing angles and color uniformity is generally excellent.
So after reading Jarred Walton’s article I felt I should write a small article to cheer him up as there happens to be a great IPS display based on a new derivate of the IPS technology (there are several) known as E-IPS.
E-IPS is now the “entry level” in the IPS range, the acronym stands for Enhanced In-Plane Switching. The actual monitor I’m talking about is the Dell 2209WA, and it does not cost $1000+. Dell went a bit crazy and put this display out in the market with an MSRP of $400. But it doesn’t end there. Since it was released this display has been on rebate for $290, and most recently as low as $209. A friend of mine bought one last week so granted I had to check it out.
And wow, what can I say?
I used a Canon 40D system camera with a EF 50mm f1.4 lens to take some pictures with the display running in clone mode alongside his old Eizo CRT display (A CRT monitor has none of the issues I have talked about so far in the article).
All test images where shot at the LCD display’s native resolution of 1680×1050 at 60hz or 75hz, the CRT was rendering the same resolution and refresh rate but since it is a 4:3 display and not 16:10 widescreen it had its view stretched. This doesn’t affect the CRT display as it can render any supported resolution and refresh rate without issues. However note that if you run an LCD at anything except its optimal resolution you can incur in even more lag, due to a built-in scaler changing the aspect ratio of the rendered image. I didn’t test the Dell 2209WA at anything bar its native resolution.
With a shutter time of 1ms I took several hundred pictures of the two displays rendering a timer clocking down with a 1ms accuracy. Many images have to be discarded due to the fact that the CRT renders the display up to down, thus only a small part of the CRT screen is actually bright on any given millisecond (while your eyes see a normal picture on screen).
Next I calculated the difference in time shown on these displays, in most cases the old venerable CRT was about 20ms ahead but to my surprise the LCD was actually tied in some pictures, and in 3 of them it was amazingly enough ahead. You can download the pictures I used here.
I added the times together and then calculate the average input lag of the Dell monitor compared to the CRT To my amazement it was only 14.27ms. This for a monitor that challenges several TN panels in price!
I also tested both displays while rendering 3DMark06 in clone mode. Suffice to say that the differences I could find where so small it was kind of ridiculous, here is just one example to prove the point, and this was the image with the biggest difference I could find out of a hundred pictures.
If you click on the image to get the full version notice that the FRAME count is off by one, and there is a tiny difference in the lantern light shown right next to the box showing the FPS, TIME and FRAME. To say the least, this input lag is impossible to notice in a real life scenario, so if you where aiming for a headshot you would still hit dead centre.
But it does not end there. This display has great color accuracy. I calibrated it with a Colorvision Spyder 2 Pro and the RGB curves only needed adjustment down to 98/100/99 respectively to get the white point to 6500k.
After calibration the difference was noticeable, but it was the smallest difference I have seen on the many monitors I have calibrated, even my Eizo S2000. Here is a shot of the three RGB lines, if they form a perfect 45° angle no calibration is done, and as you can see they are not too far off.
For a much more in-depth look at the monitor read this review of it at prad.de.
Now they do come to another conclusion than me on the issue of input lag, but do take that with a grain of salt since many people confirm on their own that the display has a input lag of no more than 20ms on this very lengthy thread at HardOCP.
This thread on Digital Photography Review also praises its color accuracy for usage in photography work (for the price, of course).
One last thing that is very cool about this display, with some tweaking it’s able to do 75hz over DVI. Many current LCD displays offer this possibility, but it is only a cheat, the panel does not actually render at 75hz anyway, it simply discards the extra frame sent by the graphics card.
Not so with the Dell 2209WA, below are two pictures, the first one taken using a refresh rate of 60hz and my camera’s shutter speed at 1/15. As you can see there are 5 mouse pointers in a trail and one of them is a bit weak (this one is fading and we ignore it for that reason).
The shutter time of 1/15 equals 66ms, this while we moved the mouse at the same time the picture was taken, thus the monitor is redrawing the picture at an interval of 16.7ms, we see 4 mouse pointers (16,7ms x 4 = 66,8ms) and shutter speed: 1000ms/15 = 66,7ms.
Next I install the monitor drivers and follow the advice by ToastyX to set the timings manually like this.
After this we get the result in the picture below, notice that now there is an extra sharp pointer? This is because 1000ms / 75hz = 13.3ms. So in the time window (exposure time) of our picture of 66,7ms we can now cram 5 pictures (13.3ms x 5 = 66.7ms).
This results in absolutely no difference in a 2D Windows desktop, but when gaming you can raise the frames per second with VSync on (preferable triple buffering) and for very fast paced games like Quake3 this results in a more fluid motion. This requires your computer to be able to render more than 60fps, so forget about this helping in Crysis.
You may be thinking that I am contradicting myself because I said at the beginning of this article that the human eye is able to see only 60hz. Well, I did not say exactly that, I said:
“and this is also close to what our eyes are capable of processing”. You have to read the fine print.
The issue with refresh rates are not the same as on CRTs, where a low refresh rate of 60hz would give you headache. This is due to flicker of the display, because it is being rendered from top to bottom by the electron cannon and only a small portion of the screen is lit at a given point in time (which can be seen in the pictures I took with my camera). This very fact proves that our eyes are able to process more than 60hz. I personally didn’t find a CRT display pleasing to look at unless it was at a minimum of 75hz. Above this there were diminishing returns, though I remember running my Eizo CRT at 100hz a few years back.
I should also add that I tested how much input lag the Dell monitor had at 75hz hoping that it could be even lower, but I came to the same conclusion; 14.91ms which is well within margin (at 60hz I measured 14.27ms).
So what’s not to like?
Well, in my opinion nothing really. I would buy the Dell 2209WA straight away if it wasn’t for the fact that I am very happy with my Eizo S2000 LCD, though the Dell 2209WA does beat it.
What would set the deal for me would be this very same display but upgraded to 24” and a 1920×1080 resolution at a similar low price, then I would buy it in a heartbeat. But if you’re looking for a 22” screen that does 1680×1050 I cannot recommend the Dell 2209WA enough, and I don’t even get paid to tell you that.