For gamers, display lag is a very real concern, and display processing is a nebulously reported (if at all) specification for just about all LCD displays. Ultimately, what matters isn’t GTG, full on, full off pixel response times, or what’s reported on the spec sheet, but the holistic latency of the monitor compared to something we can all agree is lag-free. In the past we’ve used the HP LP3065, which is an excellent baseline with almost no processing lag due to the absence of a hardware scaler, but we’re tweaking things around a bit (and yours truly doesn’t have said HP LCD), so we’re going to do something different.

One of the things we’ve seen requested pretty frequently is a comparison between the bonafide refresh rate of a good old CRT and LCD panels under test. I hopped into my time machine, took a trip to October 1999, and grabbed me a 17” Princeton EO700 CRT. This bad boy supports 1024x768 at a blistering 85 Hz. Oh, it also weighs a million pounds and makes weird sounds when turning on and off.

My ancient CRT - it's a beast

To do these tests, we connect the CRT up to a DVI to VGA adapter on our test computer’s ATI Radeon HD5870, and the LCD panel under test to DVI using an HDMI to DVI cable. I debated for some time the merits of using the same VGA signal, however, what really matters here is how the two display methods matter in the way that you, readers, are most likely to set things up. In addition, using the VGA input on any LCD is bound to add additional lag, as this is definitely a hardware scaler operation to go from analog to digital signaling, compared to the entirely digital DVI datapath. We run  the CRT at 1024x768 and 85 Hz, its highest refresh rate, and clone the display to the LCD panel.

We use the same 3Dmark03 Wings of Fury benchmark on constant loop, take a bunch of photos with a fast camera (in this case, a Canon 7D with a 28-70mm F/2.8L) with wide open aperture for fast shutter speeds, in this case at 1/800 of a second. Any differences on the demo clock will be our processing lag, and we’ll still get a good feel for how much pixel response lag there is on the LCD.

The only downside is that this means our old data is no longer a valid reference.

To compute the processing lag, I do two things. First, I watch for differences in the clock between the CRT and LCD, noting these whenever they are visible. I did this for 10 captures of the same sequence. Second, one can compute the processing difference by taking into account the FPS and the frame number difference:

Of course, not every one of those frames is written to the display, but we can still glean how much time difference there is between these respective frames with much more precision than from averaging the time, which only reports down to 1/100ths of a second. An example shot of what this difference looks like is the following:

The G2410H is a pretty decent starting benchmark for our brave new test method, considering at its core is a relatively normal S-TN panel. This is largely in line with what we expected to see, and on the whole, the processing lag is very small at around 9 ms. We’ll get a feel as we add more monitors, but this is on the whole very interesting.

Dell G2410H - Processing Lag
Averaging Time Difference FPS Computation Time Difference
9.0 ms 8.59 m

When it comes to actual pixel lag, we see with the G2410H what we usually see on all LCDs, one ghost image before and after the dominant frame, even at very fast shutter speeds so we’re not accidentally sampling the next. This still corresponds to roughly 16 ms. It’s interesting that the G2410H and other LCDs exhibit this ghosting.

At the end of the day, LCD performance still isn’t quite at parity with CRTs. But at the same time, I doubt anybody is going to want to borrow my time machine to buy one and replace their LCD. You’re getting a heck of a lot more screen real estate with a smaller, lighter weight footprint, for less electrical power, and you don’t have to look like a crazy doing it. Sure, there are a handful hardcore gamers out there that swear by their CRT’s faster refresh rate, but could a single one of them really discern individual pulses of a 9ms flashing strobe?

Analysis - Color Consistency Analysis - Brightness and Contrast
Comments Locked

39 Comments

View All Comments

  • TechnicalWord - Friday, May 7, 2010 - link

    With AMD video cards you can get rid of black borders all around as follows: bring up the latest CCC, go to Desktops & Displays, RIGHT-CLICK ON THE DISPLAY ICON AT THE BOTTOM under "Please select a display" and choose Configure, then select Scaling Options and set Underscan-Overscan to 0%.
  • strikeback03 - Wednesday, May 12, 2010 - link

    Thank you very much! That has been bothering me since I built my HTPC in January. Wonder why AMD set it that way by default.
  • Stokestack - Friday, May 7, 2010 - link

    Bring back common sense. Glossy screens are asinine. GJ on that at least, Dell.
  • quiksilvr - Friday, May 7, 2010 - link

    Glossy screens are asinine on a laptop you take outdoors. Monitors are usually indoors and it makes sense for them to have some glatte or moss (half gloss, half matte, as seen on LCD TVs) to it. Full gloss really depends on the lighting of where it is.
  • chromatix - Friday, May 7, 2010 - link

    Time was a "green screen" meant just that - a text terminal with green phosphor on the front and nothing else. Nowadays you only see them attached to obsolete mainframes.
  • jonyah - Friday, May 7, 2010 - link

    I have two of the G2410's (non-H) version that I got with a discount code from dell.com for only $200 each, after shipping. I guess I got really lucky because I watched the price and discounts daily hoping to get more and it never got down that low again. A $140 price hike though for an adjustable stand just isn't worth it. Get these down below $250 and it would be worth it. The screen is really nice, though I wish they'd do a 27" led monitor and/or up the res to 1900x1200 or higher. I do miss that extra 180pixels in height on this screen
  • casteve - Friday, May 7, 2010 - link

    I have two of the non-H as well. I missed the $200 sale, but got them for ~$250 last fall. Definately NOT the monitor you want for professional graphics design, but great for mixed use productivity, casual streaming/movies, and gaming (esp. at $250 or less). The auto mode was a little psychotic (brightness would vary in constant room lighting), so I moved to standard mode and manual settings. 15-17W consumption.
  • BernardP - Friday, May 7, 2010 - link

    Nvidia Geforce video drivers have the "Create Custom Resolution" and "Use NVidia Scaling" options that allow (with digital output) creating and scaling any custom or missing standard resolution at the correct aspect ratio. The trick is that scaling is done in the videocard, while a native-resolution signal is sent to the monitor. In essence, the monitor is displaying at its native resolution and doesn't "know" it is showing a lower custom resolution.

    For example, I find it more comfortable to use a custom 1536x960 custom resolution on a 24 inch (16:10) monitor

    For my parents' setup, I have created a custom 1080x864 resolution that is comfortably bigger for their old eyes while respecting the 5:4 ratio of their 19 inch LCD.

    It's too bad ATI is not offering these options.
  • Guspaz - Friday, May 7, 2010 - link

    I did try that; I have a G2410 (hey, I bought it because it was on sale for dirt cheap at the time) and it's useless for 4:3 games.

    The G2410 has the annoying tendancy to stretch *ANY* 4:3 resolution that I've tried up to wide. WarCraft 3 doesn't look so hot.

    I did mess about with the nVidia drivers to try to create a custom resolution that would let me run 4:3 games at actual 4:3, but didn't get anywhere.
  • aftlizard - Friday, May 7, 2010 - link

    I think it is unacceptable that a new monitor, even if it is just an update on an existing model, to not have HDMI.I use monitors now for my laptop at home and my laptop like many others does not have DVI out but rather VGA and HDMI. I can not use VGA if I am going to watch HDCP movies or use anything that requires HDCP. Sure I could purchase an HDMI-DVI converter but why not just add that extra HDMI spot to give everybody the chance to use their connections straight out of the box without having to purchase an adapter or look for a software solution.

Log in

Don't have an account? Sign up now