For gamers, display lag is a very real concern, and display processing is a nebulously reported (if at all) specification for just about all LCD displays. Ultimately, what matters isn’t GTG, full on, full off pixel response times, or what’s reported on the spec sheet, but the holistic latency of the monitor compared to something we can all agree is lag-free. In the past we’ve used the HP LP3065, which is an excellent baseline with almost no processing lag due to the absence of a hardware scaler, but we’re tweaking things around a bit (and yours truly doesn’t have said HP LCD), so we’re going to do something different.

One of the things we’ve seen requested pretty frequently is a comparison between the bonafide refresh rate of a good old CRT and LCD panels under test. I hopped into my time machine, took a trip to October 1999, and grabbed me a 17” Princeton EO700 CRT. This bad boy supports 1024x768 at a blistering 85 Hz. Oh, it also weighs a million pounds and makes weird sounds when turning on and off.

My ancient CRT - it's a beast

To do these tests, we connect the CRT up to a DVI to VGA adapter on our test computer’s ATI Radeon HD5870, and the LCD panel under test to DVI using an HDMI to DVI cable. I debated for some time the merits of using the same VGA signal, however, what really matters here is how the two display methods matter in the way that you, readers, are most likely to set things up. In addition, using the VGA input on any LCD is bound to add additional lag, as this is definitely a hardware scaler operation to go from analog to digital signaling, compared to the entirely digital DVI datapath. We run  the CRT at 1024x768 and 85 Hz, its highest refresh rate, and clone the display to the LCD panel.

We use the same 3Dmark03 Wings of Fury benchmark on constant loop, take a bunch of photos with a fast camera (in this case, a Canon 7D with a 28-70mm F/2.8L) with wide open aperture for fast shutter speeds, in this case at 1/800 of a second. Any differences on the demo clock will be our processing lag, and we’ll still get a good feel for how much pixel response lag there is on the LCD.

The only downside is that this means our old data is no longer a valid reference.

To compute the processing lag, I do two things. First, I watch for differences in the clock between the CRT and LCD, noting these whenever they are visible. I did this for 10 captures of the same sequence. Second, one can compute the processing difference by taking into account the FPS and the frame number difference:

Of course, not every one of those frames is written to the display, but we can still glean how much time difference there is between these respective frames with much more precision than from averaging the time, which only reports down to 1/100ths of a second. An example shot of what this difference looks like is the following:

The G2410H is a pretty decent starting benchmark for our brave new test method, considering at its core is a relatively normal S-TN panel. This is largely in line with what we expected to see, and on the whole, the processing lag is very small at around 9 ms. We’ll get a feel as we add more monitors, but this is on the whole very interesting.

Dell G2410H - Processing Lag
Averaging Time Difference FPS Computation Time Difference
9.0 ms 8.59 m

When it comes to actual pixel lag, we see with the G2410H what we usually see on all LCDs, one ghost image before and after the dominant frame, even at very fast shutter speeds so we’re not accidentally sampling the next. This still corresponds to roughly 16 ms. It’s interesting that the G2410H and other LCDs exhibit this ghosting.

At the end of the day, LCD performance still isn’t quite at parity with CRTs. But at the same time, I doubt anybody is going to want to borrow my time machine to buy one and replace their LCD. You’re getting a heck of a lot more screen real estate with a smaller, lighter weight footprint, for less electrical power, and you don’t have to look like a crazy doing it. Sure, there are a handful hardcore gamers out there that swear by their CRT’s faster refresh rate, but could a single one of them really discern individual pulses of a 9ms flashing strobe?

Analysis - Color Consistency Analysis - Brightness and Contrast
Comments Locked

39 Comments

View All Comments

  • strikeback03 - Monday, May 10, 2010 - link

    Using the native software packages for both I liked the i1D2 better than the Spyder3. But when my i1D2 dies after 8 months and xRite/Pantone wouldn't do anything for me I wasn't about to buy another.
  • strikeback03 - Monday, May 10, 2010 - link

    should be died, not dies
  • Brian Klug - Monday, May 10, 2010 - link

    Oh man, it died? That sucks. I hope ours doesn't give out. Thus far I agree with you - the i1D2 is producing better looking/more consistent results than the Spyder 3 subjectively, but I haven't been really good about testing, just initial messing around.

    Hmmz, this could be a review of its own... ;)

    -Brian
  • strikeback03 - Monday, May 10, 2010 - link

    I don't care overly much about the actual power consumption, but would like a display that draws less power just to keep the heat down in the summer. My Gateway FPD2485W draws about 100W regardless of brightness setting and is quite toasty.
  • ctsrutland - Monday, May 10, 2010 - link

    If you really want to tell us how green it is, you also need to tell us how much CO2 was generated during its manufacture in comparison to other screens. How much oil was used in its making? Does it have a thinner plastic shell to reduce oil use? You also need to tell us whether the finished article is particularly easy to recycle in some way. Does it have a longer warranty than normal - if I can keep it in use for more years before replacing it, then it would be greener. Does it have components that are easier to fix? It's much greener to fix faulty things than to chuck them out and buy a new one. Has Dell undertaken to make spares available for more years to help with this? Don't suppose so...
  • dragunover - Monday, May 10, 2010 - link

    Mine supports like 150 or so hertz at that resolution...
    NEC MultiSync 90
  • ReaM - Tuesday, May 11, 2010 - link

    THANK YOU SO MUCH FOR TESTING THIS ONE!!!

    I had it in the shopping cart for a month now in an online shop. Very low power consumption!
  • strikeback03 - Wednesday, May 12, 2010 - link

    Yeah, after about 8 months it wanted to calibrate everything to a very green hue and would flat out refuse to even run the calibration on my laptop at settings I had previously used (and had a screenshot of). It was stored in a drawer with the clip-on diffuser cover on when not in use. I know hardware goes bad, the lack of support is what disturbed me.
  • AllenP - Sunday, May 16, 2010 - link

    Concerning the processing and input lag:

    TOTALLY AWESOME that you are adapting this new CRT comparison technique. :) This is by far one of the most important elements of a display to me, so I'm happy to see a solid site like AnandTech giving some solid data.

    But, two comments that struck my eye about this test:

    1) It seems a little strange that you're using an HDMI to DVI adapter, instead of just going straight form a DVI off the graphics card, but it shouldn't make a difference anyway.

    2) WOW. 9ms total latency is VERY low for an LCD. I usually find around 30+ms when looking around (which is totally unacceptable to me). This is a nice monitor -- I'm looking forward to seeing some more data to verify that your test methods are solid. Man, that's low latency. :)

    The human eye can see apprx. 85Hz out of the rods (greyscale) and 60Hz out of the cones (colour). So this means that the center of your vision sees at around 60Hz and 85Hz at the edges (due to a lot more cones in the center).

    I assume that this would mean that we can therefore discern between 16.66ms and 11.76ms of lag. (Please correct me if my assumption is wrong... I'm sure I'm a bit off on that) 9ms is nicely below that threshold, which is quite impressive for an LCD.

Log in

Don't have an account? Sign up now