Display Lag and Response Time

For gamers, display lag is a very real concern, and display processing is a nebulously reported (if at all) specification for just about all LCD displays. We’ve been over this before, but ultimately, what matters isn’t GTG, full on, full off pixel response times, or what’s reported on the spec sheet, but the holistic latency of the monitor compared to something we can all agree is lag-free. We previously used a baseline LCD and compared with it as our benchmark of no display lag. Previously we were using a 17” Princeton CRT - some of you were a bit underwhelmed by that monitor.

I spent some time visiting (I kid you not) almost every thrift store in town, and found myself a veritable cornucopia of uh... lovingly cared for CRTs to choose from. I settled on a much more modern looking Sony G520 20” CRT supporting a higher resolution and refresh rate. It’s still not what I’m ultimately looking for, but it’s better. Oh, and it cost a whopping $9. ;)

I had to take another trip back in time to get this CRT... Well, almost.
 
To do these tests, we connect the CRT up to a DVI to VGA adapter on our test computer’s ATI Radeon HD5870, and the LCD panel under test to DVI using an HDMI to DVI cable. I debated for some time the merits of using the same VGA signal, however, what really matters here is how the two display methods matter in the way that you, readers, are most likely to set things up. In addition, using the VGA input on any LCD is bound to add additional lag, as this is definitely a hardware scaler operation to go from analog to digital signaling, compared to the entirely digital DVI datapath. The most optimal resolution common to the LCD and CRT was 1280x800.
 
We use the same 3Dmark03 Wings of Fury benchmark on constant loop, take a bunch of photos with a fast camera (in this case, a Nikon D80 with a 17-50mm F/2.8) with wide open aperture for fast shutter speeds, in this case up to 1/800 of a second. Any differences on the demo clock will be our processing lag, and we’ll still get a good feel for how much pixel response lag there is on the LCD.

As I mentioned earlier, the only downside is that this means our old data is no longer a valid reference.

To compute the processing lag, I do two things. First, I watch for differences in the clock between the CRT and LCD, noting these whenever they are visible. I did this for 10 captures of the same sequence. Second, one can compute the processing difference by taking into account the FPS and the frame number difference.
 
 
Of course, not every one of those frames is written to the display, but we can still glean how much time difference there is between these respective frames with much more precision than from averaging the time, which only reports down to 1/100ths of a second.

Traditionally IPS panels are a bit slower (for whatever reason) than cheaper TN panels when it comes to refresh rate and processing lag. In this case, the ZR30w is a bit slower, but only by a few milliseconds, not the tens of milliseconds or perceptible lag that we’ve seen in the past. This is intriguing, it’s entirely possible HP’s omission of an OSD IC does make a difference.

We’re still evolving what we think the best way to measure processing lag is, and even using a CRT isn’t foolproof. In this case, I set the LCD and CRT refresh rates to 60 Hz so both in theory grab the same frame from the GPU’s framebuffer. In practice, it’s likely that they just aren’t, explaining the difference. As we process more LCDs, we’ll be able to tell, but the processing lag we’ve measured from all three monitors this far is totally acceptable.

I played a number of FPS titles and RTS games on the display, and never noticed any display processing lag or ghosting to speak of. If you’re going to use a 30” panel for gaming, the ZR30w seems to be just as good as any.
 
One trailing frame visible
 
LCD response and latency performance still isn’t technically at parity with CRTs, but you’d be hard pressed to tell the difference.

In the ghosting images I snapped, I usually only saw two frames. The dominant frame, and the preceding frame. This is very surprising, since we’re used to seeing three. But all throughout the images I snapped, only two frames are visible. This is very impressive panel response.
 
Analysis: Brightness Uniformity Analysis: Power Consumption
Comments Locked

95 Comments

View All Comments

  • boe - Wednesday, June 2, 2010 - link

    I've been waiting to buy some monitors for years as the 3008WFP had it's share of issues and Apple hasn't released a new LED backlit 30" yet.

    I'm looking forward to getting a couple of new monitors but not until some higher end models come out with a clear improvement over my 3007wfp's.
  • xismo - Wednesday, June 2, 2010 - link

    I find it a bit dissapointing that you don't list the configuration this was tested on. For example I don't know the gpu the monitor was running on and therefore I wouldn't know whether the gpu had 10-bit support. It would be nice if you could try to add test of 10-bit support as well, how it performs with smooth gradients and so on. As you probably know not too many graphics cards support 10-bit, but all of the workstation class cards do, which I think is appropriate for testing with high end monitors.
  • xismo - Wednesday, June 2, 2010 - link

    After reading the comments I see there was a discussion about 10-bit support, sorry for not seeing that earlier. At least I'm not the only one concerned about this :) BTW 5870 does not have 10-bit support as almost all of the other gaming video cards. And using a mini displayport on a macbook pro will not make the geforce 330m have 10-bit support either. Displayport as well as dual link dvi are the only types of connection that are able to process 10-bit color, but you still need a matching video card. Any workstation card like quadro or firepro should be just fine. But yeah including how each monitor displays gradients would be a huge advantage for me, as this is one of the things I'm looking for in a new monitor.
  • Brian Klug - Wednesday, June 2, 2010 - link

    Hey there! Thanks for the comment, yeah I'm working on getting us either a workstation GPU or some other way (whole workstation) to really test the 10-bit aspect. It'll happen this week or next and then I'll update. I realized after posting that I forgot to make sure it was working over 10-bit. And you're right about the 5870-it doesn't have 10-bit support. Guess that's one of those arguments for a more expensive workstation version of the card!

    Cheers!
    Brian Klug
  • awaken688 - Wednesday, June 2, 2010 - link

    I'm surprised no one has really complained about the brightness limitation. For professional photographers and graphic artists (the target audience), this would be rough to have no way to see below 150nits as a real means to check for print accuracy. For some of our print shops, 100 nits is accurate to print.

    Brian,

    Did you try adjusting the brightness via the video card as well? I have a monitor that on 0 brightness is still too bright, but I can then go into nVidia's control panel and lower the brightness using that to achieve the correct brightness or lower the RGB manually (which I understand isn't an option for this monitor). Just wondering. Nonetheless, this should at least be able to hit 120 nits for imaging professionals. Good article though. I like the monitor reviews for sure.
  • Soldier1969 - Wednesday, June 2, 2010 - link

    Thanks for the review, not a bad price for a new 30 incher compared to other brands. Those people here that want 24" reviews those are a dime a dozen and for the poor. I had a 24" 1920 x 1200 monitor long before most did since 2007 when they cost a fortune. So glad I jumped to 2560 x 1600, gaming on them owns everything else out there! Blu ray looks fantastic! If youve never experienced computing on a 30 incher your missing out!
  • kasakka - Thursday, June 3, 2010 - link

    The lack of OSD is what kills this for me. I find that without it getting accurate colors is more difficult and if you want to also use the display for gaming, some games may totally ignore calibrated color profiles or software adjustments. The lack of a scaler is a bit annoying too, but since graphics cards can do that at least in Windows, it's not really a problem.

    Regarding inputs, I currently have a Dell 3008WFP and its gazillion inputs is a huge minus for me. I only need one DVI and one Displayport, so having to cycle thru all the useless inputs is annoying. More annoying is the 3008WFP's (and the 27" Dell U2711's) circa 5 second delay when swithing resolutions or inputs. But I guess I'll stick with the Dell until someone comes out with something better.
  • pmeinl - Thursday, June 3, 2010 - link

    Does the AG coating of the ZR30 have this annoying sparkle effect like other current IPS panels (ex: U2410).

    Working on my two U2410 (programming, text processing) causes me eye strain.
  • pmeinl - Friday, June 4, 2010 - link

    As some people do not see the sparkle problem, here is a thread with a picture of it:
    http://hardforum.com/showthread.php?t=1466914&...
  • B3an - Thursday, June 3, 2010 - link

    So just to be clear... does this display do more colors than the Dell 3008?

Log in

Don't have an account? Sign up now