Display Lag and Response Time

For gamers, display lag is a very real concern, and display processing is a nebulously reported (if at all) specification for just about all LCD displays. We’ve been over this before, but ultimately, what matters isn’t GTG, full on, full off pixel response times, or what’s reported on the spec sheet, but the holistic latency of the monitor compared to something we can all agree is lag-free. We previously used a baseline LCD and compared with it as our benchmark of no display lag. Previously we were using a 17” Princeton CRT - some of you were a bit underwhelmed by that monitor.

I spent some time visiting (I kid you not) almost every thrift store in town, and found myself a veritable cornucopia of uh... lovingly cared for CRTs to choose from. I settled on a much more modern looking Sony G520 20” CRT supporting a higher resolution and refresh rate. It’s still not what I’m ultimately looking for, but it’s better. Oh, and it cost a whopping $9. ;)

I had to take another trip back in time to get this CRT... Well, almost.
 
To do these tests, we connect the CRT up to a DVI to VGA adapter on our test computer’s ATI Radeon HD5870, and the LCD panel under test to DVI using an HDMI to DVI cable. I debated for some time the merits of using the same VGA signal, however, what really matters here is how the two display methods matter in the way that you, readers, are most likely to set things up. In addition, using the VGA input on any LCD is bound to add additional lag, as this is definitely a hardware scaler operation to go from analog to digital signaling, compared to the entirely digital DVI datapath. The most optimal resolution common to the LCD and CRT was 1280x800.
 
We use the same 3Dmark03 Wings of Fury benchmark on constant loop, take a bunch of photos with a fast camera (in this case, a Nikon D80 with a 17-50mm F/2.8) with wide open aperture for fast shutter speeds, in this case up to 1/800 of a second. Any differences on the demo clock will be our processing lag, and we’ll still get a good feel for how much pixel response lag there is on the LCD.

As I mentioned earlier, the only downside is that this means our old data is no longer a valid reference.

To compute the processing lag, I do two things. First, I watch for differences in the clock between the CRT and LCD, noting these whenever they are visible. I did this for 10 captures of the same sequence. Second, one can compute the processing difference by taking into account the FPS and the frame number difference.
 
 
Of course, not every one of those frames is written to the display, but we can still glean how much time difference there is between these respective frames with much more precision than from averaging the time, which only reports down to 1/100ths of a second.

Traditionally IPS panels are a bit slower (for whatever reason) than cheaper TN panels when it comes to refresh rate and processing lag. In this case, the ZR30w is a bit slower, but only by a few milliseconds, not the tens of milliseconds or perceptible lag that we’ve seen in the past. This is intriguing, it’s entirely possible HP’s omission of an OSD IC does make a difference.

We’re still evolving what we think the best way to measure processing lag is, and even using a CRT isn’t foolproof. In this case, I set the LCD and CRT refresh rates to 60 Hz so both in theory grab the same frame from the GPU’s framebuffer. In practice, it’s likely that they just aren’t, explaining the difference. As we process more LCDs, we’ll be able to tell, but the processing lag we’ve measured from all three monitors this far is totally acceptable.

I played a number of FPS titles and RTS games on the display, and never noticed any display processing lag or ghosting to speak of. If you’re going to use a 30” panel for gaming, the ZR30w seems to be just as good as any.
 
One trailing frame visible
 
LCD response and latency performance still isn’t technically at parity with CRTs, but you’d be hard pressed to tell the difference.

In the ghosting images I snapped, I usually only saw two frames. The dominant frame, and the preceding frame. This is very surprising, since we’re used to seeing three. But all throughout the images I snapped, only two frames are visible. This is very impressive panel response.
 
Analysis: Brightness Uniformity Analysis: Power Consumption
Comments Locked

95 Comments

View All Comments

  • mcklevin - Monday, June 28, 2010 - link

    I now have had this monitor for a week, and it has performed quite well, it is a very solid professional looking build. The anti-glare screen does not sparkle like the LG screen, and the black levels are better as well. Text looks much better than the LG too.

    Right now I do not have a calibrator. The color accuracy is pretty good, It does get hot it the high end like many of the wide gamut monitors. Dropping the digital vibrancy to 44 in the nvidia control panel has helped a lot with the saturation. The viewing angles are nice, horizontal I would give an 9, vertical 7. I didn't notice input lag in Mass Effect 2. I use this monitor for Cinema 4D and Aftereffects.
  • AlphaJarmel - Wednesday, June 23, 2010 - link

    So this monitor is pretty much useless for gaming as Windows will ignore the calibration.
  • SoCalRich - Thursday, July 1, 2010 - link

    Brian,

    Thanks for your review!!!

    I have a new 17" MacBook Pro i7. I was curios if you were able to hook up your MBP to this monitor using the displayport cable?

    I think this would be a great monitor for my Photoshop & Lightroom editing. I've been looking at the 30" ACD. This looks like a better monitor.

    I'm still a little confused about how you make any adjustments w/o OSD????
  • SoCalRich - Thursday, July 1, 2010 - link

    My new 17" MacBook Pro i7 uses the Intel HD Graphics for regular web surfing etc. It then switches to the NVIDIA GeForce GT 330M when I open Photoshop and Lightroom.

    I'm hoping these cards will support this monitor using the displayport cable.
  • tsittard - Saturday, July 3, 2010 - link

    I picked this monitor up for a 3rd editing monitor to be used with Final Cut Pro, AJA Kona 3 card, and a blackmagic conversion box that goes from HD-SDI to dual link dvi-d.

    Here's the link:
    http://www.blackmagic-design.com/products/hdlink/t...

    I'm not getting any picture....
    I'm currently using the zr30w as a second computer monitor, which is working fine at 2500x1600 coming off my Mac Pro, but this is not what I intended...

    Is it not possible to send this monitor a 1920x1080 signal via dvi-d and have it upscale to fill it?

    I'm also using 2 hp zr24w's for my computer monitors, and I've hooked up the blackmagic box to one of those and everything works great...

    My settings inside of FCP are for a 1080 23.98psf 10bit signal, again it seems to work fine going into the ZR24w

    I sense that there should be a simple answer here but I'm not finding it....

    the blackmagic converter box also comes in a display port version, will that allow me to send the HD resolution to the zr30w?

    Mr. Klug, any suggestions?
  • B3an - Tuesday, August 10, 2010 - link

    Brian - you mentioned you used to i1 Display 2 to calibrate this monitor, i have both this monitor and the i1D2, can you tell me what settings you used?

    I find if a default white point is not selected then the colours and especially greys have a red tint to 'em. Did you try it with the Eye-One Match 3 software that comes with the i1D2? Thanks.
  • humba - Sunday, August 15, 2010 - link

    Has anyone who actually got the monitor noticed a buzzing/humming noise coming from the screen as soon as there's a video input signal?
    The noise is similar to the feedback noise you get on an audio system if there's an issue with the grounding, and it only goes away if the brightness is increased to 80% or higher.

    I've already RMA'ed my device once (they sent out a replacement only hours after receiving the case which makes me thing it's a known defect), but the replacement exhibits the exact same behavior. I've gone through a bunch of different DP cables, tried with various DVI cables, too, on 4 different computers (HP EliteBook 8540p, Acer Timelize 3810TZ, Mac Mini 2009 and a self-assembled box) and went around the house trying different electrical phases but to no avail.

    And I seem not to be the only one since I found this test that mentions the same issue: http://www.productwiki.com/hp-zr30w/. And, strangely enough, the support documents at HP's site also has a support document (albeit a very old one ): http://h20000.www2.hp.com/bizsupport/TechSupport/D... which doesn't really have any relation to the display in question other than the 80% brightness mentioned which seems to be the fix from 2003 on.
  • martinz - Tuesday, November 1, 2011 - link

    Yes. Mine did it, got it replaced, same issue.

    HP in the US confirmed they all do it below 80% brightness, but tried to claim all/most large panels do it.

    I think it is completely unacceptable for a monitor at this price - and I am shocked that HP don't agree. Otherwise, I like it a lot: great picture and build quality.

    In a busy office it would not be noticeable, and if you can live with the (for me) eyeball melting full brightness, or are not sensitive to ambient noise, it's not an issue either.

    For me it's a show stopper. Mine is going back, but not sure what to replace it with.
  • BikeDude - Friday, August 27, 2010 - link

    Do all displays support HDCP these days, or will I still need AnyDVDHD in the future if I have to replace my current 30" Apple Cinema display?
  • fontajos - Monday, September 13, 2010 - link

    Hi Brian,

    thanks for your extensive test on the HP Z30w. I just bought the z30w and was very disappointed by the strong blue at 9000 Kelvin. I installed the hp ICM profile but nothing changed. I want to have a 6500K screen (like all other monitors here). Is it possible to send me your ICM file or can you suggest me a way to get my white to real-white and not blueish-white? If the only solution is to use a hardware calibration, can you suggest me a product?

Log in

Don't have an account? Sign up now