Display Lag and Response Time

For gamers, display lag is a very real concern, and display processing is a nebulously reported (if at all) specification for just about all LCD displays. We’ve been over this before, but ultimately, what matters isn’t GTG, full on, full off pixel response times, or what’s reported on the spec sheet, but the holistic latency of the monitor compared to something we can all agree is lag-free. We previously used a baseline LCD and compared with it as our benchmark of no display lag. Previously we were using a 17” Princeton CRT - some of you were a bit underwhelmed by that monitor.

I spent some time visiting (I kid you not) almost every thrift store in town, and found myself a veritable cornucopia of uh... lovingly cared for CRTs to choose from. I settled on a much more modern looking Sony G520 20” CRT supporting a higher resolution and refresh rate. It’s still not what I’m ultimately looking for, but it’s better. Oh, and it cost a whopping $9. ;)

I had to take another trip back in time to get this CRT... Well, almost.
 
To do these tests, we connect the CRT up to a DVI to VGA adapter on our test computer’s ATI Radeon HD5870, and the LCD panel under test to DVI using an HDMI to DVI cable. I debated for some time the merits of using the same VGA signal, however, what really matters here is how the two display methods matter in the way that you, readers, are most likely to set things up. In addition, using the VGA input on any LCD is bound to add additional lag, as this is definitely a hardware scaler operation to go from analog to digital signaling, compared to the entirely digital DVI datapath. The most optimal resolution common to the LCD and CRT was 1280x800.
 
We use the same 3Dmark03 Wings of Fury benchmark on constant loop, take a bunch of photos with a fast camera (in this case, a Nikon D80 with a 17-50mm F/2.8) with wide open aperture for fast shutter speeds, in this case up to 1/800 of a second. Any differences on the demo clock will be our processing lag, and we’ll still get a good feel for how much pixel response lag there is on the LCD.

As I mentioned earlier, the only downside is that this means our old data is no longer a valid reference.

To compute the processing lag, I do two things. First, I watch for differences in the clock between the CRT and LCD, noting these whenever they are visible. I did this for 10 captures of the same sequence. Second, one can compute the processing difference by taking into account the FPS and the frame number difference.
 
 
Of course, not every one of those frames is written to the display, but we can still glean how much time difference there is between these respective frames with much more precision than from averaging the time, which only reports down to 1/100ths of a second.

Traditionally IPS panels are a bit slower (for whatever reason) than cheaper TN panels when it comes to refresh rate and processing lag. In this case, the ZR30w is a bit slower, but only by a few milliseconds, not the tens of milliseconds or perceptible lag that we’ve seen in the past. This is intriguing, it’s entirely possible HP’s omission of an OSD IC does make a difference.

We’re still evolving what we think the best way to measure processing lag is, and even using a CRT isn’t foolproof. In this case, I set the LCD and CRT refresh rates to 60 Hz so both in theory grab the same frame from the GPU’s framebuffer. In practice, it’s likely that they just aren’t, explaining the difference. As we process more LCDs, we’ll be able to tell, but the processing lag we’ve measured from all three monitors this far is totally acceptable.

I played a number of FPS titles and RTS games on the display, and never noticed any display processing lag or ghosting to speak of. If you’re going to use a 30” panel for gaming, the ZR30w seems to be just as good as any.
 
One trailing frame visible
 
LCD response and latency performance still isn’t technically at parity with CRTs, but you’d be hard pressed to tell the difference.

In the ghosting images I snapped, I usually only saw two frames. The dominant frame, and the preceding frame. This is very surprising, since we’re used to seeing three. But all throughout the images I snapped, only two frames are visible. This is very impressive panel response.
 
Analysis: Brightness Uniformity Analysis: Power Consumption
Comments Locked

95 Comments

View All Comments

  • Mumrik - Wednesday, June 2, 2010 - link

    They basically never have. It's really a shame though - to me the ability to put the monitor into portrait mode with little to no hazzle is one of the major advantages to LCD monitors.
  • softdrinkviking - Tuesday, June 1, 2010 - link

    brian, i think it's important to remember that

    1. it is unlikely that people can perceive the difference between 24 bit and 30 bit color.

    2. to display 30 bit color, or 10 bit color depth, you also need an application that is 10 bit aware, like maya or AutoCad, in which case the user would most likely opt for a workstation card anyway.

    i am unsure, but i don't think windows 7 or any other normally used program is written to take advantage of 10 bit color depth.

    from what i understand, 10 bit color and "banding" only really has an impact when you edit and reedit image files over and over, in which case, you are probably using medical equipment or blowing up photography to poster sizes in a professional manner.

    here is a neat little AMD pdf on their 30 bit implementation
    http://ati.amd.com/products/pdf/10-Bit.pdf
  • zsero - Wednesday, June 2, 2010 - link

    I think the only point where you need to watch 10-bit _source_ is when watching results from medical imaging devices. Doctors say that they can see difference between 256 and 1024 gray values.
  • MacGyver85 - Thursday, June 3, 2010 - link

    Actually Windows 7 does support 10 bit encoding, it even supports more than that; 16 bit encoding!
    http://en.wikipedia.org/wiki/ScRGB_color_space
  • softdrinkviking - Saturday, June 5, 2010 - link

    all that means is that certain components of windows 7 "support" 16 bit color.

    it does not mean that 16 bit color is displayed at all times.

    scRGB is a color profile management specification that allows for a wider amount of color information that sRGB, but it does not automatically enable 16 bit color, or even 10 bit deep color.

    you still need to be running a program that is 10 bit aware, or using a program that is running in a 10 bit aware windows component. (like D3D).

    things like aero (which uses directx) could potentially take advantage of an scRGB color profile with 10 bit deep encoding, but why would it?
    it would suck performance for no perceivable benefit.

    the only programs that really use 10 bit color are professional imagining programs for medical, and design uses.
    it is unlikely that will change because it is more expensive to optimize software for 10 bit color, and the benefit is only perceivable in a handful of situations.
  • CharonPDX - Wednesday, June 2, 2010 - link

    I have an idea for how to improve your latency measurement.

    Get a Matrox DualHead2Go Digital Edition. This outputs DVI-I over both outputs, so each can do either analog or digital. Test it with two identical displays over both DVI and VGA to make sure that the DualHead2Go doesn't directly introduce any lag. Compare with two identical displays, one over DVI and one over VGA, to see if either the display or the DualHead2Go introduces lag over one interface over the other. (I'd recommend trying multiple pairs of identical displays to verify.)

    This would rip out any video card mirroring lag (most GPUs do treat the outputs separately, and those outputs may produce lag,) and leave you solely at the mercy of any lag inherent to the DH2Go's DAC.

    Next, get a high quality CRT, preferably one with BNC inputs. Set the output to 85 Hz for max physical framerate. (If you go with direct-drive instead of DualHead2Go, set the resolution to something really low, like 1024x768, and set the refresh rate as high as the display will go. The higher, the better. I have a nice-quality old 22" CRT that can go up to 200 Hz at 640x480 and 150 Hz at 1024x768.)

    Then, you want to get a good test. Your 3dMark is pretty good, especially with its frame counter. But there is an excellent web-based on at http://www.lagom.nl/lcd-test/response_time.php (part of a wonderful series of LCD tests.) This one goes to the thousands of a second. (Obviously, you need a refresh rate pretty high to actually display all those, but if you can reach it, it's great.)

    Finally, take your pictures with a high-sensitivity camera at 1/1000 sec exposure. This will "freeze" even the fastest frame rate.
  • zsero - Wednesday, June 2, 2010 - link

    In the operating systems 32-bit is the funniest of all, it is the same as 24-bit, except I think they count the alpha channel too, so RGB would be 24-bit and RGBA would be 32-bit. But I as far as I know on an operating system level it doesn't mean anything useful, it just looks good in the control panel. True Color would be a better name for it. In any operating system if you take a screenshot, the result will be 24 bit color RGB data.

    From wikipedia:
    32-bit color

    "32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha, Z or bump data), or sometimes even to plain 24-bit data.
  • velis - Wednesday, June 2, 2010 - link

    However:
    1. Reduce size to somewhere between 22 and 24"
    2. Add RGB LED instead of CCFL (not edge lit either)
    3. Add 120Hz for 3D (using multiple ports when necessary)
    4. Ditch the 30 bits - only good for a few apps

    THEN I'm all over this monitor.

    As it is, it's just another 30 incher, great and high quality, but still just another 30 incher...

    I SO want to replace my old Samsung 215TW, but there's nothing out there to replace it with :(
  • zsero - Wednesday, June 2, 2010 - link

    Gamut will not change between 24-bit and 30-bit color, as it is a physical properties of the panel in case (+ lighting).

    So the picture will be visually the same, nothing will change, except if you are looking at a very fine gradient, it will not have any point where you would notice a sharp line.

    Think about it in Photoshop. You make a 8-bit grayscale image (256 possible grey value for each pixel), and apply a black to white gradient on its whole size horizontally. Now look at the histogram, you see continuous distribution of values from 0 to 255.

    Now make some huge color correction, like do a big change in gamma. Now the histogram is not a continuous curve, but something full of spikes, as because of the rounding errors a correction from 256 possible values to 256 possible values skips certain values.

    Now apply a levels correction, and make the darkest black into for example 50 and the brightest white into for example 200. What happens now, is that you are compressing the whole dynamic range into a much smaller interval, but as your scale is fixed, you are now using only 150 values for the original range. That's exactly what is happening for example when you use a calibrator software to calibrate a wide-gamut (close to 100% AdobeRGB) monitor for sRGB use, because you need to use it in not color aware programs (very, very common situation).

    For actual real-world test, I would simply suggest you to use the calibrator software to calibrate your monitor to sRGB, and have a look at fine gradients. For example check it with the fullscreen tools from http://lcdresource.com/tools.php
  • Rick83 - Wednesday, June 2, 2010 - link

    I feel that Eizo has been ahead in this area for a while, and it appears that it will stay there.
    The screen manager software reduces the need of on-screen buttons, but still gives you direct access to gamma, color temperature, color values and even power on/off timer functions as well as integrated profiles - automatically raising brightness when opening a photo or video app, for example.
    Taking all controls away is a bit naive :-/

Log in

Don't have an account? Sign up now