Brightness and Contrast

For brightness, black level, and contrast points, we use the same colorimeter setup described earlier. Specifically, we use an Xrite i1D2 with ColorEyes Display Pro, and take measurements at maximum and minimum brightness of white and black targets. Dynamic contrast is turned off. We also let the panels settle in for a half hour at the respective settings before taking any measurements.

The ZR30w uses a CCFL backlight, which makes that warm up time even more critical. In practice, the display reached the target brightness relatively quickly, and settled in within a few nits after 15 minutes.





Black level is an important metric, since it directly represents the extinction ratio of the crossed polarizers in each pixel. Remember, when a pixel is desired to be black, the photoelectric crystal in the cell changes linearly polarized light 90 degrees out of phase, so it is blocked by the polarizer. Higher extinction ratios (and thus better crystals and materials) result in lower (better) black levels.

We recommend running monitors at around 200 nits of luminous intensity, just because this is often where some of the best color tracking sits, and it’s enough of a balance to not result in eye stress from looking around the room and having your pupil adjust. I mentioned earlier that the controls on the ZR30w give you between 150 nits and just over 400. It’s obvious that HP wants you to run this thing on the brighter side, and I tend to agree.

The charts show the dynamic range in brightness, and the respective black levels at each brightness. White brightness is beyond the rated 370 nits at a maximum of 403 nits.

What we’re really interested in, however, is the contrast ratio. At the extreme, we fall short of 1000:1 with 733 and 738 at both ends of the display’s brightness range. This is good performance nonetheless, but I had expected a bit more. Keep in mind when comparing black levels on that graph that the other panels may indeed go darker, but at substantially lower brightness levels.
 

Analysis: Color Uniformity Analysis: Brightness Uniformity
Comments Locked

95 Comments

View All Comments

  • Mumrik - Wednesday, June 2, 2010 - link

    They basically never have. It's really a shame though - to me the ability to put the monitor into portrait mode with little to no hazzle is one of the major advantages to LCD monitors.
  • softdrinkviking - Tuesday, June 1, 2010 - link

    brian, i think it's important to remember that

    1. it is unlikely that people can perceive the difference between 24 bit and 30 bit color.

    2. to display 30 bit color, or 10 bit color depth, you also need an application that is 10 bit aware, like maya or AutoCad, in which case the user would most likely opt for a workstation card anyway.

    i am unsure, but i don't think windows 7 or any other normally used program is written to take advantage of 10 bit color depth.

    from what i understand, 10 bit color and "banding" only really has an impact when you edit and reedit image files over and over, in which case, you are probably using medical equipment or blowing up photography to poster sizes in a professional manner.

    here is a neat little AMD pdf on their 30 bit implementation
    http://ati.amd.com/products/pdf/10-Bit.pdf
  • zsero - Wednesday, June 2, 2010 - link

    I think the only point where you need to watch 10-bit _source_ is when watching results from medical imaging devices. Doctors say that they can see difference between 256 and 1024 gray values.
  • MacGyver85 - Thursday, June 3, 2010 - link

    Actually Windows 7 does support 10 bit encoding, it even supports more than that; 16 bit encoding!
    http://en.wikipedia.org/wiki/ScRGB_color_space
  • softdrinkviking - Saturday, June 5, 2010 - link

    all that means is that certain components of windows 7 "support" 16 bit color.

    it does not mean that 16 bit color is displayed at all times.

    scRGB is a color profile management specification that allows for a wider amount of color information that sRGB, but it does not automatically enable 16 bit color, or even 10 bit deep color.

    you still need to be running a program that is 10 bit aware, or using a program that is running in a 10 bit aware windows component. (like D3D).

    things like aero (which uses directx) could potentially take advantage of an scRGB color profile with 10 bit deep encoding, but why would it?
    it would suck performance for no perceivable benefit.

    the only programs that really use 10 bit color are professional imagining programs for medical, and design uses.
    it is unlikely that will change because it is more expensive to optimize software for 10 bit color, and the benefit is only perceivable in a handful of situations.
  • CharonPDX - Wednesday, June 2, 2010 - link

    I have an idea for how to improve your latency measurement.

    Get a Matrox DualHead2Go Digital Edition. This outputs DVI-I over both outputs, so each can do either analog or digital. Test it with two identical displays over both DVI and VGA to make sure that the DualHead2Go doesn't directly introduce any lag. Compare with two identical displays, one over DVI and one over VGA, to see if either the display or the DualHead2Go introduces lag over one interface over the other. (I'd recommend trying multiple pairs of identical displays to verify.)

    This would rip out any video card mirroring lag (most GPUs do treat the outputs separately, and those outputs may produce lag,) and leave you solely at the mercy of any lag inherent to the DH2Go's DAC.

    Next, get a high quality CRT, preferably one with BNC inputs. Set the output to 85 Hz for max physical framerate. (If you go with direct-drive instead of DualHead2Go, set the resolution to something really low, like 1024x768, and set the refresh rate as high as the display will go. The higher, the better. I have a nice-quality old 22" CRT that can go up to 200 Hz at 640x480 and 150 Hz at 1024x768.)

    Then, you want to get a good test. Your 3dMark is pretty good, especially with its frame counter. But there is an excellent web-based on at http://www.lagom.nl/lcd-test/response_time.php (part of a wonderful series of LCD tests.) This one goes to the thousands of a second. (Obviously, you need a refresh rate pretty high to actually display all those, but if you can reach it, it's great.)

    Finally, take your pictures with a high-sensitivity camera at 1/1000 sec exposure. This will "freeze" even the fastest frame rate.
  • zsero - Wednesday, June 2, 2010 - link

    In the operating systems 32-bit is the funniest of all, it is the same as 24-bit, except I think they count the alpha channel too, so RGB would be 24-bit and RGBA would be 32-bit. But I as far as I know on an operating system level it doesn't mean anything useful, it just looks good in the control panel. True Color would be a better name for it. In any operating system if you take a screenshot, the result will be 24 bit color RGB data.

    From wikipedia:
    32-bit color

    "32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha, Z or bump data), or sometimes even to plain 24-bit data.
  • velis - Wednesday, June 2, 2010 - link

    However:
    1. Reduce size to somewhere between 22 and 24"
    2. Add RGB LED instead of CCFL (not edge lit either)
    3. Add 120Hz for 3D (using multiple ports when necessary)
    4. Ditch the 30 bits - only good for a few apps

    THEN I'm all over this monitor.

    As it is, it's just another 30 incher, great and high quality, but still just another 30 incher...

    I SO want to replace my old Samsung 215TW, but there's nothing out there to replace it with :(
  • zsero - Wednesday, June 2, 2010 - link

    Gamut will not change between 24-bit and 30-bit color, as it is a physical properties of the panel in case (+ lighting).

    So the picture will be visually the same, nothing will change, except if you are looking at a very fine gradient, it will not have any point where you would notice a sharp line.

    Think about it in Photoshop. You make a 8-bit grayscale image (256 possible grey value for each pixel), and apply a black to white gradient on its whole size horizontally. Now look at the histogram, you see continuous distribution of values from 0 to 255.

    Now make some huge color correction, like do a big change in gamma. Now the histogram is not a continuous curve, but something full of spikes, as because of the rounding errors a correction from 256 possible values to 256 possible values skips certain values.

    Now apply a levels correction, and make the darkest black into for example 50 and the brightest white into for example 200. What happens now, is that you are compressing the whole dynamic range into a much smaller interval, but as your scale is fixed, you are now using only 150 values for the original range. That's exactly what is happening for example when you use a calibrator software to calibrate a wide-gamut (close to 100% AdobeRGB) monitor for sRGB use, because you need to use it in not color aware programs (very, very common situation).

    For actual real-world test, I would simply suggest you to use the calibrator software to calibrate your monitor to sRGB, and have a look at fine gradients. For example check it with the fullscreen tools from http://lcdresource.com/tools.php
  • Rick83 - Wednesday, June 2, 2010 - link

    I feel that Eizo has been ahead in this area for a while, and it appears that it will stay there.
    The screen manager software reduces the need of on-screen buttons, but still gives you direct access to gamma, color temperature, color values and even power on/off timer functions as well as integrated profiles - automatically raising brightness when opening a photo or video app, for example.
    Taking all controls away is a bit naive :-/

Log in

Don't have an account? Sign up now