Opining for an OSD
 
As I mentioned before, there’s no OSD to speak of. The only options exposed to users are input selection, brightness, and dynamic contrast. A blue LED at right flashes three times when you’ve reached the upper or lower brightness limits, and flashes four times when dynamic contrast is turned on, once when turned off. The LED is off when the display is in operation, and amber when in standby. The rest is up to you.

Spartan Scaling Selections

I also already touched on scaling which appears to be pretty barebones for reasons discussed earlier. You can pixel double by going with 1280x800, or go for native, or choose another resolution and everything upscaled with some smudging. There's no 1:1 option, or any options whatsoever. Honestly, I can’t think of any reasons why you wouldn’t drive an LCD at anything other than native resolution. The dot pitch of the ZR30w is fabulous at 0.250 mm, if you’re paying for those pixels, use them, then use DPI controls in the operating system if text and UI is too small. The ZR30w does scale other aspect ratios that I tested properly, but don't expect too much. Honestly it still makes no sense to drive a display this high resolution at anything but native.

Final Thoughts

The 30” segment of the LCD market demands the best of the best. It’s indisputably the ring in which every manufacturer wages battle with its flagship monitor. Solid execution here usually translates to solid performance for smaller size panels in the same lineup. Suffice it to say that HP’s 30 inch ZR series successor to the LP3065 doesn’t disappoint.
 
What you're seeing there are flaws in my photo and banding in the camera, not the display. I did overexpose the photo a bit, which is why the icons are blown out.
 
We would’ve liked to see just a bit more color accuracy, but the tradeoff for a dramatically bigger gamut is a good one to make. The ZR30w blew past the advertised 99% of AdobeRGB, coming in at just over 111% of the volume. It’s important to note that HP likely means coverage overlap of 99%, which the ZR30w does meet. But heck, having a volume bigger than the AdobeRGB volume is more notable. In person, the ZR30w is impressive all around. It’s bright, contrasty, and has colors that put my daily use monitors and others I’ve got laying around to shame; not an easy thing to do.

At an MSRP of $1,299, the ZR30w is priced aggressively and below its predecessor and competition. If you don’t mind lacking an extra DVI port, the increase in gamut volume is a notable difference. It’s likely that both lack of an OSD as well as accessory VGA, HDMI, component or composite inputs is related to the lack of a compatible control IC. Instead, there’s likely a simple scaler so we see minimal processing lag for an IPS panel, and unfortunately minimal user control. This is a bold move on HP’s part to continue for a second generation, but likely keeps cost and input processing lag low.
 
Interestingly enough, this is definitely the way that things are going for display controls, with a growing number offering DDC control as the exclusive option for tweaking settings. In practice, what HP offers with the front controls is almost all you need, so long as you can calibrate to your liking in software. Perhaps we might have gotten under a Delta E of 1.0 with some RGB sliders, but who’s to say?
 
 
The ZR30w is definitely a serious contender among 30” displays. It’s an attractive package that wows with above average color tracking, an impressive gamut, and a competitive price, but lacks some of the extra “accessory” inputs that consumers are starting to demand. That said, it looks like the ZR30w is a solid option if you’re looking for a 30” display with an unbeatable color gamut.
Analysis: Power Consumption
Comments Locked

95 Comments

View All Comments

  • Mumrik - Wednesday, June 2, 2010 - link

    They basically never have. It's really a shame though - to me the ability to put the monitor into portrait mode with little to no hazzle is one of the major advantages to LCD monitors.
  • softdrinkviking - Tuesday, June 1, 2010 - link

    brian, i think it's important to remember that

    1. it is unlikely that people can perceive the difference between 24 bit and 30 bit color.

    2. to display 30 bit color, or 10 bit color depth, you also need an application that is 10 bit aware, like maya or AutoCad, in which case the user would most likely opt for a workstation card anyway.

    i am unsure, but i don't think windows 7 or any other normally used program is written to take advantage of 10 bit color depth.

    from what i understand, 10 bit color and "banding" only really has an impact when you edit and reedit image files over and over, in which case, you are probably using medical equipment or blowing up photography to poster sizes in a professional manner.

    here is a neat little AMD pdf on their 30 bit implementation
    http://ati.amd.com/products/pdf/10-Bit.pdf
  • zsero - Wednesday, June 2, 2010 - link

    I think the only point where you need to watch 10-bit _source_ is when watching results from medical imaging devices. Doctors say that they can see difference between 256 and 1024 gray values.
  • MacGyver85 - Thursday, June 3, 2010 - link

    Actually Windows 7 does support 10 bit encoding, it even supports more than that; 16 bit encoding!
    http://en.wikipedia.org/wiki/ScRGB_color_space
  • softdrinkviking - Saturday, June 5, 2010 - link

    all that means is that certain components of windows 7 "support" 16 bit color.

    it does not mean that 16 bit color is displayed at all times.

    scRGB is a color profile management specification that allows for a wider amount of color information that sRGB, but it does not automatically enable 16 bit color, or even 10 bit deep color.

    you still need to be running a program that is 10 bit aware, or using a program that is running in a 10 bit aware windows component. (like D3D).

    things like aero (which uses directx) could potentially take advantage of an scRGB color profile with 10 bit deep encoding, but why would it?
    it would suck performance for no perceivable benefit.

    the only programs that really use 10 bit color are professional imagining programs for medical, and design uses.
    it is unlikely that will change because it is more expensive to optimize software for 10 bit color, and the benefit is only perceivable in a handful of situations.
  • CharonPDX - Wednesday, June 2, 2010 - link

    I have an idea for how to improve your latency measurement.

    Get a Matrox DualHead2Go Digital Edition. This outputs DVI-I over both outputs, so each can do either analog or digital. Test it with two identical displays over both DVI and VGA to make sure that the DualHead2Go doesn't directly introduce any lag. Compare with two identical displays, one over DVI and one over VGA, to see if either the display or the DualHead2Go introduces lag over one interface over the other. (I'd recommend trying multiple pairs of identical displays to verify.)

    This would rip out any video card mirroring lag (most GPUs do treat the outputs separately, and those outputs may produce lag,) and leave you solely at the mercy of any lag inherent to the DH2Go's DAC.

    Next, get a high quality CRT, preferably one with BNC inputs. Set the output to 85 Hz for max physical framerate. (If you go with direct-drive instead of DualHead2Go, set the resolution to something really low, like 1024x768, and set the refresh rate as high as the display will go. The higher, the better. I have a nice-quality old 22" CRT that can go up to 200 Hz at 640x480 and 150 Hz at 1024x768.)

    Then, you want to get a good test. Your 3dMark is pretty good, especially with its frame counter. But there is an excellent web-based on at http://www.lagom.nl/lcd-test/response_time.php (part of a wonderful series of LCD tests.) This one goes to the thousands of a second. (Obviously, you need a refresh rate pretty high to actually display all those, but if you can reach it, it's great.)

    Finally, take your pictures with a high-sensitivity camera at 1/1000 sec exposure. This will "freeze" even the fastest frame rate.
  • zsero - Wednesday, June 2, 2010 - link

    In the operating systems 32-bit is the funniest of all, it is the same as 24-bit, except I think they count the alpha channel too, so RGB would be 24-bit and RGBA would be 32-bit. But I as far as I know on an operating system level it doesn't mean anything useful, it just looks good in the control panel. True Color would be a better name for it. In any operating system if you take a screenshot, the result will be 24 bit color RGB data.

    From wikipedia:
    32-bit color

    "32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha, Z or bump data), or sometimes even to plain 24-bit data.
  • velis - Wednesday, June 2, 2010 - link

    However:
    1. Reduce size to somewhere between 22 and 24"
    2. Add RGB LED instead of CCFL (not edge lit either)
    3. Add 120Hz for 3D (using multiple ports when necessary)
    4. Ditch the 30 bits - only good for a few apps

    THEN I'm all over this monitor.

    As it is, it's just another 30 incher, great and high quality, but still just another 30 incher...

    I SO want to replace my old Samsung 215TW, but there's nothing out there to replace it with :(
  • zsero - Wednesday, June 2, 2010 - link

    Gamut will not change between 24-bit and 30-bit color, as it is a physical properties of the panel in case (+ lighting).

    So the picture will be visually the same, nothing will change, except if you are looking at a very fine gradient, it will not have any point where you would notice a sharp line.

    Think about it in Photoshop. You make a 8-bit grayscale image (256 possible grey value for each pixel), and apply a black to white gradient on its whole size horizontally. Now look at the histogram, you see continuous distribution of values from 0 to 255.

    Now make some huge color correction, like do a big change in gamma. Now the histogram is not a continuous curve, but something full of spikes, as because of the rounding errors a correction from 256 possible values to 256 possible values skips certain values.

    Now apply a levels correction, and make the darkest black into for example 50 and the brightest white into for example 200. What happens now, is that you are compressing the whole dynamic range into a much smaller interval, but as your scale is fixed, you are now using only 150 values for the original range. That's exactly what is happening for example when you use a calibrator software to calibrate a wide-gamut (close to 100% AdobeRGB) monitor for sRGB use, because you need to use it in not color aware programs (very, very common situation).

    For actual real-world test, I would simply suggest you to use the calibrator software to calibrate your monitor to sRGB, and have a look at fine gradients. For example check it with the fullscreen tools from http://lcdresource.com/tools.php
  • Rick83 - Wednesday, June 2, 2010 - link

    I feel that Eizo has been ahead in this area for a while, and it appears that it will stay there.
    The screen manager software reduces the need of on-screen buttons, but still gives you direct access to gamma, color temperature, color values and even power on/off timer functions as well as integrated profiles - automatically raising brightness when opening a photo or video app, for example.
    Taking all controls away is a bit naive :-/

Log in

Don't have an account? Sign up now