Right out of the box, HP’s newest 30” display is huge - but you expected that and prepared by already clearing off your desk, right? ;)

Getting the beast of a monitor out of the packaging was actually exceedingly easy; remove one piece of styrofoam, and out comes the stand. One more large piece and the panel is right there for your picking.

The ZR30w display stand packs virtually all the bells and whistles a 30” stand should. There’s 4” of vertical travel, and movement in every axis except pivot (meaning you can’t rotate and use the monitor in portrait mode unless you roll your own VESA stand). HP’s quick release lock system is actually surprisingly useful. The monitor has a slide-in rack which mates up to the display stand; you can slide the monitor in, move a lever into the locked position, and you’re done. This is again the same mechanism used in the LP3065. I was very impressed with how solid and simple this configuration was - there’s no flexing or creaking, and no screws or assembly. It’s always a nice touch when out of box setup is painless - it’s downright critical when you’re juggling a 30” display. In addition, at the base of the monitor is a snap-on cable management cover for routing cables.  

 
The aesthetics are serious and businesslike. There’s a small HP logo up top and center, the model number sits meekly in the bottom left, and in the bottom right are the display controls. There’s a classy aluminum strip which runs along the entire outside of the bezel - a nice industrial design motif for a 30 incher. HP advertises that the chassis uses at least 25% post-consumer recycled plastic resin.
 
That classy aluminum strip runs all the way around

Around back is a much larger HP logo, cooling vents, and the display inputs. There’s also a semi hand hold which is great for guiding the monitor into the latch mechanism. Other than that, there’s not much else to speak of except the two USB 2.0 ports on the left of the display. What’s good about the ZR30w’s aesthetics is that they aren’t loud, garish, or overwhelmed with branding.

Two USB 2.0 side ports

I noted in previous display reviews that sometimes at the lowest height setting the display connectors can hit the stand or otherwise be obstructed. Note that HP gives almost two entire inches of clearance for cables. This is the way it should be done - no problems connecting DVI cables, especially since dual-link cables are notably beefier.  

We always like to use the monitor out of box without calibration for some time and just get a feel for it. While it’s easy to make a case that if you’re shopping for a 30” LCD, you’ve probably got the means to calibrate, it’s a harder case to make on the smaller displays. That said, I was immediately impressed with the ZR30w. Right away, the greens and reds were notably richer than on my two BenQ FP241W displays I use daily.  

HP ships its manual on an enclosed CD-ROM, and also part of that installer is a color calibration .icm profile. As a rule, I’m going to start using manufacturer-supplied color profiles for my subjective uncalibrated testing and “uncalibrated” results, since they’re closest to what average users without colorimeters are going to do. Even with this ICM profile however, the panel seemed a bit cool in temperature to me (I later measured and found the same), but everything else seemed quite good.

Meet the ZR30w: 1.07 Billion Colors Too big for an OSD and More Impressions
Comments Locked

95 Comments

View All Comments

  • Mumrik - Wednesday, June 2, 2010 - link

    They basically never have. It's really a shame though - to me the ability to put the monitor into portrait mode with little to no hazzle is one of the major advantages to LCD monitors.
  • softdrinkviking - Tuesday, June 1, 2010 - link

    brian, i think it's important to remember that

    1. it is unlikely that people can perceive the difference between 24 bit and 30 bit color.

    2. to display 30 bit color, or 10 bit color depth, you also need an application that is 10 bit aware, like maya or AutoCad, in which case the user would most likely opt for a workstation card anyway.

    i am unsure, but i don't think windows 7 or any other normally used program is written to take advantage of 10 bit color depth.

    from what i understand, 10 bit color and "banding" only really has an impact when you edit and reedit image files over and over, in which case, you are probably using medical equipment or blowing up photography to poster sizes in a professional manner.

    here is a neat little AMD pdf on their 30 bit implementation
    http://ati.amd.com/products/pdf/10-Bit.pdf
  • zsero - Wednesday, June 2, 2010 - link

    I think the only point where you need to watch 10-bit _source_ is when watching results from medical imaging devices. Doctors say that they can see difference between 256 and 1024 gray values.
  • MacGyver85 - Thursday, June 3, 2010 - link

    Actually Windows 7 does support 10 bit encoding, it even supports more than that; 16 bit encoding!
    http://en.wikipedia.org/wiki/ScRGB_color_space
  • softdrinkviking - Saturday, June 5, 2010 - link

    all that means is that certain components of windows 7 "support" 16 bit color.

    it does not mean that 16 bit color is displayed at all times.

    scRGB is a color profile management specification that allows for a wider amount of color information that sRGB, but it does not automatically enable 16 bit color, or even 10 bit deep color.

    you still need to be running a program that is 10 bit aware, or using a program that is running in a 10 bit aware windows component. (like D3D).

    things like aero (which uses directx) could potentially take advantage of an scRGB color profile with 10 bit deep encoding, but why would it?
    it would suck performance for no perceivable benefit.

    the only programs that really use 10 bit color are professional imagining programs for medical, and design uses.
    it is unlikely that will change because it is more expensive to optimize software for 10 bit color, and the benefit is only perceivable in a handful of situations.
  • CharonPDX - Wednesday, June 2, 2010 - link

    I have an idea for how to improve your latency measurement.

    Get a Matrox DualHead2Go Digital Edition. This outputs DVI-I over both outputs, so each can do either analog or digital. Test it with two identical displays over both DVI and VGA to make sure that the DualHead2Go doesn't directly introduce any lag. Compare with two identical displays, one over DVI and one over VGA, to see if either the display or the DualHead2Go introduces lag over one interface over the other. (I'd recommend trying multiple pairs of identical displays to verify.)

    This would rip out any video card mirroring lag (most GPUs do treat the outputs separately, and those outputs may produce lag,) and leave you solely at the mercy of any lag inherent to the DH2Go's DAC.

    Next, get a high quality CRT, preferably one with BNC inputs. Set the output to 85 Hz for max physical framerate. (If you go with direct-drive instead of DualHead2Go, set the resolution to something really low, like 1024x768, and set the refresh rate as high as the display will go. The higher, the better. I have a nice-quality old 22" CRT that can go up to 200 Hz at 640x480 and 150 Hz at 1024x768.)

    Then, you want to get a good test. Your 3dMark is pretty good, especially with its frame counter. But there is an excellent web-based on at http://www.lagom.nl/lcd-test/response_time.php (part of a wonderful series of LCD tests.) This one goes to the thousands of a second. (Obviously, you need a refresh rate pretty high to actually display all those, but if you can reach it, it's great.)

    Finally, take your pictures with a high-sensitivity camera at 1/1000 sec exposure. This will "freeze" even the fastest frame rate.
  • zsero - Wednesday, June 2, 2010 - link

    In the operating systems 32-bit is the funniest of all, it is the same as 24-bit, except I think they count the alpha channel too, so RGB would be 24-bit and RGBA would be 32-bit. But I as far as I know on an operating system level it doesn't mean anything useful, it just looks good in the control panel. True Color would be a better name for it. In any operating system if you take a screenshot, the result will be 24 bit color RGB data.

    From wikipedia:
    32-bit color

    "32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha, Z or bump data), or sometimes even to plain 24-bit data.
  • velis - Wednesday, June 2, 2010 - link

    However:
    1. Reduce size to somewhere between 22 and 24"
    2. Add RGB LED instead of CCFL (not edge lit either)
    3. Add 120Hz for 3D (using multiple ports when necessary)
    4. Ditch the 30 bits - only good for a few apps

    THEN I'm all over this monitor.

    As it is, it's just another 30 incher, great and high quality, but still just another 30 incher...

    I SO want to replace my old Samsung 215TW, but there's nothing out there to replace it with :(
  • zsero - Wednesday, June 2, 2010 - link

    Gamut will not change between 24-bit and 30-bit color, as it is a physical properties of the panel in case (+ lighting).

    So the picture will be visually the same, nothing will change, except if you are looking at a very fine gradient, it will not have any point where you would notice a sharp line.

    Think about it in Photoshop. You make a 8-bit grayscale image (256 possible grey value for each pixel), and apply a black to white gradient on its whole size horizontally. Now look at the histogram, you see continuous distribution of values from 0 to 255.

    Now make some huge color correction, like do a big change in gamma. Now the histogram is not a continuous curve, but something full of spikes, as because of the rounding errors a correction from 256 possible values to 256 possible values skips certain values.

    Now apply a levels correction, and make the darkest black into for example 50 and the brightest white into for example 200. What happens now, is that you are compressing the whole dynamic range into a much smaller interval, but as your scale is fixed, you are now using only 150 values for the original range. That's exactly what is happening for example when you use a calibrator software to calibrate a wide-gamut (close to 100% AdobeRGB) monitor for sRGB use, because you need to use it in not color aware programs (very, very common situation).

    For actual real-world test, I would simply suggest you to use the calibrator software to calibrate your monitor to sRGB, and have a look at fine gradients. For example check it with the fullscreen tools from http://lcdresource.com/tools.php
  • Rick83 - Wednesday, June 2, 2010 - link

    I feel that Eizo has been ahead in this area for a while, and it appears that it will stay there.
    The screen manager software reduces the need of on-screen buttons, but still gives you direct access to gamma, color temperature, color values and even power on/off timer functions as well as integrated profiles - automatically raising brightness when opening a photo or video app, for example.
    Taking all controls away is a bit naive :-/

Log in

Don't have an account? Sign up now