A New 30" Contender: HP ZR30w Review
by Brian Klug on June 1, 2010 6:30 PM EST
Display Lag and Response Time
For gamers, display lag is a very real concern, and display processing is a nebulously reported (if at all) specification for just about all LCD displays. We’ve been over this before, but ultimately, what matters isn’t GTG, full on, full off pixel response times, or what’s reported on the spec sheet, but the holistic latency of the monitor compared to something we can all agree is lag-free. We previously used a baseline LCD and compared with it as our benchmark of no display lag. Previously we were using a 17” Princeton CRT - some of you were a bit underwhelmed by that monitor.
I spent some time visiting (I kid you not) almost every thrift store in town, and found myself a veritable cornucopia of uh... lovingly cared for CRTs to choose from. I settled on a much more modern looking Sony G520 20” CRT supporting a higher resolution and refresh rate. It’s still not what I’m ultimately looking for, but it’s better. Oh, and it cost a whopping $9. ;)
As I mentioned earlier, the only downside is that this means our old data is no longer a valid reference.
To compute the processing lag, I do two things. First, I watch for differences in the clock between the CRT and LCD, noting these whenever they are visible. I did this for 10 captures of the same sequence. Second, one can compute the processing difference by taking into account the FPS and the frame number difference.
We’re still evolving what we think the best way to measure processing lag is, and even using a CRT isn’t foolproof. In this case, I set the LCD and CRT refresh rates to 60 Hz so both in theory grab the same frame from the GPU’s framebuffer. In practice, it’s likely that they just aren’t, explaining the difference. As we process more LCDs, we’ll be able to tell, but the processing lag we’ve measured from all three monitors this far is totally acceptable.
I played a number of FPS titles and RTS games on the display, and never noticed any display processing lag or ghosting to speak of. If you’re going to use a 30” panel for gaming, the ZR30w seems to be just as good as any.
In the ghosting images I snapped, I usually only saw two frames. The dominant frame, and the preceding frame. This is very surprising, since we’re used to seeing three. But all throughout the images I snapped, only two frames are visible. This is very impressive panel response.
95 Comments
View All Comments
boe - Wednesday, June 2, 2010 - link
I've been waiting to buy some monitors for years as the 3008WFP had it's share of issues and Apple hasn't released a new LED backlit 30" yet.I'm looking forward to getting a couple of new monitors but not until some higher end models come out with a clear improvement over my 3007wfp's.
xismo - Wednesday, June 2, 2010 - link
I find it a bit dissapointing that you don't list the configuration this was tested on. For example I don't know the gpu the monitor was running on and therefore I wouldn't know whether the gpu had 10-bit support. It would be nice if you could try to add test of 10-bit support as well, how it performs with smooth gradients and so on. As you probably know not too many graphics cards support 10-bit, but all of the workstation class cards do, which I think is appropriate for testing with high end monitors.xismo - Wednesday, June 2, 2010 - link
After reading the comments I see there was a discussion about 10-bit support, sorry for not seeing that earlier. At least I'm not the only one concerned about this :) BTW 5870 does not have 10-bit support as almost all of the other gaming video cards. And using a mini displayport on a macbook pro will not make the geforce 330m have 10-bit support either. Displayport as well as dual link dvi are the only types of connection that are able to process 10-bit color, but you still need a matching video card. Any workstation card like quadro or firepro should be just fine. But yeah including how each monitor displays gradients would be a huge advantage for me, as this is one of the things I'm looking for in a new monitor.Brian Klug - Wednesday, June 2, 2010 - link
Hey there! Thanks for the comment, yeah I'm working on getting us either a workstation GPU or some other way (whole workstation) to really test the 10-bit aspect. It'll happen this week or next and then I'll update. I realized after posting that I forgot to make sure it was working over 10-bit. And you're right about the 5870-it doesn't have 10-bit support. Guess that's one of those arguments for a more expensive workstation version of the card!Cheers!
Brian Klug
awaken688 - Wednesday, June 2, 2010 - link
I'm surprised no one has really complained about the brightness limitation. For professional photographers and graphic artists (the target audience), this would be rough to have no way to see below 150nits as a real means to check for print accuracy. For some of our print shops, 100 nits is accurate to print.Brian,
Did you try adjusting the brightness via the video card as well? I have a monitor that on 0 brightness is still too bright, but I can then go into nVidia's control panel and lower the brightness using that to achieve the correct brightness or lower the RGB manually (which I understand isn't an option for this monitor). Just wondering. Nonetheless, this should at least be able to hit 120 nits for imaging professionals. Good article though. I like the monitor reviews for sure.
Soldier1969 - Wednesday, June 2, 2010 - link
Thanks for the review, not a bad price for a new 30 incher compared to other brands. Those people here that want 24" reviews those are a dime a dozen and for the poor. I had a 24" 1920 x 1200 monitor long before most did since 2007 when they cost a fortune. So glad I jumped to 2560 x 1600, gaming on them owns everything else out there! Blu ray looks fantastic! If youve never experienced computing on a 30 incher your missing out!kasakka - Thursday, June 3, 2010 - link
The lack of OSD is what kills this for me. I find that without it getting accurate colors is more difficult and if you want to also use the display for gaming, some games may totally ignore calibrated color profiles or software adjustments. The lack of a scaler is a bit annoying too, but since graphics cards can do that at least in Windows, it's not really a problem.Regarding inputs, I currently have a Dell 3008WFP and its gazillion inputs is a huge minus for me. I only need one DVI and one Displayport, so having to cycle thru all the useless inputs is annoying. More annoying is the 3008WFP's (and the 27" Dell U2711's) circa 5 second delay when swithing resolutions or inputs. But I guess I'll stick with the Dell until someone comes out with something better.
pmeinl - Thursday, June 3, 2010 - link
Does the AG coating of the ZR30 have this annoying sparkle effect like other current IPS panels (ex: U2410).Working on my two U2410 (programming, text processing) causes me eye strain.
pmeinl - Friday, June 4, 2010 - link
As some people do not see the sparkle problem, here is a thread with a picture of it:http://hardforum.com/showthread.php?t=1466914&...
B3an - Thursday, June 3, 2010 - link
So just to be clear... does this display do more colors than the Dell 3008?