Using the updated metric for power usage, where I factor in maximum brightness and screen area and not just power usage, the ASUS PQ321Q falls right in the middle of the pack. For both maximum and minimum brightness it does OK but not incredible in comparison to other displays. With the greater transmission of IGZO I would have thought I might see better numbers from the ASUS, but I imagine power usage was very low on the list of concerns relative to other performance numbers.

Overall the ASUS PQ321Q draws a relatively average amount of power compared to other displays.

Candelas per Watt

LCD Power Draw (Kill-A-Watt)

For testing input lag, I’m again reduced to using the Leo Bodnar lag tester over HDMI. This also means that the ASUS will have to scale the 1080p signal to be 2160p and fill the screen. Unlike before, I think this might be a more accurate test as many people will not be gaming at 2160p yet. Looking at the gaming numbers that our Ian Cutress found with a 4K display, you might want to run at 1080p for a little bit until setting up a 4x Titan rig becomes more affordable. Then again, if you can afford the ASUS PQ321Q, you might be buying a 4x Titan setup as well.

Back to the actual data, and the ASUS comes in at 28.93 ms of lag on average for the 3 measurement locations. This is better than the Dell U3014 monitor does, but slower than the BenQ XL2720T that is a native 1080p display. Given that you have scaling going on here, this actually is a pretty decent result I think.

Processing Lag Comparison (By FPS)

Despite my GPU only being a GTX 660 Ti, I did try out a little bit of gaming on the ASUS. One question that was debated in Ian’s round-up was the necessity of MSAA at 4K resolutions. Measuring just now, I sit exactly 2’ away from the ASUS PQ321Q, with my eyes around dead center on the display. Turning on Half Life 2 (look, I’m not much of a gamer!), I can easily see the difference between no MSAA, 2x and 4x MSAA. The pixel density would need to be even higher, or I’d need to sit further away, for MSAA to not make a difference.

Without MSAA things still looked very sharp overall, but jagged lines are easy to spot if I look for them. You might be able to more easily get away with 2x or 4x instead of 8x MSAA, but you’ll want to have it enabled. Beyond that, the PQ321Q worked well for my casual gaming. Nothing recognized the display correctly at first, perhaps because of MST, but once in the game you can properly select the 3840x2160 resolution for it.

At the request of a commenter I'm adding some PixPerAn photos, trying to show best and worst case results. I've not used PixPerAn at all before, so feedback would be great. If I've done something wrong with it, I'll try to correct it ASAP.

Looking at the gamut, we see a value that indicates full sRGB gamut coverage. From our earlier images of the CIE diagram we know we don’t have full coverage of red, blue and magenta. It seems the extra green/yellow/orange section is large enough that we get a value that indicates a volume equal to the sRGB space, but some of that volume is an area outside of sRGB. It is close to the sRGB area, but not quite.

Display Uniformity ASUS PQ321Q Conclusions
Comments Locked

166 Comments

View All Comments

  • ninjaburger - Tuesday, July 23, 2013 - link

    I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).

    When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.

    I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
  • Hrel - Tuesday, July 23, 2013 - link

    I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not.
  • Sivar - Tuesday, July 23, 2013 - link

    30 years? I hope you don't have a Samsung TV.
  • althaz - Tuesday, July 23, 2013 - link

    It doesn't matter what brand the TV is, good TVs last up to about 7 years. Cheaper TVs last even less time.
  • DanNeely - Tuesday, July 23, 2013 - link

    My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.

    Maybe you just abuse your idiotboxes.
  • bigboxes - Wednesday, July 24, 2013 - link

    You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time.
  • bigboxes - Wednesday, July 24, 2013 - link

    Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day.
  • cheinonen - Tuesday, July 23, 2013 - link

    I've expanded upon his chart and built a calculator and written up some more about it in other situations, like a desktop LCD here:

    http://referencehometheater.com/2013/commentary/im...

    Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
  • psuedonymous - Tuesday, July 23, 2013 - link

    That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display.
  • patrickjchase - Tuesday, July 23, 2013 - link

    A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."

    To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.

    With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.

Log in

Don't have an account? Sign up now