Using the updated metric for power usage, where I factor in maximum brightness and screen area and not just power usage, the ASUS PQ321Q falls right in the middle of the pack. For both maximum and minimum brightness it does OK but not incredible in comparison to other displays. With the greater transmission of IGZO I would have thought I might see better numbers from the ASUS, but I imagine power usage was very low on the list of concerns relative to other performance numbers.

Overall the ASUS PQ321Q draws a relatively average amount of power compared to other displays.

Candelas per Watt

LCD Power Draw (Kill-A-Watt)

For testing input lag, I’m again reduced to using the Leo Bodnar lag tester over HDMI. This also means that the ASUS will have to scale the 1080p signal to be 2160p and fill the screen. Unlike before, I think this might be a more accurate test as many people will not be gaming at 2160p yet. Looking at the gaming numbers that our Ian Cutress found with a 4K display, you might want to run at 1080p for a little bit until setting up a 4x Titan rig becomes more affordable. Then again, if you can afford the ASUS PQ321Q, you might be buying a 4x Titan setup as well.

Back to the actual data, and the ASUS comes in at 28.93 ms of lag on average for the 3 measurement locations. This is better than the Dell U3014 monitor does, but slower than the BenQ XL2720T that is a native 1080p display. Given that you have scaling going on here, this actually is a pretty decent result I think.

Processing Lag Comparison (By FPS)

Despite my GPU only being a GTX 660 Ti, I did try out a little bit of gaming on the ASUS. One question that was debated in Ian’s round-up was the necessity of MSAA at 4K resolutions. Measuring just now, I sit exactly 2’ away from the ASUS PQ321Q, with my eyes around dead center on the display. Turning on Half Life 2 (look, I’m not much of a gamer!), I can easily see the difference between no MSAA, 2x and 4x MSAA. The pixel density would need to be even higher, or I’d need to sit further away, for MSAA to not make a difference.

Without MSAA things still looked very sharp overall, but jagged lines are easy to spot if I look for them. You might be able to more easily get away with 2x or 4x instead of 8x MSAA, but you’ll want to have it enabled. Beyond that, the PQ321Q worked well for my casual gaming. Nothing recognized the display correctly at first, perhaps because of MST, but once in the game you can properly select the 3840x2160 resolution for it.

At the request of a commenter I'm adding some PixPerAn photos, trying to show best and worst case results. I've not used PixPerAn at all before, so feedback would be great. If I've done something wrong with it, I'll try to correct it ASAP.

Looking at the gamut, we see a value that indicates full sRGB gamut coverage. From our earlier images of the CIE diagram we know we don’t have full coverage of red, blue and magenta. It seems the extra green/yellow/orange section is large enough that we get a value that indicates a volume equal to the sRGB space, but some of that volume is an area outside of sRGB. It is close to the sRGB area, but not quite.

Display Uniformity ASUS PQ321Q Conclusions
Comments Locked

166 Comments

View All Comments

  • cheinonen - Tuesday, July 23, 2013 - link

    And then after that you're going to sell far fewer, so your profit margins are going to have to change to adapt for that as well, and it really winds up making them far more expensive. It really is the best looking display I've used and the one I most want to keep around after the review period. Companies should be rewarded for taking the risk in releasing niche products that help push the market forward, and really are a breakthrough.
  • Sivar - Tuesday, July 23, 2013 - link

    Ideally they can cut 3 good 15" displays from the failed 30" material.
    Whether the process actually works this way, I don't know.
  • madmilk - Tuesday, July 23, 2013 - link

    It doesn't work that way. That's like saying Intel can cut a quad core CPU into two dual core CPUs.
  • sunflowerfly - Wednesday, July 24, 2013 - link

    Where do you think Intel gets lower core count CPU's? They actually do disable cores and sell them for lower spec parts.
  • DanNeely - Thursday, July 25, 2013 - link

    They've done so in the past, and IIRC still do bin GPU levels that way; but in all their recent generations the dual and quad core CPUs that make up 99% of their sales have been separate dies.

    Your analogy breaks down even for the handful of exceptions (single core celeron, quadcore LGA2011); since the LCD equivalent would be to sell you a 15" screen in a 30" case with a huge asymmetric bezel covering 3/4ths of the panel area.
  • Calista - Thursday, July 25, 2013 - link

    It's not just the parts getting more expensive to manufacture, it's also because the manufacturer knows it's a high-margin product. The difference in price for an APS-C vs an FF sensor is on the order of a magnitude smaller than the difference in price between the complete cameras, i.e. $500 vs $2500, even if the FF camera obviously also include faster processing, higher quality body etc.
  • YazX_ - Tuesday, July 23, 2013 - link

    companies would like like to milk users as its brought to Desktop marketed as NEW TECH, this is the only reason why its very pricey, and dont forget that on the next months other companies will bring their products into competition which will help greatly in reduce the prices.
  • Fleeb - Tuesday, July 23, 2013 - link

    This reply is better than yours: http://www.anandtech.com/comments/7157/asus-pq321q...
  • madmilk - Tuesday, July 23, 2013 - link

    No worries, there's a 4K 39" TV on Amazon for $700. Since that TV has the same number of pixels and isn't a whole lot bigger, I think we will soon be seeing these 32" displays fall into that sub-$1000 range as well.
  • peterfares - Wednesday, July 24, 2013 - link

    That screen is lower quality and doesn't have an input capable of driving it at 60Hz at 4K

Log in

Don't have an account? Sign up now