Coming into this review, I wasn’t totally sure what to expect from the ASUS PQ321Q, or any monitor with this high of a resolution. I love the screen on my iPhone 5 and my retina iPad, but I hold those really close to my face. Since I sit a couple feet away from a monitor, was I really going to notice the difference? Yes, yes I did.

Even coming into the office right after a standard 30”, 2560x1600 display, the difference is huge. You get either a larger desktop, or a far crisper screen, or possibly both. It isn’t a small difference, but one that I can notice easily, and every single time I sit down to my desk. It also is apparent that many application vendors have to hurry up with their software support for DPI scaling, because when it isn’t supported correctly it is really ugly out there.

The ASUS PQ321Q does have its share of problems. The color gamut isn’t perfect and leads to a good number of errors in the red, orange, and yellows of the spectrum. I found yellows to be the only one that I could easily notice when I looked at photos, but I did see red and orange issues as well. The dual HDMI 1.4a inputs are nice, but with HDMI 2.0 possibly coming later this year you are going to be limited to 30p on those inputs. The OSD could also be improved upon, as it works, but lacks any location or size adjustments and takes up almost half the screen when active.

In the end, my feelings about the ASUS PQ321Q wind up being very simple. Of the dozens of displays that I’ve reviewed for AnandTech so far, this is the one I want to hold onto the most. The razor sharp screen is just addictive to use, and you realize this is the future for displays. I’m sure over the next few years that performance will improve, prices will come down, and features will increase, and that helps everyone. But I want this now, and I don’t want it to leave my house.

The ASUS PQ321Q is pricey, and I can’t say that getting three or four 30” 2560x1600 panels isn’t a better deal, but it’s not the same as having one display that looks like this. In the end, I give the ASUS PQ321Q a Silver Award, which is the highest award I've personally given to any display. It's not perfect, but there isn't a display that's come across my desk that left me in constant awe over how incredible it was to use on a day-to-day basis that the ASUS has. It's also effectively killed any thoughts I've had about buying a laptop like a MacBook Air instead of a Retina MacBook Pro, because I can't imagine going back to a regular display. The next few years of high resolution displays can't come fast enough now.

Power Use, Input Lag, Gaming and Gamut
Comments Locked

166 Comments

View All Comments

  • ninjaburger - Tuesday, July 23, 2013 - link

    I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).

    When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.

    I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
  • Hrel - Tuesday, July 23, 2013 - link

    I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not.
  • Sivar - Tuesday, July 23, 2013 - link

    30 years? I hope you don't have a Samsung TV.
  • althaz - Tuesday, July 23, 2013 - link

    It doesn't matter what brand the TV is, good TVs last up to about 7 years. Cheaper TVs last even less time.
  • DanNeely - Tuesday, July 23, 2013 - link

    My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.

    Maybe you just abuse your idiotboxes.
  • bigboxes - Wednesday, July 24, 2013 - link

    You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time.
  • bigboxes - Wednesday, July 24, 2013 - link

    Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day.
  • cheinonen - Tuesday, July 23, 2013 - link

    I've expanded upon his chart and built a calculator and written up some more about it in other situations, like a desktop LCD here:

    http://referencehometheater.com/2013/commentary/im...

    Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
  • psuedonymous - Tuesday, July 23, 2013 - link

    That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display.
  • patrickjchase - Tuesday, July 23, 2013 - link

    A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."

    To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.

    With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.

Log in

Don't have an account? Sign up now