Many consider me to be a 4K hater. The past few trade shows I’ve attended have been pushing it on consumers to replace their TVs, but I see less value in it. When it comes to a computer display, it is a different game. Unlike a 50” TV, we sit close to our monitors, even if they are 30” in size. We also have no worries about a lack of native content, since everything is rendered on the fly and native. There are no issues with the lack of HDMI 2.0, as DisplayPort 1.2 can drive a 3840x2160 screen at 60 Hz.

When it comes to 4K on the desktop, my main question is: how much difference will I see? ASUS is one of the first with a HiDPI display in the PQ321Q. While not truly 4K, it is a 3840x2160 LCD display that can accept an Ultra High Definition (UHD) signal over HDMI and DisplayPort. It also clocks in at a wallet-stretching $3,500 right now. The question is, are we seeing the future with displays here, or are we seeing a niche product?

What does 4K/UHD/HiDPI bring to the desktop? We’ve seen it for a few years now in smartphones and tablets, making their smaller screens more usable for reading and general work. My initial thought is more desktop space, as that is what it has meant before. With a 32” monitor and a pixel density this high, running it without any DPI scaling leads to a desktop where reading text is a huge pain. Instead I believe most users will opt for DPI scaling so elements are larger and easier to read. Now you have something similar to the Retina screen on the iPhone: No more desktop space compared to a 2560x1440 monitor, but one that is razor sharp and easier to look at.

To get to this pixel density, ASUS has relied upon a panel from Sharp that uses IGZO technology. IGZO (Indium gallium zinc oxide) is a material that replaces amorphous silicon for the active layer of an LCD screen. The main benefit is higher electron mobility that allows for faster reacting, smaller pixels. We have seen non-IGZO panels in smartphones with higher pixel densities, but we don’t have any other current desktop LCDs that offer a higher pixel density than this ASUS display. IGZO also allows for a wide viewing angle.

ASUS has packed this LCD into an LED edge-lit display that only extends to 35mm thick at the maximum. Getting to that thinness requires a power brick instead of an internal power supply, which is a trade-off I’d rather not see. The 35mm depth is very nice, but unlike a TV most people don’t mount a desktop LCD to the wall so I’d take the bulk to avoid the heavy power brick. It does lead to a cooler display, as even after being on for two consecutive days the PQ321Q remains relatively cool to the touch. The power brick itself is quite warm after that period.

Unlike most ASUS displays that click into their stand, the PQ321Q is screwed in with four small screws. This seems to be another attempt to cut down on the thickness of the display, as that mounting mechanism takes up space, but I like the quick release that it offers. Inputs are provided by a single DisplayPort and a pair of HDMI 1.4a inputs. In a nice touch these inputs are side mounted, instead of bottom mounted, making It easy to access them.

Be aware that HDMI 1.4a is really not designed around UHD/4K resolutions, and so your maximum frame rate is only 30p. If you’re watching a 24p film it won’t matter, but there is no real source for those right now anyway. HDMI 2.0 is supposed to resolve this issue, but that was promised at CES this year, and I think we’ll be lucky to see it at CEDIA in September.

One area that the ASUS falls a bit short in is the On Screen Display (OSD). While clear and fairly easy to work in, it takes up most of the screen and you can’t resize it or reposition it. Moving to 4K might have required a new OSD to be developed and it just isn’t totally refined yet, but it needs some work. It isn’t awful as it’s easy to work in, and offers a user mode with a two-point white balance, but it isn’t at the top of the game.

The full specs for the ASUS are listed below. Once this beast is unboxed, lets set it up.

ASUS PQ321Q
Video Inputs 2xHDMI 1.4a, 1xDisplayPort 1.2 with MST
Panel Type IGZO LCD
Pixel Pitch 0.182mm
Colors 1.07 Billion
Brightness 350 cd/m2
Contrast Ratio 800:01
Response Time 8ms GTG
Viewable Size 31.5"
Resolution 3840x2160
Viewing Angle (H/V) 176/176
Backlight LED
Power Consumption (operation) 93W
Power Consumption (standby) <1W
Screen Treatment non-glare
Height-Adjustable Yes, 150mm
Tilt Yes, -25 to 5 degrees
Pivot No
Swivel Yes, -45 to 45 degrees
VESA Wall Mounting Yes, 200mm
Dimensions w/ Base (WxHxD) 29.5" x 19.3" x 10.1"
Weight 28.7 lbs.
Additional Features 3.5mm Input and Output, 2Wx2 speakers
Limited Warranty 3 Years
Accessories DisplayPort cable, USB to RS232 adapter cable
Price $3,499

 

Setup and Daily Use
Comments Locked

166 Comments

View All Comments

  • ninjaburger - Tuesday, July 23, 2013 - link

    I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).

    When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.

    I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
  • Hrel - Tuesday, July 23, 2013 - link

    I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not.
  • Sivar - Tuesday, July 23, 2013 - link

    30 years? I hope you don't have a Samsung TV.
  • althaz - Tuesday, July 23, 2013 - link

    It doesn't matter what brand the TV is, good TVs last up to about 7 years. Cheaper TVs last even less time.
  • DanNeely - Tuesday, July 23, 2013 - link

    My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.

    Maybe you just abuse your idiotboxes.
  • bigboxes - Wednesday, July 24, 2013 - link

    You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time.
  • bigboxes - Wednesday, July 24, 2013 - link

    Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day.
  • cheinonen - Tuesday, July 23, 2013 - link

    I've expanded upon his chart and built a calculator and written up some more about it in other situations, like a desktop LCD here:

    http://referencehometheater.com/2013/commentary/im...

    Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
  • psuedonymous - Tuesday, July 23, 2013 - link

    That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display.
  • patrickjchase - Tuesday, July 23, 2013 - link

    A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."

    To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.

    With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.

Log in

Don't have an account? Sign up now