If you are going to use the ASUS PQ321Q, you’re going to want DisplayPort 1.2 support. HDMI will work, but it’ll be choppy with its 30Hz refresh rate. If you have a video card with dual HDMI 1.4 outputs, you can use both of them to drive it at 60Hz if your video driver supports it. DisplayPort 1.2 allows for Multi-Stream Transport (MST) support, letting you drive two displays with a single DP cable. But why does that matter if the ASUS is your only monitor? Because to get the full 60Hz refresh rate out of it, DisplayPort needs to see it as a pair of 1920x2160 monitors that each get their own signal.

The ASUS has MST mode disabled by default. With my NVIDIA GTX 660 Ti I had to manually enable it in the monitor for it to turn on. I’ve been told that with ATI or Intel GPUs over DisplayPort 1.2 it is automatic, but I don’t have those to test with. Once enabled, it quickly went from 30 Hz to 60 Hz while staying at 3840x2160 resolution.

Since I run multiple displays like most people, this seemed to be an ideal time to test out Windows 8.1 and its ability to offer individual DPI scaling on monitors. For this test I used the ASUS PQ321Q, connected over DispayPort, and a Nixeus VUE 30 (review forthcoming) connected over DVI running at 2560x1600. With a single universal setting, you use a percentage setting for scaling in Windows 8.1. With individual control, you use a slider more like on a Retina MacBook Pro. The percentage is hidden, which I dislike. I don’t understand why we have a different way to select the scaling level if you have two monitors versus one. Perhaps it is a beta issue, but I think they should be uniform.

Moving beyond that, when I attempted to scale the PQ321Q, I had an image that was still fuzzy instead of sharp. Thankfully a driver update (as 4K MST panels are new) fixed this issue quickly. The independent display scaling in Windows 8.1 still didn’t work the way I wanted it to. The choices are unclear, including which monitor you are adjusting, and I never could get it setup exactly how I wanted it. I wound up setting it to 150% for both displays and dealing with my 27” running with larger icons than I prefer.

Now I have an effective 2560x1440 desktop, only everything is sharp. Amazingly sharp. It is like moving from my iPhone 3G to the iPhone 4 and its retina screen. The text as I write this in Word is crisp and clear, and editing gigantic spreadsheet in Excel is much easier when the cells are so easy to read. Unfortunately not every application in Windows plays well with DPI scaling.

Chrome is scaled 150% as Windows asked, but it is hazy and blurry. Disabling DPI scaling for the application and then scaling to 150% inside Chrome produces crisp, clear text. Firefox also didn’t scale automatically, but it has a setting to adjust to make it follow the Windows DPI scaling rules. Once set, Firefox looks very nice and crisp. For most people, that setting should already be set to follow DPI scaling.

Finding a chat client that works well is a challenge. Both Pidgin and Trillian don’t do DPI scaling and are fuzzy by default. Another app that had issues is Steam. Right-clicking in the System Tray icon brought up a menu in the middle of the screen, where it would be without DPI scaling. The reality is that some apps are great and support DPI scaling, and some need work, just like when the retina MacBook Pro was released. Evernote looks great, but Acrobat is a fuzzy mess. This is all a bit of growing pains, but I find myself disabling DPI scaling on applications that don’t support it because I prefer tiny and sharp to fuzzy and large.

Because the 2560x1440 resolution is what I’m used to with my usual 27” monitor, I found there to be no real difference in how I used the ASUS monitor. I typically split items to different sides of the screen, with Word on the right and Evernote on the left as I type this. The application that benefitted for me was image editing. Being able to fit more on the screen, or zoom in to higher levels, made working with images on the ASUS better than on a 27” of the same effective resolution. I don’t do that much image editing, but for the work I have done it has been wonderful.

You’ll also quickly find out how much people need to go back and fix up programs or websites to use images and text separately. Text combined in an image scales very poorly, but is often easier than doing proper layout for two separate elements. I feel a bit bad for all the developers that need to go back to fix everything to work with high-DPI screens, but that time has come.

The only way to sum up daily use of the ASUS PQ321Q is “awesome”. It’s not perfect, but much of that is the fault of Windows or other programs and websites. When you have something that can scale and look correct, it is amazing how much the extra pixel density and sharpness helps. Yes, this is the future for displays, and we are entering the transition period to get there.

Introduction, Design and Specs Internal Scaling, Brightness and Contrast
Comments Locked

166 Comments

View All Comments

  • ninjaburger - Tuesday, July 23, 2013 - link

    I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).

    When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.

    I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
  • Hrel - Tuesday, July 23, 2013 - link

    I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not.
  • Sivar - Tuesday, July 23, 2013 - link

    30 years? I hope you don't have a Samsung TV.
  • althaz - Tuesday, July 23, 2013 - link

    It doesn't matter what brand the TV is, good TVs last up to about 7 years. Cheaper TVs last even less time.
  • DanNeely - Tuesday, July 23, 2013 - link

    My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.

    Maybe you just abuse your idiotboxes.
  • bigboxes - Wednesday, July 24, 2013 - link

    You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time.
  • bigboxes - Wednesday, July 24, 2013 - link

    Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day.
  • cheinonen - Tuesday, July 23, 2013 - link

    I've expanded upon his chart and built a calculator and written up some more about it in other situations, like a desktop LCD here:

    http://referencehometheater.com/2013/commentary/im...

    Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
  • psuedonymous - Tuesday, July 23, 2013 - link

    That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display.
  • patrickjchase - Tuesday, July 23, 2013 - link

    A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."

    To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.

    With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.

Log in

Don't have an account? Sign up now