Given the lofty price tag, there is a good chance the ASUS PQ321Q is targeting graphics and print professionals, so meeting the sRGB standards of 80 cd/m2 and its custom gamma curve will be important.

Looking at the grayscale first, sRGB is just as good as our 200 cd/m2 target is. The gamma is virtually perfect, and there is no color shift at all. The contrast ratio falls to 667:1, which I expected as the lower light output leaves less room for adjustments. Graded just on grayscale and gamma, the PQ321Q would be perfect.

As soon as we get to the gamut, we see the same issues here as I expected to. That gamut is just a little off which gives us some noticeable dE2000 errors at 100% saturations for all colors.

Here with the color checker charts, we see a large difference between the Gretag Macbeth results and the 96-sample results. The error rises from 1.62 to 2.05 as we are sampling more orange/yellow shades that fall outside of the gamut. Nothing really different than the last calibration, so the same issues apply.

The saturations are also identical to see here. They start out with small errors but by the end, every color except for Cyan is showing a noticeable error at 100%.

For 200 cd/m2 and a gamma of 2.2 or for 80 cd/m2 and the sRGB gamma, the ASUS PQ321Q performs almost equally. The grayscale and gamma are perfect, but the gamut has some issues. Once we start to see more displays using this same panel, but different electronics and possibly different backlights, then we can determine what is causing this shift in the gamut. With the initial target for the ASUS likely being professional designers, these errors seem a bit out-of-place.

dE2000 Data, 200 cd/m2 Calibation Display Uniformity
Comments Locked

166 Comments

View All Comments

  • psuedonymous - Wednesday, July 24, 2013 - link

    The paper also mentions that cycles/degree is only ONE of the ways that the eyes perceive 'detail' When it comes to line misalignment (i.e. aliasing), we can see right down to the arcsecond level. If you want a display that does not exhibit edge aliasing, you're looking at several tens of thousands of DPI.
  • twtech - Tuesday, July 23, 2013 - link

    Even if you can't see the individual pixels, you'll still notice a difference in the clarity of the display.
  • EnzoFX - Tuesday, July 23, 2013 - link

    I cannot believe people who are saying 4k is a waste on TV's, this is asinine. 1080p on a large tv is terrible, the pixels are clearly visible.
  • 1Angelreloaded - Wednesday, July 24, 2013 - link

    Well lets be honest, its only usefull to us if the PPI is high enough to throw AA out the window, or atleast down to 2x of any iteration. I can see some uses in productivity or workstation applications. As for the TV market they aren't even fully at a standard 1080p in content, and they invested a lot into upgrading content as Hollywood started upgrading the cameras for higher resolutions, so I don't see the industry on a bandwagon to keep upgrading.
  • SodaAnt - Tuesday, July 23, 2013 - link

    720p is about as good as you need if you have a 50" TV and you sit 10 feet away from it. If you have a 30" display that you sit 18 inches from, it makes a huge difference.
  • smartthanyou - Tuesday, July 23, 2013 - link

    No person has ever made such a blanket statement. It has always been in the context of what was being viewed and the distance to the display.

    In the future, consider your posts more carefully before you put in writing that you are an idiot.
  • NCM - Tuesday, July 23, 2013 - link

    So evidently you didn't make it even to the end of the article's first paragraph?
  • CalaverasGrande - Thursday, December 26, 2013 - link

    I suppose since I work in broadcast I am special but 4k, HD and 720 are all apparent when you have a decently sharp display. Even from several feet away.
  • karasaj - Tuesday, July 23, 2013 - link

    I just had an argument with my friend over why laptops around 15" are getting 3200x1800 displays but we still have < 100 ppi on desktop displays.
    We both agreed that it would be nice to have high DPI desktop monitors but i insisted that they're too expensive and more niche than laptops and tablets.. It's crazy to see the first 4k monitor ever get such a nice reward, what do you think prevents the cost from going down yet?
  • bryanlarsen - Tuesday, July 23, 2013 - link

    Displays, like IC's, get exponentially more expensive as the size increases, especially for newer technologies. It's mostly due to the defect ratio. A 30" screen is 4 times as large as a 15" one, but it's way more than 4x as expensive. Suppose that there's a single fatal defect; the 30" screen would have to be discarded, but 3/4 of the 15" panels would be fine.

Log in

Don't have an account? Sign up now