AdobeRGB has a much larger gamut than sRGB. Even if we can’t control the gamut on the Nixeus VUE 30, moving to a larger gamut target should result in smaller errors overall. If this improves things this might work well for those doing color work, as they may want the larger gamut anyway. For normal use like gaming or web browsing, very few applications use AdobeRGB so it won’t be improved.

  Post-Calibration, 200 cd/m2 Post-Calibration, 80 cd/m2
White Level (cd/m2) 199.7718 81.959
Black Level (cd/m2) 0.3455 0.1473
Contrast Ratio 578:1 566:1
Gamma (Average) 2.1975 2.351
Color Temperature (missing) 6521K
Grayscale dE2000 0.8217 0.8328
Color Checker dE2000 1.3821 1.5443
Saturations dE2000 1.5282 1.6211

Besides the gamut, I left every target the same as with our sRGB calibration. As we can see, we get far, far better results for the color than we did before. The performance for the 80 cd/m2 target has also improved a lot with the grayscale. That shouldn’t have been affected, but it could be a better calibration run, as sometimes the software does better than other times. The visible difference with an average dE2000 of 1.33 vs. 0.83 for the grayscale is pretty minimal and hardly noticeable in real life.

The big change is the colors. While Red still falls outside of the AdobeRGB gamut, Green, Cyan and Yellow all line up nearly perfectly now. Magenta is still affected by the Red, but even those two colors are much closer to accurate than before. A quick look at the saturations table shows that the dE2000 stays below 3, or the visible error level, for every color except for highly saturated Red and Magenta. The 96-point Color Checker chart shows the same results, with those highly saturated red shades providing the only errors that really fall into the unacceptable realm.

One key chart to look at that I’ll pull out here separate from the gallery is the Delta Color Error on the Color Checker chart. As you can see, the Red shades are highly affected by an over-abundance of color here. If I were to pull out the other charts that break down the individual color errors, Delta Luminance and Delta Hue, you would see that those errors are virtually non-existent. The issue is that red has too much saturation, but the light level and the tint on it is correct.

Moving to the AdobeRGB target really improved the performance of the Nixeus VUE 30, but that isn’t without a caveat or two. Most people don’t use AdobeRGB color, and most applications don’t support the larger gamut. For those applications you are still going to see overly saturated colors on a regular basis and this won’t correct them. However, for people that can use AdobeRGB, color accuracy might be more important to them than it would be for someone that doesn’t use it.

If you are only gaming or doing general office productivity on this display, you might not care about the over-saturated gamut. If you are going to be doing photo work you certainly would, and hence this AdobeRGB target might solve your issues. If you want to have accurate colors on the Nixeus, this is the only way you can really get there, and you’ll likely know if this will work for you.

sRGB Measurements Display Uniformity
Comments Locked

95 Comments

View All Comments

  • blackoctagon - Tuesday, August 20, 2013 - link

    Glad to hear you're so insensitive to input lag. However, what you experience is by no means the cream of the crop. One can maintain the pleasant colours of IPS and still have good motion clarity by getting one of the overclockable 27-inch 1440p screens. Their input lag is much less, and further mitigated by the (approx.) 120Hz refresh rate. Orders of magnitude better for FPS gaming than what a 30-inch IPS screen can deliver
  • DanNeely - Tuesday, August 20, 2013 - link

    The 2408 was infamously bad. Unlike previous laggy Dell panels that only bothered some gamers the 2408 was slow enough that it annoyed a lot of people who were just working at the desktop. While continuing to insist nothing was wrong and it was working as designed; Dell/etc pulled back (and eventually started listing it on their spec sheets) and the display industry generally insisted on nothing slower than ~2 frames (32ms) which are good enough that no one other than some action gamers complain. I occasionally notice what might be the 30ms on my NEC 3090 when playing POE (an aRPG); but it's intermittent enough I'm not sure if it's actually panel lag or just me hitting the limits of my reaction time.
  • ZeDestructor - Tuesday, August 20, 2013 - link

    >overclockable 27-inch 1440p screens

    RPS or bit-tech (can't remember which) tested that when the Titan came out. They only achieved ~72Hz before the panel itself just started dropping frames because it couldn't keep up.

    Besides, as I said up there, image processing and DP->LVDS conversion takes time. constant time, but time nonetheless. If you had a TN panel at 2560x1600@60Hz, you'd see at least 12ms of processing lag + some more for the panel itself. If you can rip out the on-board processing entirely, you're reducing the lag quite a a bit, which is exactly what game modes do: pipe the signal straight to LVDS conversion with no post-processing. On the U2410, that drops the latency from ~30ms to ~14ms.

    In any case, you missed the point of my comment, where I mentioned it being in the same range as most other wide-gamut, professional-use panels and perfectly fine for single-player gaming, where you can learn to compensate for it. Hell, my LoL-playing friends used to pull off skillshots by timing it just right with a a 300ms ping time to US servers. If you think 30ms is bad...
  • blackoctagon - Tuesday, August 27, 2013 - link

    I would like to see this "RPS or bit-tech" review if you can find it. There are plenty of 2560x1440 monitors out there that overclock SLIGHTLY, but VERY few that support refresh rates up to approx. 120Hz. Unless the reviewers looked at one of the latter monitors (which would surprise me) then I'm not surprised that they started seeing dropped frames.
  • davsp - Tuesday, August 20, 2013 - link

    Viewable Size = 20" I'm guessing typo on spec sheet. :)
  • ingwe - Tuesday, August 20, 2013 - link

    Nah, didn't you see the HUGE bezel?
  • ZeDestructor - Tuesday, August 20, 2013 - link

    > Note also that lag might be lower running at the native 2560x1600, but I can't directly compare that with most other displays as they lack support for that resolution.

    Please don't do that. People who buy/want these big, 30" 16:10 panels are paying the hefty premium for the full resolution, not to run something lower through the scaler. As such, I (and others, probably) would appreciate native resolution response times rather than scaled. 2560x1600 is uncommon because of the hefty price (1.5k per screen so far!), not because wqe don't want 2560x1600.
  • JarredWalton - Tuesday, August 20, 2013 - link

    I believe Chris is using a Leo Bodnar device now (http://bit.ly/WXV7Vv), where formerly he used a CRT as a reference display sending the same content to both. To do native 2560x1600 lag tests, you'd need a device (CRT or Leo Bodnar or similar) that supports WQXGA...which doesn't exist. Chris can correct me if I'm wrong, though.
  • saratoga3 - Tuesday, August 20, 2013 - link

    Apparently that device can't do > 1080p. Unfortunately this means using the scaler, which I think is a really bad idea. Resizing a 4 MP image can easily take an entire frame worth of latency. Its entirely possible that the actual input lag at native resolution is much lower.
  • mdrejhon - Wednesday, August 21, 2013 - link

    The Blur Busters Input Lag Tester, supports 4K, 120Hz, WQXGA and any other resolutions. It will be released before the end of the year. There is also a second input lag tester (SMTT style) that I have invented as well, which is undergoing tests. Keep tuned at the Blur Busters website.

Log in

Don't have an account? Sign up now