For our pre-calibration measurements we target 200 cd/m2 of light output, the sRGB gamut, and a gamma of 2.2. On the VUE 30 there are color temperature settings you can use for the grayscale and the warm setting was found to produce the most accurate image.

I’m also going to approach this review differently than before. The charts for all these measurements will be available in individual galleries. There is a table at the top of the page that summarizes the pre and post calibration measurements to easily see how well the monitor does before and after calibration. This should make it easier to read, and allow me to better focus commentary about the monitor performance on the areas that need it.

  Pre-Calibration Post-Calibration,
200 cd/m2
Post-Calibration,
80 cd/m2
White Level 201.78 195.562 77.6183
Black Level 0.3214 0.3197 0.1388
Contrast Ratio 628:1 612:1 559:1
Gamma (Average) 2.2552 2.2406 2.5132
Color Temperature 6657K 6593K 6452K
Grayscale dE2000 4.0657 0.7705 1.3304
Color Checker dE2000 5.7431 4.0627 4.3305
Saturations dE2000 4.6853 3.7814 4.1323

The major improvement that we see is for the grayscale and gamma. On our 200 cd/m2 target calibration, those both come out nearly perfect. There is a small gamma spike at 95% but nothing really bad at all. The overall dE2000 is so low as to be unseen. When targeting 80 cd/m2 and the sRGB gamma curve, the Nixeus doesn’t perform quite as well. The gamma has a little more variation and the dE2000 is somewhat higher, though still very low. The loss of contrast ratio is the larger issue here.

Both grayscale results highly improve upon the original, which is slightly warm and has a very large error level as you approach peak white. The problem with the Nixeus VUE 30 lies with color reproduction. The errors for both the 96-point color checker and the saturations measurements improve, but not by a huge degree. Most of that improvement can be tied back to the grayscale improving since those numbers are a large part of these later tests. The default 6-point gamut chart is dropped here as the saturations chart covers that, and that dE2000 average is too heavily impacted by the grayscale data.

What we see is a wildly oversaturated gamut where green, cyan, red, yellow and magenta all fall far outside of the sRGB gamut boundary. With Green even the 60% saturation value is outside the sRGB gamut, which leads to very over-saturated colors. Even post-calibration we see that green dE2000 errors are past 5 from 40% on, and approaching a dE2000 of 10 by 100%. Aside from a few select colors in the Color Checker pattern, and the grayscale, almost all the colors have a large visible error.

The Nixeus lacks an internal LUT to fix this, and only so much can be done through the video card. A large gamut is nice, but just like with an OLED smartphone, we don’t want that gamut to be wildly oversaturated and push the color way outside of their boundaries. For any sort of color-critical work, or even just browsing photographs, the wild gamut of the VUE 30 will likely be a bad choice for those people after accurate colors. If you like a big, punchy image, you’ll probably like it.

Since we can’t control this gamut, perhaps using AdobeRGB as a target will lead to a better result? I decided to give it a try and see if that improves things at all, or if it was still an issue.

Brightness and Contrast AdobeRGB Calibration
Comments Locked

95 Comments

View All Comments

  • blackoctagon - Tuesday, August 20, 2013 - link

    Glad to hear you're so insensitive to input lag. However, what you experience is by no means the cream of the crop. One can maintain the pleasant colours of IPS and still have good motion clarity by getting one of the overclockable 27-inch 1440p screens. Their input lag is much less, and further mitigated by the (approx.) 120Hz refresh rate. Orders of magnitude better for FPS gaming than what a 30-inch IPS screen can deliver
  • DanNeely - Tuesday, August 20, 2013 - link

    The 2408 was infamously bad. Unlike previous laggy Dell panels that only bothered some gamers the 2408 was slow enough that it annoyed a lot of people who were just working at the desktop. While continuing to insist nothing was wrong and it was working as designed; Dell/etc pulled back (and eventually started listing it on their spec sheets) and the display industry generally insisted on nothing slower than ~2 frames (32ms) which are good enough that no one other than some action gamers complain. I occasionally notice what might be the 30ms on my NEC 3090 when playing POE (an aRPG); but it's intermittent enough I'm not sure if it's actually panel lag or just me hitting the limits of my reaction time.
  • ZeDestructor - Tuesday, August 20, 2013 - link

    >overclockable 27-inch 1440p screens

    RPS or bit-tech (can't remember which) tested that when the Titan came out. They only achieved ~72Hz before the panel itself just started dropping frames because it couldn't keep up.

    Besides, as I said up there, image processing and DP->LVDS conversion takes time. constant time, but time nonetheless. If you had a TN panel at 2560x1600@60Hz, you'd see at least 12ms of processing lag + some more for the panel itself. If you can rip out the on-board processing entirely, you're reducing the lag quite a a bit, which is exactly what game modes do: pipe the signal straight to LVDS conversion with no post-processing. On the U2410, that drops the latency from ~30ms to ~14ms.

    In any case, you missed the point of my comment, where I mentioned it being in the same range as most other wide-gamut, professional-use panels and perfectly fine for single-player gaming, where you can learn to compensate for it. Hell, my LoL-playing friends used to pull off skillshots by timing it just right with a a 300ms ping time to US servers. If you think 30ms is bad...
  • blackoctagon - Tuesday, August 27, 2013 - link

    I would like to see this "RPS or bit-tech" review if you can find it. There are plenty of 2560x1440 monitors out there that overclock SLIGHTLY, but VERY few that support refresh rates up to approx. 120Hz. Unless the reviewers looked at one of the latter monitors (which would surprise me) then I'm not surprised that they started seeing dropped frames.
  • davsp - Tuesday, August 20, 2013 - link

    Viewable Size = 20" I'm guessing typo on spec sheet. :)
  • ingwe - Tuesday, August 20, 2013 - link

    Nah, didn't you see the HUGE bezel?
  • ZeDestructor - Tuesday, August 20, 2013 - link

    > Note also that lag might be lower running at the native 2560x1600, but I can't directly compare that with most other displays as they lack support for that resolution.

    Please don't do that. People who buy/want these big, 30" 16:10 panels are paying the hefty premium for the full resolution, not to run something lower through the scaler. As such, I (and others, probably) would appreciate native resolution response times rather than scaled. 2560x1600 is uncommon because of the hefty price (1.5k per screen so far!), not because wqe don't want 2560x1600.
  • JarredWalton - Tuesday, August 20, 2013 - link

    I believe Chris is using a Leo Bodnar device now (http://bit.ly/WXV7Vv), where formerly he used a CRT as a reference display sending the same content to both. To do native 2560x1600 lag tests, you'd need a device (CRT or Leo Bodnar or similar) that supports WQXGA...which doesn't exist. Chris can correct me if I'm wrong, though.
  • saratoga3 - Tuesday, August 20, 2013 - link

    Apparently that device can't do > 1080p. Unfortunately this means using the scaler, which I think is a really bad idea. Resizing a 4 MP image can easily take an entire frame worth of latency. Its entirely possible that the actual input lag at native resolution is much lower.
  • mdrejhon - Wednesday, August 21, 2013 - link

    The Blur Busters Input Lag Tester, supports 4K, 120Hz, WQXGA and any other resolutions. It will be released before the end of the year. There is also a second input lag tester (SMTT style) that I have invented as well, which is undergoing tests. Keep tuned at the Blur Busters website.

Log in

Don't have an account? Sign up now