All calibration measurements are done using SpectraCal’s CalMAN 5.1.2 software with a custom workflow. Measurements are done using a C6 colorimeter that is first profiled against an i1Pro spectrometer to ensure accurate results. There are two sets of targets we use. Pre-Calibration and our first calibration aim for 200 cd/m^2 with a sRGB gamut and a gamma of 2.2. This is a common real-world setting for a display. The final target changes the light level target to 80 cd/m^2 and the gamma curve to the tougher sRGB standard.

 

Pre-Calibration

Post-Calibration,
200 cd/m^2
Post-Calibration,
80 cd/m^2
White Level (cd/m^2) 201.4 199.1 79.8
Black Level (cd/m^2) 0.2477 0.2502 0.103
Contrast Ratio 813:1 796:1 775:1
Gamma (Average) 2.26 2.22 2.41
Color Temperature 6434K 6508K 6553K
Grayscale dE2000 2.97 0.64 0.63
Color Checker dE2000 1.69 0.52 0.50
Saturations dE2000 1.91 0.41 0.45

Pre-calibration measurements are made using the sRGB preset as most people will likely select it. Set for 200 cd/m2 of output there is a red tint to the grayscale. The gamma is off at 5% but the color tint causes the dE2000 for the grayscale to be in the visible range. Color errors are much better and aside from a few samples it is very acceptable. If the grayscale didn’t have this tint the pre-calibration numbers would be great.

Post-calibration this grayscale issue goes away and the colors improve as well. For both the 200 cd/m2 and the 80 cd/m2 calibrations the numbers improve by a large amount and the image is virtually perfect. There really isn’t anything to complain about, just that if you can calibrate the UP3214Q you will wind up with a virtually perfect image.

Brightness and Contrast Adobe RGB Test Data
Comments Locked

84 Comments

View All Comments

  • willis936 - Tuesday, April 1, 2014 - link

    I'm not sure this is right. Companies usually are making and testing IP while a standard is in the works. In some cases they're out before the standard is done.
  • cheinonen - Tuesday, April 1, 2014 - link

    This is correct. There is currently no full HDMI 2.0 silicon out there that I'm aware of, and since the Dell started shipping last fall it certainly didn't have access to it then. There are currently devices shipping that claim "HDMI 2.0" support in the AV world, but that isn't full HDMI 2.0. It is support for 4:2:0 chroma subsampling, which is part of the HDMI 2.0 spec, and enabled UltraHD resolution at 60 Hz. Since computers don't use chroma subsampling, this isn't relevant and there is no HDMI 2.0 silicon right now.
  • Penti - Tuesday, April 1, 2014 - link

    Not even Maxwell can output it, so what sources are you suppose to use?
  • BMNify - Tuesday, April 1, 2014 - link

    where you get that idea from , its false you need a GeForce 600 "Kepler" graphics card or newer to drive a display up to 4096 x 2160.

    hell, even the ChromeOS guys have merged this linux UHD patch in to their tree now...so intel Haswell/Iris Graphics work at "UHD-1" 3840x2160P if you are not gaming http://lists.x.org/archives/xorg-devel/2014-Januar...
  • cheinonen - Tuesday, April 1, 2014 - link

    You can do that resolution at 24 Hz, or 3840x2160 at 30 Hz, but you can't do it at 60 Hz without MST right now. HDMI 2.0 allows it at 60 Hz but that isn't available yet on a product.
  • Penti - Tuesday, April 1, 2014 - link

    I was speaking about 600MHz HDMI not ~300MHz. 300MHz HDMI has been around since GCN 1.0 and Kepler. It's also available in Haswell, works fine in Windows, OS X or GNU/Linux at that res, but that limits it's to 30Hz for 3840x2160. That's not HDMI 2.0 specs. You can't use anything else than DisplayPort for 60Hz 4k/UHD. DisplayPort-receivers only do that on MST too. You need two 300MHz HDMI-ports to do UHD @ 60Hz. So gaming in UHD with HDMI is out regardless of gpu/source.

    Maxwell doesn't do H.265/HEVC for that matter either. You only need ~300MHz HDMI 1.4 to do 4096x2160 @ 24Hz. Not HDMI 2.0, that can do it @ 60Hz.
  • zanon - Tuesday, April 1, 2014 - link

    As far as things that still aren't there, I'd throw in color space (both gamut and bit depth) as well. Official UHDTV (see Rec. 2020), beyond the resolution standards bumping to 4K or 8K, also at last features a significantly larger color space and also the depth necessary to go with it (either 10-bit or 12-bit). That's another marquee feature of HDMI 2.0, 12-bit 4:2:2 4K@60fps. Without the increase depth a wider gamut isn't a straight upgrade since the delta between colors increases too, 8-bit AdobeRGB say isn't a clear superset of 8-bit sRGB. It's exciting that as well as HiDPI we'll finally see an industry wide shift to a color space that will be a strict improvement and is large enough to basically be "done" as far as human vision.

    There's still a lot more pieces needed on the PC side though, including both hardware (video cards, interconnect) and OS/applications. High DPI is slowly improving, but even Apple has slipped a bit in terms of color management and support. That said, given the economies of scale that'll come with the general UHDTV push the market pressure should be there at least.
  • peterfares - Tuesday, April 1, 2014 - link

    Did you test it on a Windows computer other than the one you pictured? Because that one is 8.0, not 8.1 which added multi-DPI support.
  • datobin1 - Tuesday, April 1, 2014 - link

    Correct, 8.0 has static scaling across all displays. 8.1 introduced different scaling for each display.
    This works very well for surface pros that are docked. It will scale the surface pro display at 150% and the extra monitors at 100%. If you move a window between the displays the screen with the majority of the window will decide the scaling for that window. As you pull it from one screen to the next you will see the window change its scaling factor.
  • cheinonen - Tuesday, April 1, 2014 - link

    Yes, I tested with both Windows 8.0 and 8.1. I just happened to have rebooted into Windows 8.0 when I took the photos but I tested both.

Log in

Don't have an account? Sign up now