LG 34UM67 sRGB Data and Bench Tests

For color accuracy, we test before and after calibration. For calibration, we use SpectraCal CalMAN with our own custom workflow. We target 200 cd/m2 of light output with a gamma of 2.2 and the sRGB color gamut, which corresponds to a general real-world use case. We use an i1 Pro provided by X-Rite. All measurements use APL 50% patterns except for uniformity testing, which uses full field.

LG 34UM67 Pre/Post Calibration
Pre-Calibration,
200 cd/m2
Post-Calibration,
200 cd/m2
Post-Calibration,
80 cd/m2
White Level ( cd/m2) 201 198.7 79.3
Black Level ( cd/m2) 0.2056 .2153 .0977
Contrast Ratio 978:1 923:1 811:1
Gamma (Average) 2.18 2.21 2.21
Color Temperature 6558K 6548K 6482K
Grayscale dE2000 2.94 0.38 0.99
Color Checker dE2000 2.49 1.24 1.39
Saturations dE2000 2.14 1.07 1.17

Before calibration, the LG 34UM67 has a slight blue tint to the grayscale but nothing too noticeable – especially for gaming purposes. Tweaking the OSD settings to 53/50/47 RGB gives a result reasonably close to the ideal 6504K color target. The grayscale errors are all under 4.0 dE2000, which is potentially visible but not overly so, with an average error level of 2.9 dE2000. The gamma curve isn’t great, starting high and ending low but with an average of 2.18 that’s close to our 2.2 target, so things can definitely be improved. Moving to colors, there are a few larger errors of nearly 5.0, mostly in the yellows and oranges. Some of these are due to the gamut falling slightly higher than sRGB, leading to some oversaturation of green and red.

Post-calibration the gamma and RGB balance are almost perfect. The average grayscale dE2000 falls to well below 1.0, which is invisible to the naked eye. Colorchecker and saturation accuracy improves as well, though there are still colors in the 4.0 range. Again, it’s mostly shades of yellows, oranges, and some greens that cause problems, which unfortunately tend to be the worst colors to have wrong for imaging professionals. Overall it’s a good monitor, and the target audience clearly isn’t going to be imaging professionals, so with or without calibration it will do well for gaming, movie watching, and other general tasks.

Changing to 80 cd/m2, the calibration results remain pretty consistent. The dE2000 numbers are slightly higher, but if the small change in accuracy is a concern then potential buyers would have already passed on this display. Only the most finicky of regular consumers might find something to complain about.

It’s also worth quickly discussing some of the other color modes, just because certain ones can be so far off that it’s a wonder anyone would even consider using them. LG offers four picture modes (Photo, Cinema, Reader 1, and Reader 2). Photos has a strong blue tint with average grayscale dE of 6.4 and many values nearing 10.0, though colors aren’t quite so bad averaging closer to 5.0. The Cinema mode is pretty close to the Custom setting, so while it’s tinted blue the grayscale dE is 2.3 while the colors average close to 4.0, with skin tones often falling into the 6.0+ range. Reader 1 and 2 are supposed to be more like print, with the results being heavily red biased with limited blue, and minimum black levels are much higher (2.5 cd/m2). The resulting grayscale dE2000 of 10.8/8.7 and average colors of 7.5/6.0 however are not particularly useful.

And that sums up why NVIDIA didn’t bother with supporting specialized color modes on their G-SYNC module: doing one color mode properly is generally more useful than supporting multiple incorrect color modes. While some people might appreciate the ability to quickly switch between various color modes, most just set up a display for everyday use and leave it be. Most named presets other than “standard” or “custom” end up being bullet points more than anything useful.

LG 34UM67 Brightness and Contrast LG 34UM67 Display Uniformity
Comments Locked

96 Comments

View All Comments

  • willis936 - Wednesday, April 1, 2015 - link

    It's worth mentioning that this wouldn't be good test methodology. Youd be at the mercy of how windows is feeling that day. To test monitor input lag you need to know how long it takes between when a pixel is sent across displayport or whatever to when it is updated on the display. It can be done without "fancy hardware" with a CRT and a high speed camera. Outside of that you'll need to be handling gigabit signals.
  • willis936 - Wednesday, April 1, 2015 - link

    Actually it can still be done with inexpensive hardware. I don't have a lot of experience with how low level you can get on the display drivers. Uou would need to find one that has the video transmission specs you want and you could dig into the driver to give out debug times when a frame started being sent (I could be making this unnecessarily complicated in my head, there may be easiest ways to do it). Then you could do a black and white test pattern with a photodiode to get the response time + input lag then some other test patterns to try to work out each of the two components (you'd need to know something about pixel decay and things I'm not an expert on).

    All of the embedded systems I know of are vga or hdmi though...
  • Murloc - Wednesday, April 1, 2015 - link

    I saw some time ago that some company sold an affordable FPGA development board with video output.
    Maybe that would work.
  • Soulwager - Wednesday, April 1, 2015 - link

    You can still calibrate with a CRT, but you can get thousands of times more samples than with a high speed camera(with the same amount of effort). USB polling variance is very easy to account for with this much data, so you can pretty easily get ~100 microsecond resolution.
  • willis936 - Wednesday, April 1, 2015 - link

    100 microsecond resolution is definitely good enough for monitor input lag testing. I won't believe you can get that by putting mouse input into a black box until I see it. It's not just windows. There's a whole lot of things between the mouse and the screen. anandtech did a decent article on it a few years back.

    http://www.anandtech.com/show/2803/7
  • Soulwager - Thursday, April 2, 2015 - link

    Games are complicated, but you can make a test program as simple as you want, all you really need to do is go from dark to light when you give an input. And the microcontroller is measuring the timestamps at both ends of the chain, so if there's an inconsistency you haven't accounted for, you'll notice it.
  • AnnonymousCoward - Friday, April 3, 2015 - link

    If Windows adds unpredictable delays, all you need to do is take enough samples and trials and compare averages. That's a cool thing about probability.
  • Ryan Smith - Wednesday, April 1, 2015 - link

    CRTs aren't a real option here unfortunately. You can't mirror a 4K LCD to a CRT, and any additional processing will throw off the calculations.
  • invinciblegod - Tuesday, March 31, 2015 - link

    Having proprietary standards in pc gaming accessories is extremely frustrating. I switch between AMD and nVidia every other generation or so and I would hate for my monitor to be "downgraded" because I bought the wrong graphics card. I guess the only solution here is to pray for nVidia to support Adaptive-Sync so that we can all focus on one standard.
  • invinciblegod - Tuesday, March 31, 2015 - link

    I assume you didn't encounter supposed horrible backlight bleed that people seem to complain about on forums. That (and the currently proprietary nature of freesync until intel or nvidia supports it) is preventing me from buying this monitor.

Log in

Don't have an account? Sign up now