LG 34UM67 Power Use, Gamut, and Input Lag

With a full white screen and the brightness set to maximum, the LG 34UM67 uses 48 watts of power at the outlet. Setting the backlight to the minimum setting reduces this down to 18 watts. Targeting 200 cd/m2 meanwhile gives us a power draw of 35W. These results are really quite good for this size display.

LCD Power Draw (Kill-A-Watt)

Candelas per Watt

The 34UM67 reproduces 75.4% of the AdobeRGB color space and 110% of sRGB (though some colors fall short while others are well above the sRGB spec). This is exactly what it sets out to do and is acceptable for a consumer-focused gaming display.

LCD Color Gamut

Input Lag?

As we lack the hardware to properly test for input lag, the only thing I can comment on is the experience. I’m not the best person for sensing input lag, though anything above 30ms or so definitely makes me notice. Having used several G-SYNC displays as well as many laptop displays over the years, I didn’t notice any issues with the LG display – if anything I’d say it was perhaps slightly more responsive than other (non-G-SYNC) displays I’ve used, perhaps thanks to the DAS feature. At least as far as input lag goes, there were no problems in my experience, and I’ve seen reports of ~10ms online which would agree with my subjective assessment. Other displays may show less input lag, but below 20ms it gets very difficult to notice.

LG 34UM67 Display Uniformity LG 34UM67 Conclusions
Comments Locked

96 Comments

View All Comments

  • willis936 - Wednesday, April 1, 2015 - link

    It's worth mentioning that this wouldn't be good test methodology. Youd be at the mercy of how windows is feeling that day. To test monitor input lag you need to know how long it takes between when a pixel is sent across displayport or whatever to when it is updated on the display. It can be done without "fancy hardware" with a CRT and a high speed camera. Outside of that you'll need to be handling gigabit signals.
  • willis936 - Wednesday, April 1, 2015 - link

    Actually it can still be done with inexpensive hardware. I don't have a lot of experience with how low level you can get on the display drivers. Uou would need to find one that has the video transmission specs you want and you could dig into the driver to give out debug times when a frame started being sent (I could be making this unnecessarily complicated in my head, there may be easiest ways to do it). Then you could do a black and white test pattern with a photodiode to get the response time + input lag then some other test patterns to try to work out each of the two components (you'd need to know something about pixel decay and things I'm not an expert on).

    All of the embedded systems I know of are vga or hdmi though...
  • Murloc - Wednesday, April 1, 2015 - link

    I saw some time ago that some company sold an affordable FPGA development board with video output.
    Maybe that would work.
  • Soulwager - Wednesday, April 1, 2015 - link

    You can still calibrate with a CRT, but you can get thousands of times more samples than with a high speed camera(with the same amount of effort). USB polling variance is very easy to account for with this much data, so you can pretty easily get ~100 microsecond resolution.
  • willis936 - Wednesday, April 1, 2015 - link

    100 microsecond resolution is definitely good enough for monitor input lag testing. I won't believe you can get that by putting mouse input into a black box until I see it. It's not just windows. There's a whole lot of things between the mouse and the screen. anandtech did a decent article on it a few years back.

    http://www.anandtech.com/show/2803/7
  • Soulwager - Thursday, April 2, 2015 - link

    Games are complicated, but you can make a test program as simple as you want, all you really need to do is go from dark to light when you give an input. And the microcontroller is measuring the timestamps at both ends of the chain, so if there's an inconsistency you haven't accounted for, you'll notice it.
  • AnnonymousCoward - Friday, April 3, 2015 - link

    If Windows adds unpredictable delays, all you need to do is take enough samples and trials and compare averages. That's a cool thing about probability.
  • Ryan Smith - Wednesday, April 1, 2015 - link

    CRTs aren't a real option here unfortunately. You can't mirror a 4K LCD to a CRT, and any additional processing will throw off the calculations.
  • invinciblegod - Tuesday, March 31, 2015 - link

    Having proprietary standards in pc gaming accessories is extremely frustrating. I switch between AMD and nVidia every other generation or so and I would hate for my monitor to be "downgraded" because I bought the wrong graphics card. I guess the only solution here is to pray for nVidia to support Adaptive-Sync so that we can all focus on one standard.
  • invinciblegod - Tuesday, March 31, 2015 - link

    I assume you didn't encounter supposed horrible backlight bleed that people seem to complain about on forums. That (and the currently proprietary nature of freesync until intel or nvidia supports it) is preventing me from buying this monitor.

Log in

Don't have an account? Sign up now