LG 34UM67 Conclusions

There are a couple aspects to this review that we want to address in the conclusion. First is how the 34UM67 fares as a monitor in general. Here things are generally similar to what we said about one of its precursors, the LG 29EA93. The 21:9 aspect ratio is at the very least an interesting alternative to other display options. If you watch a lot of anamorphic widescreen movies, it can be awesome; for playing games, the wider field of view is again very interesting, at least when the game properly supports the aspect ratio. In some ways it’s like having a couple of 1280x1080 displays sitting next to each other, except with zero bezel gap between them. While there are plenty of people that prefer taller aspect ratios (e.g. 16:10 vs. 16:9), there is also a market for even wider aspect ratios like 21:9. This may be more of a niche market than other options, but it’s definitely a viable niche.

Getting into the monitor characteristics, the 34UM67 is a very large display compared to what most people use. Being a 34” UltraWide display, it’s actually much wider than my old 30” WQXGA display that I used for most of the past ten years. My 30” display measures just over 27” wide and is 19-23” tall (with height adjustment); in contrast the 34UM67 is just under 33” wide, but it’s only 18.5” tall. If you’re height constrained but have the ability to support multiple displays, something like this 34” UltraWide format might be an interesting alternative; on the other hand, on a typical office desk the horizontal footprint can be absolutely massive.

Out of the box, the general image quality is good if not exceptional. Colors are reasonably accurate, contrast is a decent ~1000:1, and at least subjectively the pixel response times are acceptable for any purpose including intense gaming. Calibrating the display further improves the color accuracy, though there are some colors that still aren’t “perfect”. Uniformity overall is also merely acceptable – I never really noticed the problems in daily use, but there are areas that are off compared to the center. The use of an IPS panel is still a plus compared with the numerous TN displays, but for professional imaging use there are definitely better options out there, and the price of $649 MSRP means it’s not a great bargain either.

The second aspect to consider is how the display works as a gaming monitor, and in particular how well FreeSync functions. Here’s where things get a bit dicey, depending on your hardware. Running within the supported variable refresh rate range of 48-75 Hz, the 34UM67 is very smooth and it delivers all of the benefits previously enjoyed by NVIDIA G-SYNC users, just with an AMD GPU. The problem is what happens when you fall out of that range. Go above it and at the maximum 75Hz tearing is still visible, though you can also opt for a VSYNC on experience and 75 FPS is a bit better than the usual 60 FPS cap of VSYNC. Falling below the minimum supported refresh rate on the other hand is a much worse experience.

With VSYNC off behavior, tearing is extremely visible. It’s perhaps no worse than a normal 60Hz fixed refresh rate (well, it’s slightly worse as updating 48 times per second means each frame with tearing is visible longer than the usual 1/60s) but it’s definitely not better. Turn VSYNC on and you eliminate tearing but introduce judder. While it’s tempting to make comparisons between G-SYNC and FreeSync, it’s also important to remember that no G-SYNC display uses an IPS 21:9 aspect ratio panel, possibly because the limited 48Hz-75Hz dynamic refresh rate range is just too limited.

That’s ultimately the Achilles’ Heel of the LG 34UM67: as one of the very first FreeSync displays, and coming out around the same time as we’re seeing 40-144Hz G-SYNC and FreeSync displays, it can feel limited. Paired with a Radeon R9 290X, the vast majority of games can easily run at 48+ FPS and if that’s what you have it’s still a good experience. But for a lower price you can find 27” 2560x1440 AHVA displays that can be overclocked to 120Hz, and 30” 2560x1600 IPS displays that can support overclocked refresh rates of up to 120Hz only cost a bit more. Given the choice between an IPS/AHVA display running at 120Hz and a FreeSync display running at 48-75Hz, I’d generally go for the former.

This isn’t an indictment of FreeSync in general, however. The option to support lower minimum refresh rates exists, and I’d say 30Hz is really all you need – if you fall below 35-40 FPS, the smoothness already starts to go away, and dropping to 20FPS for a few frames will create a hiccup with or without dynamic refresh rates. But limiting the range refresh rates to just 28 steps, from 48-75Hz, negates much of the purpose of using FreeSync in the first place. We’ll have to see how other FreeSync/DisplayPort Adaptive-Sync displays compare before we can come to any real conclusions, and there’s definitely potential; the LG display simply isn’t the best showcase of the technology.

Finally, looking at the entire display and graphics ecosystems, as far as pricing goes AMD currently offers a clear advantage. An R9 290X is generally competitive with the GTX 970 at worst, and 15-20% faster at best, which means it can often go up against NVIDIA’s GTX 980 while saving the consumer over $200. FreeSync displays likewise look to have a pricing advantage of $100 or more compared with G-SYNC displays, but the comparisons are a lot less direct in that case. While paper specs can look similar (e.g. TN panel with 40-144Hz dynamic refresh rates), things like color quality, features, and gaming performance (i.e. ghosting) are all important. Just as a GTX 980 costs more than R9 290X but generally delivers a superior experience, we may see a similar situation in the display arena.

If you’re after dynamic refresh rates, you’re inherently locked into one GPU vendor or the other right now. NVIDIA could potentially offer support for DisplayPort Adaptive-Sync displays in the future, but so far they’re not committing to the standard. AMD on the other hand can’t ever support G-SYNC displays (at least not the dynamic refresh rate aspect), so FreeSync is the only option. High static refresh rate displays on the other hand work with both vendors equally well and cost less as a bonus, so if you need a display right now they’re the safest bet. Otherwise, given the long working lives of monitors, continuing to wait and see how the market develops isn’t a bad idea.

LG 34UM67 Power Use, Gamut, and Input Lag
Comments Locked

96 Comments

View All Comments

  • dragonsqrrl - Wednesday, April 1, 2015 - link

    "FreeSync actually has a far wider range than G-Sync so when a monitor comes out that can take advantage of it it will probably be awesome."

    That's completely false. Neither G-Sync nor the Adaptive-Sync spec have inherent limitations to frequency range. Frequency ranges are imposed due to panel specific limitations, which vary from one to another.
  • bizude - Thursday, April 2, 2015 - link

    Price Premium?! It's 50$ cheaper than it's predeccessor, the 34UM65, for crying out loud, and has a higher refresh rate as well.
  • AnnonymousCoward - Friday, April 3, 2015 - link

    The $ goes on the left of the number.
  • gatygun - Tuesday, June 30, 2015 - link

    1) 27 hz range isn't a issue, you just have to make sure you game runs at 48+ fps at any time, which means you need to drop settings until you hit 60+ on average in less action packed games and 75 average on fast paced packed action games which have a wider gap with low fps.

    The 75hz upper limit isn't a issue as you can simple use msi afterburner to lock it towards 75 fps.

    The 48hz should actually have been 35 or 30, it would make it easier for the 290/290x for sure and you can push better visuals. But the screen is a 75hz screen and that's where you should be aiming for.

    This screen will work perfectly in games like diablo 3 / path of exile / mmo's which are simplistic gpu performance games and will push 75 fps without a issue.

    For newer games like witcher 3, yes you need to trade off a lot of settings to get that 48 fps minimum, but at the same time you can just enable v-sync and deal with the additional controlled lag from those few drops you get in stressing situations. You can see them as your gpu not being up to par. crossfire will happen at some point.

    2) Extra features will cost extra money, as they will have to write additional stuff down, write additional software functions etc. It's never free, it's just free that amd gpu's handle the hardware side of things instaed of having to buy licenses and hardware and plant them into the screens. So technically specially in comparison towards nvidia it can be seen as free.

    The 29um67 is atm the cheapest freesync monitor on top of it, it's the little brother of this screen, but for the price and what it brings it's extremely sharp priced for sure.

    I'm also wondering why nobody made any review on that screen tho, the 34inch isn't great ppi wise while the 29inch is perfect for that resolution. But oh well.

    3) In my opinion the 34 isn't worth it, the 29um67 is where people should be looking at, with a price tag of 330 atm, it's basically 2x cheaper if not 3x then the swift. There is no competition.

    I agree that input lag is really needed for gaming monitors and it's a shame they didn't spend much attention towards it anymore.

    All with all the 29um67 is a solid screen for what you get, the 48 minimum is indeed not practical, but if you like your games hitting high framerates before anythign else this will surely work.
  • twtech - Wednesday, April 1, 2015 - link

    It seems like the critical difference between FreeSync and GSync is that FreeSync will likely be available on a wide-range of monitors at varying price points, whereas GSync is limited to very high-end monitors with high max refresh rates, and they even limit the monitors to a single input only for the sake of minimizing pixel lag.

    I like AMD's approach here, because most people realistically aren't going to want to spend what it costs for a GSync-capable monitor, and even if the FreeSync experience isn't perfect with the relatively narrow refresh rate range that most ordinary monitors will support, it's better than nothing.

    If somebody who currently has an nVidia card buys a monitor like this one just becuase they want a 34" ultrawide, maybe they will be tempted to go AMD for their next graphics upgrade, because it supports adaptive refresh rate with the display that they already have.

    I think ultimately that's why nVidia will have to give in and support FreeSync. If they don't, they risk effectively losing adaptive sync as a feature to AMD for all but the extreme high end users.
  • Ubercake - Thursday, April 2, 2015 - link

    Right now you can get a G-sync monitor anywhere between $400 and $800.

    AMD originally claimed adding freesync tech to a monitor wouldn't add to the cost, but somehow it seems to.
  • Ubercake - Thursday, April 2, 2015 - link

    Additionally, it's obvious by the frequency range limitation of this monitor that the initial implementation of the freesync monitors is not quite up to par. If this technology is so capable, why limit it out of the gate?
  • Black Obsidian - Thursday, April 2, 2015 - link

    LG appears to have taken the existing 34UM65, updated the scaler (maybe a new module, maybe just a firmware update), figured out what refresh rates the existing panel would tolerate, and kicked the 34UM67 out the door at the same initial MSRP as its predecessor.

    And that's not necessarily a BAD approach, per se, just one that doesn't fit everybody's needs. If they'd done the same thing with the 34UM95 as the basis (3440x1440), I'd have cheerfully bought one.
  • bizude - Thursday, April 2, 2015 - link

    Actually the MSRP is $50 cheaper than the UM65
  • gatygun - Tuesday, June 30, 2015 - link

    Good luck getting 48 minimums on a 3440x1440 resolution on a single 290x as crossfire isn't working with freesync.

Log in

Don't have an account? Sign up now