Display

The LG G3’s display has been a choice subject to immense controversy. While the LG G3 is the first international phone to ship with a QHD (2560 x 1440) display resolution, those following the industry saw the inevitable trend as Android OEMs made the jump from 720p to 1080p displays at the 4.7-5” display size. While it’s now obvious that going from 720p to 1080p brought a significant increase to perceived resolution, the same dilemma is brought up when debating whether 1080p to 1440p will bring a significant increase to perceived resolution.

Answering this question requires an understanding of both human vision and the tradeoffs that come with increased pixel densities. The short explanation is effectively that while Apple was right to say that 300 PPI or so is the correct pixel density needed to no longer perceive individual pixels at 12 inches away, the issue is more complex than that. There are edge cases such as Vernier acuity that require pixel density up to 1800 pixels per degree (PPD) in one’s field of view. What this means is that once that pixel density is exceeded, it’s possible to make two lines appear to be aligned even if they aren’t. Of course, this is extremely difficult with current technology, although there are displays in existence that do approach the 2000-3000 PPI needed to reach those levels.

There are more edge cases though. While I’m not going to go into deep depth, the eye is effective capable of sampling detail at .8 to 1 arcminute for the most part. This ignores exceptional cases such as Vernier acuity where interpolation in the mind effectively achieves much higher resolution. While this means that 300 PPI at 12 inches is “enough” to match that sampling rate, the Nyquist-Shannon sampling theorem actually means that preventing aliasing requires twice the resolution. In other words, 600 PPI is the realistic upper bound for most displays. This also ignores cases where the display is held much closer for detailed examination. For those interested in learning more about this, I would refer back to our article on display resolution and human vision.

At any rate, this is the first time that we’ve actually used a 1440p display in a smartphone. In practice, it is possible to see more detail on the LG G3’s display, but it’s hard to tell in most cases. Examining the display closely brings out the differences much more, but it’s not quite the jump that going from 720p to 1080p was. Unfortunately, this doesn’t change the cost of increasing pixel density. As I explained in previous articles, increasing pixel density comes with a greater power cost due to the need for a stronger backlight due to lower active area on the display and smaller transistors. It’s clear that LG has had issues with this, with some rather drastic measures taken.

To save power, LG specifically called out three different mechanisms used to save power on the display. LG states that panel self-refresh is still present in the LG G3, but dumping information from SurfaceFlinger reveals that it's a MIPI video panel, not a command panel. This means that either LG has implemented panel self-refresh in another manner, or that it's no longer present.

Issues with panel self-refresh aside, LG specifically calls out dynamic display clocking as one aspect of their system to save battery life. Inspecting the system files shows that the refresh rate for the display is set by a software governor, which has interesting implications for the custom ROM community and the effect that OTA updates can have on battery life. The system files suggest that the dynamic clocking mechanism isn't quite as broad as one might expect, as the only two frequencies that seem to be exposed are 50 and 60 hertz. I suspect that the nature of an LTPS panel means that it's not quite possible to realize a 0 hertz refresh rate for still images, but it may be that this is effectively a replacement for panel self-refresh. I'll save the other that I've found for the battery life section, as it goes beyond normal power-saving measures.

Outside of just power saving measures, I've also noticed artificial sharpening. This effect is obvious enough that you will notice it immediately. As a result, halos are all over the display in certain situations, and in general I hope LG adds an option to turn this off.

I’ll also touch briefly upon some of the things that I’ve found regarding the touch panel. For one, this is a Synaptics solution, the S3528A. This is the same solution found in the One (M8). Unfortunately, there’s no other information that I could find online. Fortunately, digging through the phone reveals other information. It appears that the QuickCover window is actually defined by ranges in x and y coordinates, and I assume that the same is true for the LCD itself, however all of the information is presented in a circular format.

When it comes to how the rest of the display performs, we turn to Spectracal’s CalMAN 5, with a custom workflow to test our displays and quantify performance. To start things off, we’ll take a look at maximum brightness and contrast. We’ll then move onto grayscale accuracy, then saturation accuracy, and finally the Gretag MacBeth ColorChecker test.

Display - Max Brightness

Display - Black Levels

Display - Contrast Ratio

In the maximum brightness department, LG noticeably struggles here. While the G2 had around 410 nits peak luminance, the G3 regresses to around 390 nits maximum. I didn’t find any outdoor brightness boost function in this case either. This means that outdoors, the display will be worse than 1080p devices like the One (M8) and Galaxy S5. The other issue is contrast, which is around 900:1. This isn’t actually as bad as most have made it out to be. The big issue with contrast here is how it degrades with viewing angles. In most angles, a black test image will rapidly wash out towards white when viewing the display at oblique angles.

Display - White Point

Display - Grayscale Accuracy

In grayscale, the LG G3 does quite well. Most OEMs continue to target around 7000k instead of 6500k, and the result is that there’s a lower bound on the average dE2000 scores. I’d still like to see OEMs include a mode that allows selection of a 6500k target, but LG does acceptably well here. As always, it's important to emphasize that the grayscale measurements will produce inaccurate contrast values due to the nature of the i1Pro.

Display - Saturation Accuracy

Saturations are where LG has gone a bridge too far. While some may enjoy “vivid” color, the saturation compression is insane here. In many cases, 80% and 100% saturations are effectively identical. This can be seen on the red and green sweeps. 60% saturation is often closer to the 80% saturation target. LG really, really needs to either stop doing this or give an option to disable it. This is simply just immensely detrimental to the viewing experience, especially in any situation where color accuracy is actually necessary. Editing photos is effectively impossible on this display because the results will look completely different on most other displays that are closer to following sRGB color standards.

Display - GMB Accuracy

In the Gretag MacBeth Colorchecker, the G3 manages to do well, but it’s likely that its grayscale performance is lowering the dE2000 average. Overall, while this isn’t a terrible display, it’s disappointing that LG has decided to go for showroom appeal over great color calibration out of the box. While HTC’s saturation compression algorithms can be disabled with an init.d script, I haven’t found any evidence that the same is true for the LG G3. The low peak brightness is concerning as well, and likely a mitigation for the higher pixel density. 

Introduction & Hardware Battery Life and Charge Time
POST A COMMENT

169 Comments

View All Comments

  • piroroadkill - Friday, July 04, 2014 - link

    Basically, the screen is stupid, costs more, wastes battery, slows down performance, heats it up so it throttles more, and isn't actually noticeably different compared to 1920×1080 at viewing distance.

    Yeah, so predictable. LG is doing the worst kind of spec-sheet oneupmanship.
    Reply
  • Homeles - Friday, July 04, 2014 - link

    I picked up a G2 last year, and was a bit frustrated that the G3 came out so quickly. Looks like I'm not missing out. Reply
  • piroroadkill - Friday, July 04, 2014 - link

    Yeah, the G2 is a fine device, and I'd choose the better battery life. Reply
  • mahalie - Friday, July 04, 2014 - link

    The G3 has significantly better battery life than the G2. You can complain that the screen is unnecessary all you want, but the phone performs wells and has great battery life, so what's the issue? Reply
  • fokka - Friday, July 04, 2014 - link

    the g3 beat the g2 in one test here, what numbers are you referring to?

    the issue, as i see it, is that the 1440p display adds cost, decreases performance, battery life and screen brightness, not to mention overall screen quality, compared to using a good 1080p display, all while adding very little in regards to usability and visual advantages.

    many people, me included, think that LG should have gone with a "good ol'" 1080p display in this generation and improved upon the great battery life that the g2 offered, instead of using a 1440p screen with borderline useful benefits mainly for bragging rights.

    of course not everybody agrees with this stance, but it seems to be the one main complaint about this otherwise mighty fine piece of technology.

    and you are right, the g3 (still) performs well and (still) has great battery life, but with a more reasonable 1080p display those points could have improved even more. that's all i (!) am saying.
    Reply
  • retrospooty - Friday, July 04, 2014 - link

    You may be right about the screen. Looks like some trade-offs were made, but it's still a good phone that stacks up well against it's competitors. It's still a good 5.5 inch phone that is basically the same size as an S5 or One M8. That and I cant remember the last time I had any phone on 100% brightness. I have a G2 now and keep it as 66%. In rare cases I move it up to 70%. 75% is simply too bright to look at. Reply
  • flatrock - Tuesday, July 08, 2014 - link

    I just checked the brightness on my G2. It's at 41% and is plenty bright for indoor viewing at that level. I might put it up to 75% while using it outdoors such as at my son's soccer game. Unless the sun is shining directly on the screen, 100% is overkill. In a dark rooms I set the brightness somewhere in the teens. Reply
  • upatnite2 - Friday, July 04, 2014 - link

    Same thoughts! I can deal with the battery life, but brightness and contrast are major issues. We're seeing at least a few reviews that mention "dim", which isn't a good sign, and I'm starting to wonder about performance after it heats.
    If they put in the same display as last year, LG would sell tons more, and myself included.
    Now, I have to wait to see if the S805 vsn comes to the states and if it's any better, wait for the N6, or get a used G2..
    Reply
  • HotInEER - Monday, July 07, 2014 - link

    I agree. 3 things are keeping me from trading in my HTC One M8 for this. The brightness, contrast, and lack of built in wireless charging. I don't want a flip case for that feature. Can't stand them, and sure in the heck are not paying $60 for a stupid case for that. I'd consider buying a additional back for wireless charging, however I've read on numerous sites that the US models do not have the pins. Reply
  • akdj - Sunday, July 13, 2014 - link

    "...and sure in the heck are not paying $60 for a stupid case for that."
    That made me laugh. You're unwilling to throw down $60 on a case while 'considering' dropping/dumping or upgrading from one of today's 'Flagship' phones! Thanks though for the Sunday chuckle but WOW. Color me silly, but that's a helluva 'first world' challenge you've got going there! Anand posted an(other) excellent, well written review. Another flagship handset. Innovative display and yep...as so many others before you have mentioned... All at the expense of a bit of brightness (typically a 'non issue' as we use our phones most often indoors, and outdoors the measurements are still just 'fine' for usage & snapshots), wireless charging? Really, you're basing your purchase decision on an unproven, completely niche and rare...without common specification technology? I own a Note 3 & an iPhone 5s. The former, solely for my business. The latter, my personal phone. I love them both and honestly feel like we've hit that 'plateau' in performance. Almost a year in, I'm wondering if I'll even take advantage of the NEXT deal at AT&T. They're both still fast as hell, no way I'd notice any differences between those and today's offerings (iPhone TBD, obviously) from Android. While Sammy has improved its AMOLED technology in the S5 even more than the S4-->Note 3, I've used both and honestly, even as an almost three decade professional audio and video production company business owner and operator, the differences to me were hardly distinguishable. They've come a LONG way with AMOLED in comparison to the long time king of displays, the LCD, to the extent in many (possibly more than) measurements, it's taken over as the 'better' technology. Longevity? We're yet to see, but without Samsung innovating their technology, and listening to the detractors...or paying attention to reviews, numbers and measurements each generation, they'd have 'pulled out'. You've got a damn sweet phone NOW. There's not a single phone available today that's going to 'better' your M8 if you bought it knowing its strengths, and more importantly it's limitations. If you're looking for the all around 'best' camera, you made the wrong choice. If you take few photos or primarily shoot in low light situations ...you made the right choice. Hard to bitch about HTC's UI. It's excellent! While I'm one of the very few that actually 'like' T/W, you're probably best taking my GUI opinion with a grain of salt but other than your issue with contrast (valid complaint, IMHO as a 'visual geek'), I know I took a long time to get to the point, so ....
    Tl/Dr, that's silly. Don't even THINK about replacing your M8 today. Unless you're A) still within the 'return period' and/or
    B) not happy with the phone for some reason (why did you buy it? It's a bad ass hand set! Don't 'chase' specs, the genesis of technologies or 'absolutes' when it comes to a 'smartphone' --- in the end, you'll be underwhelmed, disappointed and you'll lose money EVERY time you do something so 'silly')
    Ultimately today's smartphone market is awesome. With over two million apps between iOS and Android, Windows making their own moves (& certainly, while late to the game with the SP3, ousting of Balmer, and it's iOS MS Office suite 1.0 release...Win 365 subscription family package @ $10/month for five tabs and five computers and five users, EACH with a TB of their own storage accessible via OneNote from anywhere, anytime, GRAND SLAM! Go to Best Buy and save $40 for the year, about every other week they've got the bundle on sale...& it's completely 'cross platform' with an Android full release imminent, I'm an OSx user primarily but also own a pair of Windows machines. As well, being a Note 3 user, I'm very excited to see where MS is going...)
    Yep. LG took a leap at this resolution. But they're one of very few display manufacturers. Most OEMs use Samsung, LG, or Sharp displays. Makes sense to me at least they'd be the ones aiming for the 'ultimate' resolution for human visual acuity. Is THIS the version to buy? If you're an original LG 'g1' owner, maybe switching out of an S3 contract, or not exactly crazy any longer with iOS and considering upgrading a 4s or iPhone 5 that you purchased almost two years ago, ABSOLUTELY! If you're an owner of a 2013 flagship, either iOS or Android, from those measurements (other than PPI/display technology and size preference), it's very obvious 2014 to this point had been 'iterative' with refinements to UIs (Samsung has definitely worked in TouchWiz instead of adding 'more' they've refined existing features, ala S-Pen/features and it's recognition when the pen is out ... Better overall ability to 'control' the OEM's pre-installed software (carrier bloat, different story but ubiquitous regardless of Android choice) -- point being they've succeeded with UI improvements and that's a BIG end user 'upgrade' IMHO. While the UI needs a refresh, a launcher of choice is a simple and cheap addition. The A7 hit Qualcomm like a ton of bricks. While indeed Apple is still a ways away from utilizing its full power due to RAM, the A8 instruction set, new memory management and the 7.1 'update' were HUGE. Obviously still keeping computational pace with significantly 'faster' clock speeds on half the cores with half the memory. As well - the dated IT graphics solution Apple used on the A7 will certainly be updated on the A8. Qualcomm will be ready with 64bit SOCs next year, or late this year. I guess my question would be 'why' HTC, Samsung, and LG aren't using the faster 805/420s today? Sorry to ramble but ultimately, I'm extremely excited to see this type of evolutionary improvement from one of today's largest display manufacturers. I'm glad to see the transition from '3D' to HiDPI from the 2011/12/13 CES shows to the '4K' and HiDPI displays shown off this past January in Vegas, and actually 'affordable' 1.0 releases @ Best Buy half way through '14. IMHO, display resolution rules the roost. With resolution, comes ALL the primary factors that make ANY resolution 'good' vs 'great". Attributes that come FAR before resolution updates, you're correct ...kind of. Brightness while important, especially in a cell phone is pretty important. Though as you can plainly see, it's still 'good' and easily visible outdoors. Contrast = Huge. Color and gamut/calibration as well as grey scale, gamma, display technology used, viewing angles, saturation and 'response time'. If you're gaming, you're looking for fast refresh rates, you'd hate my new Eizo. Soooo many factors that go into a good display and they're each (OEMs) beginning to 'get it'. Pre sale calibration. Options for, albeit limited post purchase calibration (TouchWiz and the display adaption option). Pushing barriers is good. My dream is to have 4k displays and a delivery system for the content in place by 2020. HiDPI displays are ubiquitous on computing displays. (Hard to explain my passion and 're' invigoration for using my computer, my laptop ...daily, since purchasing my first rMBP in 2012. We've now got seven, five for the business, two personal 15" 2013 rockets! PCIe TB storage that is faster than anything I've used in my life, a display that blows my mind EVERY time I turn it 'on' ...its I/O options, TB2 has become a GodSend for us, as have the new docks and the ability to literally turn it into a 'desktop' workstation (- the Xeon procs and enterprise RAM, you'd never know. It's truly THAT fast!)). I could go on and on but I guess I'm blown away by the 'reasons' some folks come up with to not replace their five month old bad ass pocket computer. Nor do I understand the backlash from our 'geek' community against LG for pushing the barriers AND using scientific reasoning to explain human visual acuity, how much density is 'truly' necessary to be indistinguishable from the sharpest photo, the finest print (especially for many traditional Asian writing/text/characters and alphabets)... Or and while I so hate the cliche itself; "Like looking through a window" ...IOW, his explanation was pertaining to the ability to 'see' in a display the 'same' visually your eye would see in a 'real world' scenario.
    Again, all my opinion. Yours is different, I respect that. But I don't respect spec chasing, bullshit reasons to upgrade every three months to chase specs and bragging rights. That's dumb. It's not what you make, it's what you save. From your comment, I'm assuming your young (ya know what they say about assumption though). If I'm wrong, I'm sorry. If I'm correct, the difference in you saving that extra three to six hundred a year you're dropping on cell phones when you're 30, 40, 50...65 and ready to retire, is HUGE! Say you're 25. Without interest, that's $12-$24,000. With that cash each year and another $150/month stashed away starting at 20 instead of thirty in a fund that nets you an 8% average yearly increase (over 40 years) is 3.9 million @20 vs 2.2 million at 30 years old. At 40, you'll be lucky to hit a million. Figure out today what tomorrows 'Apple' stock will be and throw all that out the window. Buy all ya can and sell in a decade. You and the next six generations of your family won't have to work ;)
    Reply

Log in

Don't have an account? Sign up now