Camera Architecture

For those that are familiar with the LG G2’s camera, much of this will seem like old news. After all, OIS and a 13 megapixel camera are both things that have already been done, but LG did focus upon adding new elements to the camera system that are well worth investigating. The key new features this go around are laser autofocus, and OIS+. It doesn’t appear that anything significant changes between the G2 and G3 in terms of optical stack. As far as I can tell, the LG G3 does appear to be using a new front facing camera, as the Sony IMX208 is a sensor that I’ve never heard of before. There’s not a lot of public information on this sensor, but we do know that it has a 1.4 micron pixel size and 2.1 megapixels. I’ve put the details on the G3’s camera system below in a table. Outside of the camera itself, LG has also added a dual-tone LED flash much like the system on the iPhone 5s and One (M8), which improves color rendering when the flash is on. This means that the flash can complement the lighting of a scene rather than fighting it.

Camera Architecture
  LG G3
Front Camera 2.1MP
Front Camera - Sensor IMX208
(1.4µm, 1/5.8")
Front Camera - Focal Length 1.8mm
Front Camera - Max Aperture F/2.0
Rear Camera 13MP
Rear Camera - Sensor IMX135
(1.12 µm, 1/3.06")
Rear Camera - Focal Length 3.97mm (29mm eff)
Rear Camera - Max Aperture F/2.4

The first “new” feature isn’t actually particularly new, although we’ve learned more about it since it was first announced. The LG G Pro 2 introduced OIS+, which was stated to be OIS with EIS to improve stabilization. The LG G3 uses the same OIS+ system, and we now know that the plus at the end indicates that the camera is now stabilized along the z-axis. In practice, the effect is rather subtle, although it’s clearly there. Overall, the image stabilization locks on to target better than before. LG leverages this to achieve a maximum integration time of 1/9 seconds. Low light will also push ISO/sensor gain as high as 2900.

The actual new feature is the laser autofocus. While I talked about it back in the launch article, I’ve gained a more nuanced understanding of the system. The laser appears red to my eyes, but a camera with a poor IR filter sees the laser as purple, which suggests a spread of spectrum rather than a single wavelength. This system is likely to be a much more refined version of a proximity sensor. While it’ll take a more nuanced look at focus latency, subjective testing shows that the G3 is very fast to focus on low contrast targets, and is much more consistent in its low light focus performance compared to contrast-detection based systems. I haven’t found any evidence of this subsystem in the kernel, so I suspect that this system is integrated into the camera rather than as a discrete device.

While during some initial investigation it seemed that the G3 might actually use the IMX214 sensor, after some more digging it’s clear that this is using the Sony IMX135. I suspect that LG may have considered using this sensor at some point in the past but changed the spec without changing the software. Despite this, it's worth going over what the advantages of the IMX214 are. These advantages are effectively summed up with reduced z-height requirements, better sensitivity, less color crosstalk, and true video HDR. Reduced z-height requirements are a function of the better light collection capabilities despite off-center collection angle. Better sensitivity was also achieved by reducing the distance between the microlenses and the photodiode on the sensor. This same change also reduced the amount of color crosstalk, so this means that the red pixels will have less blue or green light detected and so on. Finally, the upgraded sensor means that two different exposures can be taken simultaneously for video HDR integration. Sony states that this new sensor can do this HDR combination up to 13 megapixels at 30 fps, 2160p30, or 1080p60. The IMX135 is still capable of doing the same at 1080p30, so I suspect that there wasn't enough improvement from the IMX135 to the IMX214 to justify a more expensive sensor.

To see how well this new system really performs, we have to turn to our array of camera tests. This will also serve as a good benchmark for how LG has improved image processing in general, as the OEM has a significant impact on the final image quality. This is because post-processing techniques done poorly such as excessive noise reduction, artificial sharpening, or failure to correct for various types of aberration can have severely impact final image quality.

Platform Power Camera Performance
Comments Locked

174 Comments

View All Comments

  • ZeDestructor - Friday, July 4, 2014 - link

    I can't see individual pixels on my 24" 1920x1200 screen (~97ppi), but I can EASILY tell the difference between 1920x1080 on a 5.0" phone compared to 1280x720 on a 4.7" phone at 30cm view distance.

    Hell, when the iPhone 4 came out with 326ppi, I could see the grid at around 15cm view distance, probably more - some of us have better eyes than others.

    Not seeing the pixel grid doesn't mean it's past ocular limits.
  • SleepyFE - Friday, July 4, 2014 - link

    If it looks like a perfect circle it can't look any more like a perfect circle. Can it?
  • ZeDestructor - Saturday, July 5, 2014 - link

    The eyes is very good at spotting aliasing. It doesn't jump out at you, but you get the inherent feeling that it's just not right, and with someone like me, that breaks down to peering closer, and closer, and closer, then suddenly microscope D:
  • jeffkibuule - Friday, July 4, 2014 - link

    We must stop this silliness that "not seeing pixels" is the only goal of a display when there are several other metrics at play. You'd still be able to tell the difference between aliased and non-aliased fonts at 12 inches because our brain does a lot of "massaging" of the raw data our eyes capture before we interpret it in our visual cortex. Or more simply put, "the eye is not the be-all end-all of human vision".
  • SleepyFE - Friday, July 4, 2014 - link

    I didn't say not to alias fonts. That has nothing to do with resolution, PPI or PPD. The point is that when you can't tell the difference anymore, you can't tell the difference anymore. Aliasing and proper color reproduction and so on are different problems.
  • mkozakewich - Saturday, July 5, 2014 - link

    Just because you can't see them doesn't mean other people can't. I could see the tiny spaces *between* pixels on my desktop monitor, and hairlines were still really thick. On my 1080p 10.5" screen right now, I can still make out two parallel lines from two feet away, and can see the jaggedness of an aliased 1px line drawn diagonally. At least the white background of this page doesn't look like a big mosquito net at this density.

    In short, we can see a *lot* of detail, and I know it's not enough for me as certainly as you know it's enough for you.

    We really shouldn't need any kind of antialiasing. Until our screens are of high enough resolution, though, they make good stopgaps.
  • phoenix_rizzen - Friday, July 11, 2014 - link

    And PPD stands for ... ? And it compares to PPI how ... ?
  • kaelynthedove78 - Friday, July 4, 2014 - link

    "The laser appears red to my eyes, but a camera with a poor IR filter sees the laser as purple, which suggests a spread of spectrum rather than a single wavelength."
  • kaelynthedove78 - Friday, July 4, 2014 - link

    Lasers are single wavelength sources, so what are they actually using? Does the phone come with the mandatory laser safety class certificate/sticker that lists the power and wavelength?
  • soccerballtux - Friday, July 4, 2014 - link

    did you get the placemat you used for the photograph background at Target? ;)

Log in

Don't have an account? Sign up now