Camera

The G2 joins an exclusive group of smartphones that include optical image stabilization (Lumia 920, 925, 928, 1020, HTC One). This works by physically moving the optical stack around inside the camera module to counteract hand shake and movements during image or video capture, using orientation data from a nearby gyroscope. The goal is to eliminate shakes during video capture and also to enable longer exposures during low light scenarios. 

The G2 includes a 13 MP Sony IMX135 Exmor RS CMOS sensor with 1/3.06-inch size and 1.12µm pixels. We've seen this CMOS in a lot of other devices, what's different is the optical system (in this case F/2.4 with 4.0mm focal length, for around 29mm in 35mm equivalent numbers) and of course the new module which includes OIS. 

I'm still working on a big analysis of the G2's performance, but so far I'm very impressed with the resolution that this affords and the G2's ability to still produce decent results indoors where light isn't so good and outside at night. I've only been able to use the G2 as a daily and take pictures with it for a short time, but including OIS is definitely a step in the right direction if the industry wants to adopt 1.1µm class pixel pitches. 

 
 
 
 
 
 

I had a chance to get photos with the G2 at our camera bench locations, of which 3, 4, 5, and 7 remain, and inside the lightbox with the lights on and off, and of our test patterns. I also took one in low light replicating the low light lightbox tests I've done before.

The G2 seems to have a low light mode it kicks into automatically regardless of whether you're in the normal mode or night mode from the scenes menu; when it's in this mode it doesn't record shutter time or ISO in EXIF, just like Galaxy S4, so I can only assume that LG is also combining multiple exposures. It makes it a little hard to figure out just how far you can push OIS in the G2, but the result does look very good. 

LG G2: ?, ISO ?
     

In addition the G2 can record 1080p60 video, something I've been waiting to see a mobile device do for a long time. The video encode block onboard 8974 can do up to 120fps 1080p video or 30fps 4K video (analogous since 4k is just 4 1080p frames), LG just chose to enable the 1080p60 route since the sensor can handle it. This 60 FPS video is encoded at 30 Mbps H.264 high profile instead of the 20 Mbps for 30 FPS. 

Because YouTube can't play back 60p content quite yet (nor can anywhere else online I'm aware of, the sample above is at 30 FPS) you'll have to download the two video samples and look at them side by side to gauge the difference. The change in temporal resolution is dramatic; I've been spoiled by 1080p60 from the GoPro Hero 3 Black for some time, getting this from a smartphone is a killer feature for the G2. 

OIS on the G2 is noticeable, but it isn't as dramatic as it is on some other smartphone platforms. I've been trying to understand the differences in maximum deviation / accommodation angle and cutoff frequencies for the various OIS systems that LG, HTC, and Nokia have devised, and there's a fair amount of difference in performance. 

To help me gauge some of the differences, I went out with my dual device mount and shot video on a few OIS platforms and current devices with EIS for comparison purposes. Because I'm simultaneously working on the Lumia 1020 review, I used that as the reference point. I walked a small circuit around the place where I normally take bench photos and recorded video, and shook the devices at the end of the walk each time.

The video really shows the differences in how much vibration each system really can damp out. What's crazy to me is how well the Lumia 925 does compared to everything else – the original goal was to compare the different OIS systems Nokia was using, but we can also gauge OIS performance across the spectrum here. The G2 can't quite damp out all the big jerky movements, but it does absolutely help when trying to do something like record a video standing still, walking around continues to be a very challenging test case. 

NAND Performance Conclusions
Comments Locked

120 Comments

View All Comments

  • Krysto - Sunday, September 8, 2013 - link

    Cortex A9 was great efficiency wise, and better perf/Watt than what Qualcomm had available at the time (S3 Scorpion), but Nvidia still blew it with Tegra 3. So no, that's not the only reason. Nvidia can do certain things like moving to smaller node or keeping the clock speed low of the GPU's, but adding more GPU cores, and so on, to increase efficiency and performance/Watt. But they aren't doing any of that.
  • UpSpin - Sunday, September 8, 2013 - link

    You mean they could and should have released more iterations of Tegra 3 and adding more and more GPUs to improve at least the graphics performance than waiting for A15 and Tegra 4.

    I never designed a SoC myself :-D so I don't know how hard it is but I did lots of PCB which is practically the same except on a much larger scale :-D If you add some parts you have to increase the die size, thus move other parts on the die around, reroute the stuff etc. So it's still a lot of work. The main bottleneck of Tegra 3 is memory bandwidth. So adding more GPU cores without adressing the memory bandwidth would not have made any sense most probably.

    They probably expected to ship Tegra 4 SoCs sooner, thus they saw no need in releasing a totally improved Tegra 3 and focused on Tegra 4.

    And if you compare Tegra 4 to Tegra 3, then they did exactly what you wanted, moving to a smaller node, increasing the number of GPU cores, moving to A15 while maintaining the power efficient companion core, increasing bandwidth, ...
  • ESC2000 - Sunday, September 8, 2013 - link

    I wonder whether it is more expensive to pay to license ARM's A9, A15, etc (thought they were doing an A12 as well?) or to develop it yourself like Qualcomm does. Obviously QCOM isn't starting from scratch every time, but R&D adds up fast.

    This isn't a perfect analogy at all but it makes me think of the difference between being a pharmaceutical company that develops your own products and one that makes generic versions of products someone else has already developed once the patent expires. Of course now in the US many companies that technically make their own products from scratch really just take a compound already invented and tweak it a little bit (isolate the one useful isomer, make the chiral version, etc), knowing that it is likely their modified version will be safe and effective just as the existing drug hopefully is. They still get their patent, which they can extend through various manipulations like testing in new populations right before the patent expires, but the R&D costs are much lower. Consumers therefore get many similar versions of drugs that rely on one mechanism of action (see all the SSRIs) and few other choices if that mechanism does not work for them. Not sure how I got off into that but it is something I care about and now maybe some Anandtech readers will know haha.
  • krumme - Sunday, September 8, 2013 - link

    Great story mate :), i like it.
  • balraj - Saturday, September 7, 2013 - link

    My first comment on Anandtech
    The review was cool...I'm impressed by g2 battery life n camera...
    Wish Anandtech can have a UI section
    Also can you ppl confirm if lg will support g2 with Atleast 2 yrs of software update
    That's gonna be deciding factor in choosing between g2 or nexus 5 for most of us !!!!!!!
  • Impulses - Saturday, September 7, 2013 - link

    Absolutely nobody can guarantee that, even if an LG exec came out and said so there's no guarantee they wouldn't change their mind or a carrier wouldn't delay/block an update... If updates are that important to you, then get a Nexus, end of story.
  • adityasingh - Saturday, September 7, 2013 - link

    @Brian could you verify whether the LG G2 uses Snapdragon 800 MSM8974 or MSM8974AB?

    The "AB" version clocks the CPU at 2.3Ghz, while the standard version tops out at 2.2Ghz.. However you noted in your review that the GPU is clocked at 450Mhz.. If I recall correctly, the "AB" version runs the GPU at 550Mhz.. while the standard is 450Mhz

    So in this case the CPU points to one bin.. but the GPU points to another.. Can you please confirm?
    Nice "Mini Review" otherwise.. Am looking forward to the full review soon.. Please include the throttling analysis like the one from the MotoX. It would be nice to see how the long the clocks stay at 2.3Ghz :)
  • Krysto - Sunday, September 8, 2013 - link

    He did mention it's the first. no the latter.
  • neoraiden - Saturday, September 7, 2013 - link

    Brian could you comment on how the lumia 1020 compares to a cheap ($150-200) camera as I was impressed by the difference in colour for the video comparison even if ois wasn't the best.

    I currently have a note 2 but the camera quality in low light conditions is just too bad, also the inability to move apps to my memory card has been annoying. I have an upgrade coming up in January I think, but I might try to change phone before. I was wondering whether you could comment on whether the lumia 1020 is worth the jump from android due to picture quality or will an htc one or nexus 5 (if similar to the g2) suffice? I was considering the note 3 as I like everything else but it still doesn't have ois or would the note 3 with a cheap compact be better even given the inconvenience of having to bring a camera?

    The main day to day use of my phone is news apps, Internet, email some threaded (which I hear is a problem for windows phone).
  • abrahavt - Sunday, September 8, 2013 - link

    I would wait to see what camera nexus 5 would have. Alternative is to get the Sony QX 100 and you would get great pictures irrespective of the phone

Log in

Don't have an account? Sign up now