With the announcement of the LG G Flex 2 at the LG Press Conference, we finally saw the launch of a device with Qualcomm’s Snapdragon 810 SoC. While the SoC is one notable improvement from the previous G Flex, we see a great deal of improvement in almost all areas. One of the most immediate improvements is the new OLED display when compared against the previous G Flex. There are a great deal of other improvements, which can be seen in the spec sheet below.

  LG G Flex 2
SoC MSM8994 2.0/1.5 GHz 4x A57/ 4x A53 Snapdragon 810
RAM/NAND 2/3 GB LPDDR4, 16/32GB NAND + microSD
Display 5.5” 1080p LG P-OLED
Network 2G / 3G / 4G LTE (Qualcomm UE Category 9 LTE)
Dimensions 149.1 x 75.3 x 7.1-9.4mm, 152 grams
Camera 13MP rear camera, 1.12 µm pixels, 1/3.06" CMOS size, F/2.4. 2.1MP F/2.0 FFC
Battery 3000 mAh (11.4 Whr)
OS Android 5.0 with LG UI
Connectivity 802.11a/b/g/n/ac + BT 4.1, USB2.0, GPS/GNSS, Slimport, NFC
SIM Size MicroSIM

As one can see, while there are a lot of elements shared with the LG G3, there are a number of areas where we see notable improvements that differentiate the G Flex 2 from the G3. The size of the phone has also decreased, as LG claims that people didn’t like the sheer size of the previous LG G Flex.

LG has also changed the design of the G Flex 2 to be more similar to the LG G3 with its brushed metal finish, although the self-healing polymer dictated a glossy finish with a brushed design beneath the finish. The self-healing polymer itself has been notably improved, healing over seconds instead of minutes like the original LG G Flex. LG has also introduced Dura-Guard glass, which is said to improve drop resistance when compared to Corning Gorilla Glass.

In practice, this combines to make the LG G Flex 2 into a rather interesting phone. Unfortunately, it seems that all of the phones available for demonstration were running non-final software. This meant that it wasn’t possible for us to properly benchmark the device, and there seemed to be more lag in the UI than the LG G3 or Nexus 5. In addition, these demo phones were running in poor conditions for benchmarks, as maximum brightness was constantly reduced due to thermal throttling.

Despite these issues, the G Flex 2 was still an interesting device to try. While I haven’t used the original LG G Flex extensively, I noticed that the 1080p display on the G Flex 2 behaved differently from the 1080p display on the Galaxy S5. Although the G Flex 2 doesn’t have the odd ghosting effects present on the Galaxy S5, the G Flex 2 does have noticeable mura or some sort of texture to the display. Given the demo conditions, it’s difficult to discern whether or not the display is RGB stripe or some form of PenTile.

Outside of the display, the UI brought our first experiences with LG’s UI when layered on top of Android 5.0. Unfortunately, the UI seems to be quite similar to the G3’s UI on Android 4.4, even though there are noticeable changes in areas like the notification drawer and multitasking menu. It may be that we will be left waiting until the LG G4 to see a redesign of the UI to fit with the new design guidelines. At any rate, the camera remains identical to the LG G3, with the same Sony IMX135 sensor, optics, laser AF, and OIS+. The UI remains largely similar here as well, and it seems that there remains a noticeable amount of shutter lag similar to the LG G3. On the bright side, the G Flex 2’s camera remains quick to focus and the OIS is incredibly stable when compared to most solutions I’ve tried. Overall, despite some issues the LG G Flex 2 seems to be a promising device.

POST A COMMENT

24 Comments

View All Comments

  • ws3 - Friday, January 9, 2015 - link

    Since these are ARM cores, not custom cores, I'm curious what was the goal frequency of the design. In the past, ARM cores have been designed for much lower frequencies than phone makers have wanted to sell, so they either overclocked the cores or used custom cores designed for higher frequencies.

    Maybe the problem is just lack of headroom here, so that the old overclocking tricks don't work. If that's the case, then a custom core is the only solution, and the only company that has a custom core is Apple.
    Reply
  • metayoshi - Friday, January 9, 2015 - link

    I don't think you understand what you are saying. First off, you talk about Apple having the only custom core design, but Qualcomm has had Krait for a while, along with Nvidia's Denver cores in their Tegra K1 that went into the Nexus 9. Secondly you talk about using custom cores designed for higher frequencies, but Apple's main benefit is their really high IPC, so their clock speeds are significantly lower than the typical ARM CPU. Thirdly, name a phone manufacturer whose SoC clock speeds are running at higher than they are spec'd at, otherwise known as overclocking. I can't think of any, at least any major ones, but if you can prove your own statement, maybe I would believe you. Reply
  • ws3 - Saturday, January 10, 2015 - link

    I was only thinking about ARMv8 cores, therefore Krait was not a consideration.
    Yes I did forget about the Nvidia Denver core, which is an ARMv8 core. So, Apple and Nvidia have ARMv8 custom cores. However everyone else seems to be going with A57 for the time being, and it has been reported that the A57 is having trouble running at the frequencies phone manufacturers desire while remaining within their thermal envelope.

    That is what led into my statements about overlocking. That comes from this Anandtech article: http://www.anandtech.com/show/7335/the-iphone-5s-r... which states:

    "Then there’s the frequency discussion. Brian and I have long been hinting at the sort of ridiculous frequency/voltage combinations mobile SoC vendors have been shipping at for nothing more than marketing purposes. I remember ARM telling me the ideal target for a Cortex A15 core in a smartphone was 1.2GHz. Samsung’s Exynos 5410 stuck four Cortex A15s in a phone with a max clock of 1.6GHz. The 5420 increases that to 1.7GHz. The problem with frequency scaling alone is that it typically comes at the price of higher voltage. There’s a quadratic relationship between voltage and power consumption, so it’s quite possibly one of the worst ways to get more performance. Brian even tweeted an image showing the frequency/voltage curve for a high-end mobile SoC. Note the huge increase in voltage required to deliver what amounts to another 100MHz in frequency."
    Reply
  • kmmatney - Friday, January 9, 2015 - link

    Is 8 cores necessary? Seems like if they could drop it down to 6 cores, and be able to run the cpu at a higher frequency, it would be better overall. Apple is still on 2 cores. It seems like the meaningless quest to pack as many "cores"as possible is backfiring - at least for use in phones. Reply
  • Solandri - Friday, January 9, 2015 - link

    All the octa-core designs I've seen have been 4 high-power cores coupled with 4 low-power cores, with no real provision to use all 8 cores simultaneously - just one set of 4 at a time. It's really a solution tailored for always-on devices which spend most of their time idling or doing occasional background tasks, but have to do (relatively) very heavy computational work at random intervals. e.g. Smartphones.

    As for 4 cores vs 2, I suspect the optimal number of cores is actually e. That is, 2.718. That turns out to be the optimal number for a lot of things, including word encoding. i.e. If you're trying to create a written language, at one extreme is Chinese which has a different character for every word. At the other end is computer binary which has just 2 characters (0 and 1) which you combine into a long string to create words. What's the number of characters which best balances word complexity and word length to minimize memory consumption? Turns out to be e. So a trinary encoding (0, 1, 2) is most space-efficient.

    In terms of cores, that would mean 3 cores is most efficient. But because computers are designed around powers of 2, it's easier to make 4 cores than 3. My real-world experience seems to back his up too. Dual core systems occasionally lag and stutter, especially if there's one very intensive processing task going on. Quad core systems rarely do.
    Reply
  • phoenix_rizzen - Friday, January 9, 2015 - link

    Basically only the first Exynos SoCs used cluster migration (either the high-power or the low-power clusters could be online; system only saw 4 CPUs).

    Everything since then has used CPU migration (for lack of a better term) where the OS sees all 8 CPUs and can online/offline individual cores as needed.

    With the exception of the nVidia Tegra X1 (of course) which uses their own custom interconnect and power management system and a variant of cluster migration.
    Reply
  • dave1231 - Saturday, January 10, 2015 - link

    The nvidia Tegra 4 works much better on 4 cores than 2. I think it's one of the few devices in which you can set to two or four cores and it's definitely better on 4.

    As for why 8 cores, I have seen a video showing less power consumption using the big.LITTLE combination so, given the vast performance of CPUs these days, who am I to disagree with the hypothesis.
    Reply
  • tuxRoller - Friday, January 9, 2015 - link

    Given these cores are all on their own power plane they can run one or two big cores flat out, and some few number of little cores to handle timer events and the like. Yeah, gts allows for this. My guess is that lg hasn't found the right balance of parameters to feed to the scheduler yet. Reply
  • djc208 - Friday, January 9, 2015 - link

    I really like my Optimus G, but until I see a more consistent software support pattern from LG I'll never own another one. Considering it's basically the same hardware as the Nexus 4 it's pathetic that they essentially never updated the software beyond a few desperately needed bug fixes. There were rumors of a 4.4.4 update but that was about it. Guess I'll have to root it, none of the newest phones really make me want to drop another $$$ on another phone right now. Reply
  • kmmatney - Friday, January 9, 2015 - link

    I have an LG Optimus G Pro, and the hardware is great, but lack of software support has been frustrating. There were some huge flaws that effected the normal email app, that weren;t fixed until an update 1 year after I bought the phone. 1 year! I tried a few different ROMS, but they just introduced different bugs, so went back to stock. SO yeah - I don't see myself getting another LG phone. Reply

Log in

Don't have an account? Sign up now