Battery Life

There’s been a lot of speculation about whether dual-core phones would be battery hogs or not. Turns out that voltage scaling does win, and P=V^2/R does indeed apply here. The 2X delivers middle of the road 3G and WiFi web browsing battery life numbers, and above average 3G talk time numbers.

3G Web Browsing Battery Life

WiFi Web Browsing Battery Life

3G Talk Time Battery Life

We’ve also got another new test. Gaming battery life under constant load is a use scenario we haven’t really been able to measure in the past, but are now able to. Our BaseMark GUI benchmark includes a battery test which runs the composition feature test endlessly, simultaneously taxing the CPU and GPU. It’s an aggressive test that nearly emulates constant 3D gaming. For this test we leave WiFi and cellular interfaces enabled, bluetooth off, and display brightness at 50%.

BaseMark GUI Battery Life Test

I’m a bit disappointed we don’t have more numbers to compare to, but the 2X does come out on top in this category. Anand and I both tested the Galaxy S devices we have kicking around (an Epic 4G and Fascinate), but both continually locked up, crashed, or displayed graphical corruption while running the test. Our constant 3D gaming test looks like a winner for sifting out platform instability.

Conclusion

The 2X is somewhat of a dichotomy. On one side, you've got moderately aesthetically pleasing hardware, class-leading performance from Tegra 2 that doesn't sacrifice battery life at the stake, and a bunch of notable and useful extras like HDMI mirroring. On the other, you've got some serious experience-killing instability issues (which need to be fixed by launch), a relatively mundane baseband launching at a time when we're on the cusp of 4G, and perhaps most notably a host of even better-specced Tegra 2 based smartphones with more RAM, better screens, and 4G slated to arrive very soon.

It's really frustrating for me to have to make all those qualifications before talking about how much I like the 2X, because the 2X is without a doubt the best Android phone I've used to date. Android is finally fast enough that for a lot of the tasks I care about (especially web browsing) it's appreciably faster than the iPhone 4. At the same time, battery doesn't take a gigantic hit, and the IPS display is awesome. The software instability issues (which are admittedly pre-launch bugs) are the only thing holding me back from using it 24/7. How the 2X fares when Gingerbread gets ported to it will also make a huge difference, one we're going to cover when that time comes.

The other part of the story is Tegra 2.

Google clearly chose NVIDIA’s Tegra 2 as the Honeycomb platform of choice for a reason. It is a well executed piece of hardware that beat both Qualcomm and TI’s dual-core solutions to market. The original Tegra was severely underpowered in the CPU department, which NVIDIA promptly fixed with Tegra 2. The pair of Cortex A9s in the AP20H make it the fastest general purpose SoC in an Android phone today.

NVIDIA’s GeForce ULV performance also looks pretty good. In GLBenchmark 2.0 NVIDIA manages to hold a 20% performance advantage over the PowerVR SGX 540, our previous king of the hill.

Power efficiency also appears to be competitive both in our GPU and general use battery life tests. Our initial concern about Tegra 2 battery life was unnecessary.

It’s the rest of the Tegra 2 SoC that we’re not completely sure about. Video encode quality on the LG Optimus 2X isn’t very good, and despite NVIDIA’s beefy ISP we’re only able to capture stills at 6 fps with the camera is set to a 2MP resolution. NVIDIA tells us that the Tegra 2 SoC is fully capable of a faster capture rate for stills and that LG simply chose 2MP as its burst mode resolution. For comparison, other phones with burst modes capture at either 1 MP or VGA. That said, unfortunately for NVIDIA, a significant technological advantage is almost meaningless if no one takes advantage of it. It'll be interesting to see if the other Tegra 2 phones coming will enable full resolution burst capture.

Then there’s the forthcoming competition. TI’s OMAP 4 will add the missing MPE to the Cortex A9s and feed them via a wider memory bus. Qualcomm’s QSD8660 will retain its NEON performance advantages and perhaps make up for its architectural deficits with a higher clock speed, at least initially. Let’s not forget that the QSD8660 will bring a new GPU core to the table as well (Adreno 220).

Tegra 2 is a great first step for NVIDIA, however the competition is not only experienced but also well equipped. It will be months before we can truly crown an overall winner, and then another year before we get to do this all over again with Qualcomm’s MSM8960 and TI's OMAP 5. How well NVIDIA executes Tegra 3 and 4 will determine how strong of a competitor it will be in the SoC space.

Between the performance we’re seeing and the design wins (both announced and rumored) NVIDIA is off to a great start. I will say that I’m pleasantly surprised.

HDMI Mirroring and Video Playback
Comments Locked

75 Comments

View All Comments

  • djgandy - Monday, February 7, 2011 - link

    It'll be interesting to see how all the other SoC's perform with DDR2.
  • DanNeely - Monday, February 7, 2011 - link

    Where can I find more information on this?
  • Anand Lal Shimpi - Monday, February 7, 2011 - link

    Here's a link to the immediate mode vs. tbdr discussion in our old Kyro II review:

    http://www.anandtech.com/show/735/2

    Take care,
    Anand
  • silverblue - Monday, February 7, 2011 - link

    I'm not sure I agree with the wording in this article about TBDR. The reason PowerVR didn't need to slap DDR RAM onto the Kyro II cards is because they simply didn't need it thanks to the reduction of traffic that comes from deferred rendering. The unknown element at the time was hardware T&L because it simply wasn't available and was thought to be impossible, however as this is yet again being performed on-die, wouldn't that also result in a marked reduction in traffic? Might need some clarification on this one.

    I've never seen it confirmed that the Adreno GPU performs TBDR; some clarification would be appreciated on this one as well! :)

    Onto the option for changing fonts... my vendor-agnostic Galaxy S has such a feature called "Font style" under the Display settings, allowing you to choose the "Default font", "Choco cooky", "Cool jazz" and "Rosemary", with the option of getting more online.
  • Exophase - Monday, February 7, 2011 - link

    Qualcomm bought out AMD's mobile GPUs and hence the Adreno 200 was a rebrand of AMD z430. Here's a little more background on the tiling nature of z430:

    (since apparently I can't post a link without being flagged as spam just google for this: gdc2008_ribble_maurice_TileBasedGpus.pdf - it's the first hit)
  • silverblue - Monday, February 7, 2011 - link

    Interesting... I'd like to see the differences between their approach and that of Imagination Technologies.
  • silverblue - Monday, February 7, 2011 - link

    Thanks by the way... answered a lot of questions. :)
  • AndroidFan - Monday, February 7, 2011 - link

    should be 300mhz (=600/2)
  • Zaitsev - Monday, February 7, 2011 - link

    Is the camera really too thick to fit the width of the phone? I heard of many Evo users who have cracked the glass covering. While I haven't had this problem, it makes me wonder if it's really necessary in the first place.

    Thanks
  • MeSh1 - Monday, February 7, 2011 - link

    I cant wait until you can wirelessly shoot your phones display to your tv ala intel WiDi. This HDMI out is cool, but the cable kind of kills it. With wireless display your phone becomes a game controller :) or a remote when shooting movies to your tv. Plus how cool would it be if your phone can fetch movies from your home network and you shot the playback to your TV. The Sony NGP should have implemented this. Ah well.

Log in

Don't have an account? Sign up now