Battery Life

There’s been a lot of speculation about whether dual-core phones would be battery hogs or not. Turns out that voltage scaling does win, and P=V^2/R does indeed apply here. The 2X delivers middle of the road 3G and WiFi web browsing battery life numbers, and above average 3G talk time numbers.

3G Web Browsing Battery Life

WiFi Web Browsing Battery Life

3G Talk Time Battery Life

We’ve also got another new test. Gaming battery life under constant load is a use scenario we haven’t really been able to measure in the past, but are now able to. Our BaseMark GUI benchmark includes a battery test which runs the composition feature test endlessly, simultaneously taxing the CPU and GPU. It’s an aggressive test that nearly emulates constant 3D gaming. For this test we leave WiFi and cellular interfaces enabled, bluetooth off, and display brightness at 50%.

BaseMark GUI Battery Life Test

I’m a bit disappointed we don’t have more numbers to compare to, but the 2X does come out on top in this category. Anand and I both tested the Galaxy S devices we have kicking around (an Epic 4G and Fascinate), but both continually locked up, crashed, or displayed graphical corruption while running the test. Our constant 3D gaming test looks like a winner for sifting out platform instability.

Conclusion

The 2X is somewhat of a dichotomy. On one side, you've got moderately aesthetically pleasing hardware, class-leading performance from Tegra 2 that doesn't sacrifice battery life at the stake, and a bunch of notable and useful extras like HDMI mirroring. On the other, you've got some serious experience-killing instability issues (which need to be fixed by launch), a relatively mundane baseband launching at a time when we're on the cusp of 4G, and perhaps most notably a host of even better-specced Tegra 2 based smartphones with more RAM, better screens, and 4G slated to arrive very soon.

It's really frustrating for me to have to make all those qualifications before talking about how much I like the 2X, because the 2X is without a doubt the best Android phone I've used to date. Android is finally fast enough that for a lot of the tasks I care about (especially web browsing) it's appreciably faster than the iPhone 4. At the same time, battery doesn't take a gigantic hit, and the IPS display is awesome. The software instability issues (which are admittedly pre-launch bugs) are the only thing holding me back from using it 24/7. How the 2X fares when Gingerbread gets ported to it will also make a huge difference, one we're going to cover when that time comes.

The other part of the story is Tegra 2.

Google clearly chose NVIDIA’s Tegra 2 as the Honeycomb platform of choice for a reason. It is a well executed piece of hardware that beat both Qualcomm and TI’s dual-core solutions to market. The original Tegra was severely underpowered in the CPU department, which NVIDIA promptly fixed with Tegra 2. The pair of Cortex A9s in the AP20H make it the fastest general purpose SoC in an Android phone today.

NVIDIA’s GeForce ULV performance also looks pretty good. In GLBenchmark 2.0 NVIDIA manages to hold a 20% performance advantage over the PowerVR SGX 540, our previous king of the hill.

Power efficiency also appears to be competitive both in our GPU and general use battery life tests. Our initial concern about Tegra 2 battery life was unnecessary.

It’s the rest of the Tegra 2 SoC that we’re not completely sure about. Video encode quality on the LG Optimus 2X isn’t very good, and despite NVIDIA’s beefy ISP we’re only able to capture stills at 6 fps with the camera is set to a 2MP resolution. NVIDIA tells us that the Tegra 2 SoC is fully capable of a faster capture rate for stills and that LG simply chose 2MP as its burst mode resolution. For comparison, other phones with burst modes capture at either 1 MP or VGA. That said, unfortunately for NVIDIA, a significant technological advantage is almost meaningless if no one takes advantage of it. It'll be interesting to see if the other Tegra 2 phones coming will enable full resolution burst capture.

Then there’s the forthcoming competition. TI’s OMAP 4 will add the missing MPE to the Cortex A9s and feed them via a wider memory bus. Qualcomm’s QSD8660 will retain its NEON performance advantages and perhaps make up for its architectural deficits with a higher clock speed, at least initially. Let’s not forget that the QSD8660 will bring a new GPU core to the table as well (Adreno 220).

Tegra 2 is a great first step for NVIDIA, however the competition is not only experienced but also well equipped. It will be months before we can truly crown an overall winner, and then another year before we get to do this all over again with Qualcomm’s MSM8960 and TI's OMAP 5. How well NVIDIA executes Tegra 3 and 4 will determine how strong of a competitor it will be in the SoC space.

Between the performance we’re seeing and the design wins (both announced and rumored) NVIDIA is off to a great start. I will say that I’m pleasantly surprised.

HDMI Mirroring and Video Playback
Comments Locked

75 Comments

View All Comments

  • Exophase - Monday, February 7, 2011 - link

    Thanks Anand.

    I'm surprised to hear that shot was from IMG, given that it was an IMG employee who made the comment originally about Tegra's 16-bit banding being evident on it, from the screenshot. Whoops. I do wonder what could be causing this, then.

    Nonetheless, while that definitely makes my 16bit color claim invalid the depth buffer one should still hold. We might need to wait and see how much of a difference this actually makes, or rather how effective nVidia's 16-bit depth space is.

    I'm glad to hear that you're as concerned about benchmarks on Android as I am. It's especially frustrating when I see people using them to try to indicate Atom being substantially better clock for clock than Cortex-A9.
  • Exophase - Monday, February 7, 2011 - link

    Managed to miss this:

    "The test ramps from around 3k vertices to 15k vertices per frame, and 190k to 250k triangles per frame"

    That line doesn't make any sense. How would you have hundreds of times more triangles than vertices? You must have meant something else.
  • sid1712 - Monday, February 7, 2011 - link

    Great review as usual but i'm disappointed about the lack of details on the Sound Quality of the phone. A comparison of the sound quality (via headphone jack) alongside the iPhone 4 and the Galaxy S (with Voodoo kernel preferrably) would give a good idea about the SQ of the phone.
  • ScentedKandle - Monday, February 7, 2011 - link

    Related to this, the audio codec lists "lossless" but doesn't mention what format. Can the audio chip natively decode FLAC?
  • teldar - Monday, February 7, 2011 - link

    The order of buttons if the same as my droid x.
  • Pjotr - Monday, February 7, 2011 - link

    Does it really record 1920x1088? Does this unorthodox resolution play well on TVs, if you put it on a USB stick, for example?
  • Brian Klug - Monday, February 7, 2011 - link

    It plays back from the phone properly, and most of the playback software just does a crop. A ton of devices actually produce 1088 and don't make note of it, it should playback fine.

    -Brian
  • unmesh - Monday, February 7, 2011 - link

    For active aka switching transistor power consumption, C*V^2*f (C is capacitance and f is frequency) is a better proxy than V^2/R.

    The conclusion that operating voltage has a huge effect remains the same.
  • Kevin098 - Monday, February 7, 2011 - link

    Hey, can you make a video comparison between the iphone 4 retina display and Optimus 2x ?
  • StormyParis - Monday, February 7, 2011 - link

    Pages and pages of (apparently not very acurate, too) perf data, and not even one line on sound quality, which is one of my key buying points for a phone.

    No info on whether I'll be able to stream PC-resolution videos off my server to my bed over wifi.

    Overall, not a very useful review. More like a dick size contest.

Log in

Don't have an account? Sign up now