Wireless Web Browsing Battery Life Test

For our final test I wanted to provilde a snippet of our 2013 web browsing battery life test to show what its power profile looked like. Remember the point of this test was to simulate periods of increased CPU and network activity, that could correspond to more than just browsing the web but interacting with your device in general.

 

Those bursts of power consumption are the direct result of our battery life test doing its job. That the tasks should take roughly the same time to complete on both devices, making this a good battery life test by not penalizing a faster SoC with more work.

Note that the W510's curve ends up lagging behind Surface RT's curve a bit by the end of the chart. This is purely because of the W510's garbage WiFi implementation. I understand that a fix from Acer is on the way, but it's neat to see something as simple as poorly implemented WiFi showing up in these power consumption graphs.

I always think about GPU power consumption while playing a game, but going through this experiment gave me a new found appreciation for non-gaming GPU power efficiency. Simply changing what's displayed on screen does burn an appreciable amount of power.

GPU Power Consumption Final Words
Comments Locked

163 Comments

View All Comments

  • felixyang - Friday, December 28, 2012 - link

    You get the keypoint.
  • st.bone - Tuesday, December 25, 2012 - link

    Now that's just out right ignorant on your comment, that's if you even took the time to read the article
  • jeffkibuule - Monday, December 24, 2012 - link

    This kind of makes me shake my head as to why a year old SoC was used when Samsung was shipping Exynos 5250 and Qualcomm had APQ8064, heck nVidia has Tegra4 just waiting in the wings for a likely CES announcement (I know why, bad timing).

    My only hope is that in 2013, ARM and Atom SoCs can support 1080p displays, I don't think I can use anything less without wanting to poke my eyes out.
  • kyuu - Monday, December 24, 2012 - link

    Agreed. This article mostly highlights what an out-of-date SoC Tegra3 is, and how bad it is without its companion core. Which is why its so perplexing that Microsoft went with the Tegra 3 for the Surface. I can only guess that Nvidia must be practically giving away Tegra3 nowadays, otherwise I have no idea why anyone would use it.

    I'm not sure it's a huge win for Intel that Clover Trail beats a mostly obsolete ARM SoC in power use with such an incredibly mediocre GPU.

    A more interesting comparison would be between a current-gen Qualcomm chipset and/or Samsung's current-gen Exynos.
  • kyuu - Monday, December 24, 2012 - link

    To clarify the second "paragraph", Clover Trail is the one with the mediocre GPU.
  • lmcd - Tuesday, December 25, 2012 - link

    They're both mediocre GPUs at this point. Didn't Adreno 225 keep pace with the T3 GPU?
  • jeffkibuule - Tuesday, December 25, 2012 - link

    The reasoning for going for Tegra 3 was pretty obvious. They needed a more finalized chip to develop Windows RT against and Tegra 3 was the only thing available. Relying on a faster class of SoCs in the Tegra 4, Snapdragon S4 Pro, or Exynos 5250 would mean delaying the Windows RT version of tablets by several months at least since I doubt development would have been done in time.
  • lmcd - Tuesday, December 25, 2012 - link

    S4 standard would've done better more likely. Clover Trail demo'ed how viable two cores was for W8, let alone W8 RT
  • wsw1982 - Tuesday, December 25, 2012 - link

    There is already the comparison between the medfield, krait, swift and Exynos 5250 on the android platform, isn't there? I think microsoft is the one to blame that you can only compare clover trails and tegra 3 on windows platform, LOL
  • Krysto - Monday, December 24, 2012 - link

    Intel will not have anything that even approaches next year's ARM SoC's GPU's anytime soon. And they can't match Cortex A15 in performance anytime soon either.

Log in

Don't have an account? Sign up now