Seven years ago, the browser wars seemed all but ended. AOL bought out Netscape, Microsoft Internet Explorer dominated the market, and the era of browser-based exploits began. In 2003, Microsoft's stranglehold on the browser market didn't change much, but the Mozilla group began their efforts at an open-source alternative. It still took almost two years before we finally saw the birth of Firefox, the first serious contender to the browser throne since the passing of Netscape… okay, so Netscape was more on life-support, but let's not argue semantics.

During the past four years, things have changed to the point where browser market share is a lot more varied. Various browsers have their proponents and opponents, and we've seen plenty of benchmarks demonstrating which browser is the fastest, which have the best JavaScript support, and which best complies with web programming standards. With the launch of Internet Explorer 8, Opera 10, Firefox 3.5, Safari 4, and Chrome during the past year, the market is far more varied than what we've seen in the past. So which browser reigns supreme?

Truth be told, the answer to that question is very subjective. If you have a reasonably fast system, it's unlikely that you will notice the difference between any of the major browsers when it comes to loading typical webpages. Stress tests that focus on JavaScript performance might be meaningful if you visit sites that use lots of JavaScript, and concerns about security, standards support, and availability of plug-ins/add-ons are also potentially meaningful. On average it's probably a wash as to which you'll like "best". If you're trying to figure out which browser is right for you, we suggest looking at the Browser Wars series of articles over at DailyTech.

What we are going to look at today is the impact of your choice of browser on battery life, plain and simple. Except, coming up with a benchmark is neither plain nor simple. We have used several different methods for testing battery life on laptops, and depending on the type of content you're viewing battery life ranges from nearly equal to what you can expect at idle down to roughly the same amount of battery life you would get when viewing high-definition videos. Like it or not, we feel that Adobe's Flash is used on many websites, and so we picked three websites that we frequently visit and used those for our testing. As a point of reference, here's the sort of battery life difference you're looking at when viewing "simple" webpages versus the three websites we selected, from our article comparing AMD and Intel battery life.

Browser Battery Life

Obviously, that's a huge difference in battery life. You get roughly 50% more battery life in simple Internet surfing compared to surfing sites that use of lots of Flash content (along with frames, numerous tables, etc.) Last we checked, your average website is nowhere near what would qualify as "simple", and Flash content is ubiquitous. For better or for worse, we're going to focus on battery life when viewing three websites. One of the websites is AnandTech.com, and the other two shall remain nameless. Suffice it to say, all three sites have approaches to web design that we see replicated all over the Internet.

For testing, we load the three sites into tabs on our test web browser, wait 60 seconds, and then reload all three tabs. We are using three recently tested laptops that offered decent battery life. Two of these are the Gateway NV52 and NV58 that represent the current state of entry-level AMD and Intel laptops. The third is a netbook, the ASUS Eee PC 1005HA. None of these laptops would qualify as high-end solutions, mostly because we don't think users interested in battery life are going to be looking at high-end laptops. These three laptops provide a reasonable view of the current mobile market. If there is interest, we may look at extending this testing to other laptops in the future, but first let's see what sort of results we get from the test candidates.

AMD Browser Battery Life
Comments Locked

76 Comments

View All Comments

  • JarredWalton - Monday, September 14, 2009 - link

    The test is set to load the three test pages every minute -- it's constant "user speed" simulation rather than maximum web page rendering rate. So in this instance, if two browsers ran for four hours, they rendered the same number of pages in that time.

    If I wanted to test page views per Watt, I'd need to come up with a different test, and the results wouldn't be a realistic demonstration of how people use web browsers. You have to pause and read a page to know what's there; if you're just constantly reloading pages, you can make a battery life stress test but it is no longer anything approaching a realistic view of browser usage. IMO anyway.
  • erple2 - Monday, September 14, 2009 - link

    I suppose the only corollary to that is that the test assumes that every page takes (far) less than 1 minute to load. Otherwise the GP might have a point.

    If it takes longer than 1 minute to reach "steady state", then it might become more of an issue as to which browser is more productive.

    Perhaps to "truly" measure this, you would have the timer start at 1 minute after steady state is achieved, then count how many 1 minute steady state events exist. That would make for an oddly discrete number of trials, however. However, that might only really measure how much power each browser draws while rendering a page as a function of the total amount of time spent on a "trial". However, in keeping with the other benchmarks you've provided, it appears that the time of rendering is somewhat irrelevant.

    Whew! Glad I don't have to design and run meaningful benchmarks! That's hard!
  • postler - Sunday, September 13, 2009 - link

    It would be interesting to compare Opera using all of its features (bit torrent client, chat, email client) vs other browsers with seperate programs for these tasks. I suppose Opera is more power efficient compared to other browsers+email client+bittorrent client running.
  • JarredWalton - Monday, September 14, 2009 - link

    I don't think the majority of people are going to run a bittorrent client on battery power. However, this is not a review of browsers in general; it's just a look at battery life using the same surfing test on each one. Personally, I'm a Firefox guy and this isn't enough to convince me to look elsewhere.
  • yourwhiteshadow - Sunday, September 13, 2009 - link

    What about battery life on the new macbook pros w/various browsers? Also, I went to test out a 13.3" MBP at the apple store, and ran peacekeepeer on it. The $1499 version with 4 gb ram, 2.53 ghz proc, and 250 gb hdd got consistently got a lower score than the $1199 version which had 2 gb ram, 2.26 ghz proc, and 160 gb hard drive. Both ran the same video chipset. Any ideas as to what is going on? I ended up buying the $1199 one obviously, but still I'm a little curious.
  • Voo - Sunday, September 13, 2009 - link

    I'm astonished everytime I read a article with several pages of big colorfull benchmarks in it and then see people asking for evidence to backup the "claims" made at the end of the article..

    Interesting article, though I knew before reading it that I wouldn't replace my FF. I'm just too customized to it - yep I know I'm ignorant, but after all the differences between the browsers are relativly small so I think I'll survive it ;)

    But yes just out of courtesy I'd be interested in the same benchmarks run under OS X. Would be interesting to see how IE would fare there (I think Raymond Chen said once that the IE is more or less a GUI wrapped around several core dlls, which shouldn't work for OSX)
  • coachingjoy - Sunday, September 13, 2009 - link

    Well done.

    Appears IE8 is good for something.

  • araczynski - Saturday, September 12, 2009 - link

    that was refreshingly, um, refreshing, and quite surprising, didn't think there'd be that much noticeable difference between the browsers in terms of battery life of all things.

    still hate that each of them has their own stupid rendering quirks that they seem to believe themselves above eliminating.
  • IntelUser2000 - Saturday, September 12, 2009 - link

    "None of these laptops would qualify as high-end solutions, mostly because we don't think users interested in battery life are going to be looking at high-end laptops."

    That quote and this:

    "Please note that unlike our normal battery life tests, we set the laptop on the Vista "Power Saver" profile instead of "Balanced", with the hard drive set to power down after 3 minutes and the maximum CPU performance set at 50%."

    This is ridiculous. One of the most stupid and contradictory reviews ever seen on Anandtech. Sorry but please benchmark settings that people will use. Nobody is going to put "Power Saver" on a Core 2/Turion Notebook and enable "High" on a Atom Netbook. What's the point here? I mean you said the battery life difference was 6% for Power Saver vs. Balanced which isn't significant at all for performance sacrifice. Does it artificially inflate the score differences between browers? Have you tested to see if the positions change with different settings?

    Again, what's the point of this review? Normally I'm very positive to Anandtech. This article isn't one that would show that. You could have at least put both results.
  • JarredWalton - Saturday, September 12, 2009 - link

    I'm not sure what your beef with the first quote is supposed to be. Do you think people buying high-end laptops with discrete graphics care about battery life? I don't. If you have a Core 2 Quad laptop and GTX 260M graphics, your battery life will stink regardless of browser, so I decided to look at several laptops that offer reasonable battery life and see if the choice of browser mattered. Moving on....

    The "High" setting on the ASUS was not for the OS. That was set on "Portable/Laptop". The "High" setting is specifically for ASUS' Super Hybrid Engine, which underclocks the CPU and FSB if you leave it on "Auto". The CPU can still use SpeedStep, but it will stay on a 166MHz bus (667FSB) instead of SHE dropping it to 147MHz (588FSB). Mostly I did it to reduce the amount of time required to run the tests; at over seven hours per browser, running each twice, it already took more than a week of testing time. I don't have the 1005HA anymore (ASUS wanted it back), so I can't retest anything on it.

    As I also mentioned, using "Balanced" instead of "Power Saver" gives you an extra 6% battery life - since that's what we're looking at for using different browsers, I figured it was a useful bonus. Also, I was running some of those tests for use in a future article, so I didn't want to repeat testing any more than necessary. Regardless, using Power Saver is hardly "ridiculous", and the comparison wasn't between AMD and Intel and Atom. That's why those results are on separate pages.

    If you want apples-to-apples on browsers on each laptop, that's what I provided here. If you want to compare identical settings between different laptops, that's what I've always done in the standard notebook/netbook reviews. As far as I can tell, the power saving setting does not influence the individual browser results, though to be sure I would need to run every single test again with different settings. That's not something I really feel is necessary.

    And FWIW, the "50%" setting in the advanced power options doesn't mean the CPU runs at half the maximum clock speed. CPUZ tells me otherwise. I've seen some laptops set that to "20%" or even "0%", and yet the systems still run. All I know for sure is that I used the same power settings on both Gateway systems. Since the 1005HA runs XP instead of Vista, there's already an inherent difference. I'll be looking at XP, Vista, and Win7 shortly, though, so stay tuned.

Log in

Don't have an account? Sign up now