Battery Life - Technically, No Better

Now it's time to talk about battery life. Let me run down the tests real quick. This is a combination of the tests Jarred runs in our standard notebook/netbook reviews and the tests I run in my Mac reviews. WiFi was always enabled and connected to an access point 20 feet away. The screen brightness was set to 100 nits and Windows 7 was configured to use its Power Saver battery profile.

The idle test is exactly what you think it is. The notebook just sits at the Windows 7 desktop with no screensaver active until it runs out of battery power. This is a good indication of the best battery life you'll get out of the notebook (e.g. just typing in a text document).

The light web browsing test comes from our Mac reviews and cycles through a series of web pages, pausing on each one for 20 seconds before going on to the next one. There are no flash ads on the web pages. This is the lightest load you'd see when browsing the web. A playlist of MP3s loops in the background.

The average web browsing test also comes from our Mac reviews and cycles through a series of web pages, pausing on each one for 20 seconds before going on to the next one. Each page has between 1 and 4 flash ads on it and there are three concurrent IE8 windows open, each doing the same thing. A playlist of MP3s loops in the background.

The heavy web browsing test opens four tabs in IE8, each heavily loaded with flash ads. The tabs stay open for a short period of time before the cache is cleared and the browser is closed. The system sits at the desktop for a short duration before launching IE8 once more and opening the same four tabs. The test repeats until the battery is drained. This should be close to the worst case battery life while browsing the web.

Our video playback test loops a 720p x264 movie in Media Player Classic Home Cinema x64 until the battery dies. The player uses any GPU acceleration present in the system.

Finally, the heavy downloading/multitasking test mixes a bunch of these tests together. The average web browsing test runs while a 480p XviD movie plays and while a download script executes and downloads files at a constant 500KB/s from a server.

I kept as many variables constant as possible between the two systems. Both are configured with the same amount of memory, with the same HDD and are set to the same brightness. Both systems are normalized to the same battery capacity to produce an apples-to-apples comparison of battery life.

And now, the results:

Battery Test Core 2 Duo P8700 (2.53GHz) Core i5-540M (2.53GHz) Arrandale Advantage
Idle 216 minutes 215 minutes None
Light Web Browsing 177 minutes 188 minutes +6%
Average Web Browsing 177 minutes 186 minutes +5%
Heavy Web Browsing 174 minutes 176 minutes None
Video Playback (x264) 132 minutes 134 minutes None
Heavy Downloading/Multitasking 144 minutes 147 minutes None

 

For the most part there's actually no improvement in battery life due to Arrandale. There are a couple of instances where we see a 5 or 6% increase in staying power but these two platforms are basically equal. That's great when you consider how much faster Arrandale is than its predecessor, but it's not great when you remember that we're talking about a fully power gated 32nm processor here.

If we look at our desktop Clarkdale results we see that idle power for Intel's 32nm part isn't very good. It's actually worse than the 45nm Lynnfield platform from earlier this year. Intel confirmed that there is a lot of optimization that has to happen with Arrandale. It looks like there are some silicon level tweaks that are on the roadmap to be implemented but we won't see them until the middle of 2010. That means while the first Arrandale notebooks won't offer any more battery life than their predecessors, the second wave of Arrandale should fix that.

There's also one more thing to worry about. All of our battery life tests are carefully constructed to make sure they execute the same amount of work on all systems. Twenty seconds takes the same amount of time regardless of how fast your CPU is. As we've already seen, Arrandale is nearly 20% faster than the current mobile Core 2 Duo at the same clock speed. It is possible for you to get much worse battery life out of Arrandale simply by doing a lot more work. Intel estimates that if we were to loop Cinebench over and over again we'd see about 30% worse battery life on Arrandale vs. the previous generation mobile Core 2. The reason being that Arrandale would be much faster, but draw more power. It would be doing more work over the course of the test.

For an end user all this means is that you can do things like encode videos faster on Arrandale than you could before. You can either do the same amount of encoding, faster, without hurting battery life, or you can do more encoding, in the same amount of time, while reducing battery life. Just something to be aware of as Arrandale notebooks have the ability to be just as power efficient as existing notebooks, but can easily be more power hungry if you let them.

Performance - A Huge Improvement Final Words
Comments Locked

38 Comments

View All Comments

  • bsoft16384 - Monday, January 4, 2010 - link

    The biggest problem with Intel graphics isn't performance - it's image quality. Intel's GPUs don't have AA and their AF implementation is basically useless.

    Add in the fact that the Intel recently added a 'texture blurring' feature to their drivers to improve performance (which is, I believe, on by default) and you end up with quite a different experience compared with a Radeon 4200 or GeForce 9400M based solution, even if the performance is nominally similar.

    Also, I've noticed that Intel graphics do considerably better in benchmarks than they do in the real world. The Intel GMA X4500MHD in my CULV-based Acer 1410 does around ~650 in 3DMark06, which is about 50% "faster" than my friend's 3-year-old GeForce 6150-based AMD Turion notebook. But get in-game, with some particle effects going, and the Intel pisses all over the floor (~3-4fps) while the GeForce 6150 still manages to chug along at 15fps or so.
  • bobsmith1492 - Monday, January 4, 2010 - link

    That is, Intel's integrated graphics are so slow that even if they offered AA/AF they are too slow to actually be able to use them. The same goes for low-end Nvidia integrated graphics as well.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    Not true for NV/AMD. WoW, for example, runs fine with AA/AF on GeForce 9400. It runs decent with AF on the Radeon 3200 too.

    Remember that 20fps is actually pretty playable in WoW with hardware cursor (so the cursor is always 20fps).
  • bobsmith1492 - Monday, January 4, 2010 - link

    Do you really think you can actually use AA/AF on an integrated Intel video processor? I don't believe your point is relevant.
  • MonkeyPaw - Monday, January 4, 2010 - link

    Yes, since AA and AF can really help the appearance of older titles. Some of us don't expect an IGP to run Crysis.
  • JarredWalton - Monday, January 4, 2010 - link

    The problem is that AA is really memory intensive, even on older titles. Basically, it can double the bandwidth requirements and since you're already sharing bandwidth with the CPU it's a severe bottleneck. I've never seen an IGP run 2xAA at a reasonable frame rate.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    Newer AMD/NV GPUs have a lot of bandwidth saving features, so AA is pretty reasonable in many less demanding titles (e.g. CS:S or WoW) on the HD4200 or GeForce 9400.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    And, FYI, yes, I've tried both. I had a MacBook Pro (13") briefly, and while I ultimately decided that the graphics performance wasn't quite good enough (compared with, say, my old T61 with a Quadro NVS140m), it was still night and day compared with the GMA X4500.

    The bottom line in my experience is that the GMA has worse quality output (particularly texture filtering) and that it absolutely dies with particle effects or lots of geometry.

    WoW is not at all a shader-heavy game, but it can be surprisingly geometry and texture heavy for low-end cards in dense scenes. The Radeon 4200 is "only" about 2x as fast as the GMA X4500 in most benchmarks, but if you go load up demanding environments in WoW you'll notice that the GMA is 4 or 5 times slower. Worse, the GMA X4500 doesn't really get any faster when you lower the resolution or quality settings.

    Maybe the new generation GMA solves these issues, but my general suspicion is that it's still not up-to-par with the GeForce 9400 or Radeon 4200 in worst-case performance or image quality, which is what I really care about.
  • JarredWalton - Tuesday, January 5, 2010 - link

    Well, that's the rub, isn't it: GMA 4500MHD is not the same as the X4500 in the new Arrandale CPUs. We don't know everything that has changed, but performance alone shows a huge difference. We went from 10 shader units to 12 and performance at times more than doubled. Is it driver optimizations, or is it better hardware? I'm inclined to think it's probably some of both, and when I get some Arrandale laptops to test I'll be sure to run more games on them. :-)
  • dagamer34 - Monday, January 4, 2010 - link

    Sounds like while performance increased, battery life was just "meh". However, does the increased performance factor in the Turbo Boost that Arrandale can perform or was the clock speed locked at the same rate as the Core 2 Duo?

    And what about how battery life is affected by boosting performance with Turbo Boost? I guess we'll have to wait for production models for more definitive answers (I'm basically waiting for the next-gen 13.3" MacBook Pro to replace my late-2006 MacBook Pro).

Log in

Don't have an account? Sign up now