Dell XPS 15 Battery Life

Dell uses a 9-cell, 65Wh integrated battery in the XPS 15 that’s not user replaceable, which is similar to what we’ve seen in the previous generation XPS 15z as well as the MacBook Pro 15. Apple uses a higher capacity battery, and under OS X the MBP15 will generally offer superior battery life, but unless something has changed since our last look we would expect greatly decreased battery life under Windows via Boot Camp. Since the XPS 15 is designed to run Windows from the ground up, there isn’t a problem with lack of power optimizations, and the result is competitive battery life. The LCD was set to 50% brightness (109 nits) for our battery testing—or nine steps down from maximum if you’re using the keyboard shortcut.

Battery Life - Idle

Battery Life - Internet

Battery Life - H.264 Playback

Battery Life Normalized - Idle

Battery Life Normalized - Internet

Battery Life Normalized - H.264

Idle battery life is just over seven hours, with a normalized result putting the XPS 15 ahead of everything except Ultrabooks and AMD’s Trinity. Put a more typical load on the laptops, like our Internet test, and we’re at five hours of useful battery life. If you’re doing lighter web surfing with a mix of office applications, you can expect somewhere around six hours of battery life. As for video playback, Dell manages just over four hours of 720p H.264 decoding; 1080p H.264 decoding drops that slightly to around 3.5 hours. That’s actually one of the lower results for battery life considering the battery capacity, though it’s worth noting that playback on higher resolution displays ends up being more taxing as there are more pixels to push. Overall, then, battery life is good and will be sufficient for most people to get through a day’s work without plugging in, particularly if they let the LCD turn off and/or the laptop go to sleep during periods of inactivity.

Dell XPS 15 Gaming Performance (with ThrottleStop) Dell XPS 15 LCD: Decent Contrast and Brightness, Mediocre Colors
Comments Locked

109 Comments

View All Comments

  • JarredWalton - Wednesday, July 25, 2012 - link

    I'll look into this when working on the "final" review -- e.g. when the next A05 BIOS is officially released. For gaming in general, I don't think it will matter too much, as most don't tax all four cores. Still, stranger things have happened.
  • yyrkoon - Wednesday, July 25, 2012 - link

    Well the reason why I say this Jarred. Is because of how I understand these CPU's throttle. If they do operate the way understand it. These should be able to clock higher with only two cores being used fully. Then a lot of games only really need 1-2 cores. But not all.

    I myself have tried this on a game that I know is CPU dependent. It did not increase performance for the game, but it does help with heat. Well, performance wise, it did help because I was able to overclock the processor. Then remain inside the same heat envelope.

    However, my system is based on an AMD A6-3400.
  • JarredWalton - Thursday, July 26, 2012 - link

    So I did a quick test just now. Setting Batman: AC affinity to cores 0-3 (or cores 0 and 2) resulted in throttling within the first 60 seconds or so of running the Batman benchmark. So I turned to ThrottleStop again and decided to go for broke and set the multiplier for "Turbo" (maximum) and disabled CPU PROC HOT. I reached a temperatures of 100C on the first two cores after running the benchmark three times, and while the laptop didn't crash I wouldn't be comfortable running those temps.

    Next, I dropped the ThrottleStop multiplier to 26X and retested. Cores one and two still hit 98C after a few loops, and performance wasn't any better or worse (89-90 FPS for our "Value" 768p Medium settings). Then I tried ThrottleStop with the multiplier set to 23X but without any affinity setting. Performance went up slightly (91-92 FPS), and all four CPU cores topped out at around 91C, so overall performance was slightly up and temps were slightly down by just restricting the multiplier more rather than using CPU affinity.

    Obviously, results for affinity will vary depending on game. Some games will benefit from additional cores (albeit slightly) and others really don't use more than two. If you're really hoping to control temperatures, though, setting a 23X multiplier as well as affinity should be a bit better than just TS alone.
  • yyrkoon - Friday, July 27, 2012 - link

    Jarred, thanks for taking the time to look into it.

    It is a shame that what I was thinking did not pan out. It was a shot in the dark to begin with. Based on personal experiences of my own. So I think what that confirms in my mind anyhow is that Dell needs to work on a much better cooling design for this series of laptops. Maybe just putting in a higher RPM fan will work too. Like I think you had suggested.

    Personally, I would not care if the case design were a bit thicker to allow for better cooling. Nor would I care if the laptop were a bit heavier too. But as I stated in another post, I am most likely not the norm in my laptop usage.
  • alfling - Wednesday, July 25, 2012 - link

    1) Please don't start here another "Apple fanboys vs Apple haters" battle like in most other reviews :)

    2) To the reviewer: many people experienced significant drops in download speed (upload keeps constant) when being out of line of sight from the router, while with other laptops (also older ones) keeps being good. Could you please try to walk away from the router and check for us?

    3) To the reviewer (again): I heard some people complaining that in white or very light screens (like Google homepage) they can clearly see the pixel grid of the display, but nothing official has come from Dell yet. Could you please tell us if you experience the same issue?

    Thank you in advance!
  • JarredWalton - Wednesday, July 25, 2012 - link

    WiFi connection speed over longer distances is a bit of a craps shoot, but I did read somewhere that Dell is working on tuning the WiFi performance as well. There are so many variables at play (just the type of router and the testing environment introduce all sorts of factors) that without doing a massive amount of work I couldn't say if the XPS 15 wireless is underperforming or not. I'll try to look into this a bit more for the final (next BIOS) review.

    Regarding the LCD, I don't see the grid when looking at static content, but as I noted in the review, moving windows around really shows some "fuzziness" on high contrast edges. I see similar behavior on most TN panels, and it's caused by the 6-bit to 8-bit dithering/interpolation AFAIK. Trying to capture this in a picture or video would unfortunately require a better camera/lens than I have. Anyway, the LCD is better than a lot of displays, but the ASUS N56VM/VZ 1080p panel is better IMO, and so is the old XPS L501x LCD (which had better colors and gamut as well). Will most people notice? Nope, but enthusiasts and screen connoisseurs might. The "dithering effect" doesn't bother me, but the bluish cast of the LCD is definitely noticeable.
  • alfling - Wednesday, July 25, 2012 - link

    Thank you very much for your prompt reply
  • rnmisrahi - Wednesday, November 14, 2012 - link

    Indeed, there are many problems with the wireless card. Unless you're very near your router, the speed slows down to 2 mbps, while other older machines give me 30 mbps downstream, of course.
    Look at this Youtube: http://www.youtube.com/all_comments?v=x-KFW7_UxJM
  • dragosmp - Wednesday, July 25, 2012 - link

    So what's the point of a quad core Core i7 and a discrete GPU if the chassis can't cool them? So you do have 4 cores than can potentially go to 2.8GHz, but if you try to actually use them they'll get throttled to 1.2GHz; or at 1.8GHz if this is as much as the chassis can take, and by the way thanks Jarred for doing this bit of investigative journalism. Unless they accelerate the fan further and/or modify the cooling/chassis, with all the BIOS engineers in the world they won't be able to pull more than ~1.8GHz.

    At this point I'm wondering, isn't the i7 a check box feature? From an engineering standpoint if the overall dissipation power of the chassis is xW you can take advantage of the thermal capacity and go over the xW for a certain period of time without passing the temperature threshold. Dell took this further: put a slim chassis with probably half the thermal capacitance of the old XPS 15, made it slimmer thus reduced the dissipation power and kept the same TDP CPU (which is itself surpassed while Turbo-ing). I wonder what if a 25W DC Core i5 would be faster than the 35W i7 in most apps, even heavy threaded apps, simply due to it keeping higher clocks per core.

    As a engineer I see no point in this, but if I were a seller I sure wouldn't want to be the only one that doesn't support the fastest CPU as pointless as that may be.
  • JarredWalton - Wednesday, July 25, 2012 - link

    I think the issue isn't the quad-core CPU so much as the total amount of power the cooling system needs to dissipate. If the GT 640M GDDR5 can use 40-45W of power (which seems about right) and the CPU uses up to 35W, then the cooling needs to be able to handle at least 75-80W of heat in order to avoid problems. Given what we're seeing with throttling, it looks like the cooling is probably only able to handle 60-65W, so something has to give.

    As far as the quad-core being useless, keep in mind that I never saw any throttling when running just CPU-intensive workloads. It's only the combination of CPU and GPU both being loaded where we run into issues. Games do that, and professional CAD/CAM type programs would do it as well, but a lot of other tasks aren't really going to be a problem I don't think. Even video editing probably doesn't put enough of a strain on the GPU to trigger throttling -- though I'll have to look into that later.

Log in

Don't have an account? Sign up now