ASUS N53JF Battery Life: Not Bad for 48Wh, but Please Give Us 63Wh!

We’ve complained about the use of older and smaller 48Wh batteries in midrange notebooks quite a few times, but the N53JF continues that trend. That puts it at a definite disadvantage relative to Dell XPS’ 56Wh battery, as well as Compal’s 58Wh and Clevo’s 62Wh offerings. But higher capacity isn’t the only game in town; making better use of that capacity is still possible with BIOS and hardware optimizations, and ASUS does very well in this regard. Combined with NVIDIA’s Optimus Technology, we end up with roughly four hours of useful battery life, or just under 2.5 hours of H.264 playback.

ASUS also has their “Super Hybrid Engine” available, which underclocks the CPU and locks the maximum multiplier when enabled. We test with modified Power Saver settings normally, which already locks the CPU multiplier to the minimum, but SHE is still able to wring an extra few minutes out of the battery. We left it enabled for the battery life testing, so the results are the best-case scenario. We also disable any unnecessary utilities and software, and set the LCD to 100nits (45% in this case).

Battery Life - Idle

Battery Life - Internet

Battery Life - H.264 Playback

Relative Battery Life

Despite having a smaller battery, ASUS beats the Clevo notebook in two of the three tests. The Compal result on the other hand isn’t so surprising, given the always-on GPU. We commented in the past that the Dell XPS had a substandard Internet result (despite multiple test runs), and that shows up again as the one inexplicable loss to the ASUS. In general, though, battery life is decent and enough for typical users. You’ll still want the power brick handy if you play any games or want to watch a Blu-ray from an actual disc, however—we measured just 90 minutes playing an 35Mbps AVC Blu-ray video (Jumper).

Gaming Performance The LCD, Temperatures, and Noise
Comments Locked

65 Comments

View All Comments

  • DanNeely - Wednesday, December 29, 2010 - link

    I wouldn't hold my breath. The theater's originally went widescreen (1.85 in the US, 1.6 in the EU) to differentiate themselves from the 1.33 aspect ratio of TV and offer something more than a giant screen to compensate tor the extremely expensive food and obnoxious idiots you had to share the theater with.

    1.85 isn't much more than 1.77 and with 3D poised to invade the living room as well it won't serve well as a differentiator. Unless the studios decide to throw the theaters under a bus I expect something wider to go mainstream even if they stop short of 2.39.
  • DanNeely - Tuesday, January 4, 2011 - link

    The end has begun, Vizio just launched a pair of 2.33:1 TVs at 2560x1080. I hope everyone is looking forward to their 2013 laptop running at 1400x600. It won't be deep enough to have a touchpad, so your lousy low contrast ultra-superglare LCD will be covered in fingrerprints from the touchscreen layer.

    http://ces.cnet.com/8301-32254_1-20027127-283.html
  • therealnickdanger - Tuesday, December 28, 2010 - link

    The 1080 resolution was a standard HD resolution in the 80s and 90s, long before flat-screen, fixed-pixel displays were even being sold.

    While you may argue that 1080p is a step backward in resolution from the 1600x1200 CRTs of yesteryear - not even my beloved (and perfectly calibrated) Sony FW900 24" CRT can hold a candle to the clarity of my 1080p LCDs. Not to mention the LCDs are thinner, lighter, and much cheaper. Plus, having a true 1:1 pixel ratio for HD content is so much better. My wife is a professional video effects editor and can attest to the benefit of 1080p displays for her own reasons as well.

    That's progress.

    The only regress I can think of with modern displays is the loss of refresh rates over 60Hz. That's the only reason I keep the FW900 - for gaming w/VSYNC @85Hz and up. Analog FTW in that case. More and more 120Hz and 240Hz LCDs are coming out, but without proper mainstream connectivity, what's the point? Meh to that.
  • ET - Wednesday, December 29, 2010 - link

    I agree that in some respects current displays are better than what we had ten years ago, but some things took a step back, and even if everything else was equal, it's not such significant progress. If I want a monitor that's better than 1920x1200 I need to pay a lot more than I did for the 1600x1200 19" monitor I bought 8 years ago, and it'd be a lot larger.

    One would have thought that by now it'd be possible to display high quality text and images on a PC monitor, but somehow we've degenerated into believing video is the only application that matters.

    I agree that for standard users, who do just web and content surfing, current monitors are a step up from what they had in past years (1024x768, 1280x1024), but anyone more demanding could ten years ago get something that was a step up yet took about the same space and didn't cost 5 times more.
  • chemist1 - Tuesday, December 28, 2010 - link

    Yup, what DanNeely said is right. Even with Blu-Ray, which represents the highest data rate currently available for consumer 1080p video (roughly twice what you get with terrestrial HD broadcasts, which in turn have higher data rates than cable, satellite, hulu, and netflix), the signal has to be compressed an amazing ~100:1 vs. a raw video feed! Only the cleverness of the compression algorithms, combined with the fact that large parts of a typical picture don't change much from frame to frame, allow this compression to still look good ---though it is still perceptually lossy on a high-end system (I understand Joe Kane did some studies to determine what data rate you would need to avoid all perceivable compression losses, but the results were for a private client and thus not published).

    Plus don' t forget that the current bandwidth limitations force compromises not just in spatial resolution, but also chromatic and temporal resolution. Blu-Ray movies today have 8-bit color (allowing for only 2^8=256 gradations). The standard does allow for higher color depth (up to 16 bit), but that means more data and, with the current bandwidth limit, that in turn would necessitate more compression. Likewise, at 60 fps we'd get more temporal resolution than we do at 24 or 30 fps, which would result in less blurring during fast action scenes. But if you go to 60 fps, you've got to give something else up.

    I.e., with the current bandwidth limitations, we're at about the limit of how much spatial resolution the system can offer, unless we want to increase compression artifacts or give up further on the already-compromised chromatic or temporal resolution.

    Don't get me wrong -- I have a 100" screen (JVC RS1 projector), and would love to see a consumer 4K format. But I'd also like to see at least a 12-bit 4:4:4 color space, and fewer compression artifacts---which is not going to happen until they can offer a bandwidth about an order of magnitude higher than what Blu-Ray currently offers.

    And unfortunately, a lot of video seems to be moving in the same direction as music -- less resolution for more convenience. So I think it may be a while before we see market pressure for a higher-resolution video format.
  • DanNeely - Tuesday, December 28, 2010 - link

    We also appear to be reaching the limits in what compression can offer. Over the summer I read that the team working on the H.265 algorithm were concerned that they'd only be able to reduce bitrates to 70% of current levels while maintaining quality levels vs the 50% target that they'd set when beginning the design process.
  • torgal - Tuesday, December 28, 2010 - link

    Well, and now Dell XPS 15 no longer have the 1080p upgrade (http://www.dell.com/us/p/xps-15/fs). Or have I got the wrong XPS 15?
  • jigglywiggly - Tuesday, December 28, 2010 - link

    hai guise my name is asus we make a good laptop and then ruin it by putting a POS LCD on it.
  • Kaboose - Tuesday, December 28, 2010 - link

    I think with sandy bridge on the horizon the majority of the people this laptop seems to be targeted at would be better off waiting a month or so for something more substantial for their ~$1000.
  • jabber - Tuesday, December 28, 2010 - link

    Surely it doesnt take 1 minute to wipe a product down before taking pics of it?

    Just makes it seem a little more pro.

Log in

Don't have an account? Sign up now