ASUS N53JF Battery Life: Not Bad for 48Wh, but Please Give Us 63Wh!

We’ve complained about the use of older and smaller 48Wh batteries in midrange notebooks quite a few times, but the N53JF continues that trend. That puts it at a definite disadvantage relative to Dell XPS’ 56Wh battery, as well as Compal’s 58Wh and Clevo’s 62Wh offerings. But higher capacity isn’t the only game in town; making better use of that capacity is still possible with BIOS and hardware optimizations, and ASUS does very well in this regard. Combined with NVIDIA’s Optimus Technology, we end up with roughly four hours of useful battery life, or just under 2.5 hours of H.264 playback.

ASUS also has their “Super Hybrid Engine” available, which underclocks the CPU and locks the maximum multiplier when enabled. We test with modified Power Saver settings normally, which already locks the CPU multiplier to the minimum, but SHE is still able to wring an extra few minutes out of the battery. We left it enabled for the battery life testing, so the results are the best-case scenario. We also disable any unnecessary utilities and software, and set the LCD to 100nits (45% in this case).

Battery Life - Idle

Battery Life - Internet

Battery Life - H.264 Playback

Relative Battery Life

Despite having a smaller battery, ASUS beats the Clevo notebook in two of the three tests. The Compal result on the other hand isn’t so surprising, given the always-on GPU. We commented in the past that the Dell XPS had a substandard Internet result (despite multiple test runs), and that shows up again as the one inexplicable loss to the ASUS. In general, though, battery life is decent and enough for typical users. You’ll still want the power brick handy if you play any games or want to watch a Blu-ray from an actual disc, however—we measured just 90 minutes playing an 35Mbps AVC Blu-ray video (Jumper).

Gaming Performance The LCD, Temperatures, and Noise
Comments Locked

65 Comments

View All Comments

  • JarredWalton - Tuesday, December 28, 2010 - link

    I do wipe off fingerprints, but those glossy bezels pick up every little touch and the flash photography tends to bring them out more than usual. You're not seriously going to complain about one photo (out of a couple dozen) where a few fingerprints are somewhat visible, are you?
  • therealnickdanger - Tuesday, December 28, 2010 - link

    I dunno, I took time out of my busy day at work to read an article about a laptop I didn't know existed 10 minutes ago and probably will never buy anyway because the perfect laptop that I want doesn't exist/costs too much. It really bothers me that you didn't take more time to be professional and do it perfect. Now I'm going to be tormented for the rest of the day about that photo and my overall productivity is going to suffer. Thanks a lot. BTW, Merry Christmas and Happy New Year, jerks.

    <hopefully obvious sarcasm>
  • DanNeely - Tuesday, December 28, 2010 - link

    I disagree. Years of simply saying glossy sucks when it's where it'll get fingerprints on it hasn't hammered the point home to the PHBs who write the laptop design specs. Perhaps if reviewers all start showing pictures of how disgusting it ends up looking after a week or two of use the point will finally get through.
  • hybrid2d4x4 - Wednesday, December 29, 2010 - link

    That's actually not a bad idea, but very ballsy/risky. I could see the manufacturers getting pissed at the 1st site that did that, stop sending them review units, and then no other site would do it out of fear of getting the cold shoulder. Then again, they don't seem to care about reviewers ranting about these issues in text, so maybe I'm worried over nothing. More likely, though, mfg's don't actually bother to read reviews of their own products...
  • KZ0 - Tuesday, December 28, 2010 - link

    "Mafia 2 manages 35FPS at 769p and 21.5FPS at 1080p"
    Guessing you meant 768p.

    Thanks for another good review.
  • radium69 - Tuesday, December 28, 2010 - link

    When are you going to contact MSI, to review their G series? Especially the older GX740.
    Can't beat the value and the performance ;)

    It's a shame you guys,seem totally ASUS minded the last couple of months...
  • cgeorgescu - Tuesday, December 28, 2010 - link

    People... full HD on a regular 22" makes for 100ppi, that's pretty comfortable, but on 15.6 it means 141ppi, that's a lot of pixels per inch. Don't tell me about the font scaling in Win7 cause FullHD@125% displays exactly like 1600x900@100%, no advantage if all screen elements are bigger, I don't get any extra screen real estate. Plus that the scaling doesn't work with all apps, there are plenty who don't scale at all.

    I'm very used with 1400x1050@15", 116ppi, but I wouldn't stand 141ppi all day long. Am I having problems with my eyes, is everybody else comfortable with fullHD on 15.5 (usage of 12h/day)?
  • DanNeely - Tuesday, December 28, 2010 - link

    I'm not. 1600x900 seems to be a lot rarer on 14/15" laptops than 1680x1050 was a few years ago. For that matter, has anyone reviewed the current crop of 1600x900's to see if they're good panels like most of the 1920x1080's or garbage like the 1366x768s?
  • JarredWalton - Tuesday, December 28, 2010 - link

    The two 1600x900 displays I've seen in the last year are both junk. I also think 1080p on 15.6" will be a stretch for the over-40 crowd, but I'm okay with it. Those who suggest we need 4K screens on laptops, though... I have problems with a 30" LCD at 2560x1600; what would it be like to have that resolution in 1/4 the area!?
  • DanNeely - Tuesday, December 28, 2010 - link

    Enough DPI that AA won't be needed much. GPUs capable of pushing that many pixels are some years down the pipeline though. According to the Eyefinity lead at ATI 3x25 mega pixel monitors placed to completely fill your field of view would have a high enough DPI that you'd be unable to resolve individual pixels with your eyes. At typical laptop distances an 8MP screen would probably be approaching that level.

Log in

Don't have an account? Sign up now