ASUS UL50Vf Battery Life and Power

Battery Life - Idle

Battery Life - Internet

Battery Life - DivX 720p

Battery Life - x264 720p

Relative Battery Life

As with the UL80Vt (and all CULV laptops), battery life is a very strong selling point. Putting CULV into a 15.6" chassis and adding a discrete GPU wouldn't be the first choice of most users, however, and here we see the UL50Vf falling behind the UL80Vt. As far as we can tell, the major difference comes down to the LCD, and the result is that the UL80Vt is able to deliver anywhere from 15 (x264) to 215 (idle) minutes more battery life. The Internet test is probably the best overall indication of battery life in common usage scenarios, and even there the 14" UL80Vt delivers 11% more battery life.

This is not to say that the UL80Vt is the better laptop, of course; if the choice is UL50Vf with Optimus or UL80Vt with the second generation switchable graphics, we'd definitely recommend the UL50Vf. However, it does raise the question of why NVIDIA/ASUS would launch the 15.6" model first. Smaller models should follow soon, along with faster, more powerful laptops like the ASUS N61.

NVIDIA's presentation suggests that Optimus allows you to get the best of both worlds: performance as well as battery life. As the test results so far have shown, all of that is possible. However, do keep in mind that you still can't get performance at the same time as long battery life. If you fire up a game on virtually any laptop, even with IGP, battery drain increases substantially. We performed a test of exactly that sort of scenario and the UL50Vf delivered 178 minutes of run time—impressive compared to some larger, faster offerings, sure, but not something you're going to be able to use on a long plane ride or at a LAN party without plugging in.

ASUS UL50Vf Graphics Performance ASUS UL50Vf LCD Quality
Comments Locked

49 Comments

View All Comments

  • Hrel - Tuesday, February 9, 2010 - link

    Now that I've calmed down a little I should add that I'm not buying ANY gpu that doesn't support DX11 EVER again. We've moved past that; DX11 is necessary; no exceptions.
  • JarredWalton - Tuesday, February 9, 2010 - link

    I'm hoping NVIDIA calls me in for a sooper seekrit meeting some time in the next month or two, but right now they're not talking. They're definitely due for a new architecture, but the real question is going to be what they put together. Will the next gen be DX11? (It really has to be at this point.) Will it be a tweaked version of Fermi (i.e. cut Fermi down to a reasonable number of SPs), or will they tack DX11 functionality onto current designs?

    On a different note, I still wish we could get upgradeable notebook graphics, but that's probably a pipe dream. Consider: NVIDIA makes a new mGPU that they can sell to an OEM for $150 or something. OEM can turn that into an MXM module, do some testing and validation on "old laptops", and sell it to a customer for $300 (maybe even more--I swear the markup on mobile GPUs is HUGE!). Or, the OEM could just tell the customer, "Time for an upgrade" and sell them a new $1500 gaming laptop. Do we even need to guess which route they choose? Grrr....
  • Hrel - Tuesday, February 9, 2010 - link

    It STILL doesn't have a screen with a resolution of AT LEAST 1600x900!!! Seriously!? What do I need to do? Get up on roof tops and scream from the top of my lungs? Cause I'm almost to that point. GIVE ME USEABLE SCREENS!!!!!!!
  • MrSpadge - Wednesday, February 10, 2010 - link

    Not everyones eyes are as good as yours. When I asked some 40+ people if I got the location right and showed it to them via Google Maps on my HTC Touch Diamond they rfused to even think about it without their glasses.
  • strikeback03 - Thursday, February 11, 2010 - link

    I've never had people complain about using Google Maps on my Diamond. Reading text messages and such yes, and for a lot of people forget about using the internet since they have to zoom the browser so far in, but the maps work fine.
  • GoodRevrnd - Tuesday, February 9, 2010 - link

    Any chance you could add the Macbook / Pro to the LCD quality graphs when you do these comparisons?
  • JarredWalton - Tuesday, February 9, 2010 - link

    Tell Anand to send me a MacBook for testing. :-) (I think he may have the necessary tools now to run the tests, but so far I haven't seen any results from his end.)
  • MrSpadge - Tuesday, February 9, 2010 - link

    Consider this: Fermi and following high end chips are going to beasts, but they might accelerate scientific / engineering apps tremendously. But if I put one into my workstation it's going to suck power even when not in use. It's generating noise, it's heating the room and making the air stuffy. This could easily be avoided with Optimus! It's just that someone had to ditch the old concept of "desktops don't need power saving" even more. 20 W for an idle GPU is not OK.

    And there's more: if I run GP-GPU the screen refresh often becomes sluggish (see BOINC etc.) or the app doesn't run at full potential. With Optimus I could have a high performance card crunch along, either at work or BOINC or whatever, and still get a responsive desktop from an IGP!
  • Drizzt321 - Tuesday, February 9, 2010 - link

    Is there a way to set this to specifically only use IGP? So turn off the discrete graphics entirely? Like if I'm willing to suffer lower performance but need the extra battery life. I imagine if I could, the UL50Vf could equal the UL80Vt pretty easily in terms of battery life. I'm definitely all for the default being Optimus turned on...but lets say the IGP is more efficient at decoding that 720p or 1080p, yet NVidia's profile says gotta fire up the discrete GPU. There goes quite a bit of battery life!
  • kpxgq - Wednesday, February 10, 2010 - link

    depending on the scenario... the discrete gpu may use less power than the igp... ie say a discrete gpu working at 10% vs an igp working at 90%...

    kind of like using a lower gear at inclines uses less fuel than a higher gear going the same speed since it works less harder... the software should automatically do the math

Log in

Don't have an account? Sign up now