ASUS UL50Vf Battery Life and Power

Battery Life - Idle

Battery Life - Internet

Battery Life - DivX 720p

Battery Life - x264 720p

Relative Battery Life

As with the UL80Vt (and all CULV laptops), battery life is a very strong selling point. Putting CULV into a 15.6" chassis and adding a discrete GPU wouldn't be the first choice of most users, however, and here we see the UL50Vf falling behind the UL80Vt. As far as we can tell, the major difference comes down to the LCD, and the result is that the UL80Vt is able to deliver anywhere from 15 (x264) to 215 (idle) minutes more battery life. The Internet test is probably the best overall indication of battery life in common usage scenarios, and even there the 14" UL80Vt delivers 11% more battery life.

This is not to say that the UL80Vt is the better laptop, of course; if the choice is UL50Vf with Optimus or UL80Vt with the second generation switchable graphics, we'd definitely recommend the UL50Vf. However, it does raise the question of why NVIDIA/ASUS would launch the 15.6" model first. Smaller models should follow soon, along with faster, more powerful laptops like the ASUS N61.

NVIDIA's presentation suggests that Optimus allows you to get the best of both worlds: performance as well as battery life. As the test results so far have shown, all of that is possible. However, do keep in mind that you still can't get performance at the same time as long battery life. If you fire up a game on virtually any laptop, even with IGP, battery drain increases substantially. We performed a test of exactly that sort of scenario and the UL50Vf delivered 178 minutes of run time—impressive compared to some larger, faster offerings, sure, but not something you're going to be able to use on a long plane ride or at a LAN party without plugging in.

ASUS UL50Vf Graphics Performance ASUS UL50Vf LCD Quality
Comments Locked

49 Comments

View All Comments

  • MasterTactician - Wednesday, February 10, 2010 - link

    But doesn't this solve the problem of an external graphics solution not being able to use the laptop's display? If the external GPU can pass the rendered frames back to the IGP's buffer via PCI-E than its problem solved, isn't it? So the real question is: Will nVidia capitalize on this?
  • Pessimism - Tuesday, February 9, 2010 - link

    Did Nvidia dip into the clearance bin again for chip packaging materials? Will the laptop survive its warranty period unscathed? What of the day after that?
  • HighTech4US - Tuesday, February 9, 2010 - link

    You char-lie droid ditto-heads are really something.

    You live in the past and swallow everything char-lie spews.

    Now go back to your church's web site semi-inaccurate and wait for your next gospel from char-lie.
  • Pessimism - Tuesday, February 9, 2010 - link

    So I suppose the $300+ million dollars in charges Nvidia took were merely a good faith gesture for the tech community with no basis in fact regarding an actual defect.
  • HighTech4US - Tuesday, February 9, 2010 - link

    The $300+ million was in fact a good faith measure to the vendors who bought the products that had the defect. nVidia backed extended warranties and picked up the total cost of repairs.

    The defect has been fixed long ago.

    So your char-lie comment as if it still exists deserves to be called what it is. A char-lie.
  • Pessimism - Tuesday, February 9, 2010 - link

    So you admit that a defect existed. That's more than can be said for several large OEM manufacturers.


  • Visual - Tuesday, February 9, 2010 - link

    Don't get me wrong - I really like the ability to have a long battery life when not doing anything and also have great performance when desired. And if switchable graphics is the way to achieve this, I'm all for it.

    But it seems counter-productive in some ways. If the external GPU was properly designed in the first place, able to shut down power to the unused parts of the processor, supporting low-power profiles, then we'd never have needed switching between two distinct GPUs. Why did that never happen?

    Now that Intel, and eventually AMD too, are integrating a low-power GPU inside the CPU itself, I guess there is no escaping from switchable graphics any more. But I just fail to see why NVidia or ATI couldn't have done it the proper way before.
  • AmdInside - Tuesday, February 9, 2010 - link

    Because it's hard to compete with a company that is giving away integrated graphics for free (Intel) in order to move higher priced CPUs. In a way, AMD is doing the same with ATI. Giving us great motherboards with ATI graphics and cheap cheap prices (which in many ways are much better than Intel's much higher priced offerings) in order to get you to buy AMD CPUs.
  • JarredWalton - Tuesday, February 9, 2010 - link

    Don't forget that no matter how pieces of a GPU go into a deep sleep state (i.e. via power gate transistors), you would still have some extra stuff receiving power. VRAM for example, plus any transistors/resistors. At idle the CULV laptops are down around 8 to 10W; even the WiFi card can suck up .5 to 1W and make a pretty substantial difference in battery life. I'd say the best you're likely to see from a discrete GPU is idle power draw that's around 3W over and above what an IGP might need, so a savings of 3W could be a 30% power use reduction.
  • maler23 - Tuesday, February 9, 2010 - link

    I've been waiting for this article ever since it was hinted in the last CULV roundup. The ASUS laptop is a little disappointing, especially the graphic card situation(the Alienware M11X kind of sucked up a lot of excitement there). Frankly, I'd just take a discounted UL-30VT and deal with manual graphics switching.

    Couple of questions:

    -Any chance for a review of the aforementioned Alienware M11X soon?

    -I've seen a couple of reviews with display quality comparisons including this one. How do to the Macbook and Macbook Pros fit into the rankings?

    cheers!

    -J

Log in

Don't have an account? Sign up now