NVIDIA Optimus Demonstration

So how well does Optimus actually work in practice? Outside of a few edge cases which we will mention in a moment, the experience was awesome. Some technophiles might still prefer manual control, but the vast majority of users will be extremely happy with the Optimus solution. You no longer need to worry about what video mode you're currently using, as the Optimus driver can switch dynamically. Even when you run several applications that can benefit from discrete graphics, we didn't encounter any anomalies. Load up a Flash video and the GPU turns on; load a CUDA application and the GPU stays on, even if you then close the Flash video. It's seamless and it takes the guesswork out of GPU power management.


NVIDIA provided a demonstration video showing second-generation switchable graphics compared to Optimus. We've uploaded the video to our server for your enjoyment. (Please note that QuickTime is required, and the sample video uses the H.264 codec so you'll need a reasonable CPU and/or GPU to view it properly.) At the Optimus Deep Dive, NVIDIA provided two other demonstrations of engineering platforms to show how well Optimus works. Sadly, we couldn't take pictures or record videos, but we can talk about the demonstrations.

The first demonstration was an open testbed notebook motherboard using engineering sample hardware. This definitely wasn't the sort of system you run at home, since there was a notebook LCD connected via a standard notebook power/video cable to the motherboard, exposed hardware, etc. The main purpose was to demonstrate how quickly the GPU turns on/off, as well as the fact that the GPU is really OFF. NVIDIA started by booting up Win7, at which point the mobile GPU is off. A small LED on the GPU board would light up when the GPU was on, and the fans would also spin. After Windows finished loading, NVIDIA fired up a simple app on the IGP and nothing changed. Next they started a 3D app and the GPU LED/fan powered up as the application launched; when they shut down the application, the LED/fan powered back off. At one point, with the GPU powered off, NVIDIA removed the GPU module from the system and disconnected its fan; they again loaded a simple application to demonstrate that the system is still fully functional and running of the IGP. (Had they chosen to launch a 3D application at this point, the system would have obviously crashed.) So yes, the GPU in an Optimus laptop is really powered down completely when it's not needed. Very cool!

The second demonstration wasn't quite as impressive, since no one removed a GPU from a running system. This time, Lenovo provided a technology demonstration for NVIDIA showing power draw while running various tasks. The test system was an engineering sample 17" notebook, and we weren't given other details other than the fact that it had an Arrandale CPU and some form of Optimus GPU. The Lenovo notebook had a custom application showing laptop power draw, updating roughly once per second. After loading Windows 7, the idle power was shown at 17W. NVIDIA launched a 3D app on the IGP and power draw increased to 32W, but rendering performance was quite slow. Then they launched the same 3D app on the dGPU and power use hit 39W, but with much better 3D performance. After closing the application, power draw dropped right back to 17W in a matter of seconds. At present there is no word on if/when this Arrandale-based laptop will ship, but it's a safe bet that if Lenovo can provide sample engineering hardware they're likely to ship Optimus laptops in the future.

The final "demonstration" is going to be more in line with what we like to see. Not only did and NVIDIA show us several running Optimus notebooks/laptops, but they also provided each of the attendees with an ASUS UL50Vf sample for review. The UL50Vf should be available for purchase today, and it sounds like the only reason NVIDIA delayed the Optimus launch until now was so that they could have hardware available for end-user purchase. The final part of our Optimus overview will be a review of the ASUS UL50Vf.

Optimus: Recognizing Applications ASUS UL50Vf Overview
Comments Locked

49 Comments

View All Comments

  • MasterTactician - Wednesday, February 10, 2010 - link

    But doesn't this solve the problem of an external graphics solution not being able to use the laptop's display? If the external GPU can pass the rendered frames back to the IGP's buffer via PCI-E than its problem solved, isn't it? So the real question is: Will nVidia capitalize on this?
  • Pessimism - Tuesday, February 9, 2010 - link

    Did Nvidia dip into the clearance bin again for chip packaging materials? Will the laptop survive its warranty period unscathed? What of the day after that?
  • HighTech4US - Tuesday, February 9, 2010 - link

    You char-lie droid ditto-heads are really something.

    You live in the past and swallow everything char-lie spews.

    Now go back to your church's web site semi-inaccurate and wait for your next gospel from char-lie.
  • Pessimism - Tuesday, February 9, 2010 - link

    So I suppose the $300+ million dollars in charges Nvidia took were merely a good faith gesture for the tech community with no basis in fact regarding an actual defect.
  • HighTech4US - Tuesday, February 9, 2010 - link

    The $300+ million was in fact a good faith measure to the vendors who bought the products that had the defect. nVidia backed extended warranties and picked up the total cost of repairs.

    The defect has been fixed long ago.

    So your char-lie comment as if it still exists deserves to be called what it is. A char-lie.
  • Pessimism - Tuesday, February 9, 2010 - link

    So you admit that a defect existed. That's more than can be said for several large OEM manufacturers.


  • Visual - Tuesday, February 9, 2010 - link

    Don't get me wrong - I really like the ability to have a long battery life when not doing anything and also have great performance when desired. And if switchable graphics is the way to achieve this, I'm all for it.

    But it seems counter-productive in some ways. If the external GPU was properly designed in the first place, able to shut down power to the unused parts of the processor, supporting low-power profiles, then we'd never have needed switching between two distinct GPUs. Why did that never happen?

    Now that Intel, and eventually AMD too, are integrating a low-power GPU inside the CPU itself, I guess there is no escaping from switchable graphics any more. But I just fail to see why NVidia or ATI couldn't have done it the proper way before.
  • AmdInside - Tuesday, February 9, 2010 - link

    Because it's hard to compete with a company that is giving away integrated graphics for free (Intel) in order to move higher priced CPUs. In a way, AMD is doing the same with ATI. Giving us great motherboards with ATI graphics and cheap cheap prices (which in many ways are much better than Intel's much higher priced offerings) in order to get you to buy AMD CPUs.
  • JarredWalton - Tuesday, February 9, 2010 - link

    Don't forget that no matter how pieces of a GPU go into a deep sleep state (i.e. via power gate transistors), you would still have some extra stuff receiving power. VRAM for example, plus any transistors/resistors. At idle the CULV laptops are down around 8 to 10W; even the WiFi card can suck up .5 to 1W and make a pretty substantial difference in battery life. I'd say the best you're likely to see from a discrete GPU is idle power draw that's around 3W over and above what an IGP might need, so a savings of 3W could be a 30% power use reduction.
  • maler23 - Tuesday, February 9, 2010 - link

    I've been waiting for this article ever since it was hinted in the last CULV roundup. The ASUS laptop is a little disappointing, especially the graphic card situation(the Alienware M11X kind of sucked up a lot of excitement there). Frankly, I'd just take a discounted UL-30VT and deal with manual graphics switching.

    Couple of questions:

    -Any chance for a review of the aforementioned Alienware M11X soon?

    -I've seen a couple of reviews with display quality comparisons including this one. How do to the Macbook and Macbook Pros fit into the rankings?

    cheers!

    -J

Log in

Don't have an account? Sign up now