Application Performance and Battery Life

Just to be completely safe, we did run an additional three tests to see how the NVIDIA GeForce GTX 480M might affect system performance outside of gaming. The usual suspects were run: Futuremark's Peacekeeper, Cinebench R10, and our x264 video encoding test.

Internet Performance

3D Rendering—CINEBENCH R10

3D Rendering—CINEBENCH R10

Video Encoding—x264

Video Encoding—x264

In each situation, the W880CU with its Intel Core i7-820QM falls in line with the previously tested W860CU systems.

We recently had a discussion in our forums that ultimately degenerated a bit into the old NVIDIA vs. ATI war: is NVIDIA hardware a superior option if you'll be using Adobe software? Adobe and NVIDIA both proudly tout increased GPU reliance in Creative Suite 5, culminating in what Adobe calls its "Mercury Playback Engine" in Premiere Pro CS5, a playback system supposedly accelerated by CUDA.

I personally use Premiere Pro and After Effects CS5 for video work on my own desktop, equipped with an ATI Radeon HD 5870, and in none of the CS5 applications have I ever felt like I was missing any secret sauce. It's important to note that features aren't going to be disabled if you aren't running NVIDIA kit, but we figured we'd give the GTX 480M a chance to prove itself in Premiere Pro CS5.

That didn't happen. Presently the 480M isn't supported in CS5; in fact the only NVIDIA hardware supported by the Mercury Playback Engine are the GeForce GTX 285 and several of NVIDIA's expensive workstation-class cards. NVIDIA informs us that other GPUs like the 480M are not supported at launch but Adobe is planning on increasing the number of supported GPUs in the near future. How long that will take is difficult to say (Flash 10.1 took over six months to go from Beta to final release), but at some point in the future Adobe should patch in support for additional NVIDIA hardware. For now, that means we can't make a convincing case for the GTX 480M against the competition if you're going to be using Adobe CS5 software.

If you were looking for a healthy benefit to the 480M, though, you can check out how well it sips power at idle.

Battery Life—Idle

Battery Life—Internet

Battery Life—x264 720p

Relative Battery Life

It's true the 3-cell battery in most of these Clevo notebooks is essentially a UPS system, but the ability of NVIDIA's chips to power down so much at idle (ignoring the clear benefits of Optimus in other notebooks) is nonetheless appreciated. The more power a chip draws, the more heat it's liable to produce, and thus the harder the cooling system of the notebook is going to have to work. While you would almost never take this monster off its leash, at least the GTX 480M is pulling its weight by not chugging power from the anemic battery. It essentially matches the ASUS G73Jh in power requirements when unplugged, albeit with a battery that's half the capacity.

Do note that as with other Clevo notebooks, the W880CU will kick the GPU into "limp mode" on battery power, regardless of settings, so you're not going to be playing 3D games at high detail on battery power (for 20 minutes) even if you want to.

Mobile Gaming Showdown GTX 480M: Fast but Mixed Feelings
Comments Locked

46 Comments

View All Comments

  • james.jwb - Thursday, July 8, 2010 - link

    Forgot that I use magnification for this site. It's definitely the main cause of the huge performance hit, ouch! (dual-core, pretty fast machine really).

    I think it would be a lot easier if the space now used for the carousel became something static along the lines of Engadget's chunk for "top stories". It's nice to have something there to point out important reviews/news -- I wouldn't want to see the idea completely gone, it's just a carousel is so December 2009 :-)
  • Spoelie - Thursday, July 8, 2010 - link

    While it seems generally true that power keeps increasing from generation to generation (3870, 4870, 5870), wasn't the big drop from the HD2900 series conveniently left out to make that statement stick?

    It's not really that power always increases, there's a ceiling which was reached a few generations ago and the only thing you can say is that the latest generations are generally closer to that ceiling than most of the ones before it. What the desktop GTX480 pulls is about the most what we will ever see in a desktop barring some serious cooling/housing/power redesigns.
  • bennyg - Thursday, July 8, 2010 - link

    2900 was the P4 of the gfx card world regardings power/performance. It was only released because ATi had to have something, anything, in the marketplace. If ATi had as much cash in the bank as did Intel, they would have cancelled the 2900 like Intel did Larrabee.

    Thankfully the 2900 went on from its prematurity to underpin radeons 3, 4 and 5. Whereas Prescott was just brute force attempting to beat thermodynamics. Ask Tejas what won :)
  • JarredWalton - Thursday, July 8, 2010 - link

    That's why I said "generally trending up". When the HD 2900 came out, I'm pretty sure most people had never even considered such a thing as a 1200W PSU. My system from that era has a very large for its time 700W PSU for example. The point of the paragraph is that while desktops have a lot of room for power expansion, there's a pretty set wall on notebooks right now. Not that I really want a 350W power brick.... :)
  • 7Enigma - Thursday, July 8, 2010 - link

    Thank you for the article as many of us (from an interest standpoint and not necessarily from a buyer's standpoint) were waiting for the 480M in the wild.

    My major complaint with the article is that this is essentially a GPU review. Sure it's in a laptop since this is a notebook, but the only thing discussed here was the difference between GPU's.

    With that being the case why is there no POWER CONSUMPTION numbers when gaming? It's been stated for almost every AVA laptop that these are glorified portable desktop computers with batteries that are essentially used only for moving from one outlet to the next.

    I think the biggest potential pitfall for the new 480M is to see with performance only marginally better than the 5870 (disgusts me to even write that name due to the neutered design) is to see how much more power it is drawing from the wall during these gaming scenarios.

    Going along with power usage would be fan noise, of which I see nothing mentioned in the review. Having that much more juice needed under load should surely make the fan noise increased compared to the 5870....right?

    These are two very quick measurements that could be done to beef up the substance of an otherwise good review.
  • 7Enigma - Friday, July 9, 2010 - link

    Really no one else agrees? Guess it's just me then.....
  • JarredWalton - Friday, July 9, 2010 - link

    We're working to get Dustin a power meter. Noise testing requires a bit more hardware so probably not going to have that for the time being unfortunately. I brought this up with Anand, though, and when he gets his meter Dustin can respond (and/or update the article text).
  • 7Enigma - Monday, July 12, 2010 - link

    Thanks Jarred!

    For all the other laptop types I don't think it matters but for these glorified UPS-systems it would be an important factor when purchasing.

    Thanks again for taking the time to respond.
  • therealnickdanger - Thursday, July 8, 2010 - link

    "Presently the 480M isn't supported in CS5; in fact the only NVIDIA hardware supported by the Mercury Playback Engine are the GeForce GTX 285 and several of NVIDIA's expensive workstation-class cards."

    I did the following with my 1GB 9800GT and it's an incredible boost. Multiple HD streams with effects without pausing.

    http://forums.adobe.com/thread/632143

    I figured out how to activate CUDA acceleration without a GTX 285 or Quadro... I'm pretty sure it should work with other 200 GPUs. Note that i'm using 2 monitors and there's a extra tweak to play with CUDA seamlessly with 2 monitors.
    Here are the steps:
    Step 1. Go to the Premiere CS5 installation folder.
    Step 2. Find the file "GPUSniffer.exe" and run it in a command prompt (cmd.exe). You should see something like that:
    ----------------------------------------------------
    Device: 00000000001D4208 has video RAM(MB): 896
    Device: 00000000001D4208 has video RAM(MB): 896
    Vendor string: NVIDIA Corporation
    Renderer string: GeForce GTX 295/PCI/SSE2
    Version string: 3.0.0
    OpenGL version as determined by Extensionator...
    OpenGL Version 2.0
    Supports shaders!
    Supports BGRA -> BGRA Shader
    Supports VUYA Shader -> BGRA
    Supports UYVY/YUYV ->BGRA Shader
    Supports YUV 4:2:0 -> BGRA Shader
    Testing for CUDA support...
    Found 2 devices supporting CUDA.
    CUDA Device # 0 properties -
    CUDA device details:
    Name: GeForce GTX 295 Compute capability: 1.3
    Total Video Memory: 877MB
    CUDA Device # 1 properties -
    CUDA device details:
    Name: GeForce GTX 295 Compute capability: 1.3
    Total Video Memory: 877MB
    CUDA Device # 0 not choosen because it did not match the named list of cards
    Completed shader test!
    Internal return value: 7
    ------------------------------------------------------------
    If you look at the last line it says the CUDA device is not chosen because it's not in the named list of card. That's fine. Let's add it.

    Step 3. Find the file: "cuda_supported_cards.txt" and edit it and add your card (take the name from the line: CUDA device details: Name: GeForce GTX 295 Compute capability: 1.3
    So in my case the name to add is: GeForce GTX 295

    Step 4. Save that file and we're almost ready.

    Step 5. Go to your Nvidia Drivercontrol panel (im using the latest 197.45) under "Manage 3D Settings", Click "Add" and browse to your Premiere CS5 install directory and select the executable file: "Adobe Premiere Pro.exe"

    Step 6. In the field "multi-display/mixed-GPU acceleration" switch from "multiple display performance mode" to "compatibilty performance mode"

    Step 7. That's it. Boot Premiere and go to your project setting / general and activate CUDA
  • therealnickdanger - Thursday, July 8, 2010 - link

    Sorry, I should have said for ANY CUDA card with 786MB RAM or more. It's quite remarkable.

Log in

Don't have an account? Sign up now