AVADirect was kind enough to provide us with our testing unit, a specially equipped Clevo W880CU, and a refresher of the notebook's configuration is below:

AVADirect Clevo W880CU Specifications
Processor Intel Core i7-820QM
(4x1.73GHz, 45nm, 8MB L3, Turbo to 3GHz, 45W)
Chipset Intel PM55
Memory 2x2GB DDR3-1333 (Max 2x4GB)
Graphics NVIDIA GeForce GTX 480M 2GB GDDR5
(352 CUDA Cores, 425MHz/800MHz/2.4GHz Core/Shader/RAM clocks)
Display 17.3" LED Glossy 16:9 1080p (1920x1080)
Hard Drive(s) Seagate Momentus XT 500GB 7200 RPM Hybrid Drive
(additional empty bay with RAID 0/1 capability)
Optical Drive Blu-ray Writer
Networking Gigabit Ethernet
Intel Centrino Ultimate-N 6300 (a/b/g/n)
Clevo Bluetooth
V.92 56K Modem
Audio Realtek ALC888/1200 HD Audio
4.1 speakers with line-in, mic, optical, and headphone jacks
Capable of 5.1
Battery 3-Cell, 12V, 48Wh battery
Operating System Windows 7 Home Premium 64-bit
Pricing $2936.80 as configured from AVADirect

We ran the W880CU through our usual lineup of Futuremark synthetic benchmarks, bouncing between four different versions of 3DMark and two different PCMarks. The matchup you'll want to watch is how the W880CU compares against the W860CUs; these three units are all equipped with an Intel Core i7-820QM processor and 4GB of DDR3, making them fairly ideal comparisons. The only difference that may effect scores is the use of the Corsair Nova SSDs in the W860s.

Futuremark 3DMark Vantage

Futuremark 3DMark Vantage

Futuremark 3DMark06

Futuremark 3DMark05

Futuremark 3DMark03

Futuremark PCMark Vantage

Futuremark PCMark05

The first thing to notice is that the GeForce GTX 480M takes the W880CU to the top of the class in almost every 3DMark benchmark; in fact, the newer the 3DMark gets, the wider the 480M's lead. The only exceptions are units equipped with dual-GPU solutions. PCMark is much less favorable, but the reduced scores are very likely attributable to the SSDs used in the higher scoring test systems.

So how does the GeForce GTX 480M fare in actual gaming scenarios?

The Fastest Mobile GPU in the World: NVIDIA's GeForce GTX 480M Mobile Gaming Showdown
Comments Locked

46 Comments

View All Comments

  • 7Enigma - Thursday, July 8, 2010 - link

    That's really cool. Thanks for the post. My cousin does Adobe work and I belive has the GTS 250 with either 512 or 1gig memory. I'll have to try this out the next time I'm over his place.
  • therealnickdanger - Thursday, July 8, 2010 - link

    The best part is that Adobe is aware of this tweak and has no plans to "turn it off". While using this method is not officially supported, it appears to be unofficially encouraged.

    :)
  • B3an - Thursday, July 8, 2010 - link

    Surely NV will be supporting CS5 with atleast the 4xx series? Why only have the 285GTX support it for non-workstation cards?
  • therealnickdanger - Friday, July 9, 2010 - link

    NVIDIA doesn't have much say in the matter. It's Adobe's software, Adobe's engine.

    The 4xx series works exceptionally well with the tweak, so it's a non-issue anyway.
  • Gunbuster - Thursday, July 8, 2010 - link

    Can we get a benchmark with a CrossfireX HD 5870 Laptop?
  • frozentundra123456 - Thursday, July 8, 2010 - link

    I like the idea of a 1000-1500 dollar gaming notebook for moderate gaming, but I dont think this notebook is anywhere near worth the price. For 3000 dollars, one could buy a mid level notebook for moderate gaming and buy/build a 1500 dollar desktop that would have excellent performance.
  • angelkiller - Thursday, July 8, 2010 - link

    I'm still not satisfied with their naming scheme. I do think this is a step in the right direction though. This time at least the name refers to the correct architecture. But the GTX 480M isn't a mobile version of a GTX 480. It's more like a GTX 465M. And this isn't just a Nvidia problem. The Mobility 5870 isn't a mobile version of a 5870.

    I think the idea of naming laptop cards after desktop cards is flawed to begin with. Instead, laptop cards should have their own series name. Then the name would never be misleading. Then the ATI Mobility <Series Name> could be based off the desktop Juniper chip and nobody would care. The name wouldn't refer to something that the card isn't. Hopefully that made sense.

    I also wanted to say that I've really been digging the articles AT has been putting out lately. Very thorough and informative analysis. Keep it up!
  • anactoraaron - Thursday, July 8, 2010 - link

    I completely agree. The 480M isn't "properly named". It should be named 465M.

    Also, I could care less who (nVidia or ATI) has the 'fastest' card as long as it's practical... MSI has a new laptop (reviewed here on AT) that still gets 2~3 hrs of battery life with a mobility 5870. In my mind, the superior product is the one that can actually be used not plugged in all of the time. And I don't need to re-hash all of the impractical reasons to get the desktop fermi... I still can't get the "epic fail" taste out of my mouth from this series of graphics cards from nVidia.
  • Dustin Sklavos - Friday, July 9, 2010 - link

    The thing is, at least the 480M is the same freaking silicon as the desktop 480. It may be crippled, but it's the same chip. The same can't be said about...well...pretty much anything else in Nvidia's mobile lineup. ATI was doing well in the 4 series, but their 5 series is nearly as bad. 5700s = desktop 5600s, 5800s = desktop 5700s.
  • therealnickdanger - Friday, July 9, 2010 - link

    That doesn't make sense. The desktop 470 and 465 are also "crippled" versions of the 480, but at least they are appropriately named. That's the point.

    "NVIDIA's GTX 480M uses the same cut-down core found in desktop GeForce GTX 465 cards."

    480M:
    352 CUDA Cores
    256-bit
    GDDR5

    GTX465 Desktop:
    352 CUDA Cores
    256-bit
    GDDR5

    GTX480 Desktop:
    480 CUDA Cores
    384-bit
    GDDR5

    So logically, if the 480M is the SAME as the desktop 465... then it should be called the 465M, not the 480M. Technically speaking, NVIDIA does NOT make a mobile GTX 480. It's misleading and just plain nonsense.

    ATI is no better.

Log in

Don't have an account? Sign up now