NVIDIA GeForce 500M Mobile Graphics Introduction

The difference between NVIDIA and AMD on the desktop can seem a bit blurry in places, but in notebooks it's night and day. That's not a measure of quality so much as a measure of a radical difference in both features and performance.

While NVIDIA leverages major benefits like a better mobile driver program and Optimus graphics switching technology, as well as corner cases with PhysX, CUDA, and 3D Vision (note that 3D notebooks using AMD hardware are also available), it's also where the most creative marketing is liable to surface. Once you realize what desktop GPUs are powering what notebook models, you begin to appreciate both just how dire the entry level on the desktop still is and how nutty NVIDIA's mobile branding has been.

Even though NVIDIA does have advantages (particularly Optimus across the entire 500M line), their lineup can seem downright bizarre compared to AMD's bloated one, and the specs for the GT 555M are honestly something profane. NVIDIA's also still leveraging roughly the same chips that were introduced with the entire Fermi line, though their progress at least isn't anywhere near as sluggish as the 9000M/100M/200M/300M era.

Another important difference is that while AMD and Intel's graphics hardware employ a single clock domain for the chip itself, NVIDIA's chips have had separate core and shader domains for some time now. As a result, there's a "core clock" that will refer to roughly everything on the GPU that isn't a shader or "CUDA core," and a "shader clock" that refers to the clocks of the "CUDA cores."

NVIDIA GeForce GT 520M/520MX
48 CUDA Cores, 8 TMUs, 4 ROPs, Core Clocks: 740MHz/900MHz (520M/520MX), Shader Clocks: 1480MHz/1800MHz (520M/520MX)
64-bit Memory Bus, DDR3, Effective Memory Clocks: 1.6GHz/1.8GHz (520M/520MX)
Desktop Counterpart: GeForce GT 520 (GF119)

One of the major places NVIDIA has beefed up their mobile line is a general lack of 64-bit memory buses. There wasn't a single one of those performance bottlenecks in the primary 400M line, but look everybody, it's back! The GT 520M and 520MX occupy the same space as the Mobility Radeon HD 6300 and 6400 series, as a dedicated chip for corner cases. It's also slower than the GT 420M it replaces, which had both double the CUDA cores and double the memory bus width. Basically inadequate for any kind of gaming, the 520s don't offer anything over the Sandy Bridge IGP that you don't get just by virtue of having NVIDIA hardware. (No review available.)

NVIDIA GeForce GT 525M/540M/550M
96 CUDA Cores, 16 TMUs, 4 ROPs, Core Clocks: 600MHz/672MHz/740MHz (525M/540M/550M), Shader Clocks: 1200MHz/1344MHz/1480MHz (525M/540M/550M)
128-bit Memory Bus, DDR3, Effective Memory Clocks: 1.8GHZ
Desktop Counterpart: GeForce GT 430 (GF108)

While the GF108 that powers these three largely indistinguishable chips is slightly slower than the already anemic desktop GeForce GT 240 with the same number of shaders, it's a healthy boost for the low-to-mid end. The differences between these three are strictly clock speeds, and anecdotal experience with overclocking mobile NVIDIA chips has generally been very positive, so odds are decent the enterprising end user with the skill for it can probably get the 525M to gain about 25 model points. At about the 540M mark, though, 1600x900 gaming starts becoming a real possibility. The chip is still hampered by the memory bus (and NVIDIA has had a harder time taming GDDR5 than AMD has), but it's an effective midrange solution. (The GeForce GT 425M in the Clevo B5130M review will be slightly slower than a GT 525M; the Dell XPS 15 L502x review has a GeForce GT 540M.)

NVIDIA GeForce GT 555M "A"
96 CUDA Cores, 16 TMUs, 4 ROPs, Core Clock: 753MHz, Shader Clocks: 1506MHz
128-bit Memory Bus, GDDR5, Effective Memory Clocks: 3138MHz
Desktop Counterpart: GeForce GT 440 GDDR5 (GF108)

And this is where NVIDIA's mobile lineup completely loses its mind. The GeForce GT 555M is actually two completely different chips and configurations; the "A" and "B" are our designation. Our "A" configuration is essentially just a souped-up version of the GT 525M/540M/550M, with a higher core clock and the benefit of GDDR5. While NVIDIA lists both versions on their site (though lacking an explanation as to why this split was made), a glance at NewEgg suggests this "A" version is the more common of the two (powering MSI and Lenovo laptops while the "B" version resides almost exclusively in an Alienware.) You can recognize the "A" version by the use of GDDR5, but since it and the "B" version are so bizarrely matched we can't really tell definitively which one would be the faster of the two. (No review available.)

NVIDIA GeForce GT 555M "B"
144 CUDA Cores, 24 TMUs, 24 ROPs, Core Clocks: 590MHz, Shader Clocks: 1180MHz
192-bit Memory Bus, DDR3, Effective Memory Clocks: 1.8GHz
Desktop Counterpart: None (GF106)

The other configuration of the GT 555M is a substantially beefier chip with six times the ROPs, but it operates at lower clocks and lower memory bandwidth due to the use of DDR3 instead of GDDR5. It's essentially a die-harvested version of GF106, and is identifiable by both the use of DDR3 and memory configurations of either 1.5GB of 3GB. It remains inexplicable why NVIDIA decided to use two completely different chips for the GT 555M, but hopefully this makes it a little easier to tell which is which. Raw calculations of pixel and texture fillrate suggest this "B" configuration to be the faster of the two, and as such it's probably the one to look for. Thus far we've only seen it in the Alienware M14x. (No review available.)

192 CUDA Cores, 32 TMUs, 24 ROPs, Core Clocks: 775MHz, Shader Clock: 1550MHz
192-bit Memory Bus, GDDR5, Effective Memory Clock: 2.5GHz
Desktop Counterpart: GeForce GTX 550 Ti (GF116)

Admittedly with these clock speeds the GTX 560M probably performs roughly on par with the closely related GeForce GTS 450 (with the only major deficit being memory bandwidth) as opposed to the faster GTX 550 Ti, but it's still a force to be reckoned with in the mobile arena. The GTS 450 slotted in roughly between the desktop HD 5750 and 5770, while the GTX 460M traded blows with the Mobility Radeon HD 5850 and 5870. The extra 100MHz on the core in the 560M is bound to go a long way, and while we hope to get review hardware in soon, it's reasonable to assume it's at least competitive with the 6850M and 6870M if not outright faster. NVIDIA has scored several design wins with the GTX 560M, and it should really be the entry level for the serious mobile gamer, offering a strong balance between thermals and gaming performance. (Will be faster than the previous generation GeForce GTX 460M in our 460M-centric gaming notebook face-off.)

336 CUDA Cores, 56 TMUs, 32 ROPs, Core Clock: 535MHz, Shader Clock: 1070MHz
192-bit Memory Bus, GDDR5, Effective Memory Clock: 3GHz
Desktop Counterpart: GeForce GTX 560 (GF114)

While the GTX 570M uses the same chip as the desktop GTX 560, it suffers a nearly 300MHz deficit on the core clock. It's still a strong upgrade from the GTX 560M, but second-fastest mobile GPUs seem to see much less success than their desktop counterparts and tend to be less common than the fastest model available. The 470M it replaces was extremely rare, but the 570M looks far better on paper and has at least one design win from MSI under its belt (as opposed to the largely Clevo-only 470M). (No review available.)

384 CUDA Cores, 64 TMUs, 32 ROPs, Core Clock: 620MHz, Shader Clock: 1240MHz
256-bit Memory Bus, GDDR5, Effective Memory Clock: 3GHz
Desktop Counterpart: GeForce GTX 560 Ti (GF114)

Again operating at a substantially reduced clock than its desktop model, the GTX 580M nonetheless is the fastest mobile GPU available. The GTX 485M it replaces was generally about 10% faster on average than the Radeon HD 6970M, and the GTX 580M is a largely incremental update offering a minor increase in core clocks to go along with Optimus support (yes, your eleven pound gaming laptop can now live off of the battery.) But fastest is fastest, and if you want the best this is it...provided your pockets are deep enough. (Will be slightly faster than the previous generation GeForce GTX 485M, reviewed in a Clevo P170HM.)

AMD Radeon HD 6000M Graphics Recommendations and Conclusion


View All Comments

  • Iketh - Tuesday, July 05, 2011 - link

    GT555M "B" is an option in the Dell XPS line Reply
  • anotherfakeaccount - Wednesday, July 06, 2011 - link

    This is true ^^ Reply
  • anotherfakeaccount - Wednesday, July 06, 2011 - link

    The Dell XPS 17 ONLY btw Reply
  • zackyy - Wednesday, July 06, 2011 - link

    Only on the 17inch brick Reply
  • barmalej - Wednesday, July 06, 2011 - link

    there is GT 555M "B" with 128bit bus (Clevo Clevo W150HR), and also GTX 570M uses 192bit memory bus Reply
  • Dustin Sklavos - Wednesday, July 06, 2011 - link

    Ack, thank you, fixed it. Reply
  • marc1000 - Wednesday, July 06, 2011 - link


    network terminology now? lol
  • Meaker10 - Thursday, July 07, 2011 - link

    I don't see it fixed. Also the 144 shader part with a 128bit mem bus has 16 Rops rather than 24 and is far more common than the GDDR5 part and have been around in clevo and (more importantly) Acer machines (who are the largest notebook maker after all) for far longer. Reply
  • Althernai - Wednesday, July 06, 2011 - link

    Just a word of warning about AMD GPUs in the latest Sandy Bridge laptops: AMD has moved from their manual GPU switching to a muxless, automatic switchable graphics scheme similar to Optimus, except that it doesn't work nearly as well. In particular, OpenGL applications (MineCraft, much of Adobe's content creation suite, etc.) will always run on the integrated GPU, regardless of what the user tries to force them to do.

    They tried to pull this trick without telling anyone and now there is a lot of angry people who got a laptop with a graphics card that refuses to work for their purposes:

    It's really a pity too because the combination of the 6770M and Sandy Bridge with switchable graphics is the best out there if you need a decent CPU, good battery life and a powerful GPU, but the latter only works for DirectX.
  • Wolfpup - Wednesday, July 06, 2011 - link

    Besides that, they can't use normal drivers on Intel CPUs either.

    I *HATE* all this switchable graphics stuff. As though it weren't a minor miracle this stuff worked at all, we're going to add all sorts of complexity to it?!?

Log in

Don't have an account? Sign up now