NVIDIA GeForce 500M Mobile Graphics Introduction

The difference between NVIDIA and AMD on the desktop can seem a bit blurry in places, but in notebooks it's night and day. That's not a measure of quality so much as a measure of a radical difference in both features and performance.

While NVIDIA leverages major benefits like a better mobile driver program and Optimus graphics switching technology, as well as corner cases with PhysX, CUDA, and 3D Vision (note that 3D notebooks using AMD hardware are also available), it's also where the most creative marketing is liable to surface. Once you realize what desktop GPUs are powering what notebook models, you begin to appreciate both just how dire the entry level on the desktop still is and how nutty NVIDIA's mobile branding has been.

Even though NVIDIA does have advantages (particularly Optimus across the entire 500M line), their lineup can seem downright bizarre compared to AMD's bloated one, and the specs for the GT 555M are honestly something profane. NVIDIA's also still leveraging roughly the same chips that were introduced with the entire Fermi line, though their progress at least isn't anywhere near as sluggish as the 9000M/100M/200M/300M era.

Another important difference is that while AMD and Intel's graphics hardware employ a single clock domain for the chip itself, NVIDIA's chips have had separate core and shader domains for some time now. As a result, there's a "core clock" that will refer to roughly everything on the GPU that isn't a shader or "CUDA core," and a "shader clock" that refers to the clocks of the "CUDA cores."

NVIDIA GeForce GT 520M/520MX
48 CUDA Cores, 8 TMUs, 4 ROPs, Core Clocks: 740MHz/900MHz (520M/520MX), Shader Clocks: 1480MHz/1800MHz (520M/520MX)
64-bit Memory Bus, DDR3, Effective Memory Clocks: 1.6GHz/1.8GHz (520M/520MX)
Desktop Counterpart: GeForce GT 520 (GF119)

One of the major places NVIDIA has beefed up their mobile line is a general lack of 64-bit memory buses. There wasn't a single one of those performance bottlenecks in the primary 400M line, but look everybody, it's back! The GT 520M and 520MX occupy the same space as the Mobility Radeon HD 6300 and 6400 series, as a dedicated chip for corner cases. It's also slower than the GT 420M it replaces, which had both double the CUDA cores and double the memory bus width. Basically inadequate for any kind of gaming, the 520s don't offer anything over the Sandy Bridge IGP that you don't get just by virtue of having NVIDIA hardware. (No review available.)

NVIDIA GeForce GT 525M/540M/550M
96 CUDA Cores, 16 TMUs, 4 ROPs, Core Clocks: 600MHz/672MHz/740MHz (525M/540M/550M), Shader Clocks: 1200MHz/1344MHz/1480MHz (525M/540M/550M)
128-bit Memory Bus, DDR3, Effective Memory Clocks: 1.8GHZ
Desktop Counterpart: GeForce GT 430 (GF108)

While the GF108 that powers these three largely indistinguishable chips is slightly slower than the already anemic desktop GeForce GT 240 with the same number of shaders, it's a healthy boost for the low-to-mid end. The differences between these three are strictly clock speeds, and anecdotal experience with overclocking mobile NVIDIA chips has generally been very positive, so odds are decent the enterprising end user with the skill for it can probably get the 525M to gain about 25 model points. At about the 540M mark, though, 1600x900 gaming starts becoming a real possibility. The chip is still hampered by the memory bus (and NVIDIA has had a harder time taming GDDR5 than AMD has), but it's an effective midrange solution. (The GeForce GT 425M in the Clevo B5130M review will be slightly slower than a GT 525M; the Dell XPS 15 L502x review has a GeForce GT 540M.)

NVIDIA GeForce GT 555M "A"
96 CUDA Cores, 16 TMUs, 4 ROPs, Core Clock: 753MHz, Shader Clocks: 1506MHz
128-bit Memory Bus, GDDR5, Effective Memory Clocks: 3138MHz
Desktop Counterpart: GeForce GT 440 GDDR5 (GF108)

And this is where NVIDIA's mobile lineup completely loses its mind. The GeForce GT 555M is actually two completely different chips and configurations; the "A" and "B" are our designation. Our "A" configuration is essentially just a souped-up version of the GT 525M/540M/550M, with a higher core clock and the benefit of GDDR5. While NVIDIA lists both versions on their site (though lacking an explanation as to why this split was made), a glance at NewEgg suggests this "A" version is the more common of the two (powering MSI and Lenovo laptops while the "B" version resides almost exclusively in an Alienware.) You can recognize the "A" version by the use of GDDR5, but since it and the "B" version are so bizarrely matched we can't really tell definitively which one would be the faster of the two. (No review available.)

NVIDIA GeForce GT 555M "B"
144 CUDA Cores, 24 TMUs, 24 ROPs, Core Clocks: 590MHz, Shader Clocks: 1180MHz
192-bit Memory Bus, DDR3, Effective Memory Clocks: 1.8GHz
Desktop Counterpart: None (GF106)

The other configuration of the GT 555M is a substantially beefier chip with six times the ROPs, but it operates at lower clocks and lower memory bandwidth due to the use of DDR3 instead of GDDR5. It's essentially a die-harvested version of GF106, and is identifiable by both the use of DDR3 and memory configurations of either 1.5GB of 3GB. It remains inexplicable why NVIDIA decided to use two completely different chips for the GT 555M, but hopefully this makes it a little easier to tell which is which. Raw calculations of pixel and texture fillrate suggest this "B" configuration to be the faster of the two, and as such it's probably the one to look for. Thus far we've only seen it in the Alienware M14x. (No review available.)

NVIDIA GeForce GTX 560M
192 CUDA Cores, 32 TMUs, 24 ROPs, Core Clocks: 775MHz, Shader Clock: 1550MHz
192-bit Memory Bus, GDDR5, Effective Memory Clock: 2.5GHz
Desktop Counterpart: GeForce GTX 550 Ti (GF116)

Admittedly with these clock speeds the GTX 560M probably performs roughly on par with the closely related GeForce GTS 450 (with the only major deficit being memory bandwidth) as opposed to the faster GTX 550 Ti, but it's still a force to be reckoned with in the mobile arena. The GTS 450 slotted in roughly between the desktop HD 5750 and 5770, while the GTX 460M traded blows with the Mobility Radeon HD 5850 and 5870. The extra 100MHz on the core in the 560M is bound to go a long way, and while we hope to get review hardware in soon, it's reasonable to assume it's at least competitive with the 6850M and 6870M if not outright faster. NVIDIA has scored several design wins with the GTX 560M, and it should really be the entry level for the serious mobile gamer, offering a strong balance between thermals and gaming performance. (Will be faster than the previous generation GeForce GTX 460M in our 460M-centric gaming notebook face-off.)

NVIDIA GeForce GTX 570M
336 CUDA Cores, 56 TMUs, 32 ROPs, Core Clock: 535MHz, Shader Clock: 1070MHz
192-bit Memory Bus, GDDR5, Effective Memory Clock: 3GHz
Desktop Counterpart: GeForce GTX 560 (GF114)

While the GTX 570M uses the same chip as the desktop GTX 560, it suffers a nearly 300MHz deficit on the core clock. It's still a strong upgrade from the GTX 560M, but second-fastest mobile GPUs seem to see much less success than their desktop counterparts and tend to be less common than the fastest model available. The 470M it replaces was extremely rare, but the 570M looks far better on paper and has at least one design win from MSI under its belt (as opposed to the largely Clevo-only 470M). (No review available.)

NVIDIA GeForce GTX 580M
384 CUDA Cores, 64 TMUs, 32 ROPs, Core Clock: 620MHz, Shader Clock: 1240MHz
256-bit Memory Bus, GDDR5, Effective Memory Clock: 3GHz
Desktop Counterpart: GeForce GTX 560 Ti (GF114)

Again operating at a substantially reduced clock than its desktop model, the GTX 580M nonetheless is the fastest mobile GPU available. The GTX 485M it replaces was generally about 10% faster on average than the Radeon HD 6970M, and the GTX 580M is a largely incremental update offering a minor increase in core clocks to go along with Optimus support (yes, your eleven pound gaming laptop can now live off of the battery.) But fastest is fastest, and if you want the best this is it...provided your pockets are deep enough. (Will be slightly faster than the previous generation GeForce GTX 485M, reviewed in a Clevo P170HM.)

AMD Radeon HD 6000M Graphics Recommendations and Conclusion
Comments Locked

85 Comments

View All Comments

  • ppeterka - Thursday, July 7, 2011 - link

    For most of the people portability is more than the distance from your couch to your kitchen. Try lugging that 10 pound beast with yourself on the underground, and try to fix up some slides in a PowerPoint, or try to fit it into the hand luggage when flying to a meeting.

    It might be new to you, and I risk ruining your optimistic world, but laptops are work equipment too. For quite some people... And as gaming notebooks are over-over-overpriced and then some, I find them useless unless someone is a traveling game hero... But there is a price in that case, and not only the pricetag, but several other crippling compromises must be made when going that route.

    For the price, you could get a decent Brazos based netbook to lug around, AND a fully fledged SLI/CF desktop. You're much better off with this, as I assume
    * you don't play interactive, 3d intensive games while cooking (which however Brazos would even support to a degree)
    * you won't plan on getting your gaming fix while underway

    Do you disagree with this?
  • rubbahbandman - Friday, July 8, 2011 - link

    I think you'd be surprised how affordable a good "gaming" laptop/desktop replacement is. I picked up the HP dv7-6143cl from Costco for only $875 along with a 2 yr warranty and it has some solid specs.

    2630qm, 8gb ram, 6770m, and you'd think with a 17.3" screen it would be heavy, but it weighs only 7lbs, less than a gallon of milk and that's in spite of the ridiculous 9 cell battery it has. (supposedly it can manage a 9.5 hr battery life).

    The native res is 1600x900 which isn't that special, but it works great for demanding games like Crysis 2. With the dx11 patch and high-res textures pack I can manage a solid 45-60fps, which is perfectly playable, and that pretty much sets the bar for challenging my system, so other than Crysis 2, I can crank up the res to my heart's content.
  • Mediamon - Sunday, July 10, 2011 - link

    Almost had me sold. Costco's HP configurator shows $1200 current price for rig with those same specs. A $325 difference. You must know someone at costco or playing a joke.
  • chinedooo - Monday, July 11, 2011 - link

    i would suggest hp's dv6t6100 straight from their website. $1025 with tax for a i7 2630, 15.6 in 1080p screen, 9 cell battery and hd 6770. this is after getting $450 off with a coupon which is pretty much always available. the thing weighs like 5.5 lbs. It really is a great laptop.
  • scook9 - Thursday, July 7, 2011 - link

    I got the 2920xm because it can overclock (at all) and has considerably better turbo options.

    We know you think it is idiotic, most people do. Because they either a) cant appreciate mobility AND power or b) cant afford it

    I never tried to argue that my M18x was a top value proposition ;) simply that bang for the buck is not there for the GTX 580m's vs the 6970m's

    A 12 pound laptop is about as powerful as a 50 pound desktop. Additionally, it already has the UPS, screen, mouse, keyboard built in (adding to value mind you). If you cannot handle moving a 12 pound laptop, you are just pathetic. End of story. The thing does not have to be a damned frisbee, but it is plenty portable. I have traveled all over the country with high end (large) laptops, it is perfectly doable.

    And as for your remark about being inferior to a desktop, I can share some benches if you still feel that way.

    Here is one, it can play Crysis MAXED out with all settings very high and max AA at 60 FPS on the native resolution. Don't spout off shit you have zero experience with, makes you look like the child you are.

    SO, at the end of your rant the only real complain I can come up with is price - yes I could have spent that $4000 on a desktop but I did not want to. Because I like being able to take my entire system with me wherever I go without having to think twice about it. My desktop is about 65 pounds by itself - THAT is not portable, a laptop (even if it weighs 20 pounds) is always portable.
  • jensend - Wednesday, July 6, 2011 - link

    You say "it's even a little difficult to recommend spending up for a notebook with anything less than a GeForce GT 540M or Radeon HD 6500M/6600M/6700M unless you really need the faster CPU on top of it." - but considering the pricing, the power consumption disadvantage, and Llano's strong performance I don't see why you'd go with a discrete AMD chip less powerful than Turks+gddr5. Why would you go for a (equal shader count) 6500M? Sure, there's more memory bandwidth, but you're sacrificing a good bit of wattage for not a heck of a lot of performance.
  • khimera2000 - Wednesday, July 6, 2011 - link

    My issue with this artticle is the touting of optimus, but the programe isint even supported that well. My notebook hasent seen a driver upgrade in the last 6 months. AMD might not have the dual graphics out all over, but you can bet that it will be better supported once all the bugs are nocked out.

    as it stands having intel and Nvidia play nice is really starting to chap my ass, and is becoming a fast reason to dump the intel Nvidia headach, and go for a pure amd build (once the drivers are mature enough of course).

    intel based optimus is broken, I wouldent outline the feature so much its missleading.

    agree with the rest though :)
  • RussianSensation - Wednesday, July 6, 2011 - link

    It would have been even more helpful if you guys included some benchmarks with the GPUs segregated into Mainstream and Performance category. I have a feeling the 6970M is the "best bang for the buck" on the high-end for mobile GPUs. The fact that 6970M also lives in the slim iMac chassis likely suggest that it also runs a lot cooler and is more energy efficient/has better power consumption than the 570M/580M chips.

    I feel that current stagnation at 40nm process has pretty much leveled GPU progress in both the mobile and desktop space. I foresee a major performance increase, especially on the mobile side in 12 months from now when we begin to see 28nm GPUs enter the marketplace.
  • Imnotrichey - Wednesday, July 6, 2011 - link

    I have never owned a laptop, but I have always wondered how will these things do if you have a home base set up at home (external monitor, external keyboard, mouse, etc.) for hardcore gaming but still want the portability of a laptop for work/school use.

    If plugged in, will these things be able to handle playing games on a 24 inch at 1900x1200? I am guessing not the latest graphically intense games (Crysis 2 for example) but what about like TF2, WOW, L4D and slightly older games like those?

    How much would you need to spend to handle gaming on an external monitor of that size? Sorry if this is a noob question, but thats always been my goal with a laptop but have never pulled the trigger. Might have to with grad school coming up soon.
  • randomusername3242 - Wednesday, July 6, 2011 - link

    For games such as WoW, TF2, L4D it is definitely possible. 1920 x 1080 at max settings is something a mid-tier mobile card could realistically do.

    For Crysis etc. you *can* make it work but it makes no sense. Like I posted above, you will overpay by $500-$1000 at least and the laptop will not even be portable in the end. It will be as portable as a concrete brick that weighs 10 lbs.

Log in

Don't have an account? Sign up now