The NVIDIA GeForce 600M Lineup

As mentioned perviously, NVIDIA's GeForce 600M series basically consists of rebadges, a die shrink, and the Kepler-based GK107. NVIDIA splits their mobile graphics into two categories (three if you count the anemic GeForce 610M): Performance and Enthusiast. Note that with almost every spec, NVIDIA lists them as "up to," so expect at least some wiggle room on core and memory clocks. Technically the memory bus and memory type should be consistent across implementations, though (with the exception of the GT 640M LE). These are their Enthusiast-class GTX GPUs:

  GeForce GTX 675M GeForce GTX 670M GeForce GTX 660M
GPU and Process 40nm GF114 40nm GF114 28nm GK107
CUDA Cores 384 336 Up to 384
GPU Clock 620MHz 598MHz 835MHz
Shader Clock 1240MHz 1196MHz -
Memory Eff. Clock 3GHz 3GHz 4GHz
Memory Bus 256-bit 192-bit 128-bit
Memory Bandwidth 96GB/s 72GB/s 64GB/s
Memory Up to 2GB GDDR5 Up to 3GB GDDR5 Up to 2GB GDDR5

What we're looking at, essentially, are the GeForce GTX 580M and 570M being rebadged as the 675M and 670M; the 670M sees a minor clock bump from 575MHz but these are basically the same top-end that users have been enjoying for a while now. That's not necessarily a bad thing as the 580M and 570M are capable performers, but it certainly does leave room for a new top-end mobile GPU (GTX 680M, anyone?) at some point in the future.

Meanwhile, GK107 is pushed about as hard as it can be with the GTX 660M. We're still not certain on the actual core count as NVIDIA is being so liberal with their use of "Up to" clauses. If we assume the 660M will be the highest clocked mobile variant at launch, it will likely use all of the available cores while the lower end models will potentially trim down the number of active cores. According to NVIDIA, however, there's also some flexibility with the core counts and clock speeds, with the end goal being to deliver performance within a relatively tight range; more on this in a moment.

Astute observers will note that NVIDIA actually already has a couple of 600M series GPUs in the wild; these are rebadges of existing 40nm GF108-based GPUs, and you'll see them in the next chart which represents the top half of NVIDIA's Performance (GT) line. Note also that NVIDIA isn't providing spec memory clocks for any of these chips; they're all "up to" the values shown below.

  GeForce GT 650M GeForce GT 640M GeForce GT 640M LE
GPU and Process 28nm GK107 28nm GK107 28nm GK107 40nm GF108
CUDA Cores Up to 384 Up to 384 Up to 384 96
GPU Clock 850MHz 625MHz 500MHz 762MHz
Shader Clock - - - 1524MHz
Memory Bus 128-bit 128-bit 128-bit 128-bit
Memory Bandwidth Up to 64GB/s Up to 64GB/s Up to 28.8GB/s Up to 50.2GB/s
Memory Up to 2GB DDR3
or GDDR5
Up to 2GB DDR3
or GDDR5
Up to 2GB DDR3 Up to 2GB DDR3
or GDDR5

So starting with the GT 650M, what's weird is that the GT 650M is, at least on paper, theoretically capable of being a faster chip than the GTX 660M. We'd guess it will either have far fewer than 384 CUDA cores or will run at lower than 850MHz clocks. We've also seen the Acer M3 with the GT 640M, which did have 384 cores clocked at 625MHz. (It also used DDR3 and was paired with a ULV CPU so it doesn't represent the maximum performance we're likely to see from the GT 640M.) Note that all of the announced 28nm Kepler parts currently use the GK107 core, but NVIDIA has not provided details on the exact core counts yet. In fact, let's just get right into the crux of the problem.

At present, NVIDIA is not disclosing the exact configuration of the various GK107 parts, which means we don't know what the granularity for disabling/harvesting die will be. If GK107 uses the same 192 core SMX/GPC as GK104, we'd likely see 192 core or 384 core variants, and the charts right now suggest everything will be 384 cores with just differing clocks. With the smaller die size there's also a possibility that the chips will consist of either four 96 core GPC/SMX units or eight 48 core GPC/SMX units, and those would be the smallest functional block that can be disabled. Considering NVIDIA lists GTX 660M as "up to 835MHz" and GT 650M as "up to 850MHz", with both being "up to 384 cores", that suggests that perhaps there's more granularity available than 192 core blocks. GT 650M could have 336 cores at 850MHz or 384 cores at 740MHz and both would provide approximately the same performance. However, until we can get more information (or the parts are actually found in the wild), we can't say for sure what clocks or core counts the GK107 GPUs will use. This leads us into the next topic for these parts.

Yes, NVIDIA is up to their old tricks again with the GeForce GT 640M LE (and given some of the above, we might see even more variations on the other parts as well). I thought we were over this after the marketing nightmare that is the GeForce GT 555M. That said, if history has taught us anything, it's that any chip that supports both DDR3 and GDDR5 is almost always going to be running DDR3 once you get into this performance bracket. I'm honestly not sure how we're going to be able to tell the two GT 640M LE's apart in the marketplace, though, outside of waiting for reviews to surface, and that bothers me. Our best advice is to make sure you research what you're getting if you want faster GPU performance.

  GeForce GT 635M GeForce GT 630M GeForce GT 620M
GPU and Process 40nm GF116 28nm GF117/40nm GF108 28nm GF117
CUDA Cores 96/144 96 96
GPU Clock 675MHz 800MHz 625MHz
Shader Clock 1350MHz 1600MHz 1250MHz
Memory Bus 192-bit 128-bit 128-bit
Memory Up to 2GB DDR3/GDDR5 Up to 2GB DDR3 Up to 1GB DDR3

Speaking of the GeForce GT 555M, it's basically been rebadged as the GeForce GT 635M. Note that while NVIDIA's spec sheet lists it as only supporting GDDR5, models with DDR3 are already out in the wild. Either way, the 635M is basically a holdover from the last generation and at the risk of speculating, I wouldn't expect to see it in any great volume. NVIDIA has more profitable chips to sell, and those more profitable chips are also liable to be better citizens in terms of performance-per-watt.

Unfortunately, the GT 630M is another problem child. The 28nm variant is likely going to be much more compelling than its 40nm counterpart, as NVIDIA is estimating that the shrink basically cuts the power consumption of the chip in half while delivering the same level of performance (better actually) than last generation's very popular GeForce GT 540M. Unfortunately, just like the two GT 640M LEs, there's just no way to tell which version you're going to be getting. Ultimately, we expect the 40nm parts to all disappear and be replaced by 28nm variants, but we'll have to wait and see how that plays out.

By the way, that 28nm replacement of GF108 may not initially seem very compelling, but it should actually be a great and inexpensive option for getting decent graphics performance without requiring much in the way of cooling (let alone power consumption). The GT 540M has been a perfectly adequate performer this generation, and having that now become the baseline for mobile graphics performance at half the power draw is a good thing. The codename appears to be GF117 and NVIDIA is keeping many of the details close to their chest, but architecturally it's not simply a die shrink of GF108 and should include some additional enhancements that take advantage of the move to 28nm. Of course, die shrinks are never "simple", so just what has been enhanced remains to be seen.

Update: NVIDIA has now posted their spec pages for the above GPUs. I've gone ahead and linked them in the above table and updated a few items. Worth noting is that the GT 650M now lists clocks of 850MHz with DDR3 and 735MHz with GDDR5. It looks like both versions will have 384 cores, so OEMs will choose between more computational power and less bandwidth (DDR3) or less computational power and more bandwidth (GDDR5). NVIDIA suggested that their goal is to keep products with the same name within ~10% performance range, and the tradeoffs listed should accomplish that goal. I'm also inclined to think GK107 consists of two 192 core blocks now, as every product page using that core only states 384 cores, with the exception of the GT 640M LE, but we know 640M LE will have both 40nm and 28nm variants. In general, we'd suggest going with the 28nm GDDR5 configurations when possible, as 128-bit DDR3 has been a bit of a bottleneck for even 96 core GF108, never mind the improved Kepler chips.

Introducing the NVIDIA GeForce 600M Series Conclusion: Bring On the GeForce 600Ms
POST A COMMENT

25 Comments

View All Comments

  • Hubb1e - Thursday, March 22, 2012 - link

    I have Optimus and a 330M in my laptop and I can't get driver updates for it. So yeah, Optimus is cool for 6 months, but when the laptop gets retired so does your driver updates. At least with Gateway.... Reply
  • JarredWalton - Thursday, March 22, 2012 - link

    Which laptop is this? All Optimus laptops should be supported by the current release, see: http://www.geforce.com/drivers/results/42588

    If you're trying to update your drivers via Gateway, that's why you're not getting updates.
    Reply
  • Grydelåg - Wednesday, March 28, 2012 - link

    With all respect Jarred.

    1.
    You should stick with the factory approved drivers.
    You are on your own with the driver from gforce site.
    If you use non factory aproved drivers you could also get cooling issues with the notebook.
    2..
    My Optimus ACER 5755g is really the most error filled Nvidia implementation I have ever owned (I own / have owned/used like 20+ Nvidia based cards over many years)
    3.
    There are real issues with optimus, just look at the official Nvidia forum, there are apart from the non professionals, some really skilled people who writes there.

    So I dont really see Optimus as a compeditive argument for Nvidia versus Ati as you state in the article.
    Best Rgds.
    Reply
  • SInC26 - Tuesday, April 10, 2012 - link

    "Factory approved" drivers are just outdated Nvidia drivers that OEM's haven't bothered to update. You really should update your own graphics drivers for bug fixes (possibly with Optimus) and better performance. Driver updates also should not affect cooling performance. Reply
  • setzer - Thursday, March 22, 2012 - link

    I agree, having a optimus enabled laptop with an intel hd3000 and a gf520, the only way to get decent battery life is to disable the geforce totally because seriously what is the point of the geforce 520M?
    There's better compatibility for gaming than the hd3000 but in either case they are both crappy graphics cards for games and as I work mostly under linux where Optimus isn't supported by nvidia the Optimus point is moot.
    Optimus makes sense if you are pairing a decent graphics card, say anything equal or above x40M with the integrated intel gpu, pairing low end cards is just pointless.
    What would be good is intel getting serious with gpu performance and offer at least 90% of the graphics performance and compatibility of the integrated amd apus with lower power usage. That way we could get rid of the lower tiers of mobile and desktop cards all together :)
    Reply
  • JarredWalton - Thursday, March 22, 2012 - link

    If you buy a laptop with Optimus and a low-end GPU and then:

    1) Expect it to run games well
    2) Expect it to work with Linux

    ...then you didn't do your homework. But given you "mostly work under Linux" you're already in a diminishingly small minority. Optimus isn't fore every single use case, but really Linux is the only area where Optimus in my opinion is seriously flawed.

    As for Wolfpup, I know you get on here every time we mention Optimus and you spread the same old FUD. You have never once responded to my questions about specifically where Optimus has caused you problems. The reason, apparently, is that you've either never actually used Optimus, or you've never actually experienced a problem directly related to Optimus, and so you just wave your hands and say it's a horrible technology. Is Optimus perfect? Well, no, but then neither is any other technology. More to the point, given the alternatives I'll take Optimus every time. Which alternatives are those?

    1) Discrete only GPUs in laptops, where battery life gets killed. The lowest power dGPUs idle at around 3-4W, which is 50% more than where a modern Sandy Bridge laptop idles.
    2) IGP only laptops, which means forget about gaming on Intel. For AMD, Llano is actually decent, but it's still only good for our "Value 1366x768" settings.
    3) Manually switchable graphics, with driver updates dictated by your laptop vendor. Go ahead and ask how many driver updates people with manually switchable laptops have received over the lifetime of their laptop--usually it's one update if you're lucky, two at most.
    4) Manually switchable graphics with muxes also do the lovely flash and blink on the screen for anywhere from five to ten seconds every time you switch.

    Optimus gives you the battery life of the IGP when you don't need the dGPU, and the performance of the dGPU when you do need it. It gives you a switch that takes place in a fraction of a second with no screen flicker. You get driver updates from NVIDIA and Intel whenever they provide them and no need to wait for the laptop vendors to release an updated driver. The "problems" with Optimus are that you don't get alternative OS support, and you tend to lose some performance when running in high frame rate situations (e.g. >120 FPS). Since there's not much point running higher than 120FPS anyway, given the 60Hz refresh rates of most displays (120Hz on 3D displays), Linux support is really the only valid complaint. That's less than 5% of users on laptops/desktops, and NVIDIA has decided that they're willing to lose that business on laptops.
    Reply
  • orionismud - Tuesday, March 27, 2012 - link

    Hopefully people interested in this get to your reply. Anandtech needs a comment system where you can like / dislike. Reply
  • lurker22 - Thursday, March 22, 2012 - link

    This is why I haven't upgraded my 9800GT. I don't have the patience to spend a couple hours trying to figure out which is a good card anymore. When their numbering system isn't even in order for what is a "better card" it's just plain obnoxious. Reply
  • MrSpadge - Thursday, March 22, 2012 - link

    They want you to "just get the bigger GBs"... Reply
  • Warren21 - Thursday, March 22, 2012 - link

    I was helping a friend build a new budget desktop gaming PC and ran into this same age-old scenario.

    He came up with his own "killer!" build to compare to mine and had put a $75 GT 520 (2GB DDR3) in there as opposed to a $130 6850 (1GB GDDR5) in my original plan. I had to explain that the GT 520 is garbage in comparison...

    Ended up going for a $220 eVGA GTX 480 1.5GB, BNIB with 3 years of warranty. Awesome sale.
    Reply

Log in

Don't have an account? Sign up now