Introducing the NVIDIA GeForce 600M Series

While the desktop-bound GeForce GTX 680 is undoubtedly the most exciting release from NVIDIA today and the true flagbearer for their new Kepler microarchitecture, NVIDIA actually has a whole host of releases ready to go on the notebook front. We've already had a chance to check out the GeForce GT 640M in action, but it's far from the only member of the old/new GeForce 600M series. Today we have details on their complete 600M series from top to bottom; some of it is exciting and new, and some of it is just the GPU industry up to its same old marketing tricks.

What we're essentially looking at within the GeForce 600M series are a couple of rebrands, one brand new piece of silicon, and one die shrunk last-generation GPU. NVIDIA has assembled a motley crew with a notable absence at the top of the heap.

At this point you should already be very familiar with the Fermi architecture, so the rebadges and die shrink aren't going to be that new to you; if you're not familiar, you can refresh yourself by checking out Ryan's launch article for the GeForce GTX 480 on the desktop. What you're probably really here for are details on Kepler's mobile variant. We couldn't share any details about it when we reviewed the Acer Aspire Timeline Ultra M3, and instead had to pussyfoot around what we knew of Kepler. This was made doubly difficult by the fact that NVIDIA's own Control Panel and GPU-Z didn't correctly recognize Kepler chips.

For a more detailed analysis of the Kepler architecture you'll want to check out Ryan's review of the GeForce GTX 680, but there are some essential differences between Kepler and Fermi that bear repeating here. The only Kepler chip in NVIDIA's current mobile lineup is the GK107, and it sports 384 of NVIDIA's CUDA cores; architecturally it's basically a quarter of the GK104 that powers the GTX 680. That's one quarter the cores and one half the memory bus. Unfortunately it's also difficult to determine how many ROPs or TMUs are powering it, but the most recent version of GPU-Z pegs it at 16 ROPs which would theoretically work out to 64 TMUs. That seems to fall in line with our experience. Kepler also ditches the separate shader domain that's been a part of NVIDIA's GPUs for a long time; the GK107's CUDA cores all run at the same clock as the rest of the GPU, so despite having as many cores as the GTX 580M, performance won't be quite that high.

What GK107 also brings to the table is NVIDIA Control Panel support for FXAA and TXAA, as well as NVIDIA's dedicated video encoding hardware NVENC. There's also DirectX 11.1 support and a notebook analog to the desktop's GPU Boost: the GK107 is able to dynamically increase its clock speed depending on the current thermal load within the notebook. We're not sure exactly how this is done, but it's essentially a GPU version of the Turbo Boost and Turbo Core technologies we've seen from Intel and AMD CPUs for some time. Finally, NVIDIA's Optimus graphics-switching technology makes its return.

NVIDIA was happy to announce a series of design wins with Ivy Bridge, but while we're privy to them we unfortunately can't share any details with you right now. We'll have to wait until Intel's Ivy Bridge embargo lifts for that information. Suffice it to say, expect to see a LOT of GeForce 600M GPUs on the market once Ivy Bridge is launched.

With that massive info dump out of the way, let's take a look at NVIDIA's 600M series proper.

The NVIDIA GeForce 600M Lineup
Comments Locked


View All Comments

  • Hubb1e - Thursday, March 22, 2012 - link

    I have Optimus and a 330M in my laptop and I can't get driver updates for it. So yeah, Optimus is cool for 6 months, but when the laptop gets retired so does your driver updates. At least with Gateway....
  • JarredWalton - Thursday, March 22, 2012 - link

    Which laptop is this? All Optimus laptops should be supported by the current release, see:

    If you're trying to update your drivers via Gateway, that's why you're not getting updates.
  • Grydelåg - Wednesday, March 28, 2012 - link

    With all respect Jarred.

    You should stick with the factory approved drivers.
    You are on your own with the driver from gforce site.
    If you use non factory aproved drivers you could also get cooling issues with the notebook.
    My Optimus ACER 5755g is really the most error filled Nvidia implementation I have ever owned (I own / have owned/used like 20+ Nvidia based cards over many years)
    There are real issues with optimus, just look at the official Nvidia forum, there are apart from the non professionals, some really skilled people who writes there.

    So I dont really see Optimus as a compeditive argument for Nvidia versus Ati as you state in the article.
    Best Rgds.
  • SInC26 - Tuesday, April 10, 2012 - link

    "Factory approved" drivers are just outdated Nvidia drivers that OEM's haven't bothered to update. You really should update your own graphics drivers for bug fixes (possibly with Optimus) and better performance. Driver updates also should not affect cooling performance.
  • setzer - Thursday, March 22, 2012 - link

    I agree, having a optimus enabled laptop with an intel hd3000 and a gf520, the only way to get decent battery life is to disable the geforce totally because seriously what is the point of the geforce 520M?
    There's better compatibility for gaming than the hd3000 but in either case they are both crappy graphics cards for games and as I work mostly under linux where Optimus isn't supported by nvidia the Optimus point is moot.
    Optimus makes sense if you are pairing a decent graphics card, say anything equal or above x40M with the integrated intel gpu, pairing low end cards is just pointless.
    What would be good is intel getting serious with gpu performance and offer at least 90% of the graphics performance and compatibility of the integrated amd apus with lower power usage. That way we could get rid of the lower tiers of mobile and desktop cards all together :)
  • JarredWalton - Thursday, March 22, 2012 - link

    If you buy a laptop with Optimus and a low-end GPU and then:

    1) Expect it to run games well
    2) Expect it to work with Linux

    ...then you didn't do your homework. But given you "mostly work under Linux" you're already in a diminishingly small minority. Optimus isn't fore every single use case, but really Linux is the only area where Optimus in my opinion is seriously flawed.

    As for Wolfpup, I know you get on here every time we mention Optimus and you spread the same old FUD. You have never once responded to my questions about specifically where Optimus has caused you problems. The reason, apparently, is that you've either never actually used Optimus, or you've never actually experienced a problem directly related to Optimus, and so you just wave your hands and say it's a horrible technology. Is Optimus perfect? Well, no, but then neither is any other technology. More to the point, given the alternatives I'll take Optimus every time. Which alternatives are those?

    1) Discrete only GPUs in laptops, where battery life gets killed. The lowest power dGPUs idle at around 3-4W, which is 50% more than where a modern Sandy Bridge laptop idles.
    2) IGP only laptops, which means forget about gaming on Intel. For AMD, Llano is actually decent, but it's still only good for our "Value 1366x768" settings.
    3) Manually switchable graphics, with driver updates dictated by your laptop vendor. Go ahead and ask how many driver updates people with manually switchable laptops have received over the lifetime of their laptop--usually it's one update if you're lucky, two at most.
    4) Manually switchable graphics with muxes also do the lovely flash and blink on the screen for anywhere from five to ten seconds every time you switch.

    Optimus gives you the battery life of the IGP when you don't need the dGPU, and the performance of the dGPU when you do need it. It gives you a switch that takes place in a fraction of a second with no screen flicker. You get driver updates from NVIDIA and Intel whenever they provide them and no need to wait for the laptop vendors to release an updated driver. The "problems" with Optimus are that you don't get alternative OS support, and you tend to lose some performance when running in high frame rate situations (e.g. >120 FPS). Since there's not much point running higher than 120FPS anyway, given the 60Hz refresh rates of most displays (120Hz on 3D displays), Linux support is really the only valid complaint. That's less than 5% of users on laptops/desktops, and NVIDIA has decided that they're willing to lose that business on laptops.
  • orionismud - Tuesday, March 27, 2012 - link

    Hopefully people interested in this get to your reply. Anandtech needs a comment system where you can like / dislike.
  • lurker22 - Thursday, March 22, 2012 - link

    This is why I haven't upgraded my 9800GT. I don't have the patience to spend a couple hours trying to figure out which is a good card anymore. When their numbering system isn't even in order for what is a "better card" it's just plain obnoxious.
  • MrSpadge - Thursday, March 22, 2012 - link

    They want you to "just get the bigger GBs"...
  • Warren21 - Thursday, March 22, 2012 - link

    I was helping a friend build a new budget desktop gaming PC and ran into this same age-old scenario.

    He came up with his own "killer!" build to compare to mine and had put a $75 GT 520 (2GB DDR3) in there as opposed to a $130 6850 (1GB GDDR5) in my original plan. I had to explain that the GT 520 is garbage in comparison...

    Ended up going for a $220 eVGA GTX 480 1.5GB, BNIB with 3 years of warranty. Awesome sale.

Log in

Don't have an account? Sign up now