Conclusion: Bring On the GeForce 600Ms

While there's a decent amount of the kind of branding chicanery we've come to really dislike in the 600M series, we have a feeling most of those rebranded chips are going to wind up being brushed aside. They're not liable to be as profitable as the 28nm GPUs once yields get up there, making them less compelling for NVIDIA to sell, and they're holdovers in terms of thermal requirements that are liable to be less compelling for OEMs. In fact, without giving too much away, the list of OEM wins in our reviewer's guide that are under embargo pretty much confirms it: the bulk of the systems on our list are using the 640M on up.

Of course, what's really telling is what's missing from the list: a GeForce GTX 680M. It's tough to complain too much about the GeForce GTX 580M getting a second wind as the GTX 675M (naming shenanigans notwithstanding); the top end of mobile graphics has actually been pretty healthy since the GeForce GTX 485M and AMD Radeon HD 6950M launched. But given that the Kepler-based GK107 powering a good chunk of the 600M series possesses only a quarter of the shader power of its big brother, we expect another Kepler GPU will fill in the gap.

At the same time, it wouldn't be unreasonable to expect a cut down GK104 to materialize as the GTX 680M; the desktop GTX 680 only has a TDP of 195 watts, and some careful binning and pruning of clocks (keep in mind that the desktop card is running the GPU at 1GHz and the power-hungry GDDR5 at a staggering 6GHz) could theoretically produce a competitive top-end notebook GPU. It wouldn't be unheard of; NVIDIA's crammed cut down GF100/GF110 Fermi chips into notebooks with a 100W TDP, and the GTX 680 is already very close to that level. Give NVIDIA some time to make a bunch of money selling all the GTX 680 cards they can to early adopters and then we're likely to start seeing trickle down parts, including our presumed GTX 680M.

Regardless, we do have a pair of very compelling products on the table right now: the GK107 powering the GT 640M, 650M, and GTX 660M, and the 28nm replacement for GF108 at the bottom of the list. (Again, note that this isn't a straight die shrink as there are other changes.) We've already seen that the GeForce GT 640M can produce the kind of gaming experience NVIDIA claims in our own testing, and it stands to reason there's a decent amount of performance waiting to be unlocked by a jump to GDDR5 in higher-end parts, not to mention pairing the GPU with a faster Ivy Bridge (non-ULV) processor. Meanwhile, the 28nm Fermi part provides a substantial jump in performance for the bottom end of the list, allowing for a halfway decent replacement for the terminally awful GF119 (GT 520M/520MX) that's taken up residence in a few popular notebooks.

All that remains to be seen is how AMD is going to respond. With the low idle power draw of the Southern Islands chips, AMD at least has some of the pieces in place, but they really need something that competes directly with Optimus—not just on the switching technology, but on reference driver updates as well. Meanwhile, Turks is already getting long in the tooth and would likely need a die shrink to stay competitive with the 600M series. That's before we even talk about the abnormally popular 6400M series, which will hopefully just be obsoleted entirely by both Ivy Bridge's IGP and the GeForce GT 620M. But Cape Verde and Pitcairn both bode well for the mobile market; the 7750's 55W TDP makes it an excellent candidate for mobile deployment, while Pitcairn can have its clocks shaved just enough to make a formidable top-end notebook GPU. Either way, with the entirety of the current Radeon HD 7000M series just being rebrands of the 6000M (all the way up to the 7690M), AMD will need to step their game up. Hopefully as we get closer to the Ivy Bridge launch we'll see what they have in store.

The NVIDIA GeForce 600M Lineup
POST A COMMENT

25 Comments

View All Comments

  • Hubb1e - Thursday, March 22, 2012 - link

    I have Optimus and a 330M in my laptop and I can't get driver updates for it. So yeah, Optimus is cool for 6 months, but when the laptop gets retired so does your driver updates. At least with Gateway.... Reply
  • JarredWalton - Thursday, March 22, 2012 - link

    Which laptop is this? All Optimus laptops should be supported by the current release, see: http://www.geforce.com/drivers/results/42588

    If you're trying to update your drivers via Gateway, that's why you're not getting updates.
    Reply
  • Grydelåg - Wednesday, March 28, 2012 - link

    With all respect Jarred.

    1.
    You should stick with the factory approved drivers.
    You are on your own with the driver from gforce site.
    If you use non factory aproved drivers you could also get cooling issues with the notebook.
    2..
    My Optimus ACER 5755g is really the most error filled Nvidia implementation I have ever owned (I own / have owned/used like 20+ Nvidia based cards over many years)
    3.
    There are real issues with optimus, just look at the official Nvidia forum, there are apart from the non professionals, some really skilled people who writes there.

    So I dont really see Optimus as a compeditive argument for Nvidia versus Ati as you state in the article.
    Best Rgds.
    Reply
  • SInC26 - Tuesday, April 10, 2012 - link

    "Factory approved" drivers are just outdated Nvidia drivers that OEM's haven't bothered to update. You really should update your own graphics drivers for bug fixes (possibly with Optimus) and better performance. Driver updates also should not affect cooling performance. Reply
  • setzer - Thursday, March 22, 2012 - link

    I agree, having a optimus enabled laptop with an intel hd3000 and a gf520, the only way to get decent battery life is to disable the geforce totally because seriously what is the point of the geforce 520M?
    There's better compatibility for gaming than the hd3000 but in either case they are both crappy graphics cards for games and as I work mostly under linux where Optimus isn't supported by nvidia the Optimus point is moot.
    Optimus makes sense if you are pairing a decent graphics card, say anything equal or above x40M with the integrated intel gpu, pairing low end cards is just pointless.
    What would be good is intel getting serious with gpu performance and offer at least 90% of the graphics performance and compatibility of the integrated amd apus with lower power usage. That way we could get rid of the lower tiers of mobile and desktop cards all together :)
    Reply
  • JarredWalton - Thursday, March 22, 2012 - link

    If you buy a laptop with Optimus and a low-end GPU and then:

    1) Expect it to run games well
    2) Expect it to work with Linux

    ...then you didn't do your homework. But given you "mostly work under Linux" you're already in a diminishingly small minority. Optimus isn't fore every single use case, but really Linux is the only area where Optimus in my opinion is seriously flawed.

    As for Wolfpup, I know you get on here every time we mention Optimus and you spread the same old FUD. You have never once responded to my questions about specifically where Optimus has caused you problems. The reason, apparently, is that you've either never actually used Optimus, or you've never actually experienced a problem directly related to Optimus, and so you just wave your hands and say it's a horrible technology. Is Optimus perfect? Well, no, but then neither is any other technology. More to the point, given the alternatives I'll take Optimus every time. Which alternatives are those?

    1) Discrete only GPUs in laptops, where battery life gets killed. The lowest power dGPUs idle at around 3-4W, which is 50% more than where a modern Sandy Bridge laptop idles.
    2) IGP only laptops, which means forget about gaming on Intel. For AMD, Llano is actually decent, but it's still only good for our "Value 1366x768" settings.
    3) Manually switchable graphics, with driver updates dictated by your laptop vendor. Go ahead and ask how many driver updates people with manually switchable laptops have received over the lifetime of their laptop--usually it's one update if you're lucky, two at most.
    4) Manually switchable graphics with muxes also do the lovely flash and blink on the screen for anywhere from five to ten seconds every time you switch.

    Optimus gives you the battery life of the IGP when you don't need the dGPU, and the performance of the dGPU when you do need it. It gives you a switch that takes place in a fraction of a second with no screen flicker. You get driver updates from NVIDIA and Intel whenever they provide them and no need to wait for the laptop vendors to release an updated driver. The "problems" with Optimus are that you don't get alternative OS support, and you tend to lose some performance when running in high frame rate situations (e.g. >120 FPS). Since there's not much point running higher than 120FPS anyway, given the 60Hz refresh rates of most displays (120Hz on 3D displays), Linux support is really the only valid complaint. That's less than 5% of users on laptops/desktops, and NVIDIA has decided that they're willing to lose that business on laptops.
    Reply
  • orionismud - Tuesday, March 27, 2012 - link

    Hopefully people interested in this get to your reply. Anandtech needs a comment system where you can like / dislike. Reply
  • lurker22 - Thursday, March 22, 2012 - link

    This is why I haven't upgraded my 9800GT. I don't have the patience to spend a couple hours trying to figure out which is a good card anymore. When their numbering system isn't even in order for what is a "better card" it's just plain obnoxious. Reply
  • MrSpadge - Thursday, March 22, 2012 - link

    They want you to "just get the bigger GBs"... Reply
  • Warren21 - Thursday, March 22, 2012 - link

    I was helping a friend build a new budget desktop gaming PC and ran into this same age-old scenario.

    He came up with his own "killer!" build to compare to mine and had put a $75 GT 520 (2GB DDR3) in there as opposed to a $130 6850 (1GB GDDR5) in my original plan. I had to explain that the GT 520 is garbage in comparison...

    Ended up going for a $220 eVGA GTX 480 1.5GB, BNIB with 3 years of warranty. Awesome sale.
    Reply

Log in

Don't have an account? Sign up now