Introducing the NVIDIA GeForce 700M Family

With spring now well under way and the pending launch of Intel’s Haswell chips, OEMs always like to have “new” parts across the board, and so once more we’re getting a new series of chips from NVIDIA, the 700M parts. We’ve already seen a few laptops shipping with the 710M and GT 730M; today NVIDIA is filling out the rest of 700M family. Last year saw NVIDIA’s very successful launch of mobile Kepler; since that time, the number of laptops shipping with NVIDIA dGPUs compared to AMD dGPUs appears to have shifted even more in NVIDIA’s favor.

Not surprisingly, with TSMC still on 28nm NVIDIA isn’t launching a new architecture, but they’ll be tweaking Kepler to keep it going through 2013. Today's launch of the various 700M GPUs is thus very similar to what we saw with the 500M launch: everything in general gets a bit faster than the previous generation. To improve Kepler NVIDIA is taking the existing architecture and making a few moderate tweaks, improving their drivers (which will also apply to existing GPUs), and as usual they’re continuing to proclaim the advantages of Optimus Technology.

Starting on the software side of things, we don’t really have anything new to add on the Optimus front, other than to say that in our experience it continues to work well on Windows platforms—Linux users may feel otherwise, naturally. On the bright side, things like the Bumblebee Project appear to be helping the situation, so now it's at least possible to utilize the dGPU and iGPU under Linux. As far as OEMs go, Optimus has now matured to the point where I can't immediately come up with any new laptop that has an NVIDIA GPU and doesn't support Optimus; we're now at the point where an NVIDIA equipped laptop inherently implies Optimus support.

The second software aspect is NVIDIA’s GeForce Experience software, which allows for automatic game configuration based on your hardware. You can see the full slide deck in the gallery at the end with a few additional details, but GeForce Experience is a new software tool that’s designed to automatically adjust supported games to the “best quality for your hardware” setting. This may not seem like a big deal for enthusiasts, but for your average Joe that doesn’t know what all the technical names mean (e.g. antialiasing, anisotropic filtering, specular highlighting, etc.) it’s a step towards making PCs more gamer friendly—more like a console experience, only with faster hardware. ;-) GeForce Experience is already in open beta, with over 1.5 million downloads and counting, so it’s definitely something people are using.

Finally, NVIDIA has added GPU Boost 2.0 to the 700M family. This is basically the same as what’s found in GeForce Titan, though with some tuning specific to mobile platforms as opposed to desktops. We’re told GPU Boost 2.0 is the same core hardware as GPU Boost 1.0, with software refinements allowing for more fine-grained control of the clocks. Ryan has already covered GPU Boost 2.0 extensively, so we won’t spend much more time on it other than to say that over a range of titles, NVIDIA is getting a 10-15% performance improvement relative to GPU Boost 1.0.

Moving to the hardware elements, hardware change only applies to one of the chips. GK104 will continue as the highest performing option in the GTX 675MX and GTX 680M (as well as the GTX 680MX in the iMac 27), and GK106 will likewise continue in the GTX 670MX (though it appears some 670MX chips also use GK104). In fact, for now NVIDIA isn’t announcing any new high-end mobile GPUs, so the GTX 600M parts will continue to fill that niche. The changes come for everything in the GT family, with some of the chips apparently continuing to use GK107 while a couple options will utilize a new GK208 part.

While NVIDIA won’t confirm which parts use GK208, the latest drivers do refer to that part number so we know it exists. GK208 looks to be largely the same as GK107, and we’re not sure if there are any real differences other than the fact that GK208 will be available as a 64-bit part. Given the similarity in appearance, it may serve as a 128-bit part as well. Basically, GK107 was never available in a 64-bit configuration, and GK208 remedies that (which actually makes it a lower end chip relative to GK107).

GeForce 700M Models and Specifications
Comments Locked

91 Comments

View All Comments

  • HisDivineOrder - Monday, April 1, 2013 - link

    Yeah. I kinda figured. Still, if it's the same, then it'd be foolish not to ask.

    I knew when I heard about Boost 2.0 in Titan that all that time spent discussing it was going to mean it would show up in Kepler refresh products relatively soon afterward. I wouldn't be surprised to see nVidia refresh even the desktop high end with something like that. Minor changes, including slightly higher clocks and a newer Boost.

    Even a "minor change" would probably be enough to ruin AMD's next six months.
  • Guspaz - Monday, April 1, 2013 - link

    I've been using GeForce Experience, and have some comments. It's definitely a timesaver, and it's nice to be able to "just click a button" and not have to worry about tweaking the detailed settings (although it's nice to still be able to if I want to override something). I find that the settings it picks generally do run at a good framerate on my rig. It also makes driver updates easier, since it presents you with notification of new drivers (even betas), gives you a nice list of changes in each version, and makes install a one-click affair (it downloads/installs inside the app).

    Downsides? First, it doesn't support very many games. This is understandable since supporting a game means they need to have setting profiles for every one of their cards, but also a whole lot of other configurations such as different CPUs and different monitor resolutions. Unless there is some sort of dynamic algorithm involved, that would be an enormous number of potential configs per game. Still, the limited game support is unfortunate. Second, the app will continually notify you that new optimized settings are available, even when the new settings it downloaded are not for any game you have installed. So it keeps telling me there are new settings, but when I go into the app to check, there are no updates for any of my games.
  • Wolfpup - Monday, April 1, 2013 - link

    I hadn't heard of this program, and have to say it's kind of a cool idea. Heck, *I* don't always like messing around with sometimes vaguely settings in games, I think for the average user this could be really cool, and does indeed help make it more console like.
  • HisDivineOrder - Monday, April 1, 2013 - link

    I like that they went in and started supporting prioritizing resolution. So instead of just abstractly telling me to change my 2560x1600 to 1920x1200/1080, they leave it at 2560x1600 now. That's good.

    Plus, their latest release notes also said they were adding SLI support, which is great.

    The thing that I think this program lacks is the option to set certain settings that you want to be true regardless and then have the program adjust to recommend specs around certain "givens" that you won't ever change.

    For example, I'm not a big fan of AA unless there is ABSOLUTELY no performance setting that can't be turned all the way up. I can imagine some people might want AA at all costs because jaggies just bug them.

    I think we should both have the option to prioritize for the setting we want. I'd also love it if we had a program like Geforce Experience that let us alter the settings for a game before we entered it and also served as a launcher (much as Geforce Experience does), but I think instead of just doing the defaults, we should have the option to select the settings, choose the "Optimal" as determined by nVidia, and also the option to do the tweaking from right inside the Geforce Experience interface.

    And if I'm adding in wish list items, I'd love it if nVidia would integrate SMAA and FXAA into the program. Hell, I think I'd really prefer it if Geforce Experience would serve a similar function to SweetFX except in an official setting kinda way. So we could tweak the game from Geforce Experience in addition to just it serving as a simple optimizer.

    It could come with an "Advanced" mode. I think a quick launch and/or settings for the control panel might be nice to help us move between different UI's, from adjusting the game to adjusting the profiles to adjusting the settings of the video card. Maybe it should be the same interface with different tabs to interact with each element.

    And... nah. That's it.
  • cjb110 - Tuesday, April 2, 2013 - link

    mmm, I don't like it seems to push the pretty too much hurt performance no end.
    Now it might be that its looking at GPU only, in which case...duh pointless.

    Nice idea, but needs to be a bit more balanced in its options!
  • tviceman - Monday, April 1, 2013 - link

    Jared, can you confirm or not if these parts have the same (or very similar) power envelope as their like-named 600 series parts that are being replaced?
  • JarredWalton - Monday, April 1, 2013 - link

    My understanding is that they do have the same approximate power envelopes. However, keep in mind that NVIDIA doesn't disclose notebook power TDPs -- they simply say that they work with each OEM to provide what the OEM desires. Thus, two laptops with GT 750M could potentially have TDPs as much as 5-10W apart (or not -- I don't know how much of a difference we're likely to see).
  • Einy0 - Monday, April 1, 2013 - link

    I hate this up to crap for specs. This leaves way too much wiggle room for OEMs to under clock the chips to fit a certain cooling profile. This messes with performance way too much. There should be clearly defined specifications for each GPU model. The typical consumer doesn't understand that the bigger number doesn't mean faster. It doesn't make sense that you pay more for a higher end part only to have it nerfed down to the OEM's cooling solution, etc...
  • JarredWalton - Monday, April 1, 2013 - link

    NVIDIA's policy is that a GPU has to score within 10% of the "stock" GPU in order to have the same model name, so a GT 650M with DDR3 can't be more than 10% off the performance of a GT 650M with GDDR5. Of course, there's a catch: the 10% margin is measured with 3DMark Vantage "Performance" defaults, which aren't nearly as meaningful as using a suite of games for the testing. So basically, I'm with you: it sucks.
  • random2 - Monday, April 1, 2013 - link

    ... but again considering Windows 8 almost requires a touchscreen to really be useful that’s expected ...

    Really? Once this OS is set to boot to the desktop, it's a great little OS for those of us who don't run tablets or touch panels.

Log in

Don't have an account? Sign up now