NVIDIA’s GeForce 700M Family: Full Details and Specs
by Jarred Walton on April 1, 2013 9:00 AM EST
Introducing the NVIDIA GeForce 700M Family
With spring now well under way and the pending launch of Intel’s Haswell chips, OEMs always like to have “new” parts across the board, and so once more we’re getting a new series of chips from NVIDIA, the 700M parts. We’ve already seen a few laptops shipping with the 710M and GT 730M; today NVIDIA is filling out the rest of 700M family. Last year saw NVIDIA’s very successful launch of mobile Kepler; since that time, the number of laptops shipping with NVIDIA dGPUs compared to AMD dGPUs appears to have shifted even more in NVIDIA’s favor.
Not surprisingly, with TSMC still on 28nm NVIDIA isn’t launching a new architecture, but they’ll be tweaking Kepler to keep it going through 2013. Today's launch of the various 700M GPUs is thus very similar to what we saw with the 500M launch: everything in general gets a bit faster than the previous generation. To improve Kepler NVIDIA is taking the existing architecture and making a few moderate tweaks, improving their drivers (which will also apply to existing GPUs), and as usual they’re continuing to proclaim the advantages of Optimus Technology.
Starting on the software side of things, we don’t really have anything new to add on the Optimus front, other than to say that in our experience it continues to work well on Windows platforms—Linux users may feel otherwise, naturally. On the bright side, things like the Bumblebee Project appear to be helping the situation, so now it's at least possible to utilize the dGPU and iGPU under Linux. As far as OEMs go, Optimus has now matured to the point where I can't immediately come up with any new laptop that has an NVIDIA GPU and doesn't support Optimus; we're now at the point where an NVIDIA equipped laptop inherently implies Optimus support.
The second software aspect is NVIDIA’s GeForce Experience software, which allows for automatic game configuration based on your hardware. You can see the full slide deck in the gallery at the end with a few additional details, but GeForce Experience is a new software tool that’s designed to automatically adjust supported games to the “best quality for your hardware” setting. This may not seem like a big deal for enthusiasts, but for your average Joe that doesn’t know what all the technical names mean (e.g. antialiasing, anisotropic filtering, specular highlighting, etc.) it’s a step towards making PCs more gamer friendly—more like a console experience, only with faster hardware. ;-) GeForce Experience is already in open beta, with over 1.5 million downloads and counting, so it’s definitely something people are using.
Finally, NVIDIA has added GPU Boost 2.0 to the 700M family. This is basically the same as what’s found in GeForce Titan, though with some tuning specific to mobile platforms as opposed to desktops. We’re told GPU Boost 2.0 is the same core hardware as GPU Boost 1.0, with software refinements allowing for more fine-grained control of the clocks. Ryan has already covered GPU Boost 2.0 extensively, so we won’t spend much more time on it other than to say that over a range of titles, NVIDIA is getting a 10-15% performance improvement relative to GPU Boost 1.0.
Moving to the hardware elements, hardware change only applies to one of the chips. GK104 will continue as the highest performing option in the GTX 675MX and GTX 680M (as well as the GTX 680MX in the iMac 27), and GK106 will likewise continue in the GTX 670MX (though it appears some 670MX chips also use GK104). In fact, for now NVIDIA isn’t announcing any new high-end mobile GPUs, so the GTX 600M parts will continue to fill that niche. The changes come for everything in the GT family, with some of the chips apparently continuing to use GK107 while a couple options will utilize a new GK208 part.
While NVIDIA won’t confirm which parts use GK208, the latest drivers do refer to that part number so we know it exists. GK208 looks to be largely the same as GK107, and we’re not sure if there are any real differences other than the fact that GK208 will be available as a 64-bit part. Given the similarity in appearance, it may serve as a 128-bit part as well. Basically, GK107 was never available in a 64-bit configuration, and GK208 remedies that (which actually makes it a lower end chip relative to GK107).




77 Comments
View All Comments
whyso - Monday, April 01, 2013 - link
It depends really. As long as you don't touch voltage the temperature does not rise much. I have a 660m and it reaches 1085/2500 without any problems (ANIC rating of 69%). Overclocked vs non overclocked is basically a 2 degree difference (72 vs 74 degrees). Better than a stock 650 desktop.Also considering virtually every 660m I have seen boost up to 950/2500 from 835/2000 I don't think the 750m is going to be any upgrade. Many 650m have a boost of 835 core so there really is no upgrade there either (maybe 5-10%). GK107 is fine with 64 GB/sec bandwidth. Reply
whyso - Monday, April 01, 2013 - link
Whoops sorry didn't see the 987 clocks, nice jump there. ReplyJarredWalton - Monday, April 01, 2013 - link
Funny thing is that in reading comments on some of the modded VBIOS stuff for the Sony VAIO S, the modder say, "The Boost clock doesn't appear to be working properly so I just set it to the same value..." Um, think please Mr. Modder. The Boost clock is what the GPU is able to hit when certain temperature and power thresholds are not exceeded; if you overclock, you've likely inherently gone beyond what Boost is designed to do.Anyway, a 2C difference for a 660M isn't a big deal, but you're also looking at a card with a default 900MHz clock, so you went up in clocks by 20% and had a 3% temperature increase (and no word on fan speed). Going from 500MHz to 950MHz is likely going to be more strenuous on the system and components. Reply
damianrobertjones - Monday, April 01, 2013 - link
"and their primary competition in the iGPU market is going to be HD 4000 running on a ULV chip!"Wouldn't that be the HD 4600? Also it's a shame that no-one really states the HD4000 with something like Vengeance ram which improves performance Reply
HisDivineOrder - Monday, April 01, 2013 - link
So if the "core hardware" is the same from Boost 1 and 2, then nVidia should go on and make Boost 2.0 be something we all can enable in the driver.Or... are they trying to get me to upgrade to new hardware to activate a feature my card is already fully capable of supporting? Haha, nVidia, you so crazy. Reply
JarredWalton - Monday, April 01, 2013 - link
There may be some minor difference in the core hardware (some extra temperature or power sensors?), but I'd be shocked if NVIDIA offered an upgrade to Boost 1.0 users via drivers -- after all, it looks like half of the performance increase from 700M is going to come from Boost 2.0! ReplyHisDivineOrder - Monday, April 01, 2013 - link
Yeah. I kinda figured. Still, if it's the same, then it'd be foolish not to ask.I knew when I heard about Boost 2.0 in Titan that all that time spent discussing it was going to mean it would show up in Kepler refresh products relatively soon afterward. I wouldn't be surprised to see nVidia refresh even the desktop high end with something like that. Minor changes, including slightly higher clocks and a newer Boost.
Even a "minor change" would probably be enough to ruin AMD's next six months. Reply
Guspaz - Monday, April 01, 2013 - link
I've been using GeForce Experience, and have some comments. It's definitely a timesaver, and it's nice to be able to "just click a button" and not have to worry about tweaking the detailed settings (although it's nice to still be able to if I want to override something). I find that the settings it picks generally do run at a good framerate on my rig. It also makes driver updates easier, since it presents you with notification of new drivers (even betas), gives you a nice list of changes in each version, and makes install a one-click affair (it downloads/installs inside the app).Downsides? First, it doesn't support very many games. This is understandable since supporting a game means they need to have setting profiles for every one of their cards, but also a whole lot of other configurations such as different CPUs and different monitor resolutions. Unless there is some sort of dynamic algorithm involved, that would be an enormous number of potential configs per game. Still, the limited game support is unfortunate. Second, the app will continually notify you that new optimized settings are available, even when the new settings it downloaded are not for any game you have installed. So it keeps telling me there are new settings, but when I go into the app to check, there are no updates for any of my games. Reply
Wolfpup - Monday, April 01, 2013 - link
I hadn't heard of this program, and have to say it's kind of a cool idea. Heck, *I* don't always like messing around with sometimes vaguely settings in games, I think for the average user this could be really cool, and does indeed help make it more console like. ReplyHisDivineOrder - Monday, April 01, 2013 - link
I like that they went in and started supporting prioritizing resolution. So instead of just abstractly telling me to change my 2560x1600 to 1920x1200/1080, they leave it at 2560x1600 now. That's good.Plus, their latest release notes also said they were adding SLI support, which is great.
The thing that I think this program lacks is the option to set certain settings that you want to be true regardless and then have the program adjust to recommend specs around certain "givens" that you won't ever change.
For example, I'm not a big fan of AA unless there is ABSOLUTELY no performance setting that can't be turned all the way up. I can imagine some people might want AA at all costs because jaggies just bug them.
I think we should both have the option to prioritize for the setting we want. I'd also love it if we had a program like Geforce Experience that let us alter the settings for a game before we entered it and also served as a launcher (much as Geforce Experience does), but I think instead of just doing the defaults, we should have the option to select the settings, choose the "Optimal" as determined by nVidia, and also the option to do the tweaking from right inside the Geforce Experience interface.
And if I'm adding in wish list items, I'd love it if nVidia would integrate SMAA and FXAA into the program. Hell, I think I'd really prefer it if Geforce Experience would serve a similar function to SweetFX except in an official setting kinda way. So we could tweak the game from Geforce Experience in addition to just it serving as a simple optimizer.
It could come with an "Advanced" mode. I think a quick launch and/or settings for the control panel might be nice to help us move between different UI's, from adjusting the game to adjusting the profiles to adjusting the settings of the video card. Maybe it should be the same interface with different tabs to interact with each element.
And... nah. That's it. Reply