GeForce 700M Models and Specifications

With that brief introduction out of the way, here are the specs of the now announced 700M family. If I had to guess, I expect we’ll see revised high-end 700M parts sometime later this year based on tweaked GK106 and GK104 chips—like maybe a GTX 780M that has the performance of the GTX 680MX but in the power envelope of the GTX 680M—but we’ll have to wait and see what happens.

  GeForce GT 750M GeForce GT 745M GeForce GT 740M
GPU and Process 28nm GK107 or GK106 28nm GK107 28nm GK107
CUDA Cores 384 384 384
GPU Clock Up to 967MHz
plus Boost
Up to 837MHz
plus Boost
Up to 980MHz
plus Boost
Memory Eff. Clock Up to 5.0GHz Up to 5.0GHz Up to 5.0GHz
Memory Bus Up to 128-bit Up to 128-bit Up to 128-bit
Memory Bandwidth Up to 80GB/s Up to 80GB/s Up to 80GB/s
Memory Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3

Compared to the previous generation GTX 660M, GT 650M, GT645M, and GT 640M (not to mention the GT 640M LE), the new chips all have the same core set of features but now with GPU Boost 2.0 and higher memory clocks. I wish NVIDIA would just drop support for DDR3 on their higher end chips, and likewise the “up to” clauses aren’t really helpful, but they’re both necessary evils thanks to working with OEMs that sometimes have slightly different requirements. Overall, performance of these new 700M parts should be up 15-25% relative to the previous models, thanks to higher GPU and memory clock speeds.

You’ll note that the core clocks appear to be a little crazy, but this is based largely on how the OEMs choose to configure a specific laptop. With both GDDR5 and DDR3 variants available, NVIDIA wants to keep performance of chips in the same name within 10% of each other. Thus, we could see a GT 740M with 2.5GHz GDDR5 and a moderate core clock, another GT 740M with 2.0GHz GDDR5 and a slightly higher core clock, and a third variant with 1800MHz DDR3 but matched to a 980MHz core clock. Presumably, most (all?) currently planned GT 750M and GT 745M laptops are using GDDR5 memory, and thus we don’t see the higher core clocks. As for the Boost clocks, in practice that can increase the GPU core speed 15% or more over the normal value, with most games realizing a 10-15% performance thanks to the increase.

One final item of interest is that while the GT 750M appears to have a similar configuration to the other GPUs—384 cores, 128-bit memory interface—at least in the chip shots provided the GT 750M uses a different GPU core. Based on the appearance in the above images, the GT 750M uses GK106, only it’s what would be called a “floor sweeper” model: any GK106 chip with too many defective cores to be used elsewhere can end up configured basically the same as GK107. Presumably, there will also be variants that use GK107 (or potentially GK208, just like the other parts), but NVIDIA wouldn’t confirm or deny this.

  GeForce GT 735M GeForce GT 730M GeForce GT 720M GeForce 710M
GPU and Process 28nm GK208 28nm GK208 28nm Fermi 28nm Fermi
CUDA Cores 384 384 96 96
GPU Clock Up to 889MHz
plus Boost
Up to 719MHz
plus Boost
Up to 938MHz
with Boost
Up to 800MHz
with Boost
Memory Eff. Clock Up to 2.0GHz Up to 2.0GHz Up to 2.0GHz Up to 1.8GHz
Memory Bus Up to 64-bit Up to 64-bit Up to 64-bit Up to 64-bit
Memory Bandwidth 32GB/s 32GB/s 32GB/s 32GB/s
Memory Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3

Moving on to the lower end of the 700M range, we have the GT 730M and 710M that have already shown up in a few laptops. Joining them are GT 735M and GT 720M, which are similar chips with higher clocks. All of these chips have 64-bit memory interfaces and that will obviously curtail performance a bit, but NVIDIA is targeting Ultrabooks and other thin form factors here so performance and thermals need to be kept in balance; more on this in a moment.

The GT 735M and 730M at least are “new” parts that we haven’t seen previously in the Kepler family. The word is that some OEMs were after more economical alternatives than even the GT 640M LE, and the option to go with a 64-bit interface opens up some new markets. It’s basically penny pinching on the part of the OEMs, but we’ve complained about BoM price saving measures plenty so we won’t get into it here. NVIDIA did mention that they’ve spent some additional time tuning the drivers for performance over a 64-bit bus on these chips, and their primary competition in the iGPU market is going to be HD 4000 running on a ULV chip—and in the near future, HD 4600 with Haswell. They'll also compete with AMD APUs and dGPUs, obviously, but NVIDIA is more interested in trying to show laptop vendors and users what they gain by adding an NVIDIA dGPU to an Intel platform.

Introducing the NVIDIA GeForce 700M Family Performance Expectations and Closing Thoughts
Comments Locked

91 Comments

View All Comments

  • Kevin G - Monday, April 1, 2013 - link

    What I'd like to see is an ExpressCard version of the low end parts. I've been working with numerous business class laptops with this expansion slot and I've run into the situation where I could use an additional display. I've used USB adapters but they've been less than ideal. I fathom a low clock GK208 chip and a 64 bit wide memory bus could be squeezed into an ExpressCard form factor. I'd expect it to perform around the level of Intel HD4000 but that'd still be far superior to USB solutions.
  • arthur449 - Monday, April 1, 2013 - link

    While some ExpressCard slots give access to the PCI-E bus, the problem is that the laptop's BIOS/UEFI has to support the device in its whitelist. In almost every situation where people have modded their laptops and attached them to external GPUs, they had to flash a custom ROM to remove compatibility restrictions put in place to limit the amount of compatibility testing the vendor had to conduct.
  • rhx123 - Monday, April 1, 2013 - link

    A surprisingly low amount of laptops needed modification to remove the whitelist on the express card slot, and it is possible to do it with software pre-windows if there is whitelisiting.
    I did not have to whitelist on my Lenovo X220T.
  • JarredWalton - Monday, April 1, 2013 - link

    Cooling would require the majority of the GPU to exist outside of the slot if you go this route. I don't think you could properly route heat-pipes through the relatively thin slot opening with a radiator/fan on the outside. Once you go external, the number of people really interested in the product drops quite a bit, and you'd still need to power the device so on most laptops without a dGPU I expect the external ExpressCard option would also require external power. At that point, the only real value is that you could have an external GPU hooked up to a display and connect your laptop to it for a semi-portable workstation.
  • Kevin G - Monday, April 1, 2013 - link

    It would be crazy to put any of these chips into an ExpressCard form factor without reducing power consumption. I was thinking of dropping the clock down to 400 Mhz and cutting power consumption further with a corresponding drop in voltages. It wouldn't have to break any performance records, just provide full acceleration and drive an external display.

    In hindsight, the GK208 may be too power hungry. The 28 nm Fermi parts (GF117?) should be able to hit the power and thermal allocations for ExpressCard without resorting to an external chassis.
  • Wolfpup - Tuesday, April 2, 2013 - link

    I like the IDEA of a connection to an external dock that allows ANY video card to be used (heck, why not go for SLI?) but notebooks would have to be able to support it-sounds like lots don't, plus tons of notebooks don't have ExpressCard slots anymore (plus not sure if the bandwidth would start being a bottleneck or not). (Or obviously Thunderbolt could theoretically pull this off too...IF you could just boot with any GPU installed and have the external GPU active by the time Windows boots at least).
  • rhx123 - Monday, April 1, 2013 - link

    You can make an external graphics card if you want, I have a 650Ti desktop card attached through ExpressCard.

    It's powered by an XBox PSU.

    http://imgur.com/239skMP
  • rhx123 - Monday, April 1, 2013 - link

    It can drive the internal laptop display through Optimus.
  • Flunk - Monday, April 1, 2013 - link

    Disappointing, this is a really small bump. Mostly a re-labelling of existing parts. Although I suppose it is to be expected seeing as almost all Geforce GT 640m LE-650ms can be clocked up to 1100Ghz with a little bit of bios hacking.
  • JarredWalton - Monday, April 1, 2013 - link

    Besides the fact that nothing runs at 1100GHz (or Ghz, whatever those are), I dare say you've exaggerated quite a bit. Many laptops with even moderate dGPUs run quite warm, and that's with the dGPUs hitting a max clock of around 900MHz (GT 650M with DDR3 and a higher clocked core as opposed to GDDR5 with a lower clocked core). If you manage to hack the VBIOS for a laptop to run what is supposed to be a 500MHz part at 1GHz or more, you're going to overload the cooling system on virtually every laptop I've encountered.

    In fact, I'll go a step further and say that with very few exceptions, overclocking of laptops in general is just asking for trouble, even when the CPU supports it. I tested old Dell XPS laptops with Core 2 Extreme CPUs that could be overclocked, and the fans would almost always be at 100% under any sort of load as soon as you started overclocking. Long-term, that sort of thing is going to cause component failures far more quickly, and on laptops that cost well over $2000 I think most would be quite angry if it failed after a couple years.

    If you understand the risks and don't really care about ruining a laptop, by all means have at it. But the number of laptops I've seen running stock that have heat dissipation issues urges extreme caution.

Log in

Don't have an account? Sign up now