GeForce 700M Models and Specifications

With that brief introduction out of the way, here are the specs of the now announced 700M family. If I had to guess, I expect we’ll see revised high-end 700M parts sometime later this year based on tweaked GK106 and GK104 chips—like maybe a GTX 780M that has the performance of the GTX 680MX but in the power envelope of the GTX 680M—but we’ll have to wait and see what happens.

  GeForce GT 750M GeForce GT 745M GeForce GT 740M
GPU and Process 28nm GK107 or GK106 28nm GK107 28nm GK107
CUDA Cores 384 384 384
GPU Clock Up to 967MHz
plus Boost
Up to 837MHz
plus Boost
Up to 980MHz
plus Boost
Memory Eff. Clock Up to 5.0GHz Up to 5.0GHz Up to 5.0GHz
Memory Bus Up to 128-bit Up to 128-bit Up to 128-bit
Memory Bandwidth Up to 80GB/s Up to 80GB/s Up to 80GB/s
Memory Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3

Compared to the previous generation GTX 660M, GT 650M, GT645M, and GT 640M (not to mention the GT 640M LE), the new chips all have the same core set of features but now with GPU Boost 2.0 and higher memory clocks. I wish NVIDIA would just drop support for DDR3 on their higher end chips, and likewise the “up to” clauses aren’t really helpful, but they’re both necessary evils thanks to working with OEMs that sometimes have slightly different requirements. Overall, performance of these new 700M parts should be up 15-25% relative to the previous models, thanks to higher GPU and memory clock speeds.

You’ll note that the core clocks appear to be a little crazy, but this is based largely on how the OEMs choose to configure a specific laptop. With both GDDR5 and DDR3 variants available, NVIDIA wants to keep performance of chips in the same name within 10% of each other. Thus, we could see a GT 740M with 2.5GHz GDDR5 and a moderate core clock, another GT 740M with 2.0GHz GDDR5 and a slightly higher core clock, and a third variant with 1800MHz DDR3 but matched to a 980MHz core clock. Presumably, most (all?) currently planned GT 750M and GT 745M laptops are using GDDR5 memory, and thus we don’t see the higher core clocks. As for the Boost clocks, in practice that can increase the GPU core speed 15% or more over the normal value, with most games realizing a 10-15% performance thanks to the increase.

One final item of interest is that while the GT 750M appears to have a similar configuration to the other GPUs—384 cores, 128-bit memory interface—at least in the chip shots provided the GT 750M uses a different GPU core. Based on the appearance in the above images, the GT 750M uses GK106, only it’s what would be called a “floor sweeper” model: any GK106 chip with too many defective cores to be used elsewhere can end up configured basically the same as GK107. Presumably, there will also be variants that use GK107 (or potentially GK208, just like the other parts), but NVIDIA wouldn’t confirm or deny this.

  GeForce GT 735M GeForce GT 730M GeForce GT 720M GeForce 710M
GPU and Process 28nm GK208 28nm GK208 28nm Fermi 28nm Fermi
CUDA Cores 384 384 96 96
GPU Clock Up to 889MHz
plus Boost
Up to 719MHz
plus Boost
Up to 938MHz
with Boost
Up to 800MHz
with Boost
Memory Eff. Clock Up to 2.0GHz Up to 2.0GHz Up to 2.0GHz Up to 1.8GHz
Memory Bus Up to 64-bit Up to 64-bit Up to 64-bit Up to 64-bit
Memory Bandwidth 32GB/s 32GB/s 32GB/s 32GB/s
Memory Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3

Moving on to the lower end of the 700M range, we have the GT 730M and 710M that have already shown up in a few laptops. Joining them are GT 735M and GT 720M, which are similar chips with higher clocks. All of these chips have 64-bit memory interfaces and that will obviously curtail performance a bit, but NVIDIA is targeting Ultrabooks and other thin form factors here so performance and thermals need to be kept in balance; more on this in a moment.

The GT 735M and 730M at least are “new” parts that we haven’t seen previously in the Kepler family. The word is that some OEMs were after more economical alternatives than even the GT 640M LE, and the option to go with a 64-bit interface opens up some new markets. It’s basically penny pinching on the part of the OEMs, but we’ve complained about BoM price saving measures plenty so we won’t get into it here. NVIDIA did mention that they’ve spent some additional time tuning the drivers for performance over a 64-bit bus on these chips, and their primary competition in the iGPU market is going to be HD 4000 running on a ULV chip—and in the near future, HD 4600 with Haswell. They'll also compete with AMD APUs and dGPUs, obviously, but NVIDIA is more interested in trying to show laptop vendors and users what they gain by adding an NVIDIA dGPU to an Intel platform.

Introducing the NVIDIA GeForce 700M Family Performance Expectations and Closing Thoughts
Comments Locked

91 Comments

View All Comments

  • crypticsaga - Monday, April 1, 2013 - link

    What possible reason would intel have for pushing a product like that? In fact if some sources are correct they are trying to do the exact opposite by bottlenecking even internal dGPU by limiting available PCIe lanes in Broadwell.
  • shompa - Monday, April 1, 2013 - link

    That problem have been solved for over a year with Thunderbolt. Use a Thunderbolt PCIE with graphic card.
  • fokka - Monday, April 1, 2013 - link

    afaik this is not entirely correct since even thunderbolt is too slow to properly utilize a modern graphics card.
    this is not surprising, since thunderbolt is based on 4x pci-e 2.0 (2GB/s) and current desktop class graphics are using 16x pci-e 3.0 (~16GB/s) which is about eight times as fast.

    so i wouldn't say the problem is completely solved throughput-wise, but thunderbold sure was an important step in the right direction.
  • MojaMonkey - Monday, April 1, 2013 - link

    No, shompa is correct, it has been solved with Thunderbolt and I'm personally using a GTX 680 connected externally. Works great.
  • Wolfpup - Monday, April 1, 2013 - link

    Ugh. You're either ignorant or reaaaaally generous with the hyperbole. "20+ lbs notebooks"? Really?

    In real life, mid-range notebooks/GPUs do fine for gaming, and high end notebooks/GPUs do...REALLY fine. When you can max out today's games at 1080p, that isn't "performing poorly", and is orders of magnitude better than Intel's video.

    If YOU guys don't want high end notebooks, fine, but I don't see how they're hurting you.
  • lmcd - Tuesday, April 2, 2013 - link

    My cheap A8m (Trinity) can play Rage at high texture res at panel res (1366x768), just for starters. And that's $400 level I think right now.
  • superjim - Wednesday, April 10, 2013 - link

    I can confirm this. An A8-4500M does really well for $400 or below on 1366x768. Now if the A10 would come down to $400 we'll really be in good shape.
  • xenol - Monday, April 1, 2013 - link

    I had a laptop with discrete graphics that lasted for over 9 hours on battery, while surfing the web. It was a laptop with an early form of Optimus (you had to manually switch), but still, you can have graphical performance xor battery life if you don't need the performance. But asking for both? Now that's silly.

    As for your issue with marketing the 680M as it is when it can't outperform a midrange desktop card... You do realize that this is a different market segment? Also you should tell "shame on you" to all the display companies who mislead customers into think they're buying a panel that can do 16 million colors (which last I checked, 18-bits is not 16 million) or have a 1000000:1 contrast ratio (which you need to be in a pitch black room and being shown a black/white checkerboard pattern to see).
  • Wolfpup - Monday, April 1, 2013 - link

    "Modest performance increase"? I wouldn't call my GTX 680m a "modest performance increase" over Intel video lol

    Are you KIDDING?!? Notebook hardware is ALWAYS worse than desktop. This applies obviously to CPUs too, which you're inexplicably not complaining about. You always pay more to get the same performance. That doesn't mean it's "dishonest" or the like.

    And quite obviously integrated video can never catch up with a discreet part so long as they make high end discreet parts, so the time is "never", not "near".

    ****
    Regarding the article...Optimus...eh, Nvidia's driver team is impressive as always, but literally the first program I ran that I wanted to run on the GPU wouldn't run on the GPU...thankfully my notebook lets you turn off Optimus.
  • JarredWalton - Monday, April 1, 2013 - link

    Which program did you run that you wanted on the GPU? Please be very specific.

Log in

Don't have an account? Sign up now