GeForce 700M Models and Specifications

With that brief introduction out of the way, here are the specs of the now announced 700M family. If I had to guess, I expect we’ll see revised high-end 700M parts sometime later this year based on tweaked GK106 and GK104 chips—like maybe a GTX 780M that has the performance of the GTX 680MX but in the power envelope of the GTX 680M—but we’ll have to wait and see what happens.

  GeForce GT 750M GeForce GT 745M GeForce GT 740M
GPU and Process 28nm GK107 or GK106 28nm GK107 28nm GK107
CUDA Cores 384 384 384
GPU Clock Up to 967MHz
plus Boost
Up to 837MHz
plus Boost
Up to 980MHz
plus Boost
Memory Eff. Clock Up to 5.0GHz Up to 5.0GHz Up to 5.0GHz
Memory Bus Up to 128-bit Up to 128-bit Up to 128-bit
Memory Bandwidth Up to 80GB/s Up to 80GB/s Up to 80GB/s
Memory Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3

Compared to the previous generation GTX 660M, GT 650M, GT645M, and GT 640M (not to mention the GT 640M LE), the new chips all have the same core set of features but now with GPU Boost 2.0 and higher memory clocks. I wish NVIDIA would just drop support for DDR3 on their higher end chips, and likewise the “up to” clauses aren’t really helpful, but they’re both necessary evils thanks to working with OEMs that sometimes have slightly different requirements. Overall, performance of these new 700M parts should be up 15-25% relative to the previous models, thanks to higher GPU and memory clock speeds.

You’ll note that the core clocks appear to be a little crazy, but this is based largely on how the OEMs choose to configure a specific laptop. With both GDDR5 and DDR3 variants available, NVIDIA wants to keep performance of chips in the same name within 10% of each other. Thus, we could see a GT 740M with 2.5GHz GDDR5 and a moderate core clock, another GT 740M with 2.0GHz GDDR5 and a slightly higher core clock, and a third variant with 1800MHz DDR3 but matched to a 980MHz core clock. Presumably, most (all?) currently planned GT 750M and GT 745M laptops are using GDDR5 memory, and thus we don’t see the higher core clocks. As for the Boost clocks, in practice that can increase the GPU core speed 15% or more over the normal value, with most games realizing a 10-15% performance thanks to the increase.

One final item of interest is that while the GT 750M appears to have a similar configuration to the other GPUs—384 cores, 128-bit memory interface—at least in the chip shots provided the GT 750M uses a different GPU core. Based on the appearance in the above images, the GT 750M uses GK106, only it’s what would be called a “floor sweeper” model: any GK106 chip with too many defective cores to be used elsewhere can end up configured basically the same as GK107. Presumably, there will also be variants that use GK107 (or potentially GK208, just like the other parts), but NVIDIA wouldn’t confirm or deny this.

  GeForce GT 735M GeForce GT 730M GeForce GT 720M GeForce 710M
GPU and Process 28nm GK208 28nm GK208 28nm Fermi 28nm Fermi
CUDA Cores 384 384 96 96
GPU Clock Up to 889MHz
plus Boost
Up to 719MHz
plus Boost
Up to 938MHz
with Boost
Up to 800MHz
with Boost
Memory Eff. Clock Up to 2.0GHz Up to 2.0GHz Up to 2.0GHz Up to 1.8GHz
Memory Bus Up to 64-bit Up to 64-bit Up to 64-bit Up to 64-bit
Memory Bandwidth 32GB/s 32GB/s 32GB/s 32GB/s
Memory Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3

Moving on to the lower end of the 700M range, we have the GT 730M and 710M that have already shown up in a few laptops. Joining them are GT 735M and GT 720M, which are similar chips with higher clocks. All of these chips have 64-bit memory interfaces and that will obviously curtail performance a bit, but NVIDIA is targeting Ultrabooks and other thin form factors here so performance and thermals need to be kept in balance; more on this in a moment.

The GT 735M and 730M at least are “new” parts that we haven’t seen previously in the Kepler family. The word is that some OEMs were after more economical alternatives than even the GT 640M LE, and the option to go with a 64-bit interface opens up some new markets. It’s basically penny pinching on the part of the OEMs, but we’ve complained about BoM price saving measures plenty so we won’t get into it here. NVIDIA did mention that they’ve spent some additional time tuning the drivers for performance over a 64-bit bus on these chips, and their primary competition in the iGPU market is going to be HD 4000 running on a ULV chip—and in the near future, HD 4600 with Haswell. They'll also compete with AMD APUs and dGPUs, obviously, but NVIDIA is more interested in trying to show laptop vendors and users what they gain by adding an NVIDIA dGPU to an Intel platform.

Introducing the NVIDIA GeForce 700M Family Performance Expectations and Closing Thoughts
Comments Locked

91 Comments

View All Comments

  • JarredWalton - Wednesday, April 3, 2013 - link

    They're so low end that I hardly worry about them. Anyone buying a 710M or GT 720M ought to know what they're getting, and at least they're 28nm.
  • Notmyusualid - Wednesday, April 3, 2013 - link

    I used to shake my head when I read comments about supposed 20+ lbs laptops, noise, heat, cost, etc.

    They are not as noisy, nor as heavy as you think. Sure, if you run wPrime all day, you are going to hear it...if you OpenCL compute, you'll hear the GPUs too.

    But nobody is forcing you to buy it, and there is clearly a market out there.

    Now I smile at the ignorant comments. Like another posted said, making these big laptops is not hurting you, so why hate?

    My M18x R2 gets 5+hrs on discrete graphics (nice for typing reports in airports, when you've no lounge access, and unsure of the batteries current capacity).

    On it's dual graphic cards I push 3DMark06 33,500, 3DMark11 11,250+, under Linux, Pyrit crunches 130,000PMK/s - all with NO GPU OVERCLOCKING.

    Can YOUR desktop do that? I know of many that can't.

    What are valid opinions here, is the confusing GPU-naming-game, that both AMD & Nvidia play.

    Forcing many uninformed / incorrect purchases, the world over.

    If the average consumer knew what they'd get with GDDR3, and 128bit bus width, they might run a mile. Let alone what architecture might reside beneath... I'd welcome a more consistent naming approach, like BMW, for example. (You can be sure your 550i is gonna smoke a 316i). And I'm not saying that is a perfect system either.

    Anyway, like they say on Youtube, "Haters are gonna hate".
  • Rishi. - Monday, April 22, 2013 - link

    ummmmm........yes , its the confusing naming schemes they follow , coupled by somewhat more confusing spec. sheet.!
  • nerd1 - Saturday, April 13, 2013 - link

    Samsung released chronos 7 with 8870m, which lasts 10+ hrs, 20mm thick and more powerful than 670M and weighs slightly more than rmbp 15.

    Mobile gaming really gets awesome.
  • Rishi. - Monday, April 22, 2013 - link

    Yeah , it certainally does.!!!
    I just wished I had a portable beast with GTX675 ,or around that.!!!
    Desktop users gonna hate , though.!!!
  • karbom - Sunday, April 14, 2013 - link

    Hi Jarred. I have a confirmation that acer laptop v3 571g has 730m with 128 bit memory bus bandwidth interface. What do you think about other OEMs like Dell, will they implement the same 128 bit interface or 64 bit interface as specifications tend to differ among OEMs also Notebookcheck.net indicates interface as 128/64 bit.
  • Mr. Bub - Wednesday, April 17, 2013 - link

    And here begins the obsolescence of my 6 month old laptop with a GT640m.
  • Rishi. - Monday, April 22, 2013 - link

    THe only thing I am pissed off about from Nvidia is that their technical specifications about their mobile dGPUs. I have a hard time finding out the difference b/w some of their cards by looking at the spec. sheet. They all apper to be almost same (unless you consider the clock variations.)!!

    Overclocking is safe , but only as long as you don't mess with the voltages and step up the clock speed slowly. And its not at all a good idea to push the dGPU around the upper limits.
    I don't know much about the newer 700m series , but I used to have two models of Kepler600m series. I overclocked the GT640m to touch the values around a DDR3 GT650m , using a modded vBIOS. The temperature for GPU never exceeded 76C with the help of a powerful cooler at 99% load for several minutes.
    Performance was on par with GT650 DDR3.!!! :)

    However , for me the major issues was the blazing hot cores of 3610QM.Under 60-75% load for 30mins , it reaches 90C , just like that.!!!! Its probably either a poor thermal paste.

    And to those who think the laptops with dGPU are poor performing and are overpriced , " are you new to this world , baby!!!! ?? ". Cause here on Earth , I have never heard of a Notebook which performs better than a Desktop at same price point. You have to pay the price for mobility.!!!

    "Haters gonna Hate.!"
  • sdubyas - Wednesday, April 24, 2013 - link

    you see for many consumers, like myself, we appreciate your efforts to interpret what is going on with the the specs of tgese chips. however even with this information i am weary of trying decipher what and when I should actually purchase a machine that is not just a rebrand or fail chip. I got burned on a Toshiba satellite with sli of their 8600M and it was a piece, one that failed in just over two years. anyway, that was 2007 and I said I'd never buy another "gaming laptop" but it's time to try again. would anyone of you recommend a laptop that is already available or will soon be. I am really looking at a 680m as it shold be relevant for a couple of years. however, I really don't have more than 1.5k to burn especially on another fail laptop. I also looked at the macbook but they are running only 650m and I'm not sure when their next gen is forthcoming.

    here is something I am also considering: http://www.villageinstruments.com/tiki-index.php?p... has anyone looked into the vidock?
  • Menetlaus - Monday, May 6, 2013 - link

    I realize this likely won't be read or replied to, but it would be GREATLY appreciated to have a few laptops reviewed with the 7 hundred series nVidia GPU's prior to the Haswell launch.

    In my case it would be much easier to compare a Ivy Bridge/660M to IB/750M and finally the to a Haswell/750M in a midrange gaming system rather than skipping the IB/750M step and wondering how much of the change is CPUvsGPU based.

    Lenovo had/has some IB/750M gaming laptops for sale (replacing an IB/660M offering) and will likely have a Haswell/750M available shortly after the Haswell launch. MSI also has lines with IB/660M and is likely to be at the haswell mobile launch party.

Log in

Don't have an account? Sign up now