Introducing the GeForce 400M Family

Back in May, NVIDIA surprised us by announcing their first mobile DX11 GPU, the GTX 480M. What was surprising is that they were using a full GF100 chip, only harvested and downclocked relative to the desktop GPUs. In fact, GTX 465M would have been a more accurate name, as the 480M shipped with the same number of cores as the desktop GTX 465. Power requirements were understandably quite high (100W), but there's no arguing that the 480M is now the fastest mobile GPU on the block. Whether it's worth the price of admission is another story, of course, which segues nicely into today's announcement.

NVIDIA is filling out the rest of their mobile lineup with a slew of new chips. What they're not telling is precisely which core the chips are using, so potentially there will be some overlap with harvesting going on (the 445M in particular looks like it will use two different chips). NVIDIA also didn't give us any figures for power requirements, though Optimus Technology means that when paired with and IGP-enabled CPU they can "idle" at 0W. Anyway, here's what we do know, starting with the high-end offerings. (We've split the other parts out on the next page to keep our tables manageable.)

NVIDIA High-End 400M Specifications
  GeForce GTX 480M GeForce GTX 470M GeForce GTX 460M
Codename GF100 GF104 GF106
CUDA Cores 352 288 192
Graphics Clock (MHz) 425 535 675
Processor Clock (MHz) 850 1070 1350
Memory Clock (MHZ) 1200 1250 1250
Standard Memory Configuration GDDR5 GDDR5 GDDR5
Memory Interface Width 256-bit 192-bit 192-bit
Memory Bandwidth (GB/sec) 76.8 60 60
SLI Ready Yes Yes Yes

We eliminated several rows of supported features, which we'll summarize here: all of the 400M CPUs, from the lowly 415M up to the top 480M, include support for DX11, OpenGL 4.0, PhysX, Optimus, CUDA, DirectCompute, OpenCL, H.264/VC1/MPEG2 1080p video decoding, and full spec Blu-ray decode. They also support the HDMI 1.4a spec, so hopefully that means all the new cards will include 1.4a ports; now we just need 1.4a HDMI displays to go along with the GPUs.

The more interesting specs are the number of CUDA cores in the various models, which allow us to make guesses as to the base chip. (Update: NVIDIA also included images of the chips, though it looks like they used the same image for many chips and just changed the log via Photoshop, so we have a pretty good idea of what's going on. We have updated the tables after looking at the images, as one reader suggested we do.) We already know 480M uses a harvested GF100. The GF104 was introduced on the desktop with the GTX 460, and it contains up to 384 CUDA cores—which potentially means the 480M could switch to the GF104 as well. Anyway, the 470M will use GF104, and perhaps a new revision of the 480M will make the switch as well. In the past, NVIDIA has chopped off about half of their halo product for the next level GPUs, and then half of that again for the lower midrange parts, and finally one third/fourth of that for the entry-level parts. Thus, GT200 had up to 240 cores, GT215 had 128, GT216 48, and GT218 came with a lowly 16 cores. Right now, it looks like we don't have that final cut yet, so perhaps we'll see a G 410M at some point in the future.

The good news is that with 400M, we get roughly twice as many cores at every level compared to the previous generation 200M/300M parts, but typically slightly lower clocks. Theoretical computational power is nearly double, but the catch is that our testing of the desktop GTX 480 suggests that clock-for-clock, GF100 cores aren't as potent as GT200 cores. So looking at clocks and core counts, GTX 480 has 90% more computational power available relative to GTX 285, but in actual games it's more like 50% faster—though memory bandwidth and other areas also come into play. Even with that said, here's how things break down in the various performance segments.

At the very top, we've gone from 285M with 128 cores at 1500MHz to 480M with 352 cores at 850MHz. That represents a computational power increase of about 55%, but memory bandwidth is relatively close—only 18% higher. In our testing 480M beat 285M by around 20%, so the computational power isn't likely the bottleneck and memory bandwidth is playing a major role. What we'd like to see is a shift to the smaller (and presumably less power hungry) GF104 while still keeping the same specs, but perhaps that's not possible. Either way, 480M is the mobile performance champion but with a 100W TDP it's also very hot and will only be found in larger notebooks.

The next step down gives us 470M, which replace GTX 260M. The 260M had a TDP of around 55W (75W max, but that was more for the 285M), so presumably the 470M will target a similar power envelope. Core count at the top goes from 112 at 1375MHz in up to 288 at 1070MHz, an increase of 100%. As we saw with 285M and 480M, however, memory bandwidth may be the bigger factor; here the 260M and 470M are equal (60GB/s vs. 60.8GB/s), so it will be interesting to see how performance plays out. It's also very possible that future games will be able to stress shaders more than memory bandwidth and thus show greater performance improvements.

The 460M replaces the GTS 360M and GTS 350M, neither of which saw much use in notebooks. (We'll actually look at our first GTS 350M notebook in the near future, just in time for replacements to arrive.) GTS 360M has 96 cores at 1325MHz with 57.6GB/s of bandwidth; GTS 350M has a slightly lower shader and RAM clocks. The new 460M checks in with 192 cores at 1350MHz, and slightly more memory bandwidth. Again, computationally we're looking at roughly double the performance potential. If TDP is similar, we're also looking at around 40W for the 460M.

Performance and Mainstream 400M
POST A COMMENT

39 Comments

View All Comments

  • sotoa - Friday, September 03, 2010 - link

    Great summary of mobile gpu's I needed this.
    I'm on the edge of my seat waiting for more details on the GF 104. Hopefully they push these out quick for the xmas holidays. Too late for the school season.
    Reply
  • therealnickdanger - Friday, September 03, 2010 - link

    3-D glasses on a laptop seem pointless. Laptops are PERFECT candidates for autostereoscopic displays. Single viewer/user environment with easily adjustable display angles. Once this shutter glasses fad is over, let me know. Reply
  • synaesthetic - Friday, September 03, 2010 - link

    Once this 3D fad in general ends, let me know.

    It's just giving laptop makers even *more* excuses to put 1366x768 resolution displays on expensive, high-powered laptops.

    It is very, very sad.

    All 15" laptops should have 1920x1080 LCD options. I don't care if I have to pay more for it. I just want the option, damn it.
    Reply
  • manno - Thursday, September 23, 2010 - link

    "Once this 3D fad in general ends, let me know.

    It's just giving laptop makers even *more* excuses to put 1366x768 resolution displays on expensive, high-powered laptops.

    It is very, very sad.

    All 15" laptops should have 1920x1080 LCD options. I don't care if I have to pay more for it. I just want the option, damn it. "

    Elect this man president! I keep seeing deals on awesome specked laptops that all have 1366x768 displays... Why power that with a 5650m it's so annoying. I just want a:
    15" 1920x1080
    Core i5 520m or a dual core AMD @ 2.8 Ghz+ (Virtualization)
    Dx11 ATi 5650 or Nvidia GT 435M
    SWITCHABLE GRAPHICS

    Questions to laptop designers:
    Why do all the high-specked AMD laptops come with quads only?
    Why do you hate full HD screens?
    What use is a 16"+ screen @ 1366x768?
    Why don't you offer full HD screens?
    Why ship a laptop with a Core i5/i3 and discreet graphics and not make them switchable? The PM55 chipset is a frustration.
    Why don't you offer full HD screens?
    Why don't you offer full HD screens?
    Why don't you offer full HD screens?
    Why don't you offer full HD screens?
    Why don't you offer full HD screens?
    Why don't you offer full HD screens?
    Why don't you offer full HD screens?
    Why don't you offer full HD screens?

    WhY wHy whY WHY!?

    :)

    Peace!
    -manno
    Reply
  • mczak - Friday, September 03, 2010 - link

    it's correct in the table but wrong in the text (along with the calculated percentage increase over GTX285M).
    Interestingly, as far as peak compute power is concerned, this actually makes GTX470M faster than GTX480M.
    Also, a GF104-based GTX480M wouldn't be possible, at least not with 352 cores - either 336 or 384 it must be (though possibly GTX470M is already close in performance anyway - see above).
    GTX460M is likely based on GF106, not GF104.
    Reply
  • mczak - Friday, September 03, 2010 - link

    Oh and gt445m should be solely based on GF106. All parts below that should be GF108, if rumours are to be believed. Reply
  • JarredWalton - Friday, September 03, 2010 - link

    I've updated the tables based on images NVIDIA sent showing the chips. However, these "images" appear to be the same three chips used multiple times. Given that GT 445M lists two wildly different specs -- 128-bit 25.6GB/s and 192-bit 60GB/s -- I'm guessing they just disable 64-bits of memory controller. Still stinks though. Reply
  • Infomastr - Friday, September 03, 2010 - link

    Wonder how long it will take before these really start replacing 300M cards in current machines? Reply
  • JarredWalton - Friday, September 03, 2010 - link

    Some models with 400M should appear in a couple weeks. Honestly, I expect an initial price premium, but maybe I'll be pleasantly surprised. It's very difficult to recommend anything with 300M now, though, unless it comes at a reduced price. Reply
  • Zorblack1 - Friday, September 03, 2010 - link

    Since you guys raved about the Asus U30JC I went and bought one. Now I'm thinking I want to update my laptop. How about about something where you stick the GT 415M into the U30JC. Reply

Log in

Don't have an account? Sign up now