Getting Technical with Next Generation ION

Obviously, there are some changes to the core ION design, since the previous version was a chipset with integrated graphics while the new version only has to worry about GPU duties. The manufacturing technology for ION has also shifted from 55nm for the first generation to 40nm hardware; this results in a package size reduction from 35x35mm down to 23x23mm. Outside of clock speeds what we're looking at is essentially a netbook/nettop version of the GeForce G 210M (with half the SPs on certain models—see below), with full DirectX 10.1 support. The memory technology will continue to be DDR2, which comes from the Intel NM10 chipset.

Since NG-ION is now a discrete GPU, it also comes with up to 512MB of dedicated DDR3 memory. This alone should provide better performance than the old version, as there's now more bandwidth to go around. NVIDIA isn't disclosing clock speeds yet, but given the process shrink we would expect moderately higher clocks. That probably won't matter much for complex games, as the Atom CPU will continue to be a bottleneck, but it will help with some games and CUDA applications. NVIDIA says that in general NG-ION will be 50 to 100% faster than the original ION.

Like the previous iteration of ION, there will be two versions of ION. Nettop IONs will come with 16 SPs (aka CUDA Cores), while netbooks will come with either 16 SPs or 8 SPs. The 8 SP version is designed specifically to fit within the thermal constraints of a 10.1" chassis, so we'll actually see 10" netbooks with ION this time around. Fewer CUDA Cores may impact gaming performance (depending on the title) and CUDA applications, but NVIDIA says Blu-ray playback will continue to work, so by extension HD video decoding won't be a problem.

The one aspect of the technology that NVIDIA wouldn't discuss extensively is how Optimus works within the bandwidth constraints imposed by the NM10 chipset. Just to clarify, NM10 provides just four PCI Express 1.1 lanes, and while a manufacturer could choose to use all of those for the ION GPU NVIDIA says most will use a single link, leaving the other lanes open for additional devices. PCIe 1.1 provides 250MB/s of bidirectional bandwidth on each link, so that means Optimus will have to work within that constraint. Optimus copies content from the GPU memory to the system memory directly, to avoid any issues with flickering displays, but that means NG-ION will need to transmit a lot of data over a relatively narrow link.

Driving a low resolution 1024x600 LCD at 60FPS isn't a problem, but even at 1366x768 we are at the limits of x1 bandwidth (such a display would require 251.78MB/s to be exact). NVIDIA says that NG-ION won't have any trouble driving up to 1080p displays, so naturally we were curious how they manage to get 1920x1080x32-bit @ 60FPS (497.66MB/s) over a 250MB/s link. For now, unfortunately, they aren't saying, other than that there's some "intelligent work" going on in the drivers. Real-time compression is one option, and if you only transmit the meaningful 24-bits of color you can save 33% of the total bandwidth. All we know is that NVIDIA says it works, that the content isn't compressed (presumably that means no lossy compression, as clearly there has to be some form of compression happening), and that they're not overclocking the PCIe bus. In short, Optimus has some "special sauce" that NVIDIA doesn't want to disclose (yet?).

Update: The answer was staring me in the face, but I missed it. NVIDIA was kind enough to provide the remaining information. If you read below, HDMI requires the GPU, since the Atom IGP doesn't support HDMI output. That solves any question of 1080p support, since the only thing going over the x1 link at that point is the compressed video stream. For content running on the laptop LCD, the maximum resolution supported by the Atom LVDS controller is 1366x768, so NVIDIA can do two things. First, any content at a higher resolution (e.g. 1080p Blu-ray) can be scaled down to 1366x768 before going over the x1 PCI-E bus. Second, most videos are at 30p or 24p, so 1366x768x32-bit @ 30FPS is well withing the bandwidth constraints of an x1 link (125.9MB/s). For 1366x768 @ 60FPS, rendering internally at 32-bit (with the Alpha/Z/Bump channel) and then transmitting the final 24-bits of color data still seems like a good solution, but NVIDIA didn't clarify if they're doing anything special for that case or if they're just saturating the x1 link.

Speaking of Optimus, that also means that NG-ION netbooks are limited to Win7 systems. This won't be a concern for most users, but know in advance that installing Linux on such a netbook would mean the loss of your GPU functionality (short of some enterprising individuals figuring out how to do Optimus on their own, which I wouldn't put past the Linux community given time). NVIDIA informs us that there are currently no plans for Optimus on desktops or on other OSes.

One item that created a bit of confusion for us was the Windows 7 Starter requirement. NVIDIA is able to handle the Aero UI with ION, obviously, but what does that mean for Optimus? If GMA 3150 can't run Aero on its own, Optimus would have to constantly power up the GPU to handle Aero effects. That would be counterproductive and as far as we can tell, the real issue with Windows 7 Starter comes down to manufacturers wanting to save money by going with Starter instead of install Home Premium. GMA 3150 can handle Aero (albeit barely), so Optimus doesn't need to worry about using the GPU for standard Windows applications with Pineview CPUs like the N450. Whether all ION netbooks will require Home Premium isn't clear, but we see no reason to pair such a netbook with Win7 Starter.

Finally, because of the IGP restrictions on the GMA 3150, NG-ION will require the GPU to be active any time an HDMI connection is used. GMA 3150 does not have support for HDMI at all, so the HDMI connection will run solely off the ION GPU. Considering HDMI means you're tethered to an external display this shouldn't be a problem, as you can easily plug in the netbook. Second, GMA 3150/NM10 has a resolution limit of 1366x768 for LVDS and 1400x1050 for VGA, so higher resolutions will need ION—but again, higher resolutions will only be available with an external display. The VGA connection is supposed to come from the IGP, so the resolution limit for VGA will remain in effect.

Index ION: TNG Lives Long and Prospers
Comments Locked

34 Comments

View All Comments

  • teohhanhui - Tuesday, March 2, 2010 - link

    So they have a good tech and they're saying desktop users can't get it? :(
  • beginner99 - Tuesday, March 2, 2010 - link

    yep strange. One would assume that if it's only in the drivers there is no additional cost to also make it available for desktops.
    Meaning this looks like this optimus thing must have some kind of downside (additional hardware, slower performance, worse image quality?). Nothing is for free...

    Optimus would also be nice for desktops with the new i3/i5 dual-cores. You could have a decent gaming performance and low power usage if you are not gaming.

  • ChuckDriver - Tuesday, March 2, 2010 - link

    Nvidia probably doesn't want to allocate resources to validate a feature on a platform where it would add little value. I think Optimus GPUs also have a region of silicon called the "Copy Engine" that copies the contents of the Nvidia framebuffer to the Intel framebuffer. Nvidia might not include that on the desktop GPUs or disable it in the BIOS if present. These are my opinions, I don't have any documentation to back them up.
  • JarredWalton - Tuesday, March 2, 2010 - link

    To my knowledge, all 40nm G200/G300 parts have the Copy Engine... but it may not be on desktop chips. Anyway, NVIDIA's statements to me indicate that they just don't see it as critical on the desktop. If you can idle at around 20W, and you can use the GPU for other tasks, desktops may as well keep the GPU live at all times. (And if you're running a lower end GPU, idle power is probably under 10W.) Also, you would need to have all of the video output functions come off the IGP, and there are a lot of IGP motherboards where you are limited. How many would fully support HDMI with Optimus at 1080p? I don't know for sure.

    I still think they'll release Optimus for desktops at some point, but they don't want to spill the beans beforehand. It will probably be limited, i.e. something like "Core i3/i5 and later IGP required" to reduce the amount of validation. Honestly, though, until the notebook drivers are in lock step with the desktop drivers and all of the various bugs are worked out, Optimus can remain a mobile-only solution.
  • Penti - Tuesday, March 2, 2010 - link

    Nothing is stopping anyone to use mobile parts on desktops though.

    PS. Sorry for accidentally hitting the report post link =P
  • ltcommanderdata - Tuesday, March 2, 2010 - link

    NVIDIA informs us that there are currently no plans for Optimus on desktops or on other OSes.

    With nVidia so adamant about Optimus not coming to other OSs can we imply that Optimus won't be coming to the next MacBook Pro refresh as rumoured and that this new 40nm Ion won't serve as a replacement for the 9400M in Apple computers?

    Any word on the TDP of this new Ion? I'm guessing it'll have to be quite a bit lower than other low-end discrete nVidia GPUs like the 305M/310M to make it worthwhile.

    In terms of how the new Ion is achieving enough PCIe bandwidth, could nVidia be implementing their PCIe link such that they can gang the transmit and receive pairs? I'm assuming peak bandwidth is mainly needed for uplink back to the chipset, so ganging the differential pair together can double bandwidth to the required 500MB/s.
  • AmdInside - Tuesday, March 2, 2010 - link

    Given that Apple has used their own Hybrid technology on their Macbooks so far, I am not the least bit surprised they were not interested in Optimus.

    I personally was at first turned off by Optimus because I thought the display engine had the same limitations of the Intel GPU (I want 1080p HDMI output from a netbook) but I see this is not the case. Thank goodness. I also see there will be a 10" ION 2 which is what I've longed for since the first ION was introduced. Finally a netbook that I can carry everywhere I go including the gym and use like a portable video player. It's March. Where can I buy the Acer Aspire One 532G?
  • JarredWalton - Tuesday, March 2, 2010 - link

    NVIDIA tells us mid to late March for these systems to show up at retail.
  • yyrkoon - Tuesday, March 2, 2010 - link

    There is a lot to think about here. But when thinking about it, you're still forced to realize this this is still going to be in an atom platform. A platform that will no doubt be over priced( maybe even more than an entry level laptop ), use barely less power than an entry level laptop, and provide far less performance.

    I am seeing a pattern here, one that I have seen emerge in the past, when other companies ( some even far larger ) went belly up, or lost a huge portion of the PC/Portable computer market.

    They have to have the know how, and they definitely have the backing. Is there something wrong with mixing this technology with an ARM processor, or just make a new class of a netbook that uses low powered ( other ) mobile processors. Be it in a netbook, or a nettop ? Oh right. The biggest gaming OS would be Windows . . . and they expect that of an atom CPU no less. . . Yeah right.

    Yeah, I do not know. They are either mired in legality issues, or their creative side is no longer very creative. Who knows. Maybe some day they'll wise up and see the bigger picture.
  • JarredWalton - Tuesday, March 2, 2010 - link

    NVIDIA is still pushing the idea of moving more work to the GPU side of things and taking away from what the CPU needs to handle. This obviously works very well for certain tasks (e.g. video decoding, encoding, etc.) but doesn't help in other areas.

    But, remember when a 1.0GHz Pentium 3 was super fast? Atom is still a step up from there, so with the correct software solutions Atom + ION is viable for a lot of things. Running standard Windows games with no extra work done on optimization? Not so much.

    As for Apple, even when they did switchable graphics you had to log off and log back in, and the "high-end" graphics in MacBook Pro is a rather anemic 9600M. Not that NVIDIA has had much better without moving to power hungry parts, but 32 SPs is nothing to write home about. I always thought it was odd that MBP had 9400M and 9600M... sure, it's twice as fast, but still a far cry from modern desktops.

    Anyway, if NVIDIA ever does port Optimus to OS X (which despite their statements to the contrary seems like it will happen at some point), Linux would probably not be too far behind.

Log in

Don't have an account? Sign up now