The netbook market has exploded since its introduction; there are over 40 million netbooks out in the wild today, which is tremendous growth for what is essentially a new class of PC. With such a large number of mobile devices, it's only natural for companies like NVIDIA to look for ways to get a slice of the netbook pie. As a graphics company, NVIDIA is quick to point out how poorly Intel's IGPs perform, and the GMA 950 paired with Atom netbooks is particularly slow. As an alternative to Atom+GMA 950, NVIDIA created the ION platform, which would provide dramatically improved graphics along with HD video decoding.

The first such implementation combined the GeForce 9400M chipset with an Atom N270/N280 for netbooks, or an Atom 330 for a nettop. A single-core Atom CPU is just barely able to handle 720p H.264 decoding on its own (with the CoreAVC codec—other less optimized codecs would still drop frames). 1080p support? Fahgeddaboutit! NVIDIA's ION nettops provided the necessary hardware to make a tiny HTPC box capable of handling Blu-ray playback, and the CPU and chipset are efficient enough that passive cooling isn't a problem.

On the netbook side, ION was a tougher sell. 1080p support is a nice bullet feature, but when most netbooks have 1024x600 LCDs, does HD support really matter? Plus, you would need an external USB Blu-ray drive to make it work. There's still gaming, and with Flash 10.1 (now at Beta 3) acceleration you can certainly make the argument that an ION netbook provides a superior user experience compared to stock Atom netbooks, but the caveats don't end there. NVIDIA stated that their chipset power requirements were "competitive" with Intel's chipset, but they appear to be taking performance into the equation. Our own numbers suggest that a good GMA 950+N280 solution is anywhere from 17% (H.264 decode) to over 35% (Internet surfing) more power efficient than the original ION—that's using the ASUS 1005HA and the HP Mini 311 as a point of reference. So you'd be stuck with less battery life but more features, with a higher price as well.

Things got quite a bit more complicated with the release of the Pine Trail platform and Pineview processors. Besides the fact that Pine Trail is even more power efficient (up to 70% more battery life relative to ION in Internet testing), Intel moved their IGP solution into the CPU package and eliminated the old FSB link. The Atom N450 links to the NM10 chipset with a proprietary DMI connection and NVIDIA doesn't make—and legally can't make—a compatible chipset, so using a non-Intel chipset with N450 simply isn't an option. The problem with Pine Trail is that HD video decoding remains difficult, unless you add a separate decoder chip, and gaming and other aspects of the user experience are still lackluster—N450 netbooks typically make do with Windows 7 Starter. Lucky for NVIDIA, they have some new technology called Optimus that makes all of this a moot point.

If you've got any math skills, you've probably already put two and two together to figure out what NVIDIA is announcing today. The Next Generation ION (NG-ION) platform consists of a Pineview netbook with a discrete graphics chip from NVIDIA, with Optimus allowing the GPU to switch on/off as needed. Note that there is no Optimus technology for nettop solutions, which will simply use an NVIDIA discrete GPU all the time. On a nettop that's always plugged in, NG-ION might use ~3W more power at idle, but that's not enough to worry about. There's also a benefit to just keeping things simple by using a standard discrete GPU.

Simplifying NG-ION like we just did is great for the layman, but there are plenty of other technical aspects to discuss that make things a bit more interesting. We don't have hardware for testing, so all we can pass along are NVDIA's performance information, but they make sense as we'll see in a moment. We'll also discuss some of the implementation specific details, expected availability, etc.

Getting Technical with Next Generation ION
Comments Locked

34 Comments

View All Comments

  • teohhanhui - Tuesday, March 2, 2010 - link

    So they have a good tech and they're saying desktop users can't get it? :(
  • beginner99 - Tuesday, March 2, 2010 - link

    yep strange. One would assume that if it's only in the drivers there is no additional cost to also make it available for desktops.
    Meaning this looks like this optimus thing must have some kind of downside (additional hardware, slower performance, worse image quality?). Nothing is for free...

    Optimus would also be nice for desktops with the new i3/i5 dual-cores. You could have a decent gaming performance and low power usage if you are not gaming.

  • ChuckDriver - Tuesday, March 2, 2010 - link

    Nvidia probably doesn't want to allocate resources to validate a feature on a platform where it would add little value. I think Optimus GPUs also have a region of silicon called the "Copy Engine" that copies the contents of the Nvidia framebuffer to the Intel framebuffer. Nvidia might not include that on the desktop GPUs or disable it in the BIOS if present. These are my opinions, I don't have any documentation to back them up.
  • JarredWalton - Tuesday, March 2, 2010 - link

    To my knowledge, all 40nm G200/G300 parts have the Copy Engine... but it may not be on desktop chips. Anyway, NVIDIA's statements to me indicate that they just don't see it as critical on the desktop. If you can idle at around 20W, and you can use the GPU for other tasks, desktops may as well keep the GPU live at all times. (And if you're running a lower end GPU, idle power is probably under 10W.) Also, you would need to have all of the video output functions come off the IGP, and there are a lot of IGP motherboards where you are limited. How many would fully support HDMI with Optimus at 1080p? I don't know for sure.

    I still think they'll release Optimus for desktops at some point, but they don't want to spill the beans beforehand. It will probably be limited, i.e. something like "Core i3/i5 and later IGP required" to reduce the amount of validation. Honestly, though, until the notebook drivers are in lock step with the desktop drivers and all of the various bugs are worked out, Optimus can remain a mobile-only solution.
  • Penti - Tuesday, March 2, 2010 - link

    Nothing is stopping anyone to use mobile parts on desktops though.

    PS. Sorry for accidentally hitting the report post link =P
  • ltcommanderdata - Tuesday, March 2, 2010 - link

    NVIDIA informs us that there are currently no plans for Optimus on desktops or on other OSes.

    With nVidia so adamant about Optimus not coming to other OSs can we imply that Optimus won't be coming to the next MacBook Pro refresh as rumoured and that this new 40nm Ion won't serve as a replacement for the 9400M in Apple computers?

    Any word on the TDP of this new Ion? I'm guessing it'll have to be quite a bit lower than other low-end discrete nVidia GPUs like the 305M/310M to make it worthwhile.

    In terms of how the new Ion is achieving enough PCIe bandwidth, could nVidia be implementing their PCIe link such that they can gang the transmit and receive pairs? I'm assuming peak bandwidth is mainly needed for uplink back to the chipset, so ganging the differential pair together can double bandwidth to the required 500MB/s.
  • AmdInside - Tuesday, March 2, 2010 - link

    Given that Apple has used their own Hybrid technology on their Macbooks so far, I am not the least bit surprised they were not interested in Optimus.

    I personally was at first turned off by Optimus because I thought the display engine had the same limitations of the Intel GPU (I want 1080p HDMI output from a netbook) but I see this is not the case. Thank goodness. I also see there will be a 10" ION 2 which is what I've longed for since the first ION was introduced. Finally a netbook that I can carry everywhere I go including the gym and use like a portable video player. It's March. Where can I buy the Acer Aspire One 532G?
  • JarredWalton - Tuesday, March 2, 2010 - link

    NVIDIA tells us mid to late March for these systems to show up at retail.
  • yyrkoon - Tuesday, March 2, 2010 - link

    There is a lot to think about here. But when thinking about it, you're still forced to realize this this is still going to be in an atom platform. A platform that will no doubt be over priced( maybe even more than an entry level laptop ), use barely less power than an entry level laptop, and provide far less performance.

    I am seeing a pattern here, one that I have seen emerge in the past, when other companies ( some even far larger ) went belly up, or lost a huge portion of the PC/Portable computer market.

    They have to have the know how, and they definitely have the backing. Is there something wrong with mixing this technology with an ARM processor, or just make a new class of a netbook that uses low powered ( other ) mobile processors. Be it in a netbook, or a nettop ? Oh right. The biggest gaming OS would be Windows . . . and they expect that of an atom CPU no less. . . Yeah right.

    Yeah, I do not know. They are either mired in legality issues, or their creative side is no longer very creative. Who knows. Maybe some day they'll wise up and see the bigger picture.
  • JarredWalton - Tuesday, March 2, 2010 - link

    NVIDIA is still pushing the idea of moving more work to the GPU side of things and taking away from what the CPU needs to handle. This obviously works very well for certain tasks (e.g. video decoding, encoding, etc.) but doesn't help in other areas.

    But, remember when a 1.0GHz Pentium 3 was super fast? Atom is still a step up from there, so with the correct software solutions Atom + ION is viable for a lot of things. Running standard Windows games with no extra work done on optimization? Not so much.

    As for Apple, even when they did switchable graphics you had to log off and log back in, and the "high-end" graphics in MacBook Pro is a rather anemic 9600M. Not that NVIDIA has had much better without moving to power hungry parts, but 32 SPs is nothing to write home about. I always thought it was odd that MBP had 9400M and 9600M... sure, it's twice as fast, but still a far cry from modern desktops.

    Anyway, if NVIDIA ever does port Optimus to OS X (which despite their statements to the contrary seems like it will happen at some point), Linux would probably not be too far behind.

Log in

Don't have an account? Sign up now