POST A COMMENT

34 Comments

Back to Article

  • synaesthetic - Friday, March 05, 2010 - link

    I thought the major difference nvidia pushed for ION2 was that it would be available on non-Atom platforms (like maybe Intel CULV)...

    I guess not. Unimpressive.
    Reply
  • wicko - Wednesday, March 03, 2010 - link

    I have yet to see the Ion platform really take off here in Canada. At least in netbooks anyway. Most of them come with Intel's GMA series, the only one I've seen with an Ion is an Asus Eee (and an Ion LE in an HP netbook).

    I'd be more excited if I could buy the actual product :(
    Reply
  • CZroe - Wednesday, March 03, 2010 - link

    I don't know why this article speaks as if it will soon be available when it has already been available. For example, many already own the Acer Aspire one 532G. Just look at the AspireOneUser.com forums. Reply
  • JarredWalton - Wednesday, March 03, 2010 - link

    I'm not sure where people are buying the 532G... I can find the 532H all over the place, but not the 532G. Can you link me to a thread? Because I don't see any on the main pages; just an Engadget blog. Reply
  • anandtech02148 - Tuesday, March 02, 2010 - link

    nvidia going up against microsoft& intel ,except for the zuneHd
    they hold the netbook standard,its wintel all over again.
    maybe apple should use that $40billions &buyout nvidia,
    it make senses sinced apple is building their own Soc.
    Reply
  • Penti - Tuesday, March 02, 2010 - link

    A good Broadcom Crystal HD alternative then, just as I thought a GPU connected on the PCI-E bus. Copy Engine is of course new tech that weren't around when I speculated, but if it's just gonna be used for DXVA then it's pretty moot in the netbook space. In the HTPC space there's other chips that can be used any way as you won't need the copy engine there. But certainly a nice addition making 1080p HDMI possible. Reply
  • darkhawkff - Tuesday, March 02, 2010 - link

    Being an HP 311 owner, who is extremely happy with the little machine, I fail to really see any value added to both the new Ion, as well as the new Atom CPU's. Granted, the old Ion wasn't exactly a powerhouse, but for what it was it was great. Given the simple fact that (atleast on my HP 311) I was able to attain 2.2 GHz stable overclocks on the Atom N280 processor, it made it possible to (reasonably) play semi-recent games even on it. Testing has shown that at the >2.0 GHz range, the GPU becomes the bottleneck, not the CPU (check out www.myhpmini.com for results). I even use it to play WoW on while I watch a movie or TV. Would I be able to do a 10 man or 25 man raid on it? No, but for questing and doing 5 mans it works quite well. Add in the fact that all the HP 311's hardware (outside of Wireless Lan) is supported by OSX, making it the perfect Mini Mac, how can you really go wrong?

    Nvidia is just trying to continue to stay in the Netbook market, and I can't blame them. But I really see very little value added compared to current offerings. Yes, the video card is probably a little bit better, but will a GT210 really be that much better than even a 9400 at this point? I would wager no.
    Reply
  • JarredWalton - Tuesday, March 02, 2010 - link

    My own testing suggests that CULV at 1.3GHz is still a bottleneck for G210M, which is around 60% faster than 9400M because of the dedicated RAM and higher clocks. I doubt that even Atom at 2.2GHz is going to make the NG-ION the bottleneck. As for the original ION, it probably had as much of a bottleneck from the shared memory bandwidth. (Also: few people would recommend running the HP Mini 311 at 2.2GHz; 2.0GHz is where I'd draw the line, and an extra 10% in CPU clock with a much greater risk of system damage is pretty extreme.)

    So take ION, get rid of battery life limitations, and make it 50 to 100% faster. Yes, the new ION is clearly better than the old version in quite a few ways. Is it universally better? No, because Optimus has a few minor drawbacks: Win7 required--but we would all run that regardless, right?--and there's simplicity in not having to transfer data between GPU RAM and system RAM.

    Finally, comparing an overclocked HP Mini 311 to what you would get from a stock NG-ION is setting up a straw man. Of course an overclocked ION/Atom are going to be faster (or at least equal), but you'll also be able to overclock NG-ION. We'll see how it all works out when we get actual hardware.
    Reply
  • jimhsu - Tuesday, March 02, 2010 - link

    Assuming I read your post right, I also agree that the Atom remains the bottleneck .. ION and ION 2 remain fundamentally CPU limited. Is there anything other than cost preventing the complete abandonment of Atom and replacement with CULV? Reply
  • jimhsu - Tuesday, March 02, 2010 - link

    I commented on that mostly because I've personally used netbooks with Atom and find the performance completely unacceptable, even comparing with 5 year old desktops. When "simple" tasks like launching Word or Firefox become CPU limited operations (CPU pegged at 100%), it brings back unpleasant memories from the last decade. Resuming from sleep takes half a minute for the computer to be usable again. Etc, etc. Reply
  • erple2 - Tuesday, March 02, 2010 - link

    To be fair, launching FireFox on my Ubuntu 9.10 desktop (a P4 2.53 GHz, 768MB computer with a 9800Pro gfx card) also pegs the CPU at 100%... Maybe that desktop CPU is more than 5 years old? Reply
  • JarredWalton - Tuesday, March 02, 2010 - link

    The only thing preventing people from using CULV instead of Atom is cost, and as you may have guessed from my last article on the subject (CULV Roundup: Who Needs Atom?) I'm all for skipping Atom and getting CULV instead. If you need graphics, then you go with CULV+G210M with stuff like the ASUS UL series. I also find Atom to be painfully slow, but I suppose I'm more demanding of my PC than a lot of users. Reply
  • damianrobertjones - Tuesday, March 02, 2010 - link

    According to apple the Netbook is set to fail? Then we get a graph stating that a high number of people actually use and like Netbooks?

    Should we believe the trend or listen to a guy that will sell you ANYTHING at a higher than normal price?

    Reply
  • kevinqian - Tuesday, March 02, 2010 - link

    There seems to be an influx of new Core i3/i5 laptops coming in at around $500-600 mark. Granted, they are 15.6" and larger LCD. At that price, it essentially prices out CULV laptops selling at similar prices. So $400 gets you an ION netbook and $500 gets you a full blown Core i3 laptop. Did someone at Intel screw up their market segmentation? Reply
  • Lonyo - Tuesday, March 02, 2010 - link

    How did Intel screw up market segmentation?
    NV are the ones screwing up by adding cost to what should be an inexpensive platform.
    $400 for an ION netbook isn't $400 for a netbook, and it has nothing to do with Intel. In fact, the less ION the better for them.
    Reply
  • JarredWalton - Tuesday, March 02, 2010 - link

    Atom = $300 to $400 with long battery life and low performance

    CULV = $500 to $600 with long battery life and three times the performance of Atom

    Core i3/i5 = $500 to $1000 (or more) with less than half the battery life of CULV but more than twice the performance.

    When Core i3/i5 CULV arrives, I expect it won't be significantly faster than current CULV... maybe 25% faster (at most 75% in certain scenarios)? So that's the market segmentation Intel is going for, more or less.
    Reply
  • kevinqian - Tuesday, March 02, 2010 - link

    So it comes down to pick your poison at the $500-600 price point. Do you want fast performance with a larger display or slower performance and more portability. I guess if you want both (performance and portability), you gotta step up to a future MBP or Lenovo T4xxs. Reply
  • Penti - Tuesday, March 02, 2010 - link

    Atom should mostly move back into the embedded space now. It's great for that.

    CULV is great for consumers especially ones flash gets official hw acceleration. There's no CULV that can't be accelerated, no confusion. Atom with Broadcom or ION and W7HP will cost 400-450 and the costumers must make an active choice.
    Reply
  • AmdInside - Tuesday, March 02, 2010 - link

    I think the technology is just too advanced to just throw it out there for all desktops. Look at how long it has taken NVIDIA to get from their first Hybrid to Optimus. Notebooks and OEM systems are more controlled environments. I personally would not want to see it in a desktop because it would just be another feature that can fail to work properly. ATI and NVIDIA have gotten pretty good at reducing power requirements when the GPU is not doing much. I think overtime they will get even better, especially when you see how little power Tegra consumes. Reply
  • Doormat - Tuesday, March 02, 2010 - link

    One of the rumors about the ION was Nvidia was recommending OCing the PCIe link in the netbook. An extra 10% gets 275MB/s.

    Also, the whole "no other OSes" line was really depressing.
    Reply
  • teohhanhui - Tuesday, March 02, 2010 - link

    So they have a good tech and they're saying desktop users can't get it? :( Reply
  • beginner99 - Tuesday, March 02, 2010 - link

    yep strange. One would assume that if it's only in the drivers there is no additional cost to also make it available for desktops.
    Meaning this looks like this optimus thing must have some kind of downside (additional hardware, slower performance, worse image quality?). Nothing is for free...

    Optimus would also be nice for desktops with the new i3/i5 dual-cores. You could have a decent gaming performance and low power usage if you are not gaming.

    Reply
  • ChuckDriver - Tuesday, March 02, 2010 - link

    Nvidia probably doesn't want to allocate resources to validate a feature on a platform where it would add little value. I think Optimus GPUs also have a region of silicon called the "Copy Engine" that copies the contents of the Nvidia framebuffer to the Intel framebuffer. Nvidia might not include that on the desktop GPUs or disable it in the BIOS if present. These are my opinions, I don't have any documentation to back them up. Reply
  • JarredWalton - Tuesday, March 02, 2010 - link

    To my knowledge, all 40nm G200/G300 parts have the Copy Engine... but it may not be on desktop chips. Anyway, NVIDIA's statements to me indicate that they just don't see it as critical on the desktop. If you can idle at around 20W, and you can use the GPU for other tasks, desktops may as well keep the GPU live at all times. (And if you're running a lower end GPU, idle power is probably under 10W.) Also, you would need to have all of the video output functions come off the IGP, and there are a lot of IGP motherboards where you are limited. How many would fully support HDMI with Optimus at 1080p? I don't know for sure.

    I still think they'll release Optimus for desktops at some point, but they don't want to spill the beans beforehand. It will probably be limited, i.e. something like "Core i3/i5 and later IGP required" to reduce the amount of validation. Honestly, though, until the notebook drivers are in lock step with the desktop drivers and all of the various bugs are worked out, Optimus can remain a mobile-only solution.
    Reply
  • Penti - Tuesday, March 02, 2010 - link

    Nothing is stopping anyone to use mobile parts on desktops though.

    PS. Sorry for accidentally hitting the report post link =P
    Reply
  • ltcommanderdata - Tuesday, March 02, 2010 - link

    NVIDIA informs us that there are currently no plans for Optimus on desktops or on other OSes.

    With nVidia so adamant about Optimus not coming to other OSs can we imply that Optimus won't be coming to the next MacBook Pro refresh as rumoured and that this new 40nm Ion won't serve as a replacement for the 9400M in Apple computers?

    Any word on the TDP of this new Ion? I'm guessing it'll have to be quite a bit lower than other low-end discrete nVidia GPUs like the 305M/310M to make it worthwhile.

    In terms of how the new Ion is achieving enough PCIe bandwidth, could nVidia be implementing their PCIe link such that they can gang the transmit and receive pairs? I'm assuming peak bandwidth is mainly needed for uplink back to the chipset, so ganging the differential pair together can double bandwidth to the required 500MB/s.
    Reply
  • AmdInside - Tuesday, March 02, 2010 - link

    Given that Apple has used their own Hybrid technology on their Macbooks so far, I am not the least bit surprised they were not interested in Optimus.

    I personally was at first turned off by Optimus because I thought the display engine had the same limitations of the Intel GPU (I want 1080p HDMI output from a netbook) but I see this is not the case. Thank goodness. I also see there will be a 10" ION 2 which is what I've longed for since the first ION was introduced. Finally a netbook that I can carry everywhere I go including the gym and use like a portable video player. It's March. Where can I buy the Acer Aspire One 532G?
    Reply
  • JarredWalton - Tuesday, March 02, 2010 - link

    NVIDIA tells us mid to late March for these systems to show up at retail. Reply
  • yyrkoon - Tuesday, March 02, 2010 - link

    There is a lot to think about here. But when thinking about it, you're still forced to realize this this is still going to be in an atom platform. A platform that will no doubt be over priced( maybe even more than an entry level laptop ), use barely less power than an entry level laptop, and provide far less performance.

    I am seeing a pattern here, one that I have seen emerge in the past, when other companies ( some even far larger ) went belly up, or lost a huge portion of the PC/Portable computer market.

    They have to have the know how, and they definitely have the backing. Is there something wrong with mixing this technology with an ARM processor, or just make a new class of a netbook that uses low powered ( other ) mobile processors. Be it in a netbook, or a nettop ? Oh right. The biggest gaming OS would be Windows . . . and they expect that of an atom CPU no less. . . Yeah right.

    Yeah, I do not know. They are either mired in legality issues, or their creative side is no longer very creative. Who knows. Maybe some day they'll wise up and see the bigger picture.
    Reply
  • JarredWalton - Tuesday, March 02, 2010 - link

    NVIDIA is still pushing the idea of moving more work to the GPU side of things and taking away from what the CPU needs to handle. This obviously works very well for certain tasks (e.g. video decoding, encoding, etc.) but doesn't help in other areas.

    But, remember when a 1.0GHz Pentium 3 was super fast? Atom is still a step up from there, so with the correct software solutions Atom + ION is viable for a lot of things. Running standard Windows games with no extra work done on optimization? Not so much.

    As for Apple, even when they did switchable graphics you had to log off and log back in, and the "high-end" graphics in MacBook Pro is a rather anemic 9600M. Not that NVIDIA has had much better without moving to power hungry parts, but 32 SPs is nothing to write home about. I always thought it was odd that MBP had 9400M and 9600M... sure, it's twice as fast, but still a far cry from modern desktops.

    Anyway, if NVIDIA ever does port Optimus to OS X (which despite their statements to the contrary seems like it will happen at some point), Linux would probably not be too far behind.
    Reply
  • yyrkoon - Wednesday, March 03, 2010 - link

    One of the things that gets me, is that they will not / can not port this technology to the desktop. Would it not be great to have switchable switchable graphics on a low powered IGP platform, and then get a boost when you need / want it ? But nvidia still drives up the power required to use parts on the desktop.

    But, let me backup a minute. Would it not be nice to have a mobile part in a desktop for max efficiency ? Let say, something like the equivalent of the 250M, with very low power usage, but very good performance for the power usage statistics ? I am thinking ~35-40W max under load.

    Even the 7600GT for its time, could not beat these power usage numbers, and for a single monitor at around 1440x900, it did not do terrible performance wise. That, and the 7600GT was one of the most power thrifty discrete cards offered for the desktop, that gave decent performance at or around this resolution. AM I wrong in thinking the 250M GPU could trump the 7600GT in both of these areas ? If I am, then I am sure there is something that *can*.

    Also, look, I am pro Microsoft. I really like Windows 7, especially the 64-bit variant of Ultimate. It runs really nicely on "cheap" laptop with only a T3400 CPU, but with 4GB of memory. Anyways, what is up with nvidia and their "nothing but Windows" stance on this. While again, is there something wrong with the other hardware available to make better use of this current technology ? ARM comes to mind. As well as even a different CPU produced by Intel, or even AMD.

    Maybe the above is moot, because there is already something to fill those gaps, or they do not want to compete with themselves because of the new emerging hardware ( based on ARM was it ? ) they seem to have announced recently. I really do not know the whole story, but it does seem rather short sighted to me that they would limit this hardware to a single software platform. No matter which is is. Give your customer the freedom while using your hardware, and perhaps they will respond in kind by buying your hardware to begin with( and all that ).
    Reply
  • Penti - Tuesday, March 02, 2010 - link

    Twice as fast? What are you on?

    http://www.notebookcheck.net/NVIDIA-GeForce-9600M-...">http://www.notebookcheck.net/NVIDIA-GeForce-9600M-...

    http://www.notebookcheck.net/NVIDIA-GeForce-9400M-...">http://www.notebookcheck.net/NVIDIA-GeForce-9400M-...

    It's game-able with 9600M it's not really game-able with Integrated 9400M.
    Reply
  • JarredWalton - Tuesday, March 02, 2010 - link

    Okay, so it's "over twice as fast". It's still not a performance part. 3DMark isn't usually the best source of data for true performance. Looking to actual games, 9600M typically scores around 2 to 3 times as high as 9400M. The 9400M achieves playable frame rates at minimum details and 800x600 in nearly all games, but only about half are playable at 1366x768. Something like a 9600M is playable in all titles at 1366x768. It's still pretty anemic compared to a $100 desktop card, or a 9800M part. Reply
  • Penti - Wednesday, March 03, 2010 - link

    I was looking at the games (which is included in most reviews/benchmarks at that site).

    9400M does fairly well on a high-speed cpu though I'll give you that. But it's still a pain to run most games.

    Dedicated memory helps, I wonder if the NG-ION will be helped by it. Looks like it will be pretty low bandwidth. 9600M is old of course. But not much else has been available. Of course I'd rather see say a Mobility HD5650. But that's still only comparable in performance to a 9800M GS. They fit the power envelope though. But that won't happen till they move to Core i lappys for Apples part. But of course even the difference of 9400M and 9600M can be felt as enormous. You don't really need to be able to play at higher resolutions then the lappy screen either way. I do agree that it's pretty anemic any way though, especially for the 17" MacBook Pro, but then again it's not a gaming computer. It's not the same as desktop where you need to game at around 1920x1200 and has screens upto 2560x1440. Being able to play at all is pretty good on a laptop.
    Reply

Log in

Don't have an account? Sign up now