Power Consumption & Noise

The move from the original Atom platform to Pine Trail dropped power considerably. The old Intel 945G chipset pulled more power than two dual-core Atom CPUs put together. The on-die GPU and NM10 Express chipset fixed that problem.

NVIDIA’s Next Generation ION platform increases power consumption over Pine Trail (and the original ION) because you’re adding a discrete GPU on top of the existing power budget of the platform. Thankfully it’s not a very power hungry GPU.

Power Consumption: NG-ION vs. ION
  Zotac ZBOX HD-ID11 (NG-ION) Zotac ZBOX HD-ND02 (ION1)
Idle 21.7W 19.7W
Load (Playing 1080p H.264) 27.0W 23.4W

Idle power is still in the low 20s and under load we saw the Zotac ZBOX HD-ID11 peak at around 27W. It’s a few watts more than the old ION.

There's a single heatsink + fan that's used to cool the CPU, GPU and NM10 Express chipset. At idle it's barely audible right next to the machine and unnoticeable from across the room. The problem is that the dual core Atom D510 can put out a decent amount of heat under full load. After a lot of testing the fan spun up to 6500 RPM and then noise is another issue entirely. I measured 55 dB(A) an inch away from the unit with the fan running at 6500 RPM. A foot away we're still at 50dB(A).

This situation only appears to happen with the CPU running at its max clock speed however. The XBMC Live image I installed doesn't seem to let the Atom cores underclock themselves to 600MHz when idle which results in heat building up very quickly. I suspect this is simply an issue of not having the right driver installed to enable EIST.

Under Windows 7, being used as a HTPC, the fan always remained quiet. But if you are constantly doing a lot of CPU intensive stuff, expect the system to get considerably louder.

Flash 10.1 Acceleration: The Problem Final Words
Comments Locked

40 Comments

View All Comments

  • Shadowmaster625 - Friday, May 7, 2010 - link

    Why would intel only give 4 pci express lanes? That's just retarded. Why would NVidia even mess with this atom? Why not just use the old atom? Its the same damn thing. Just do a LTB on the old atom. Nvidia should go BK for doing stupid crap like this.
  • hpmoon - Friday, May 7, 2010 - link

    Wow. So while we appreciate that Zotac sent an early review unit, they should have paid attention when most of us observed that it would be rather offensive to jack up the price a second time merely by gauging enthusiast interest. $209 --> $239 --> $259 = pissed off customers. Now that the reviews are eh, we're done with you. And it's gonna hit you hard when every reviewer bemoans how $260 is just the beginning, with $100 at a minimum in additional expense for the RAM and hard drive. For truth-in-advertising, let's get real: The HD-ID11 is just under $400.

    Moving along.
  • hemantha - Sunday, May 9, 2010 - link

    From the power consumption page - "The XBMC Live image I installed doesn't seem to let the Atom cores underclock themselves to 600MHz". I think D510 doesn't support EIST. I believe only Atom Nxxx do. So unless motherboard supports undervolting, I don't think these can be made to run at lower clock speeds.
  • Nathelion - Monday, May 10, 2010 - link

    Is there any information on if/when a Nano-Ion combo will be out? Atom really isn't fast enough to catch my eye, and (C)ULV is too expersive.
  • sucram03 - Tuesday, May 11, 2010 - link

    Did you really just mention VIA? That's scary @_@

    Really, the whole fact that is unless costs are driven down, users are almost better off getting a cheap AMD Vision-powered laptop for approximately the same price. You can find some of those laptops on sale for <$450 and have Athlon II X2 M300 CPUs and Radeon HD 4200's, which are both good enough to accelerate any videos thanks to the new release of the 10.4 Catalyst version (H.264 decoding up to L5.1). And most have HDMI ports, bluetooth, 802.11n, the list goes on...

    Add to that the general flexibility and portability of having a laptop (i.e. having a built-in display right there with the computer, having a battery), and although you will have higher energy usage, it is NOT going to be a major concern for most households when all you do is boot it up for playback.

    Broadcom's chipset is interesting, but still is only able to decode up to L4.1 H.264 if I remember correctly. Nvidia's chipsets would be the BEST to use to enable CUDA decoding and remove pretty much all limitations on accelerating any kind of video, but if you're going to have to pay the same as what you could buy a laptop for (or more), then what's the use? IMHO, AMD appears to have positioned themselves in the middle if we're talking about the HTPC/movie playback department for a budget system. Cost, features, benefits all seem to be pointing to them for the best benefit possible.
  • CereKong - Monday, May 10, 2010 - link

    Quote:
    While manufacturers can use all four PCIe 1.0 lanes coming off Intel’s NM10 Express chipset, most have chosen to use just one leaving the remaining lanes for things like WiFi. A single PCIe 1.0 lane can only provide 250MB/s of bandwidth in either direction, hardly enough for a modern GPU. It’s because of this limitation that the next-generation ION GPU could actually perform slower than the first ION.

    Thus which manufacturers do provide motherboards with multiple lanes for the GPU - and if possible are there any differences performance wise?
  • SnazzyS - Thursday, May 13, 2010 - link

    NewEgg sold out very quickly. Looks like Logic Supply has some in stock: http://www.logicsupply.com/products/zbox_hd_id11
  • idokibovito - Friday, May 14, 2010 - link

    Not quite sure there but I've been keeping my eyes on the Acer Revo 3610 which seems to basically be the same thing as this _without_ the cooling fan! Looking at benchmarks the new CPU is 5-10% faster (tops) and the GPU is not much faster either (because of the PCIe 1x lane). In some benchmarks both CPU and GPU are actually slower than ION1 (which has a Geforce 9400M instead of a GT218.
    I would prefer the new generation, even if it's just a spit faster (think VPDAU and VP3 vs. VP4). But that fan and seemingly no real life performance benefit keeps me looking back on the Revo, which is cheaper and a hardware that is known to work with XBMC and Linux without dirty patches and evening prayers.

    I can't see why this "next-gen" thing is better or even more future proof, however I would like to. Anyone?
    Thanks
  • coutch - Monday, May 24, 2010 - link

    any word if the drivers released today (BETA 256) address the flash performance issue ?
  • Jackie78 - Wednesday, July 28, 2010 - link

    Which version of XBMC did you use, since I guess they do not officially support DXVA accelerated video.

Log in

Don't have an account? Sign up now