Power Consumption

The reason that a handful of execution engines within a $150 graphics card can be faster than even some of the most powerful desktop microprocessors is because of the use of specialized logic designed specifically for the task at hand. NVIDIA took this approach to an even greater degree by effectively making its BSP engine useful for exactly one thing: CAVLC/CABAC bitstream decoding for H.264 encoded content. Needless to say, NVIDIA's approach is not only faster than the general purpose microprocessor approach, but it should also be more power efficient.

To measure the improvement in power efficiency, we outfitted our test bed with a GeForce 8600 GT and ran the Yozakura benchmark with hardware acceleration enabled and disabled. With it enabled, the 8600 GT is handling 100% of the H.264 decode process; with it disabled the host CPU (an Intel Core 2 Duo E6320) is responsible for decoding the video stream. We measured total system power consumption at the wall outlet and reported the average and max values in Watts.

Power Consumption

At idle, our test bed consumed 112W and when decoding the most stressful H.264 encoded HD-DVD we've got the power jumped up to 124.8W. Relying on the CPU alone to handle the decoding required 8% more power, bringing the average system power usage up to 135.1W.

Power Consumption

Surprisingly enough, the difference in power consumption isn't as great as we'd expect. Obviously system performance is a completely different story as the 8600's hardware acceleration makes multitasking while watching H.264 content actually feasible, but these numbers show the strength of Intel's 65nm manufacturing process. We do wonder what the power consumption difference would look like if a CPU manufacturer was able to produce a CPU and a GPU on the very same process. With AMD's acquisition of ATI, we may very well know the answer to that question in the coming years.

Serenity (VC1) Final Words
Comments Locked

64 Comments

View All Comments

  • bearxor - Monday, May 21, 2007 - link

    How come we still don't have a article or benchies on a 8500?
  • billd - Friday, May 4, 2007 - link

    It's a mystery to me why nvidia thinks we are interested in H.264 when there is so little material encoded in it. Of the shipping HD disks reviewed on the hidefdigest.com site, most Blu-Ray titles are encoded in MPEG-2 and most HD DVD titles are encoded in VC-1. Furthermore there are more Blu-Ray titles encoded in VC-1 than H.264. It would have been more helpful if nvidia had natively supported VC-1 first and introduced H.264 later. i.e.

    Blu-Ray:
    MPEG-2 : 121
    AVC MPEG-4 : 30
    VC-1 : 46

    HD DVD:
    MPEG-2 : 2
    AVC MPEG-4 : 10
    VC-1 : 161

    Perhaps there are some TV broadcasts in H.264 however given the low bit-rate compared to HD disks there should be little benefit offloading from the CPU to the video card.
  • SilverTrine - Wednesday, May 2, 2007 - link

    Its not really appropriate to call ATi defunct when they have folded into another company, and hardware is still being sold under the ATi name.
  • Parhelion69 - Monday, April 30, 2007 - link

    Anand, I've seen in some previous benchmarkings that software solutions using CoreAVC gave better results than hardware decoding on previous generations of ATI and NVIDIA video cards, could you make some tests to see if this behavior still applies?

    Also I'd love to see tests on older CPUs, like a single core athlon 64 3000+, to see the real help of the decoding on hardware.

    Thanks a lot, I always find your reviews extremelly helpful and professional, keep the good work up!
  • Delerue - Friday, May 4, 2007 - link

    Yeah. I agree. Indeed, some people already sugest this to the Xbit Labs review, since they missed the same things. Look here: http://www.xbitlabs.com/discussion/3743.html">http://www.xbitlabs.com/discussion/3743.html

    BTW, nice review, Anand. You're the guy that I really trust when we talk about hardware. In time, have you confirmed this 'I believe that only PowerDVD/WinDVD support the 8600's hardware acceleration at this point'? Ah! You talked about Intervideo forum, but I can't find it. Can you give to me the adress, please?

    Thanks and keep going!
  • Tewt - Monday, April 30, 2007 - link

    What am I missing here? Wasn't this tech introduced in the 7xxx series? Was I getting 'part' as opposed to 'full'? Or is this 'acceleration' versus 'decoding' and what is the difference?

    And I would like to throw in my two cents along with Parhelion. Just from general reading, my opinion is I keep seeing more and more raw power being thrown around with HD decoding/viewing/etc. Where is the lowest bar for watching HD with no 'hiccups'?

    I would love to see someone write a code for Linux for watching HD and we find out a 1Ghz PIII and an ATI 8500 or Nvidia 5500 would run it just fine.

    Sorry, thought I was watching HD content(games and downloaded trailers) just fine not too long ago with my A64 3200+ and Geforce 6600GT.

  • DerekWilson - Tuesday, May 1, 2007 - link

    games and downloaded trailers are much much lower bitrate than especially blu-ray is capable of. lower powered cpus and older gpus can handle these fine, its the heavy hitting stuff that is the problem.

    the 7 series did not offer full decode. nothing has offered full decode until now. so yes, you were getting part. much of the decode process was being performed on the cpu, while the partially decoded video was sent to the gpu for final processing.

    with the 8600/8500, the cpu handles aacs and i/o overhead, decrypting the data on the disk, and re-encrypting the data stream to send to the gpu. this is for aacs protected content of course. games and downloaded content won't have all this stuff going on. your hd videos will still play with less cpu intervention especially in the case of h.264 videos.
  • bigpow - Monday, April 30, 2007 - link

    don't these people ever learn?
    they f#$ked up the 6800GT/Ultra vs 6600GT with purevideo and now did it again?
  • erikejw - Sunday, April 29, 2007 - link

    I cannot find a single word on picture quality in the article hence I assume it is top notch and there is no difference at all.

    I have no hardware decoder on my system and the quality of the different software decoders
    are from ok to abysmal.

    In a cheap HTPC system a slow Athlon x2 seems to be a good fit.
    I'll build my system around one and a 8500 card.
  • DerekWilson - Tuesday, May 1, 2007 - link

    decode quality is equal to powerdvd software decode quality at least.

    nvidia will be including hd filtering/post processing for the 8600 series on par with 8800, while the 8500 may not have the processing power to fully implement all the quality features.

    we will be evaluating performance using the hd version of silicon optix hqv when finalized. and we may take a look at our beta version before that as well.

Log in

Don't have an account? Sign up now