NVIDIA has always been the underdog when it comes to video processing features on its GPUs. For years ATI had dominated the market, being the first of the two to really take video decode quality and performance into account on its GPUs. Although now defunct, ATI maintained a significant lead over NVIDIA when it came to bringing TV to your PC. ATI's All-in-Wonder series offered a much better time shifting/DVR experience than anything NVIDIA managed to muster up, usually too late on top of that. Obviously these days most third party DVR applications have been made obsolete by the advent of Microsoft's Media Center 10-ft UI, but when the competition was tough, ATI was truly on top.

While NVIDIA eventually focused on more than just 3D performance with its GPUs, NVIDIA always seemed to be one step behind ATI when it came to video processing and decoding features. More recently, ATI was first to offer H.264 decode acceleration on its GPUs at the end of 2005.

NVIDIA has remained mostly quiet throughout much of ATI's dominance of the video market, but for the first time in recent history, NVIDIA actually beat ATI to the punch on implementing a new video related feature. With the launch of its GeForce 8600 and 8500 GPUs, NVIDIA became the first to offer 100% GPU based decoding of H.264 content. While we can assume that ATI will offer the same in its next-generation graphics architecture, the fact of the matter is that NVIDIA was first and you can actually buy these cards today with full H.264 decode acceleration.

We've taken two looks at 3D gaming performance of NVIDIA's GeForce 8600 series and came away relatively unimpressed, but for those interested in watching HD-DVD/Blu-ray content on their PCs does NVIDIA's latest mid-range offering have any redeeming qualities?

Before we get to the performance tests, it's important to have an understanding of what the 8600/8500 are capable of doing and what they aren't. You may remember this slide from our original 8600 article:

The blocks in green illustrate what stages in the H.264 decode pipeline are now handled completely by the GPU, and you'll note that this overly simplified decode pipeline indicates that the GeForce 8600 and 8500 do everything. Adding CAVLC/CABAC decode acceleration was the last major step in offloading H.264 processing from the host CPU, and it simply wasn't done in the past because of die constraints and transistor budgets. As you'll soon see, without CAVLC/CABAC decode acceleration, high bitrate H.264 streams can still eat up close to 100% of a Core 2 Duo E6320; with the offload, things get far more reasonable.

The GeForce 8600 and 8500 have a new video processor (that NVIDIA is simply calling VP2) that runs at a higher clock rate than its predecessor. Couple that with a new bitstream processor (BSP) to handle CAVLC/CABAC decoding, and these two GPUs can now handle the entire H.264 decode pipe. There's a third unit that wasn't present in previous GPUs that has made an appearance in the 8600/8500 and that is this AES128 engine. The AES128 engine is simply used to decrypt the content sent from the CPU as per the AACS specification, which helps further reduce CPU overhead.

Note that the offload NVIDIA has built into the G84/G86 GPUs is hardwired for H.264 decoding only; you get none of the benefit for MPEG-2 or VC1 encoded content. Admittedly H.264 is the more strenuous of the three, but given that VC1 content is still quite prevalent among HD-DVD titles it would be nice to have. Also note that as long as your decoder supports NVIDIA's VP2/BSP, any H.264 content will be accelerated. For MPEG-2 and VC1 content, the 8600 and 8500 can only handle inverse transform, motion compensation and in-loop deblocking and the rest of the pipe is handled by the host CPU; VP1 NVIDIA hardware only handles motion compensation and in-loop deblocking. ATI's current GPUs can handle inverse transform, motion compensation and in-loop deblocking, so they should in theory have lower CPU usage than the older NVIDIA GPUs on this type of content.

It's also worth noting that the new VP2, BSP and AES128 engines are only present in NVIDIA's G84/G86 GPUs, which are currently only used on the GeForce 8600 and 8500 cards. GeForce 8800 owners are out of luck, but NVIDIA never promised this functionality to 8800 owners so there are no broken promises. The next time NVIDIA re-spins its high end silicon we'd expect to see similar functionality there, but we're guessing that it won't be for quite some time.

The Applications
POST A COMMENT

64 Comments

View All Comments

  • bearxor - Monday, May 21, 2007 - link

    How come we still don't have a article or benchies on a 8500? Reply
  • billd - Friday, May 04, 2007 - link

    It's a mystery to me why nvidia thinks we are interested in H.264 when there is so little material encoded in it. Of the shipping HD disks reviewed on the hidefdigest.com site, most Blu-Ray titles are encoded in MPEG-2 and most HD DVD titles are encoded in VC-1. Furthermore there are more Blu-Ray titles encoded in VC-1 than H.264. It would have been more helpful if nvidia had natively supported VC-1 first and introduced H.264 later. i.e.

    Blu-Ray:
    MPEG-2 : 121
    AVC MPEG-4 : 30
    VC-1 : 46

    HD DVD:
    MPEG-2 : 2
    AVC MPEG-4 : 10
    VC-1 : 161

    Perhaps there are some TV broadcasts in H.264 however given the low bit-rate compared to HD disks there should be little benefit offloading from the CPU to the video card.
    Reply
  • SilverTrine - Wednesday, May 02, 2007 - link

    Its not really appropriate to call ATi defunct when they have folded into another company, and hardware is still being sold under the ATi name. Reply
  • Parhelion69 - Monday, April 30, 2007 - link

    Anand, I've seen in some previous benchmarkings that software solutions using CoreAVC gave better results than hardware decoding on previous generations of ATI and NVIDIA video cards, could you make some tests to see if this behavior still applies?

    Also I'd love to see tests on older CPUs, like a single core athlon 64 3000+, to see the real help of the decoding on hardware.

    Thanks a lot, I always find your reviews extremelly helpful and professional, keep the good work up!
    Reply
  • Delerue - Friday, May 04, 2007 - link

    Yeah. I agree. Indeed, some people already sugest this to the Xbit Labs review, since they missed the same things. Look here: http://www.xbitlabs.com/discussion/3743.html">http://www.xbitlabs.com/discussion/3743.html

    BTW, nice review, Anand. You're the guy that I really trust when we talk about hardware. In time, have you confirmed this 'I believe that only PowerDVD/WinDVD support the 8600's hardware acceleration at this point'? Ah! You talked about Intervideo forum, but I can't find it. Can you give to me the adress, please?

    Thanks and keep going!
    Reply
  • Tewt - Monday, April 30, 2007 - link

    What am I missing here? Wasn't this tech introduced in the 7xxx series? Was I getting 'part' as opposed to 'full'? Or is this 'acceleration' versus 'decoding' and what is the difference?

    And I would like to throw in my two cents along with Parhelion. Just from general reading, my opinion is I keep seeing more and more raw power being thrown around with HD decoding/viewing/etc. Where is the lowest bar for watching HD with no 'hiccups'?

    I would love to see someone write a code for Linux for watching HD and we find out a 1Ghz PIII and an ATI 8500 or Nvidia 5500 would run it just fine.

    Sorry, thought I was watching HD content(games and downloaded trailers) just fine not too long ago with my A64 3200+ and Geforce 6600GT.

    Reply
  • DerekWilson - Tuesday, May 01, 2007 - link

    games and downloaded trailers are much much lower bitrate than especially blu-ray is capable of. lower powered cpus and older gpus can handle these fine, its the heavy hitting stuff that is the problem.

    the 7 series did not offer full decode. nothing has offered full decode until now. so yes, you were getting part. much of the decode process was being performed on the cpu, while the partially decoded video was sent to the gpu for final processing.

    with the 8600/8500, the cpu handles aacs and i/o overhead, decrypting the data on the disk, and re-encrypting the data stream to send to the gpu. this is for aacs protected content of course. games and downloaded content won't have all this stuff going on. your hd videos will still play with less cpu intervention especially in the case of h.264 videos.
    Reply
  • bigpow - Monday, April 30, 2007 - link

    don't these people ever learn?
    they f#$ked up the 6800GT/Ultra vs 6600GT with purevideo and now did it again?
    Reply
  • erikejw - Sunday, April 29, 2007 - link

    I cannot find a single word on picture quality in the article hence I assume it is top notch and there is no difference at all.

    I have no hardware decoder on my system and the quality of the different software decoders
    are from ok to abysmal.

    In a cheap HTPC system a slow Athlon x2 seems to be a good fit.
    I'll build my system around one and a 8500 card.
    Reply
  • DerekWilson - Tuesday, May 01, 2007 - link

    decode quality is equal to powerdvd software decode quality at least.

    nvidia will be including hd filtering/post processing for the 8600 series on par with 8800, while the 8500 may not have the processing power to fully implement all the quality features.

    we will be evaluating performance using the hd version of silicon optix hqv when finalized. and we may take a look at our beta version before that as well.
    Reply

Log in

Don't have an account? Sign up now