Yozakura (H.264)

The Yozakura test isn't the highest bitrate test we have, but it is the most stressful we've encountered due to how it uses the H.264 codec. Our benchmark starts at the beginning of chapter 1 and continues until the 1:45 mark.

Yozakura (H.264) - Average % CPU Utilization

We start off with PowerDVD and immediately we see the tremendous difference that NVIDIA's new video decode engine offers. While even the previous generation NVIDIA hardware still eats up more than a single CPU core, the 8600s average in the low 20% for CPU utilization.


All of the steps that happen outside of the green box are responsible for any remaining CPU utilization seen when playing back H.264 content on a GeForce 8600.

Why isn't the CPU utilization down to 0%? The entire H.264 decode pipeline is handled on the GPU, but NVIDIA claims that the extra 20% is simply related to processing and decrypting data off of the disk before it's passed on to the GPU. If you had an unencrypted disk, the CPU utilization should be in the single digits.

Yozakura (H.264) - Max % CPU Utilization

The maximum CPU utilization for these two cards is still significant, but obviously much better than the 70%+ of the competitors. Surprisingly enough, ATI's hardware actually does worse than NVIDIA's in these tests despite offloading more of the decode pipeline than the GeForce 7 or 8800.

To confirm our findings we also ran the tests under WinDVD 8, which as we mentioned before doesn't support ATI hardware acceleration so the only GPUs compared here are from NVIDIA.

Yozakura (H.264) - Average % CPU Utilization

NVIDIA's older hardware actually does worse under WinDVD 8 than under PowerDVD, but the 8600 does a lot better.

Yozakura (H.264) - Max % CPU Utilization

Maximum CPU utilization is particularly better on the 8600s under WinDVD 8, the two never even break 24%.

Looking at the PowerDVD and WinDVD scores, it's interesting to note that while the 8600 GTS is clearly faster in PowerDVD, the two cards are basically tied under WinDVD. There is definitely room for further optimizations in PowerDVD at present, so hopefully we will get that along with bug fixes in a future update.

The Test The Interpreter (H.264)
Comments Locked

64 Comments

View All Comments

  • JarredWalton - Saturday, April 28, 2007 - link

    The peak numbers may not be truly meaningful other than indicating a potential for dropped frames. Average CPU utilization numbers are meaningful, however. Unlike SETI, there is a set amount of work that needs to be done in a specific amount of time in order to successfully decode a video. The video decoder can't just give up CPU time to lower CPU usage, because the content has to be handled or frames will be dropped.

    The testing also illustrates the problem with ATI's decode acceleration on their slower cards, though: the X1600 XT is only slightly faster than doing all the work on the CPU in some instances, and in the case of VC1 it may actually add more overhead than acceleration. Whether that's due to ATI's drivers/hardware or the software application isn't clear, however. Looking at the WinDVD vs. PowerDVD figures, the impact of the application used is obviously not negligible at this point.
  • BigLan - Saturday, April 28, 2007 - link

    Does the 8600 also accelerate x264 content? It's looking like x264 will become the successor to xvid, so if these cards can, they'll be the obvious choice for HD-HTPCs.

    I guess the main question would be if windvd or powerdvd can play x264. I suspect they can't, but nero showtime should be able to.
  • MrJim - Tuesday, May 8, 2007 - link

    Accelerating x264 content would be great but i dont know what the big media companies would think about that, maybe ATI or Nvidia will lead the way, hopefully.
  • Xajel - Saturday, April 28, 2007 - link

    I'm just asking why those enhancement are not in the higher 8800 GPU's ??

    I know 8600 will be more used in HTPC than 8800, but it's just not a good reason to not include them !!
  • Axbattler - Saturday, April 28, 2007 - link

    Those cards came out 5 months after the 8800. Long enough for them to add the tech it seems. I'd expect them in the 8900 (or whatever nVidia name their refresh) though. Actually, it would be interesting to see if they add to the 8800 Ultra.
  • Xajel - Saturday, April 28, 2007 - link

    I don't expect Ultra to have them, AFAIK Ultra is just tweaked version of GTX with higher MHz for both Core and RAM...
    I can expect it for my 7950GT successor
  • Spacecomber - Friday, April 27, 2007 - link

    I'm not sure I understand why Nvidia doesn't offer an upgraded version of their decoder software, instead of relying on other software companies to get something put together to work with their hardware.
  • thestain - Friday, April 27, 2007 - link

    http://www.newegg.com/product/product.asp?item=N82...">All this tech jock sniffing with the latest and greatest, but this old reliable is a better deal isn't it?

    For watching movies.. for the ordinary non-owner of the still expensive hd dvd players and hd dvds... for standard definition content.. even without the nice improvements nvidia has made.. seems to me that the old tech still does a pretty good job.

    What do you think of this ole 6600 compared to the 8600 in terms of price paid for the performance you are going to see and enjoy in reality?
  • DerekWilson - Saturday, April 28, 2007 - link

    the key line there is "if you have a decent cpu" ... which means c2d e6400.

    for people with slower cpus, the 6600 will not cut it and the 8600gt/8500gt will be the way to go.

    the aes-128 step still needed to be done on older hardware (as it needs to decrypt the data stream sent to it by the CPU), but using dedicated hardware rather than the shader hardware to do this should help save power or free up resources for other shader processing (post processing like noise redux, etc).
  • Treripica - Friday, April 27, 2007 - link

    I don't know if this is too far off-topic, but what PSU was used for testing?

Log in

Don't have an account? Sign up now