The Test

As we previously indicated, we need to use at least a Core 2 Duo E6400 in order to avoid dropping frames while testing graphics card decode acceleration under X-Men: The Last Stand. As we also wanted an accurate picture of how much GPU decode acceleration really helps, we needed to use a CPU powerful enough to avoid dropping frames even under the most stressful load without GPU assistance. Thus we chose the Core 2 Duo X6800 for our tests. Using this processor, we can more accurately see how each graphics card compares to the others and how much each graphics card is able to assist the CPU.

We tested CPU utilization by using perfmon to record data while we viewed a section of X-Men: The Last Stand. The bookmark feature really helped out, allowing us to easily jump to the specific scene we wanted to test in Chapter 18. In this scene, the Golden Gate is being torn apart and people are running everywhere. This is one of the most stressful scenes in the movie, reaching a bitrate of over 41 Mbps at one point.

Unfortunately, we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames. This means we can't really compare what happens to the video quality when the CPU is running at 100%. In lieu of dropped frames, we will need to stick with CPU overhead as our performance metric.

For reference we recorded average and maximum CPU overhead while playing back our benchmark clip with no GPU acceleration enabled.

Here is the rest of our test system:

Performance Test Configuration
CPU: Intel Core 2 Duo X6800
Motherboard(s): ASUS P5B Deluxe
Chipset(s): Intel P965
Chipset Drivers: Intel 7.2.2.1007 (Intel)
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Cards: Various
Video Drivers: ATI Catalyst 6.11
NVIDIA ForceWare 93.71
NVIDIA ForceWare 97.02
Desktop Resolution: 1920x1080 - 32-bit @ 60Hz
OS: Windows XP Professional SP2


H.264 Encoded HD Content: A Good Thing X-Men: The Last Stand CPU Overhead
Comments Locked

86 Comments

View All Comments

  • charleski - Tuesday, December 19, 2006 - link

    The only conclusion that can be taken from this article is that PowerDVD uses a very poor h.264 decoder. You got obsessed with comparing different bits of hardware and ignored the real weak link in the chain - the software.

    Pure software decoding of 1080-res h.264 can be done even on a PentiumD if you use a decent decoder such as CoreAVC or even just the one in ffdshow. You also ignored the fact that these different decoders definitely do differ in the quality of their output. PowerDVD's output is by far the worst to my eyes, the best being ffdshow closely followed by CoreAVC.
  • tronsr71 - Friday, December 15, 2006 - link

    The article mentions that the amount of decoding offloaded by the GPU is directly tied into core clock speed (at least for Nvidia)... If this is true, why not throw in the 6600GT for comparison?? They usually come clocked at 500 mhz stock, but I am currently running mine at 580 with no modifications or extra case cooling.

    In my opinion, if you were primarily interested in Blu-Ray/HD-DVD watching on your computer or HTPC and gaming as a secondary pastime, the 6600GT would be a great inexpensive approach to supporting a less powerful CPU.

    Derek, any chance we could see some benches of this GPU thrown into the mix?
  • balazs203 - Friday, December 15, 2006 - link

    Could somebody tell me what the framerate is of the outgoing signal from the video card? I know that the Playstation 3 can only output a 60 fps signal, but some standalone palyers can output 24 fps.
  • valnar - Wednesday, December 13, 2006 - link

    From the 50,000 foot view, it seems just about right, or "fair" in the eyes of a new consumer. HD-DVD and BluRay just came out. It requires a new set-top player for those discs. If you built a new computer TODAY, the parts are readily available to handle the processing needed for decoding. One cannot always expect their older PC to work with today's needs - yes, even a PC only a year old. All in all, it sounds about right.

    I fall into the category as most of the other posters. My PC can't do it. Build a new one (which I will do soon), and it will. Why all the complaining? I'm sure most of us need to get a new HDCP video card anyway.
  • plonk420 - Tuesday, December 12, 2006 - link

    i can play a High Profile 1080p(25) AVC video on my X2-4600 at maybe 40-70 CPU max (70% being a peak, i think it averaged 50-60%) with CoreAVC...

    now the ONLY difference is my clip was sans audio and 13mbit (i was simulating a movie at a bitrate if you were to try to squeeze The Matrix onto a single layer HD DVD disc). i doubt 18mbit adds TOO much more computation...
  • plonk420 - Wednesday, December 13, 2006 - link

    http://www.megaupload.com/?d=CLEBUGGH">http://www.megaupload.com/?d=CLEBUGGH

    give that a try ... high profile 1080p AVC, with all CPU-sapping options on except for B-[frame-]pyramid.

    it DOES have CAVLC (IIRC), 3 B-frames, 3 Refs, 8x8 / 4x4 Transform
  • Spoelie - Friday, April 20, 2007 - link

    CABAC is better and more cpu-sapping then CAVLC
  • Stereodude - Tuesday, December 12, 2006 - link

    How come the results of this tests are so different from http://www.pcper.com/article.php?aid=328&type=...">this PC Perspective review? I realize they tested HD-DVD, and this review is for Blu-Ray, but H.264 is H.264. Of note is that nVidia provided an E6300 and 7600GT to them to do the review with and it worked great (per the reviewer). Also very interesting is how the hardware acceleration dropped CPU usage from 100% down to 50% in their review on the worst case H.264 disc, but only reduced CPU usage by ~20% with a 7600GT in this review.

    Lastly, why is nVidia http://download.nvidia.com/downloads/pvzone/Checkl...">recommending an E6300 for H.264 blu-ray and HD-DVD playback with a 7600GT if it's completely inadequate as this review shows?
  • DerekWilson - Thursday, December 14, 2006 - link

    HD-DVD movies even using H.264 are not as stressful. H.264 decode requirements depend on the bitrate at which video is encoded. Higher bitrates will be more stressful. Blu-ray disks have the potential for much higher bitrate movies because they currently support up to 50GB (high bitrate movies also require more space).
  • balazs203 - Wednesday, December 13, 2006 - link

    Maybe the bitrate of their disk is not as high as the bitrate of that part of XMEN III.

    I would not say it completely inadequate. According to the Anandtech review the E6300 with the 8800GTX could remain under 100% CPU utilisation even under the highest bitrate point (the 8800GTX and the 7600GT had the same worst case CPU utilisation in the tests).

Log in

Don't have an account? Sign up now