The Test

As we previously indicated, we need to use at least a Core 2 Duo E6400 in order to avoid dropping frames while testing graphics card decode acceleration under X-Men: The Last Stand. As we also wanted an accurate picture of how much GPU decode acceleration really helps, we needed to use a CPU powerful enough to avoid dropping frames even under the most stressful load without GPU assistance. Thus we chose the Core 2 Duo X6800 for our tests. Using this processor, we can more accurately see how each graphics card compares to the others and how much each graphics card is able to assist the CPU.

We tested CPU utilization by using perfmon to record data while we viewed a section of X-Men: The Last Stand. The bookmark feature really helped out, allowing us to easily jump to the specific scene we wanted to test in Chapter 18. In this scene, the Golden Gate is being torn apart and people are running everywhere. This is one of the most stressful scenes in the movie, reaching a bitrate of over 41 Mbps at one point.

Unfortunately, we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames. This means we can't really compare what happens to the video quality when the CPU is running at 100%. In lieu of dropped frames, we will need to stick with CPU overhead as our performance metric.

For reference we recorded average and maximum CPU overhead while playing back our benchmark clip with no GPU acceleration enabled.

Here is the rest of our test system:

Performance Test Configuration
CPU: Intel Core 2 Duo X6800
Motherboard(s): ASUS P5B Deluxe
Chipset(s): Intel P965
Chipset Drivers: Intel 7.2.2.1007 (Intel)
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Cards: Various
Video Drivers: ATI Catalyst 6.11
NVIDIA ForceWare 93.71
NVIDIA ForceWare 97.02
Desktop Resolution: 1920x1080 - 32-bit @ 60Hz
OS: Windows XP Professional SP2


H.264 Encoded HD Content: A Good Thing X-Men: The Last Stand CPU Overhead
Comments Locked

86 Comments

View All Comments

  • DerekWilson - Tuesday, December 12, 2006 - link

    CPU utilization would be 50% if a single core was maxed on perfmon --

    PowerDVD is multithreaded and 100% utilization represents both cores being pegged.
  • Renoir - Tuesday, December 12, 2006 - link

    Any chance of doing a quick test on quad-core to see how many threads powerdvd can generate unless you know already? At the very least it can from what you've said evenly distribute the load across 2 threads which is good.
  • DerekWilson - Thursday, December 14, 2006 - link

    We are looking into this as well, thanks for the feedback.
  • mi1stormilst - Monday, December 11, 2006 - link

    Who the crap cares, stupid movies are freaking dumb gosh! I mean who the crap watches movies on their computers anyway...freaking dorks. LOL!
  • Sunrise089 - Monday, December 11, 2006 - link

    This isn't a GPU review that tests a new game that we know will be GPU limited. This is a review of a technology that relies on the CPU. Furthermore, this is a tech that obviously pushes CPUs to their limit, so the legions of people without Core2Duo based CPUs would probably love to know whether or not their hardware is up to the task of decoding these files. I know any AMD product is slower than the top Conroes, but since the hardware GPU acceleration obviously doesn't directly coorespond to GPU performance, is it possible that AMD chips may decode Blue Ray at acceptable speeds? I don't know, but it would have been nice to learn that from this review.
  • abhaxus - Monday, December 11, 2006 - link

    I agree completely... I have an X2 3800+ clocked at 2500mhz that is not about to get retired for a Core 2 Duo.

    Why are there no AMD numbers? Considering the chip was far and away the fastest available for several years, you would think that they would include the CPU numbers for AMD considering most of us with fast AMD chips only require a new GPU for current games/video. I've been waiting AGES for this type of review to decide what video card to upgrade to, and anandtech finally runs it, and I still can't be sure. I'm left to assume that my X2 @ 2.5ghz is approximately equivalent to an E6400.
  • DerekWilson - Tuesday, December 12, 2006 - link

    Our purpose with this article was to focus on graphics hardware performance specifically
  • Sunrise089 - Tuesday, December 12, 2006 - link

    Frankly Derrick, that's absurd.

    If you only wanted to focus on the GPUs, then why test different CPUs? If you wanted to find out info about GPUs, why not look into the incredibly inconsistant performance, centerned around the low corelation between GPU performance in games versus movie acceleration? Finally, why not CHANGE the focus of the review when it became apparent that which GPU one owned was far less important then what CPU you were using?

    Was it that hard to throw in a single X2 product rather than leave the article incomplete?
  • smitty3268 - Tuesday, December 12, 2006 - link

    With all due respect, if that is the case then why did you even use different cpus? You should have kept that variable the same throughout the article by sticking with the 6800. Instead, what I read seemed to be 50% about GPU's and 50% about Core 2 Duo's.

    I'd really appreciate an update with AMD numbers, even if you only give 1 that would at least give me a reference point. Thanks.
  • DerekWilson - Thursday, December 14, 2006 - link

    We will be looking at CPU performance in other articles.

    The information on CPU used was to justify our choice of CPU in order to best demonstrate the impact of GPU acceleration.

Log in

Don't have an account? Sign up now