Final Words

We've been hearing for quite some time now that Blu-ray and HDDVD movies could prove to be too much for today's desktop microprocessors; today we finally have the proof. X-Men: The Last Stand encoded using the H.264/MPEG-4 AVC High Profile at 1080p requires more processing power to decode than affordable dual core CPUs can handle. We are at a point where GPU decode acceleration is essentially required with all but the highest end processors in order to achieve an acceptable level of quality while watching HD content on the PC.

NVIDIA hardware performs better under our current set of drivers and the beta build of PowerDVD we are using, but exactly how well GeForce 7 Series hardware handles the decode process is more dependant on the type of card being used than ATI. In general, higher performance NVIDIA cards do better at decoding our H.264 Blu-ray content. The 7950 GX2 doesn't perform on par with the rest of the high end NVIDIA cards as SLI doesn't help with video decode. With the exception of the X1600 Pro, each of the ATI cards we tested affected performance almost exactly the same.

While there isn't much more to say about performance right now, we do need to consider that we are working with an early release of our player software, and ATI and NVIDIA are always improving their driver support for video decode acceleration. While we can't count on seeing improved performance in the future on current hardware, it is always nice to know that the possibility exists. We will continue to track performance with future player and driver updates.

But no matter what we see in the future, NVIDIA has done an excellent job with the 8800 series. G80 based cards will definitely lead the way in HD video decode performance, making it possible to stick with a cheaper CPU and still get a good experience. Of course, nothing about playing HD content on the PC is cheap right now, especially if we are talking about using an 8800 in conjunction with our Blu-ray drive.

For those who don't have the money to build a computer around Blu-ray or HDDVD, a standalone player is the other option. We tested our Samsung player with X-Men: The Last Stand to see if it could handle the demands of an H.264 movie (as any good CE player should). We were happy to see that the Samsung box didn't seem to have any problems playing our movie.

As for recommendations, based on our testing, we would not suggest anything less than an Intel Core 2 Duo E6600 for use in a system designed to play HD content. The E6400 may work well enough, but not even the 8800 GTX can guarantee zero dropped frames on the E6300. ATI owners will want to lean more towards an E6700 processor, but can get away with the E6600 in a pinch. But keep in mind that X-Men: The Last Stand is only one of the first H.264 movies to come out. We may see content that is more difficult to decode in the future, and faster processors are definitely a good place to pad your performance to ensure a quality HD experience on the PC.

X-Men: The Last Stand CPU Overhead
Comments Locked

86 Comments

View All Comments

  • DerekWilson - Tuesday, December 12, 2006 - link

    CPU utilization would be 50% if a single core was maxed on perfmon --

    PowerDVD is multithreaded and 100% utilization represents both cores being pegged.
  • Renoir - Tuesday, December 12, 2006 - link

    Any chance of doing a quick test on quad-core to see how many threads powerdvd can generate unless you know already? At the very least it can from what you've said evenly distribute the load across 2 threads which is good.
  • DerekWilson - Thursday, December 14, 2006 - link

    We are looking into this as well, thanks for the feedback.
  • mi1stormilst - Monday, December 11, 2006 - link

    Who the crap cares, stupid movies are freaking dumb gosh! I mean who the crap watches movies on their computers anyway...freaking dorks. LOL!
  • Sunrise089 - Monday, December 11, 2006 - link

    This isn't a GPU review that tests a new game that we know will be GPU limited. This is a review of a technology that relies on the CPU. Furthermore, this is a tech that obviously pushes CPUs to their limit, so the legions of people without Core2Duo based CPUs would probably love to know whether or not their hardware is up to the task of decoding these files. I know any AMD product is slower than the top Conroes, but since the hardware GPU acceleration obviously doesn't directly coorespond to GPU performance, is it possible that AMD chips may decode Blue Ray at acceptable speeds? I don't know, but it would have been nice to learn that from this review.
  • abhaxus - Monday, December 11, 2006 - link

    I agree completely... I have an X2 3800+ clocked at 2500mhz that is not about to get retired for a Core 2 Duo.

    Why are there no AMD numbers? Considering the chip was far and away the fastest available for several years, you would think that they would include the CPU numbers for AMD considering most of us with fast AMD chips only require a new GPU for current games/video. I've been waiting AGES for this type of review to decide what video card to upgrade to, and anandtech finally runs it, and I still can't be sure. I'm left to assume that my X2 @ 2.5ghz is approximately equivalent to an E6400.
  • DerekWilson - Tuesday, December 12, 2006 - link

    Our purpose with this article was to focus on graphics hardware performance specifically
  • Sunrise089 - Tuesday, December 12, 2006 - link

    Frankly Derrick, that's absurd.

    If you only wanted to focus on the GPUs, then why test different CPUs? If you wanted to find out info about GPUs, why not look into the incredibly inconsistant performance, centerned around the low corelation between GPU performance in games versus movie acceleration? Finally, why not CHANGE the focus of the review when it became apparent that which GPU one owned was far less important then what CPU you were using?

    Was it that hard to throw in a single X2 product rather than leave the article incomplete?
  • smitty3268 - Tuesday, December 12, 2006 - link

    With all due respect, if that is the case then why did you even use different cpus? You should have kept that variable the same throughout the article by sticking with the 6800. Instead, what I read seemed to be 50% about GPU's and 50% about Core 2 Duo's.

    I'd really appreciate an update with AMD numbers, even if you only give 1 that would at least give me a reference point. Thanks.
  • DerekWilson - Thursday, December 14, 2006 - link

    We will be looking at CPU performance in other articles.

    The information on CPU used was to justify our choice of CPU in order to best demonstrate the impact of GPU acceleration.

Log in

Don't have an account? Sign up now