H.264 Encoded HD Content: A Good Thing

Almost anything can be done in a faster, more compact, or higher quality way. Sometimes there are tradeoffs to be made, and sometimes one way of doing things is just better than another. It has been quite some time since studios began distributing movies encoded in MPEG-2 stored on DVDs. Now that we have some new physical media entering the market, we will also see more efficient codecs enter the playing field as well.

H.264 is another name for a subset of MPEG-4 called MPEG-4 Part 10, or AVC (for Advanced Video Coding). This codec is a big step beyond MPEG-2 in terms of how heavily video of a given quality can be compressed. There are quite a few factors that make H.264 a better vehicle for video, but these are a little beyond the scope of this article. For now we will focus on the impact of H.264 and why it's a better option than MPEG-2.

The major benefit of H.264 over MPEG-2 is its small file size due to high compression. High resolution video can be stored in much less space. This is very useful because even though BDs can be 25 or 50 GB, high quality high resolution video is not small. The higher the compression we have, the higher the quality of video that will fill up a disk. Alternately, with high compression we also have extra room for the all important bonus features and extra content that we expect with any good DVD today.

Higher image quality is also inherent in H.264 due to some of the improved features of the codec. Variable block size motion compensation, better handling of interlaced video, in-loop deblocking, and better subpixel accuracy all contribute to a better overall image quality. Alternately, studios can use the image quality advantages to lower bitrate even more, as compression artifacts don't show up as readily.

With all these advantages, there is one downside to H.264: decoding the video takes much more work than with MPEG-2. High powered, dedicated H.264 decoding hardware is required in standalone BD and HDDVD players, as a generic processor just isn't enough to handle the work load. This is understandable as we have to make a tradeoff between file size/bitrate and the amount of work a CPU needs to do to reproduce the video, and H.264 produces very small files.

The large file size vs. heavy compression issue is actually fairly intuitive. Imagine completely uncompressed video where every pixel of every frame is stored in memory. The only thing we need to do to display the video is to send the data to the TV. This requires almost no processing but very high file size and bandwidth from the storage media. As a reference point, uncompressed 24-bit 1080p content at 24fps (the standard frame rate for movies) would require a whopping 1.19 Gbps of bandwidth and a 90 minute movie would need about 750GB of storage. Obviously, some form of compression is absolutely required.

When storing less data through compression, the CPU must do work to fill in the blanks before sending the video out to a display. With our previous Blu-ray test movie Click (which used MPEG-2), we saw bitrates of 50-60 Mbps throughout our test (representing somewhere between a 20:1 and 24:1 compression rate). Moving to X-Men: The Last Stand, most of our test is at about 20 Mbps, though we do see a very short spike that hits over 40 Mbps (somewhere around a 60:1 compression rate). We would need to compare the same section of one movie encoded in both MPEG-2 and H.264 in order to speak directly to the differences between the two, but for now we will generally see at least half the bitrate with H.264 that we get with MPEG-2. We also see a much lower CPU utilization with MPEG-2 because it doesn't compress the video as much as H.264.

If we focus on our high compression codec, we'll see that higher bitrates with H.264 mean more work for the CPU. When complex scenes occur, more data is required to generate a proper image. The CPU still needs to process all this data in the same way it would with a less complex scene, and we end up seeing higher processor utilization.

The encoding process takes more work as well, and we've been told that this is part of the reason we haven't seen many H.264 BD movies before now. When getting a movie ready for sale, studios will encode it many times and have people to view every frame of video and make sure nothing needs to be cleaned up. Every time a problem is found, the entire movie must be encoded again. It takes significantly more time to do this with H.264 than with MPEG-2. Fortunately, it seems that studios are making the sacrifices they need to make in order to bring a better experience to the end user.

To sum up, while MPEG-2 is relatively easy to decode, H.264 enables smaller files with better image quality. On the down side, the time it takes to encode a movie using H.264 is much higher than required for MPEG-2, and the processing power needed to decode H.264 without dropping frames can be very large. Without GPU acceleration, not even an Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames.

Before we get to the test, we'll leave you with a short list of H.264 Blu-ray titles. While we don't have the bitrate information for all of these, we chose X-Men: The Last Stand because it listed 18 Mbps video (higher than some of the others) and has some fairly complex special effects.

Blu-ray H.264 Movies:
Behind Enemy Lines
The League of Extraordinary Gentlemen
X-Men: The Last Stand
Speed
Glory Road
Gone in 60 Seconds
Eight Below
The Great Raid


Index The Test
Comments Locked

86 Comments

View All Comments

  • DerekWilson - Tuesday, December 12, 2006 - link

    CPU utilization would be 50% if a single core was maxed on perfmon --

    PowerDVD is multithreaded and 100% utilization represents both cores being pegged.
  • Renoir - Tuesday, December 12, 2006 - link

    Any chance of doing a quick test on quad-core to see how many threads powerdvd can generate unless you know already? At the very least it can from what you've said evenly distribute the load across 2 threads which is good.
  • DerekWilson - Thursday, December 14, 2006 - link

    We are looking into this as well, thanks for the feedback.
  • mi1stormilst - Monday, December 11, 2006 - link

    Who the crap cares, stupid movies are freaking dumb gosh! I mean who the crap watches movies on their computers anyway...freaking dorks. LOL!
  • Sunrise089 - Monday, December 11, 2006 - link

    This isn't a GPU review that tests a new game that we know will be GPU limited. This is a review of a technology that relies on the CPU. Furthermore, this is a tech that obviously pushes CPUs to their limit, so the legions of people without Core2Duo based CPUs would probably love to know whether or not their hardware is up to the task of decoding these files. I know any AMD product is slower than the top Conroes, but since the hardware GPU acceleration obviously doesn't directly coorespond to GPU performance, is it possible that AMD chips may decode Blue Ray at acceptable speeds? I don't know, but it would have been nice to learn that from this review.
  • abhaxus - Monday, December 11, 2006 - link

    I agree completely... I have an X2 3800+ clocked at 2500mhz that is not about to get retired for a Core 2 Duo.

    Why are there no AMD numbers? Considering the chip was far and away the fastest available for several years, you would think that they would include the CPU numbers for AMD considering most of us with fast AMD chips only require a new GPU for current games/video. I've been waiting AGES for this type of review to decide what video card to upgrade to, and anandtech finally runs it, and I still can't be sure. I'm left to assume that my X2 @ 2.5ghz is approximately equivalent to an E6400.
  • DerekWilson - Tuesday, December 12, 2006 - link

    Our purpose with this article was to focus on graphics hardware performance specifically
  • Sunrise089 - Tuesday, December 12, 2006 - link

    Frankly Derrick, that's absurd.

    If you only wanted to focus on the GPUs, then why test different CPUs? If you wanted to find out info about GPUs, why not look into the incredibly inconsistant performance, centerned around the low corelation between GPU performance in games versus movie acceleration? Finally, why not CHANGE the focus of the review when it became apparent that which GPU one owned was far less important then what CPU you were using?

    Was it that hard to throw in a single X2 product rather than leave the article incomplete?
  • smitty3268 - Tuesday, December 12, 2006 - link

    With all due respect, if that is the case then why did you even use different cpus? You should have kept that variable the same throughout the article by sticking with the 6800. Instead, what I read seemed to be 50% about GPU's and 50% about Core 2 Duo's.

    I'd really appreciate an update with AMD numbers, even if you only give 1 that would at least give me a reference point. Thanks.
  • DerekWilson - Thursday, December 14, 2006 - link

    We will be looking at CPU performance in other articles.

    The information on CPU used was to justify our choice of CPU in order to best demonstrate the impact of GPU acceleration.

Log in

Don't have an account? Sign up now