H.264 Encoded HD Content: A Good Thing

Almost anything can be done in a faster, more compact, or higher quality way. Sometimes there are tradeoffs to be made, and sometimes one way of doing things is just better than another. It has been quite some time since studios began distributing movies encoded in MPEG-2 stored on DVDs. Now that we have some new physical media entering the market, we will also see more efficient codecs enter the playing field as well.

H.264 is another name for a subset of MPEG-4 called MPEG-4 Part 10, or AVC (for Advanced Video Coding). This codec is a big step beyond MPEG-2 in terms of how heavily video of a given quality can be compressed. There are quite a few factors that make H.264 a better vehicle for video, but these are a little beyond the scope of this article. For now we will focus on the impact of H.264 and why it's a better option than MPEG-2.

The major benefit of H.264 over MPEG-2 is its small file size due to high compression. High resolution video can be stored in much less space. This is very useful because even though BDs can be 25 or 50 GB, high quality high resolution video is not small. The higher the compression we have, the higher the quality of video that will fill up a disk. Alternately, with high compression we also have extra room for the all important bonus features and extra content that we expect with any good DVD today.

Higher image quality is also inherent in H.264 due to some of the improved features of the codec. Variable block size motion compensation, better handling of interlaced video, in-loop deblocking, and better subpixel accuracy all contribute to a better overall image quality. Alternately, studios can use the image quality advantages to lower bitrate even more, as compression artifacts don't show up as readily.

With all these advantages, there is one downside to H.264: decoding the video takes much more work than with MPEG-2. High powered, dedicated H.264 decoding hardware is required in standalone BD and HDDVD players, as a generic processor just isn't enough to handle the work load. This is understandable as we have to make a tradeoff between file size/bitrate and the amount of work a CPU needs to do to reproduce the video, and H.264 produces very small files.

The large file size vs. heavy compression issue is actually fairly intuitive. Imagine completely uncompressed video where every pixel of every frame is stored in memory. The only thing we need to do to display the video is to send the data to the TV. This requires almost no processing but very high file size and bandwidth from the storage media. As a reference point, uncompressed 24-bit 1080p content at 24fps (the standard frame rate for movies) would require a whopping 1.19 Gbps of bandwidth and a 90 minute movie would need about 750GB of storage. Obviously, some form of compression is absolutely required.

When storing less data through compression, the CPU must do work to fill in the blanks before sending the video out to a display. With our previous Blu-ray test movie Click (which used MPEG-2), we saw bitrates of 50-60 Mbps throughout our test (representing somewhere between a 20:1 and 24:1 compression rate). Moving to X-Men: The Last Stand, most of our test is at about 20 Mbps, though we do see a very short spike that hits over 40 Mbps (somewhere around a 60:1 compression rate). We would need to compare the same section of one movie encoded in both MPEG-2 and H.264 in order to speak directly to the differences between the two, but for now we will generally see at least half the bitrate with H.264 that we get with MPEG-2. We also see a much lower CPU utilization with MPEG-2 because it doesn't compress the video as much as H.264.

If we focus on our high compression codec, we'll see that higher bitrates with H.264 mean more work for the CPU. When complex scenes occur, more data is required to generate a proper image. The CPU still needs to process all this data in the same way it would with a less complex scene, and we end up seeing higher processor utilization.

The encoding process takes more work as well, and we've been told that this is part of the reason we haven't seen many H.264 BD movies before now. When getting a movie ready for sale, studios will encode it many times and have people to view every frame of video and make sure nothing needs to be cleaned up. Every time a problem is found, the entire movie must be encoded again. It takes significantly more time to do this with H.264 than with MPEG-2. Fortunately, it seems that studios are making the sacrifices they need to make in order to bring a better experience to the end user.

To sum up, while MPEG-2 is relatively easy to decode, H.264 enables smaller files with better image quality. On the down side, the time it takes to encode a movie using H.264 is much higher than required for MPEG-2, and the processing power needed to decode H.264 without dropping frames can be very large. Without GPU acceleration, not even an Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames.

Before we get to the test, we'll leave you with a short list of H.264 Blu-ray titles. While we don't have the bitrate information for all of these, we chose X-Men: The Last Stand because it listed 18 Mbps video (higher than some of the others) and has some fairly complex special effects.

Blu-ray H.264 Movies:
Behind Enemy Lines
The League of Extraordinary Gentlemen
X-Men: The Last Stand
Speed
Glory Road
Gone in 60 Seconds
Eight Below
The Great Raid


Index The Test
Comments Locked

86 Comments

View All Comments

  • charleski - Tuesday, December 19, 2006 - link

    The only conclusion that can be taken from this article is that PowerDVD uses a very poor h.264 decoder. You got obsessed with comparing different bits of hardware and ignored the real weak link in the chain - the software.

    Pure software decoding of 1080-res h.264 can be done even on a PentiumD if you use a decent decoder such as CoreAVC or even just the one in ffdshow. You also ignored the fact that these different decoders definitely do differ in the quality of their output. PowerDVD's output is by far the worst to my eyes, the best being ffdshow closely followed by CoreAVC.
  • tronsr71 - Friday, December 15, 2006 - link

    The article mentions that the amount of decoding offloaded by the GPU is directly tied into core clock speed (at least for Nvidia)... If this is true, why not throw in the 6600GT for comparison?? They usually come clocked at 500 mhz stock, but I am currently running mine at 580 with no modifications or extra case cooling.

    In my opinion, if you were primarily interested in Blu-Ray/HD-DVD watching on your computer or HTPC and gaming as a secondary pastime, the 6600GT would be a great inexpensive approach to supporting a less powerful CPU.

    Derek, any chance we could see some benches of this GPU thrown into the mix?
  • balazs203 - Friday, December 15, 2006 - link

    Could somebody tell me what the framerate is of the outgoing signal from the video card? I know that the Playstation 3 can only output a 60 fps signal, but some standalone palyers can output 24 fps.
  • valnar - Wednesday, December 13, 2006 - link

    From the 50,000 foot view, it seems just about right, or "fair" in the eyes of a new consumer. HD-DVD and BluRay just came out. It requires a new set-top player for those discs. If you built a new computer TODAY, the parts are readily available to handle the processing needed for decoding. One cannot always expect their older PC to work with today's needs - yes, even a PC only a year old. All in all, it sounds about right.

    I fall into the category as most of the other posters. My PC can't do it. Build a new one (which I will do soon), and it will. Why all the complaining? I'm sure most of us need to get a new HDCP video card anyway.
  • plonk420 - Tuesday, December 12, 2006 - link

    i can play a High Profile 1080p(25) AVC video on my X2-4600 at maybe 40-70 CPU max (70% being a peak, i think it averaged 50-60%) with CoreAVC...

    now the ONLY difference is my clip was sans audio and 13mbit (i was simulating a movie at a bitrate if you were to try to squeeze The Matrix onto a single layer HD DVD disc). i doubt 18mbit adds TOO much more computation...
  • plonk420 - Wednesday, December 13, 2006 - link

    http://www.megaupload.com/?d=CLEBUGGH">http://www.megaupload.com/?d=CLEBUGGH

    give that a try ... high profile 1080p AVC, with all CPU-sapping options on except for B-[frame-]pyramid.

    it DOES have CAVLC (IIRC), 3 B-frames, 3 Refs, 8x8 / 4x4 Transform
  • Spoelie - Friday, April 20, 2007 - link

    CABAC is better and more cpu-sapping then CAVLC
  • Stereodude - Tuesday, December 12, 2006 - link

    How come the results of this tests are so different from http://www.pcper.com/article.php?aid=328&type=...">this PC Perspective review? I realize they tested HD-DVD, and this review is for Blu-Ray, but H.264 is H.264. Of note is that nVidia provided an E6300 and 7600GT to them to do the review with and it worked great (per the reviewer). Also very interesting is how the hardware acceleration dropped CPU usage from 100% down to 50% in their review on the worst case H.264 disc, but only reduced CPU usage by ~20% with a 7600GT in this review.

    Lastly, why is nVidia http://download.nvidia.com/downloads/pvzone/Checkl...">recommending an E6300 for H.264 blu-ray and HD-DVD playback with a 7600GT if it's completely inadequate as this review shows?
  • DerekWilson - Thursday, December 14, 2006 - link

    HD-DVD movies even using H.264 are not as stressful. H.264 decode requirements depend on the bitrate at which video is encoded. Higher bitrates will be more stressful. Blu-ray disks have the potential for much higher bitrate movies because they currently support up to 50GB (high bitrate movies also require more space).
  • balazs203 - Wednesday, December 13, 2006 - link

    Maybe the bitrate of their disk is not as high as the bitrate of that part of XMEN III.

    I would not say it completely inadequate. According to the Anandtech review the E6300 with the 8800GTX could remain under 100% CPU utilisation even under the highest bitrate point (the 8800GTX and the 7600GT had the same worst case CPU utilisation in the tests).

Log in

Don't have an account? Sign up now