Introduction

In November, we published our first article featuring Blu-ray content. While we focused more on the capability of the cards we tested to play digital content protected with HDCP, we did take a preliminary look at hardware accelerated high definition video playback with the movie Click.

Our first glimpse of the processing power required to play HD content on the PC gave us a very good indication that Blu-ray movies using MPEG-2 should have no problem on a modern system, even without GPU acceleration. The Core 2 Duo E6300 is easily capable of playing back 50-60 Mbps MPEG-2 video at 1080p. Adding a GPU to the mix did make an impact, but the small boost in performance just wasn't necessary.

Today we will turn the tables around and look at what happens when H.264/MPEG-4 AVC meets Blu-ray on the PC. This combination is much more demanding than MPEG-2 encoded Blu-ray movies, as H.264 is capable of much higher compression at better quality which requires more processing power.

Before we get to our results, it is important to talk a bit about playback of HD media on the PC. BD and HDDVD movies are copy protected with AACS which uses HDCP to encrypt and decrypt the video signal when it's sent over a digital connection. In order to view one of these movies on an HDTV over either a DVI or HDMI connection, an HDCP enabled video card is required.

All video cards that have an HDMI connection on them should support HDCP, but the story is different with DVI. Only recently have manufacturers started including the encryption keys required for HDCP. Licensing these keys costs hardware makers money, and the inclusion of HDCP functionality hasn't been seen as a good investment until recently (as Blu-ray and HDDVD players are finally available for the PC). While NVIDIA and ATI are both saying that most (if not all) of the cards available based on products released within the last few months will include the required hardware support, the final decision is still in the hands of the graphics card maker.

It is important to make it clear that HDCP graphics cards are only required to watch protected HD content over a digital connection. Until movie studios decide to enable the ICT (Image Constraint Token), HD movies will be watchable at full resolution over an analog connection. While analog video will work for many current users, it won't be a long term solution.

Now that we've recapped what we know about watching HD content on the PC, lets take a look at why things will be a little different now that H.264/MPEG-4 AVC encoded movies are here.

H.264 Encoded HD Content: A Good Thing
Comments Locked

86 Comments

View All Comments

  • Stereodude - Wednesday, December 13, 2006 - link

    Also, there's http://www.avsforum.com/avs-vb/showthread.php?p=91...">this post on AVSforum. The poster had no problems playing back Xmen-3 with a "P4 3.2Ghz HT system and a Radeon X1950Pro". Clearly a 3.2gHz HT P4 isn't nearly as powerful as any of those C2D processor nor was the X1950Pro as the various nVidia cards.
  • Stereodude - Wednesday, December 13, 2006 - link

    Perhaps, but nVidia intentionally sent them a H.264 torture test disc that's not available in the US. That also doesn't explain why the 7600GT nearly cut the CPU usage in half for one review, but only helped 20% in the other.

    Also, nVidia says an E6330 or X2 4200+ with a 7600GT is adequate for the most demanding H.264 titles. That sure doesn't agree with the conclusion of this Anandtech piece, which says you need a 8800GTX card to use a E6300.
  • balazs203 - Wednesday, December 13, 2006 - link

    In the PC Perspective article they say:

    "In our testing the H.264 bit rates were higher than the VC-1 rates, in the high 18-19 Mbps up to 22 Mbps in some cases."

    That is about half the maximum bitrate of the Anadtech tested disc.
  • Stereodude - Wednesday, December 13, 2006 - link

    Since when does bitrate = difficulty to decode?
  • DerekWilson - Thursday, December 14, 2006 - link

    bitrate does equal difficulty to decode because it equals more to do per frame.
  • frogge - Tuesday, December 12, 2006 - link

    64 bit OS vs 32 bit...
  • puffpio - Tuesday, December 12, 2006 - link

    Will you start using more updated/modern encoding CPU tests for H.264 encoding? Currently you use Quicktime right? That doesn't use many of H264's advanced features.

    Have you considered using x264 (an open source encoder of H264 that generates the best quality encodes of publicly available H264 encoders) using a standard set of encoding parameters?

    Nothing taxes a CPU better than video encoding :)
  • rain128 - Tuesday, December 12, 2006 - link

    Im little bit sceptic about those test results. Becuse my Home computer on the subject line played Dejavu clip (downloaded from Apple website trailer 1 - 1080p) with CPU usage 40..60% and with current version of NVIDIA drivers. Wiht older drivers (dont know excact version, installed those over a year ago) average farame rate was between 50...70%.

    For a decoder used PowerDVD 7, installed trial and even when cyberlinks webpage says that H.264 codec doesnt work with trial version i had now problems with it. Gspot reported for a default rendering path Cyberlinks H.264 codec. For fulscreen capability used BSPlayer, strange was that Windows mediaplayer didnt want to play that trial eventhough all other players had no problem finding installed codecs.

    TIP: with BSPlayer you can see droped frame rate count.
  • Renoir - Tuesday, December 12, 2006 - link

    The h.264 clips on the apple website tend to have lower bit rates than those found on blu-ray discs so that explains your cpu usage.
  • DerekWilson - Tuesday, December 12, 2006 - link

    this is what we have found as well, and is also why looking at BD and HDDVD performance is more important than when we've looked at downloaded clips in the past

Log in

Don't have an account? Sign up now