The System, Tests and Performance

The system NVIDIA brought with them was a Shuttle SD31P featuring an Intel Pentium D 830 and a Toshiba TS-L802A HD-DVD player. While this is on the low end of what we would want to use in a multimedia box, specs like this could fit in with people who want a quiet, lower power media center box for their living room. Many of the early adopters of HD-DVD and Blu-Ray on the PC will likely be enthusiasts with high quality components who won't run into usability issues with new media, but those who want to spend as little as possible may have a more difficult time getting their HD-DVD or Blu-Ray to work as expected. Even those who wish to run a completely silent rig may find a lack of quality without offloading some of the processing to a graphics card.

Full specs of the system NVIDIA brought for us to test are as follows:
CPU: Intel Pentium D 830
RAM: 1GB DDR2 533
Chipset: Intel 945G
Graphics: MSI 7600GT w/ HDCP
Display: Westinghouse LVM-42W2
HD-DVD Drive: Toshiba TS-L802
OS: Windows XP SP2
HD-DVD Player: CyberLink PowerDVD (for HD-DVD)

In order to fully test the playback capabilities of the system, we tested the Japanese version of The Chronicles of Riddick, which uses H.264 encoding. This is by far the most strenuous video test available right now.

Sample Riddick videos

Watching the videos will give you an idea of what the highest bit-rate Japanese titles will be like without a high speed CPU or good video decode acceleration on the GPU. Keep in mind that the framerate mismatch between the output of the TV and the DV cam, the scaling, and the compression all decrease the impact that dropped frames have. Recording a video of a display using a DV cam is always going to result in reduced quality, so subjectively you'll just have to trust us when we say that the GPU accelerated playback was smooth and didn't show dropped frames. The difference are much more noticeable in person and we can say without reservation that the Japanese version of Riddick is unwatchable on a Pentium D 830 without PureVideo HD at this point.

As PureVideo HD performance is dependant on the GPU's core clock speed, the impact of PureVideo will vary depending on the speed rather than the 3D power of your graphics card. For example, our MSI 7600 GT HDCP runs at a default 580 MHz core speed. This makes it a more effective PureVideo card than a 7900 GT running between 450 and 500 MHz. NVIDIA allowed us to underclock the MSI graphics card and test a variety of settings between 350 MHz and 580 MHz in order to fully understand the way performance varies versus GPU clock speed.

To test the performance of PureVideo HD, we ran perfmon while playing back Riddick at different GPU clock speeds. Riddick was playable down to 450MHz, and PureVideo HD did have some impact on performance even as low as 400MHz. There wasn't any perceivable quality difference between running the GPU at 350MHz and running without GPU acceleration.

Without PureVideo HD running The Chronicles of Riddick (Japan), the CPU was at 100% the entire time. Enabling PureVideo HD with the GPU running at retail speed (580MHz), CPU usage dropped to about 80% on average. With other titles using VC-1 content, we saw CPU usage at around 80% without GPU assistance. Turning on PureVideo HD gave us about 60% CPU usage. In general, it looks like a 580 MHz NVIDIA GPU is capable of decreasing the load on a Pentium D 830 by about 20% in any given situation. This isn't a huge drop, but it is definitely significant and can help in those tough situations (like H.264 encoded imports).

We did scan through a number of features like scene selection, bookmarks, and picture in picture (viewing multiple tracks at once). All of these features run as they would on a CE player, though interface consistency can be a problem (some require keyboard or mouse only for different menus or features).

Our final test takes a look at system level power draw with and without PureVideo HD enabled. We ran our system through a Kill-A-Watt device and looked for maximum power draw over a specific clip of the movie. While our Kill-A-Watt doesn't record average power, we eyeballed the range of and frequency of power levels and came up with a very rough average. We did this for both The Chronicles of Riddick (Japan) and Swordfish. Power draw is in Watts and listed avg/peak. Idle power is 127W.

System Power Draw
Riddick Swordfish
CPU Only: 185/193 179/185
PureVideo HD: 185/192 175/180

It is pretty clear that there isn't a real power advantage with PureVideo HD at this point in time. As more of the pipeline is moved onto the GPU, the specialized hardware could potentially increase the efficiency of the process. This could lead to lower power draw, but at this point in time, all we are doing is shifting where the power is going.

We won't be able to get our hands on another drive for a while, so although we have ATI cards that feature HDCP, we are unable to compare AVIVO to PureVideo HD at this time. Certainly we are hoping to see a similar level of quality from ATI. Our comparison will be available as soon as we are able to get hardware and drivers.

PureVideo HD and Video Playback Final Words


View All Comments

  • Dismal - Tuesday, August 1, 2006 - link

    Potentially dumb question: Do all these graphics cards coming out now have support for all these 16:9 resolutions such as 1920x1080? Documentation for the cards that show what kind of resolutions they support seem scarce. I only worry because my 6800GT won’t touch 16:9 at all. I'm hoping times have changed. Reply
  • skycat - Thursday, July 27, 2006 - link

    I'm a little bit confused here. Do we have to have a HDCP video card in order to play HD-DVD or BD?
    I have a 7800gtx video card, and Dell UltraSharp 2407WFP monitor which supports HDCP. So if I get a HD-DVD rom drive, will I able to play HD-DVD in full resolution via DVI?
  • Renoir - Friday, July 28, 2006 - link

    Based on Derek's answer to my similar question above the answer appears to be no. The graphics card needs to support hdcp although if I understand him correctly you will be able to hook up the monitor via vga and get full resolution. Hope that helps. Reply
  • Clauzii - Tuesday, July 25, 2006 - link

    In what GPU series did nVidia implement the hardware for PureVideo? - Since I think it took a LONG time from then till now, and still drivers are BETA????? I don't get it.... Reply
  • DerekWilson - Wednesday, July 26, 2006 - link

    Purevideo works fine and is not beta in current drivers.

    Purevideo HD, which enables playback of HDCP protected content stored on HD-DVD or Blu-ray disks, is currently in beta.

    Since HD-DVD drives and Blu-ray drives have only recently started hitting the market, it isn't suprising that this feature of Purevideo HD is still in development. But Purevideo itself has been production quality for quite some time now. I know it's been at least as long as the 7 series parts have been out, but I think it was available at some point before that. I'd have to go back and check to make sure though.

    Derek Wilson
  • Clauzii - Wednesday, July 26, 2006 - link

    Thanks :) Reply
  • phusg - Tuesday, July 25, 2006 - link

    I didn't see any mention of load on the dual core Pentium, specifically the 'second' core. Is this being used at all? Seems to me that utilizing the second core would be much more advantageous than the 20% decrease from utilizing the GPU. Reply
  • DerekWilson - Wednesday, July 26, 2006 - link

    When we refer to 100% processor usage on a dual core system, we mean 100% of both cores.

    In other words, if one core went unused, we would see usage of about 50%.

    In every case, load was spread fairly evenly across both cores.

    Taking that a step further to put it all together -- smooth HD-DVD playback of H.264 content requires at least 2x 3.0GHz Netburst cores and Purevideo HD on a GPU running at 450MHz or more. Alternately, more powerful CPU(s) could make up for the need of a GPU, but until we collect more data, we don't know where the crossover point is.
  • Renoir - Wednesday, July 26, 2006 - link

    I'm about to build a friend a budget pc based on the geforce 6150/430 chipset which runs at 475mhz and I would find it funny if it turns out to offer faster h.264 acceleration than the 7800GTX which another friend has which runs at 430mhz. Was thinking though, although Nvidia say that the performance of the video processor is dependent on gpu clock speed is there any difference between the processor on the 6XXX series as opposed to the 7XXX series?

    Given that I don't game on my pc but am interested in the video performance of gpus I must say I prefer the approach Nvidia is taking more than ATI's because I don't like the idea of having to buy a high end gpu just to get good hardware acceleration of video. Having said that I'm interested to see what effect the move to unified shaders has on avivo's video acceleration because I believe ATI's video acceleration is dependent on the number of pixel pipelines.
  • Renoir - Wednesday, July 26, 2006 - link

    I guess the easiest thing would be to make sure you have a cpu that can decode the highest bit rate h.264 video on the market and consider hardware acceleration a bonus. I am therefore really looking forward to your future articles which should establish how fast a cpu you need in order to not be dependent on the gpu. Reply

Log in

Don't have an account? Sign up now