The System, Tests and Performance

The system NVIDIA brought with them was a Shuttle SD31P featuring an Intel Pentium D 830 and a Toshiba TS-L802A HD-DVD player. While this is on the low end of what we would want to use in a multimedia box, specs like this could fit in with people who want a quiet, lower power media center box for their living room. Many of the early adopters of HD-DVD and Blu-Ray on the PC will likely be enthusiasts with high quality components who won't run into usability issues with new media, but those who want to spend as little as possible may have a more difficult time getting their HD-DVD or Blu-Ray to work as expected. Even those who wish to run a completely silent rig may find a lack of quality without offloading some of the processing to a graphics card.

Full specs of the system NVIDIA brought for us to test are as follows:
CPU: Intel Pentium D 830
RAM: 1GB DDR2 533
Chipset: Intel 945G
Graphics: MSI 7600GT w/ HDCP
Display: Westinghouse LVM-42W2
HD-DVD Drive: Toshiba TS-L802
OS: Windows XP SP2
HD-DVD Player: CyberLink PowerDVD (for HD-DVD)

In order to fully test the playback capabilities of the system, we tested the Japanese version of The Chronicles of Riddick, which uses H.264 encoding. This is by far the most strenuous video test available right now.

Sample Riddick videos

Watching the videos will give you an idea of what the highest bit-rate Japanese titles will be like without a high speed CPU or good video decode acceleration on the GPU. Keep in mind that the framerate mismatch between the output of the TV and the DV cam, the scaling, and the compression all decrease the impact that dropped frames have. Recording a video of a display using a DV cam is always going to result in reduced quality, so subjectively you'll just have to trust us when we say that the GPU accelerated playback was smooth and didn't show dropped frames. The difference are much more noticeable in person and we can say without reservation that the Japanese version of Riddick is unwatchable on a Pentium D 830 without PureVideo HD at this point.

As PureVideo HD performance is dependant on the GPU's core clock speed, the impact of PureVideo will vary depending on the speed rather than the 3D power of your graphics card. For example, our MSI 7600 GT HDCP runs at a default 580 MHz core speed. This makes it a more effective PureVideo card than a 7900 GT running between 450 and 500 MHz. NVIDIA allowed us to underclock the MSI graphics card and test a variety of settings between 350 MHz and 580 MHz in order to fully understand the way performance varies versus GPU clock speed.

To test the performance of PureVideo HD, we ran perfmon while playing back Riddick at different GPU clock speeds. Riddick was playable down to 450MHz, and PureVideo HD did have some impact on performance even as low as 400MHz. There wasn't any perceivable quality difference between running the GPU at 350MHz and running without GPU acceleration.

Without PureVideo HD running The Chronicles of Riddick (Japan), the CPU was at 100% the entire time. Enabling PureVideo HD with the GPU running at retail speed (580MHz), CPU usage dropped to about 80% on average. With other titles using VC-1 content, we saw CPU usage at around 80% without GPU assistance. Turning on PureVideo HD gave us about 60% CPU usage. In general, it looks like a 580 MHz NVIDIA GPU is capable of decreasing the load on a Pentium D 830 by about 20% in any given situation. This isn't a huge drop, but it is definitely significant and can help in those tough situations (like H.264 encoded imports).

We did scan through a number of features like scene selection, bookmarks, and picture in picture (viewing multiple tracks at once). All of these features run as they would on a CE player, though interface consistency can be a problem (some require keyboard or mouse only for different menus or features).


Our final test takes a look at system level power draw with and without PureVideo HD enabled. We ran our system through a Kill-A-Watt device and looked for maximum power draw over a specific clip of the movie. While our Kill-A-Watt doesn't record average power, we eyeballed the range of and frequency of power levels and came up with a very rough average. We did this for both The Chronicles of Riddick (Japan) and Swordfish. Power draw is in Watts and listed avg/peak. Idle power is 127W.

System Power Draw
Riddick Swordfish
CPU Only: 185/193 179/185
PureVideo HD: 185/192 175/180

It is pretty clear that there isn't a real power advantage with PureVideo HD at this point in time. As more of the pipeline is moved onto the GPU, the specialized hardware could potentially increase the efficiency of the process. This could lead to lower power draw, but at this point in time, all we are doing is shifting where the power is going.

We won't be able to get our hands on another drive for a while, so although we have ATI cards that feature HDCP, we are unable to compare AVIVO to PureVideo HD at this time. Certainly we are hoping to see a similar level of quality from ATI. Our comparison will be available as soon as we are able to get hardware and drivers.

PureVideo HD and Video Playback Final Words
Comments Locked

45 Comments

View All Comments

  • BigLan - Monday, July 24, 2006 - link

    "Curiously, player vendors seem to be releasing different versions of their software for HD-DVD and Blu-Ray.... Hopefully CyberLink, InterVideo, et al, will merge their player versions at some point in the future, but we aren't sure of the technical reasons that might have required this initial move."

    AFAIK, the BD camp (maybe HDDVD as well, not sure) does not allow licencees to create a device capable of playing BD and HDDVD, which is why there are separate version planned. This may change if/when one of the large CE makers produces a combo standalone player, but I don't think that either Intervideo or Cyberlink can afford to stand up to the licensors and having their license revoked.
  • DerekWilson - Monday, July 24, 2006 - link

    technically the devices are the drives -- and if I've got 2 drives (one HD and one BD), I don't see the reason why I should need two pieces of software. Different hardware is still required. But I could see this as a reason for initially making two players.
  • bersl2 - Saturday, July 22, 2006 - link

    HDCP is still a trap.
  • DerekWilson - Sunday, July 23, 2006 - link

    hear hear
  • SunAngel - Saturday, July 22, 2006 - link

    First off, Nvidia is doing an excellent job with PureVideo. And like the author commented, PureVideoHD should get better over time.

    However, some of the points in the article are a little alarming. First, HDCP is going to be required across the entire digital range equal to and greater than 720p. Second, if one link in the HDCP chain is not authenticated the resolution will be downsized to 540p. Most current HDCP-enabled widescreen lcd tvs (or at least the ones that are reasonably affordable) can output only as high as 1366x768. Thus, trying to downsize a 1080i/p resolution picture into a 720p resolution will be a waste of computing resources. Setting the display adapter to match the resolution of the tv set will regain some computing resources and reduce the load on the processor and gpu. At this point, very few of us including enthusiasts have 1080p enabled sets (I am going to bite my tongue because you can buy "full-sized" tvs with 1080p output for as little as $2500US) so forcing 1080p content to show on a 720p display is moot. Third, all 6 series and 7 series Nvidia gpus support some sort of HD acceleration. Nvidia GPUs 7600GT and higher have the sophisticated high-definition de-interlacing and inverse telecine support that is complementary to PureVideoHD, otherwise the CPU will be handling the task. SSE and 3DNow! extentions should easily handle those functions. Fourth, and the author did mention on this, the playback software and PureVideoHD are both still in beta form (well Cyberlink's player is nolonger in beta and can be bought on their site for $40US). If past performance with Cyberlink's player with PureVideo support is any indication of what's ahead, I am sure anxious to enjoy the next upgrade. Overall, the article was good reading. I just fell reviews at this time should be done with what's typically out in people's homes. Again, most of us don't own 1080p sets (I will not comment on those that don't even have HD sets), but quite a few of us have 720p sets. In my opinion, a better article would have been to review using 720p output on a 720p HDCP enable-set. This way we all would have a truer view of what PureVideoHD have waiting in store for us. On a scale of 1 to 10 is give the author a 7 for his piece. Good Luck. Cheers!
  • SunAngel - Saturday, July 22, 2006 - link

    I should clarify my comment on image downsizing. The image will be downsized if the constraint token is used. And from the look of current piracy issues I expect this to come into effect once higher resolution sets become more mainstream and vga is abandoned.
  • DerekWilson - Sunday, July 23, 2006 - link

    Thanks, we will make sure to compare video output over 720p and 1080p in future HD content reviews (and especially in our comparison between NVIDIA and ATI playback).

    Our reasoning for doing the test the way we did was something along the lines of -- people buying HD or BD players and media for the PC right now very likely have a lot of disposeable income and probably have no problem dropping the $1800 for the cost of the 1080p Westinghouse we used.

    Over the next few months as players are available and drop in price, it does make much more sense to look at 720p output as this is a much more mainstream target.

    Thanks for the feedback.

    Derek Wilson
  • Tujan - Tuesday, July 25, 2006 - link

    I find it very strange the cliche of 'displosable cash'.

    When HD-DVD,or BD is discussed,as a 'media choice. Was it really the hollywood set,or the computer set wich had derived that the content would be as it is - in-the-media (on disk_) . Since obviously all of the support electronics consists of components that actually do not exist,or are merely speculation of future itenerary comming for some unknown. Oblivious of 'media. Or media of intention.

    I hope that somebody breaks open those boxes. Just to make sure there isn't an IBM processor in them. One of those 'Core processors or something.

    Certainly isn't the 'media'. It is obviously a 'platform.

    All of the baly who about 'getting-the-equipment up. Ya know. For the most of it,this will always,be a 'simulation. Of the real thing. No matter the ideals 'theator mode,may take up from specs ballyhoowed via 'movie makers. If you throw them something new,they will certainly consider you an old timer. Being there is nothing to compare.
    Being so that I would tend to agree that it is a constant of artificiality,that is actually the the status quo.

    Hope there is an alternative to the status quo.To keep speculating of something real. Assuming this 'must be constant.
  • Zaitsev - Saturday, July 22, 2006 - link

    Page 4, 3rd paragraph, 2nd line reads: "with and with a GPU on the D 830"

    I believe it should be "with and without a GPU..."

    I'm really looking forward to the comparison with ATI cards. Interesting article, though.

    Cheers.
  • Pirks - Saturday, July 22, 2006 - link

    Ya, it's a toy review, since no real content is out there. For SERIOUS review you people have to include CoreAVC which kills any CyberLink or whatever and craps on its corpse - I almost can play 1080p videos on my Athlon XP 3200 on Socket A - WITHOUT _ANY_ GPU ACCELERATION. CyberLink and buddies can't even spit close to that. This means I'll buy A64 3800 soon for pennies, pop it in, pop CoreAVC and give nVidia and other boys a big fat finger. Haha - just try CoreAVC yourself, you won't believe your eyes!

    So unless I see comparison of some serious sort, which means including CoreAVC in addition to other big boys - that'd be just another toy review. Move along people.

Log in

Don't have an account? Sign up now