NVIDIA's PureVideo HD: HD-DVD Playback on the PC

by Derek Wilson on 7/22/2006 10:00 AM EST


Back to Article

  • Dismal - Tuesday, August 01, 2006 - link

    Potentially dumb question: Do all these graphics cards coming out now have support for all these 16:9 resolutions such as 1920x1080? Documentation for the cards that show what kind of resolutions they support seem scarce. I only worry because my 6800GT won’t touch 16:9 at all. I'm hoping times have changed. Reply
  • skycat - Thursday, July 27, 2006 - link

    I'm a little bit confused here. Do we have to have a HDCP video card in order to play HD-DVD or BD?
    I have a 7800gtx video card, and Dell UltraSharp 2407WFP monitor which supports HDCP. So if I get a HD-DVD rom drive, will I able to play HD-DVD in full resolution via DVI?
  • Renoir - Friday, July 28, 2006 - link

    Based on Derek's answer to my similar question above the answer appears to be no. The graphics card needs to support hdcp although if I understand him correctly you will be able to hook up the monitor via vga and get full resolution. Hope that helps. Reply
  • Clauzii - Tuesday, July 25, 2006 - link

    In what GPU series did nVidia implement the hardware for PureVideo? - Since I think it took a LONG time from then till now, and still drivers are BETA????? I don't get it.... Reply
  • DerekWilson - Wednesday, July 26, 2006 - link

    Purevideo works fine and is not beta in current drivers.

    Purevideo HD, which enables playback of HDCP protected content stored on HD-DVD or Blu-ray disks, is currently in beta.

    Since HD-DVD drives and Blu-ray drives have only recently started hitting the market, it isn't suprising that this feature of Purevideo HD is still in development. But Purevideo itself has been production quality for quite some time now. I know it's been at least as long as the 7 series parts have been out, but I think it was available at some point before that. I'd have to go back and check to make sure though.

    Derek Wilson
  • Clauzii - Wednesday, July 26, 2006 - link

    Thanks :) Reply
  • phusg - Tuesday, July 25, 2006 - link

    I didn't see any mention of load on the dual core Pentium, specifically the 'second' core. Is this being used at all? Seems to me that utilizing the second core would be much more advantageous than the 20% decrease from utilizing the GPU. Reply
  • DerekWilson - Wednesday, July 26, 2006 - link

    When we refer to 100% processor usage on a dual core system, we mean 100% of both cores.

    In other words, if one core went unused, we would see usage of about 50%.

    In every case, load was spread fairly evenly across both cores.

    Taking that a step further to put it all together -- smooth HD-DVD playback of H.264 content requires at least 2x 3.0GHz Netburst cores and Purevideo HD on a GPU running at 450MHz or more. Alternately, more powerful CPU(s) could make up for the need of a GPU, but until we collect more data, we don't know where the crossover point is.
  • Renoir - Wednesday, July 26, 2006 - link

    I'm about to build a friend a budget pc based on the geforce 6150/430 chipset which runs at 475mhz and I would find it funny if it turns out to offer faster h.264 acceleration than the 7800GTX which another friend has which runs at 430mhz. Was thinking though, although Nvidia say that the performance of the video processor is dependent on gpu clock speed is there any difference between the processor on the 6XXX series as opposed to the 7XXX series?

    Given that I don't game on my pc but am interested in the video performance of gpus I must say I prefer the approach Nvidia is taking more than ATI's because I don't like the idea of having to buy a high end gpu just to get good hardware acceleration of video. Having said that I'm interested to see what effect the move to unified shaders has on avivo's video acceleration because I believe ATI's video acceleration is dependent on the number of pixel pipelines.
  • Renoir - Wednesday, July 26, 2006 - link

    I guess the easiest thing would be to make sure you have a cpu that can decode the highest bit rate h.264 video on the market and consider hardware acceleration a bonus. I am therefore really looking forward to your future articles which should establish how fast a cpu you need in order to not be dependent on the gpu. Reply
  • BigLan - Monday, July 24, 2006 - link

    "Curiously, player vendors seem to be releasing different versions of their software for HD-DVD and Blu-Ray.... Hopefully CyberLink, InterVideo, et al, will merge their player versions at some point in the future, but we aren't sure of the technical reasons that might have required this initial move."

    AFAIK, the BD camp (maybe HDDVD as well, not sure) does not allow licencees to create a device capable of playing BD and HDDVD, which is why there are separate version planned. This may change if/when one of the large CE makers produces a combo standalone player, but I don't think that either Intervideo or Cyberlink can afford to stand up to the licensors and having their license revoked.
  • DerekWilson - Monday, July 24, 2006 - link

    technically the devices are the drives -- and if I've got 2 drives (one HD and one BD), I don't see the reason why I should need two pieces of software. Different hardware is still required. But I could see this as a reason for initially making two players. Reply
  • bersl2 - Saturday, July 22, 2006 - link

    HDCP is still a trap. Reply
  • DerekWilson - Sunday, July 23, 2006 - link

    hear hear Reply
  • SunAngel - Saturday, July 22, 2006 - link

    First off, Nvidia is doing an excellent job with PureVideo. And like the author commented, PureVideoHD should get better over time.

    However, some of the points in the article are a little alarming. First, HDCP is going to be required across the entire digital range equal to and greater than 720p. Second, if one link in the HDCP chain is not authenticated the resolution will be downsized to 540p. Most current HDCP-enabled widescreen lcd tvs (or at least the ones that are reasonably affordable) can output only as high as 1366x768. Thus, trying to downsize a 1080i/p resolution picture into a 720p resolution will be a waste of computing resources. Setting the display adapter to match the resolution of the tv set will regain some computing resources and reduce the load on the processor and gpu. At this point, very few of us including enthusiasts have 1080p enabled sets (I am going to bite my tongue because you can buy "full-sized" tvs with 1080p output for as little as $2500US) so forcing 1080p content to show on a 720p display is moot. Third, all 6 series and 7 series Nvidia gpus support some sort of HD acceleration. Nvidia GPUs 7600GT and higher have the sophisticated high-definition de-interlacing and inverse telecine support that is complementary to PureVideoHD, otherwise the CPU will be handling the task. SSE and 3DNow! extentions should easily handle those functions. Fourth, and the author did mention on this, the playback software and PureVideoHD are both still in beta form (well Cyberlink's player is nolonger in beta and can be bought on their site for $40US). If past performance with Cyberlink's player with PureVideo support is any indication of what's ahead, I am sure anxious to enjoy the next upgrade. Overall, the article was good reading. I just fell reviews at this time should be done with what's typically out in people's homes. Again, most of us don't own 1080p sets (I will not comment on those that don't even have HD sets), but quite a few of us have 720p sets. In my opinion, a better article would have been to review using 720p output on a 720p HDCP enable-set. This way we all would have a truer view of what PureVideoHD have waiting in store for us. On a scale of 1 to 10 is give the author a 7 for his piece. Good Luck. Cheers!
  • SunAngel - Saturday, July 22, 2006 - link

    I should clarify my comment on image downsizing. The image will be downsized if the constraint token is used. And from the look of current piracy issues I expect this to come into effect once higher resolution sets become more mainstream and vga is abandoned. Reply
  • DerekWilson - Sunday, July 23, 2006 - link

    Thanks, we will make sure to compare video output over 720p and 1080p in future HD content reviews (and especially in our comparison between NVIDIA and ATI playback).

    Our reasoning for doing the test the way we did was something along the lines of -- people buying HD or BD players and media for the PC right now very likely have a lot of disposeable income and probably have no problem dropping the $1800 for the cost of the 1080p Westinghouse we used.

    Over the next few months as players are available and drop in price, it does make much more sense to look at 720p output as this is a much more mainstream target.

    Thanks for the feedback.

    Derek Wilson
  • Tujan - Tuesday, July 25, 2006 - link

    I find it very strange the cliche of 'displosable cash'.

    When HD-DVD,or BD is discussed,as a 'media choice. Was it really the hollywood set,or the computer set wich had derived that the content would be as it is - in-the-media (on disk_) . Since obviously all of the support electronics consists of components that actually do not exist,or are merely speculation of future itenerary comming for some unknown. Oblivious of 'media. Or media of intention.

    I hope that somebody breaks open those boxes. Just to make sure there isn't an IBM processor in them. One of those 'Core processors or something.

    Certainly isn't the 'media'. It is obviously a 'platform.

    All of the baly who about 'getting-the-equipment up. Ya know. For the most of it,this will always,be a 'simulation. Of the real thing. No matter the ideals 'theator mode,may take up from specs ballyhoowed via 'movie makers. If you throw them something new,they will certainly consider you an old timer. Being there is nothing to compare.
    Being so that I would tend to agree that it is a constant of artificiality,that is actually the the status quo.

    Hope there is an alternative to the status quo.To keep speculating of something real. Assuming this 'must be constant.
  • Zaitsev - Saturday, July 22, 2006 - link

    Page 4, 3rd paragraph, 2nd line reads: "with and with a GPU on the D 830"

    I believe it should be "with and without a GPU..."

    I'm really looking forward to the comparison with ATI cards. Interesting article, though.

  • Pirks - Saturday, July 22, 2006 - link

    Ya, it's a toy review, since no real content is out there. For SERIOUS review you people have to include CoreAVC which kills any CyberLink or whatever and craps on its corpse - I almost can play 1080p videos on my Athlon XP 3200 on Socket A - WITHOUT _ANY_ GPU ACCELERATION. CyberLink and buddies can't even spit close to that. This means I'll buy A64 3800 soon for pennies, pop it in, pop CoreAVC and give nVidia and other boys a big fat finger. Haha - just try CoreAVC yourself, you won't believe your eyes!

    So unless I see comparison of some serious sort, which means including CoreAVC in addition to other big boys - that'd be just another toy review. Move along people.
  • Delerue - Friday, September 08, 2006 - link

    I agree with Pirks. It's more about the codec than the system power. I have a Sempron 3000+ that can handle any 1080p vídeo (WMV9 or H.264) without any GPU optimization (indeed I have a X800 XL). I think that CoreAVC is really the best codec avaliable to decode H.264; the difference between the others is really huge. Try to run this video without CoreAVC and then with (unfortunately you have to pay to get CoreAVC codec, but I think it's worth every cent): http://www.apple.com/trailers/imax/imaxdeepsea3d/h...">http://www.apple.com/trailers/imax/imaxdeepsea3d/h... (1080p version, indeed). After that, try this WMV9 with the Windows default codecs (not FFDshow): http://outerspace.terra.com.br/videos/callofduty3_...">http://outerspace.terra.com.br/videos/callofduty3_... (with the H.264 above is one of the heaviest videos I've ever seen). You'll see that you don't need a high end machine to run 1080p videos.

    BTW, in this article here the author said that ATI can do a better job than nVidia when we're talking about 2D. And it's not only about the image quality, but performance too. He said that Purevideo seems to be more a name than a system performance helper:


    BTW, I liked your article. Well writen, clear and right to the point. But I think you forgot to say that Windows Media Player 10 have a optimization patch to run WMV9 videos faster. Look here: http://support.microsoft.com/kb/888656">http://support.microsoft.com/kb/888656

    So, we're waiting for the ATI time. ;)

    P.S.: sorry for my bad english.
  • ChronoReverse - Saturday, July 22, 2006 - link

    Even more interesting is that CoreAVC is going to have GPU acceleration soon too. Here we have a decoder that when not in multi-threaded mode beats out both multi-threaded (on multiple cores) and GPU-assisted decoders.

    And because h.264 is bit-identical for all decoders, this means CoreAVC is doing something really right.
  • Pirks - Sunday, July 23, 2006 - link

    Exactly. Since CoreAVC craps on dead corpses of all the other codecs EVEN including ffdshow (jeez, I couldn't believe my eyes when I saw this!) and all of this WITHOUT GPU ACCELERATION, I don't even wanna think what's gonna happen when CoreAVC gets some boost from say 7800GS on my AGP mobo. I'll be watching 1080p videos on my 3 year old Socket A rig!! Woot! And all the dualcore fanatics can eat their fancy useless dualcores, hehe :-)) Reply
  • DerekWilson - Sunday, July 23, 2006 - link

    We will absolutely be reviewing multiple playback techniques when we have a drive for more than a day.

    The problem isn't 1080p content, as PowerDVD has no problem with 1080p American content (non-H.264), but we will be very interested in seeing the capability of other players to decode higher bit rate video encoded with H.264.

    This is a very first glimpse of the current HD media playback capabilities of the PC, so please expect more as soon as we are able to get our hands on it.
  • ChronoReverse - Sunday, July 23, 2006 - link

    To be clear. CoreAVC is a commercially available H.264 decoder. It's claim to fame is being able to decode H.264 using less CPU power than any other publically available decoder multi-threaded or not, GPU-assisted or not. Reply
  • bob661 - Monday, July 24, 2006 - link

    So since US movies won't have H.264 encoding, this codec is irrelevant for US consumers, correct? Reply
  • DerekWilson - Monday, July 24, 2006 - link

    actually, i end up importing a bunch of japanese titles, so it does end up affecting me. also, we will have to look and see if there is any quality difference between the same movie encoded in h.264 and vc-1 / mpeg2 or whatever ... especially because the h.264 encodings are done in a higher bitrate as well. Reply
  • ChronoReverse - Monday, July 24, 2006 - link

    Typically, using a higher efficiency codec like VC-1 and H.264 implies a lower bitrate but equal perceived quality. That's why a single layer bluray disc would hold 2 hours with MPEG2 but about 4 hours with H.264 and VC-1. It's strange that your discs would be encoded with a higher bitrate compared to the MPEG2 versions (unless those were the DVD versions?)

    In any case, it's not like 1080p MPEG2 is really that relevant when it's WMV9/VC-1 and H.264 decoding that's interesting. We've had MPEG2-assist for a long time and any modern CPU should be able to decode it.
  • bobsmith1492 - Saturday, July 22, 2006 - link

    What exactly do the videos show? The one with Purevideo looks just like the one without... was the first one a bit choppy or something? Reply
  • DerekWilson - Sunday, July 23, 2006 - link

    yes, the one without purevideo is choppy. if you look closely at the logo and the scene where the faces are rotating, you can see the stuttering.

    as we said in the article, this looked much worse in person and rendered the movie unwatchable.
  • yzkbug - Saturday, July 22, 2006 - link

    I have a $5K question (the cost of a new TV) unanswered by this article. Do you absolutely need an HDCP-enabled TV to watch HD movies on PC? The slide on page 2 shows that a monitor can be connected either via Analog (VGA or Component) or via Digital (DVI or HDMI with HDCP). So, does it mean that it is possible to watch HD over Analog without any PQ degradation? Also, does it mean that DVI without HDCP is a no-go? Reply
  • Bowsky - Saturday, July 22, 2006 - link

    HDCP is only required to view HD content is an Image Contrait Tokien (ICT, I think thats its name) is present on the disc. If it's not there the media can be played at full resolution over any connect such as nonHDCP-DVI, VGA, component, etc.

    To answer your $5,000 question, the movie companies have decided to wait until 2010 before using the ICT on any media. After that all media will be down-scaled if played over non-HDCP connections. So my answer to you is buy the HDCP television set. It won't be required immediately, but will unfortunately be required in the near future.

    Also, most new HDTVs on sale these days have HDCP so ther shouldn't be too much to worry about when buying.
  • DerekWilson - Sunday, July 23, 2006 - link

    Our understanding of the situation is that any DIGITAL playback requires HDCP or no image will be displayed (under current PC video player technology -- downscaling may be possible in the future).

    Currently all titles will be able to play full resolution Analog (component, vga), but in the future this will not be allowed either.

    non-HDCP DVI and non-HDCP HDMI will never playback full resolution HD content distributed on HD media with HDCP protection enabled. This is essentially all titles.

    If you want digital playback of HDs or BDs, you can't do it without an HDCP television. If you don't mind analog playback, your fine for the next few years.
  • Renoir - Sunday, July 23, 2006 - link

    Just so I understand you Derek are you saying that full resolution playback over a digital connection will not be allowed regardless of whether the image constraint token is used or not and that full res will only be allowed over analogue until the ICT is used? If so that sucks big time. Would then have to hook up my monitor via both vga and dvi depending on if I'm watching a hd disc or not (assuming of course I still have my current monitor by the time I watch hd discs off course). Reply
  • DerekWilson - Sunday, July 23, 2006 - link

    this is the way I understand it. Reply
  • Renoir - Monday, July 24, 2006 - link

    Bummer! Was hoping the lack of the ICT would allow me to use dvi at full res. I imagine it's because they're more worried about people getting a perfect digital copy rather than capturing the analogue at full res and then converting it to digital. However I'm not aware of any dvi capturing devices although there are plenty of component ones. Does anyone know of any hi res digital capturing devices as I'm curious now :-) Reply
  • DerekWilson - Monday, July 24, 2006 - link

    we wanted to build one to analyse video output of graphics cards without relying on screen capture utilities ... it shouldn't really be that difficult. Reply
  • Renoir - Monday, July 24, 2006 - link

    cool! Perhaps in terms of piracy they feel that it's less neccesary to protect the full res analogue video than a bit for bit accurate dvi feed. If so then they must be thinking that people would find pirated videos that were redigitised from component etc (albeit at full res) less compelling than one straight from dvi. What other reason/s do they have for allowing full res over analogue but not over digital? Reply
  • Renoir - Monday, July 24, 2006 - link

    Just realised I haven't taken any analogue copy protection such as macrovision into account. Any info on this aspect? If it's present then that would pretty much answer my last question. Reply
  • vhx500 - Saturday, July 22, 2006 - link

    On page 2, you mention plaing Riddick and Swordfish, but you are displaying a screenshot from The Bourne Supremacy? Reply
  • DerekWilson - Sunday, July 23, 2006 - link

    The only reason we looked at Bourne Supremacy was for the features. None of the other titles we tested offered the picture in picture capability.

    And, actually, this feature is still a bit buggy with the latest PowerDVD -- audio in the window was out of sync.

    Performance and power draw of Bourne Supremacy should be on par with Swordfish.
  • DigitalFreak - Saturday, July 22, 2006 - link


    We won't be able to get our hands on another drive for a while, so although we have ATI cards that feature HDCP, we are unable to compare AVIVO to PureVideo HD at this time.

    Couldn't you just pull the Nvidia card out and put the ATI card in and run tests, or weren't you allowed to open the box?
  • DerekWilson - Sunday, July 23, 2006 - link

    Really, it was more of a time issue. NVIDIA were only here for a few hours, and we didn't have that much time. It would have taken much longer to actually do anything useful with the ATI card we have.

    Then there is the issue of driver and player support. It isn't certain that we would even have been able to get the ATI card working with the system and the beta version of PowerDVD. And I don't know if ATI enables HDCP video decode acceleration in their current drivers (though I doubt it). I think we would have been able to get video through HDCP, but I seriously doubt it would have been a fair comparison of realworld video acceleration capability.

    Derek Wilson
  • PrinceGaz - Saturday, July 22, 2006 - link

    First post! Reply
  • DigitalFreak - Saturday, July 22, 2006 - link

    STFU Reply

Log in

Don't have an account? Sign up now