NVIDIA's PureVideo Driver and Encoder

There are two parts to the software side of PureVideo - the GPU driver and the PureVideo DVD decoder. The driver is simply a version of the ForceWare 67.01 driver, the PureVideo DVD decoder is the latest update to NVIDIA's NVDVD decoder - version 1.00.65. The GPU driver is obviously available free to the public, while the PureVideo DVD decoder sells for $19.99 due to associated royalties. The PureVideo DVD decoder is available as a 30-day free trial from NVIDIA's website.

The PureVideo DVD decoder installs just like any application would and has a control panel associated with it. You can only access the control panel while using the decoder (e.g. watching a DVD) or if you are using a media player that lets you access it directly (e.g. Zoom Player). The PureVideo decoder control panel has a few options to it, although the control panel is unnecessarily complicated.

The main options you'll want to adjust are the de-interlacing options, but unfortunately NVIDIA included two separate de-interlacing controls in the driver that will undoubtedly confuse users.

The first control is marked De-interlace Control and has the following options: Automatic, Film, Video and Smart. Automatic mode simply uses the DVD flags to determine what the source is and applies the appropriate algorithms based on the flags.

The Film and Video modes tell the DVD decoder to treat all content as 24 fps or 30 fps content respectively. Smart mode is the option you'll want to set and it uses both flags as well as NVIDIA's own algorithms to determine the best de-interlacing to apply.

Then we have the De-interlace Mode control which has the following options: Best available, Display fields separately and Combine fields.

Display fields separately and Combine fields force bob and weave, respectively, regardless of content.

Best available is the option you'll want to use for the best image quality as it uses NVIDIA's per pixel adapative de-interlacing algorithms. So the combination you'll want to use is Smart mode with the Best available setting. NVIDIA included the other options for the tweakers in all of us, however we'd much rather see a single control or something that is at least a bit more intuitive than what NVIDIA has put together right now.

A Brief Look at De-Interlacing Modes DVD Playback Quality
Comments Locked

62 Comments

View All Comments

  • mcveigh - Tuesday, December 21, 2004 - link

    nserra: where do you get your info on mpg4 from?
  • tfranzese - Tuesday, December 21, 2004 - link

    I guess giving you the hardware features now means you should be entitled to the software to take advantage of it as well, huh?

    Those of you who think you're entitled to nVidia's DVD decoder are living in la la land. So now they should also give you the games that take advantage of their technology/features too?
  • Koing - Tuesday, December 21, 2004 - link

    Just don't buy an Nvidia card next time if you feel ripped off.

    You paid 'expecting' a feature to be 'fully' and not half 'assed' in to work 8 months late.

    Protest with your WALLETS.

    <-- never an early graphics adopter!

    Still with my Nvidia GF4Ti4200 :P

    Koing
  • nserra - Tuesday, December 21, 2004 - link

    http://www.ati.com/vortal/videoinnovations/flash/i...

    #29 VIRGE
    Ati have MPEG4 encoding and decoding since 18/07/2002 so what do you want?

    S3 Omni is better at video then Ati of Nvidia the problem is the rest.... even so, to bad there isnt an S3 in the tests.

    I dont know why nvidia doesnt charge some buck or two for each chip/card it makes, so the feature get paied?

    Unless this enconder software that they are charging, works on any card in the market, if its the case they are right in doing so.
  • LoneWolf15 - Tuesday, December 21, 2004 - link

    Anand,

    I think you would do your readers a great service by retesting all of this on an AMD processor. Your Pentium 4 rig had Hyperthreading on, making this issue look far les serious than it is. Hyperthreading P4 chips report far lower CPU utilization numbers as opposed to even high-level Athlon 64 processors. So those of us with a Geforce 6800 who are running high-end AMD hardware are getting 70-100% CPU utilization in WMV9. It explains why a lot of us are having trouble trusting nVidia now, and in the future.
  • atssaym - Tuesday, December 21, 2004 - link

    NVDVD =/= NVIDIA DVD Decoder, they are TWO DIFFERENT THINGS

    link
    http://nvidia.com/object/decoder_faq.html#othernv

    Will this product work with NVDVD and ForceWare Multimedia?
    The new software will upgrade the decoders in ForceWare Multimedia 3.0 to access the new features in the NVIDIA DVD Decoder. The NVIDIA DVD Decoder will not upgrade NVDVD 2.0 as it was a separate application based upon an old architecture.

    Decoder Homepage
    http://nvidia.com/object/dvd_decoder.html
    NVDVD Homepage
    http://nvidia.com/page/nvdvd.html
  • pataykhan - Tuesday, December 21, 2004 - link

    Nice comparison of ATI and NVIDIA

    But still one thing is missing. I use cyberlink powerdvd for watching dvds. My question is that if i have got nvidia or ati dvd decoder. Should i use hardware accelerated decoder or should i stick with cyberlink decoder.

    Cyberlink has implemented some smart software deinterlacer and image quality enhancer (CLEV-2). So i would like to see the image quality differences between hardware/software decoders available.
  • bigpow - Tuesday, December 21, 2004 - link

    This sucks...

    ATI doesn't charge $20 for the claimed decoder/accelerator...

    And they have the ATI HD component dongle too!
  • segagenesis - Tuesday, December 21, 2004 - link

    Bah, I would be kind of pissed if I had a high end $500 card only to find out now its more worthless than a sub $200 card at decoding video. A bit of shame on nVidia.
  • Guspaz - Monday, December 20, 2004 - link

    Oh, as a clarification of my previous comment (#41) I meant that I was under the impression that normally the DVD decoding software handled the de-interlacing, obviously PureVideo overrides that. But what I mean is on ATI cards, without any type of hardware de-interlacing, wouldn't the deinterlacing differ from decoder to decoder?

Log in

Don't have an account? Sign up now