DVD Playback Quality

Now that we've laid the background information, it's time to look at DVD playback quality. Although NVIDIA provided us with around 700MB of test data, we took it upon ourselves to put together our own test suite for image quality comparisons. We used some tests that have been used in the home theater community as de-interlacing benchmarks, as well as others that we found to be particularly good measures of image quality.

For all of our quality tests we used Zoom Player Pro, quite possibly one of the most feature filled media players available.

Our first set of tests are Secrets of Home Theater and High Fidelity tests. The Galaxy Quest theatrical trailer isn't flagged at all and relies entirely on the DVD decoder's algorithms for proper de-interlacing. The default image below is ATI's X700 Pro, mouse over it to see NVIDIA's PureVideo enabled 6600GT:



Hold mouse over image to see NVIDIA's Image Quality

NVIDIA offers a huge advantage here, the interlacing artifacts that are present in the ATI image are no where to be found in the NVIDIA image.

Next up we have The Making of Apollo 13 documentary off of the Apollo 13 DVD. Often times bonus materials on DVDs aren't properly encoded and trip up DVD decoders, let's see how ATI and NVIDIA fair here. The default image below is ATI, mouse over the image to see NVIDIA.



Hold mouse over image to see NVIDIA's Image Quality

NVIDIA once again takes the lead here; notice the combing artifacts on the man's suit coat, they are not present with NVIDIA's solution.

Our final test here is from the Making of the Big Lebowski off of the Big Lebowski DVD. The scene here is "The Jesus" licking a bowling ball, first let's have a look at what the scene is supposed to look at just before it transitions to another frame:

Now let's have a look at how ATI and NVIDIA display the scene:



Hold mouse over image to see NVIDIA's Image Quality

Neither ATI or NVIDIA pass the Big Lebowski test, what went wrong here? The correct image above was generated by using a software decoder (DScaler 5) and forcing "bob" de-interlacing, which uses none of the data from the next field in constructing the current frame. The reason this works is because this particular scene causes most DVD decoders to incorrectly weave two fields together from vastly different scenes, resulting in the artifacts seen above. It's quite disappointing that neither ATI nor NVIDIA are able to pass this test as it is one of the most visible artifacts of poor de-interlacing quality.

NVIDIA's PureVideo Driver and Encoder DVD Playback Quality (continued)
Comments Locked

62 Comments

View All Comments

  • mcveigh - Tuesday, December 21, 2004 - link

    nserra: where do you get your info on mpg4 from?
  • tfranzese - Tuesday, December 21, 2004 - link

    I guess giving you the hardware features now means you should be entitled to the software to take advantage of it as well, huh?

    Those of you who think you're entitled to nVidia's DVD decoder are living in la la land. So now they should also give you the games that take advantage of their technology/features too?
  • Koing - Tuesday, December 21, 2004 - link

    Just don't buy an Nvidia card next time if you feel ripped off.

    You paid 'expecting' a feature to be 'fully' and not half 'assed' in to work 8 months late.

    Protest with your WALLETS.

    <-- never an early graphics adopter!

    Still with my Nvidia GF4Ti4200 :P

    Koing
  • nserra - Tuesday, December 21, 2004 - link

    http://www.ati.com/vortal/videoinnovations/flash/i...

    #29 VIRGE
    Ati have MPEG4 encoding and decoding since 18/07/2002 so what do you want?

    S3 Omni is better at video then Ati of Nvidia the problem is the rest.... even so, to bad there isnt an S3 in the tests.

    I dont know why nvidia doesnt charge some buck or two for each chip/card it makes, so the feature get paied?

    Unless this enconder software that they are charging, works on any card in the market, if its the case they are right in doing so.
  • LoneWolf15 - Tuesday, December 21, 2004 - link

    Anand,

    I think you would do your readers a great service by retesting all of this on an AMD processor. Your Pentium 4 rig had Hyperthreading on, making this issue look far les serious than it is. Hyperthreading P4 chips report far lower CPU utilization numbers as opposed to even high-level Athlon 64 processors. So those of us with a Geforce 6800 who are running high-end AMD hardware are getting 70-100% CPU utilization in WMV9. It explains why a lot of us are having trouble trusting nVidia now, and in the future.
  • atssaym - Tuesday, December 21, 2004 - link

    NVDVD =/= NVIDIA DVD Decoder, they are TWO DIFFERENT THINGS

    link
    http://nvidia.com/object/decoder_faq.html#othernv

    Will this product work with NVDVD and ForceWare Multimedia?
    The new software will upgrade the decoders in ForceWare Multimedia 3.0 to access the new features in the NVIDIA DVD Decoder. The NVIDIA DVD Decoder will not upgrade NVDVD 2.0 as it was a separate application based upon an old architecture.

    Decoder Homepage
    http://nvidia.com/object/dvd_decoder.html
    NVDVD Homepage
    http://nvidia.com/page/nvdvd.html
  • pataykhan - Tuesday, December 21, 2004 - link

    Nice comparison of ATI and NVIDIA

    But still one thing is missing. I use cyberlink powerdvd for watching dvds. My question is that if i have got nvidia or ati dvd decoder. Should i use hardware accelerated decoder or should i stick with cyberlink decoder.

    Cyberlink has implemented some smart software deinterlacer and image quality enhancer (CLEV-2). So i would like to see the image quality differences between hardware/software decoders available.
  • bigpow - Tuesday, December 21, 2004 - link

    This sucks...

    ATI doesn't charge $20 for the claimed decoder/accelerator...

    And they have the ATI HD component dongle too!
  • segagenesis - Tuesday, December 21, 2004 - link

    Bah, I would be kind of pissed if I had a high end $500 card only to find out now its more worthless than a sub $200 card at decoding video. A bit of shame on nVidia.
  • Guspaz - Monday, December 20, 2004 - link

    Oh, as a clarification of my previous comment (#41) I meant that I was under the impression that normally the DVD decoding software handled the de-interlacing, obviously PureVideo overrides that. But what I mean is on ATI cards, without any type of hardware de-interlacing, wouldn't the deinterlacing differ from decoder to decoder?

Log in

Don't have an account? Sign up now