Final Words

We excluded a few of the tests from our image quality comparisons, simply because they were more focused on noise reduction and both ATI and NVIDIA did equally poorly there. Instead, we decided to focus on cadence detection and de-interlacing quality, using the tests from the previous pages.

Let's first look at how all of the numbers tally up for ATI and NVIDIA:

Test NVIDIA PureVideo ATI Avivo
Color Bar/Vertical Detail 5 5
Jaggies Pattern 1 3 3
Jaggies Pattern 2 3 0
Flag 5 5
Picture Detail 0 0
Noise Reduction 0 0
Motion Adaptive Noise Reduction 0 0
3:2 Detection 10 10
Film Cadence
2:2 0 0
2:2:2:4 0 0
2:3:3:25 5 0
5:5 0 0
6:4 0 0
8:7 0 0
3:2 5 5
Scrolling Text (Horiz) 5 5
Scrolling Text (Vert) 10 5
Total 51 38

The subjective numbers add up to pretty much summarize our experience with ATI's Avivo at this point. While neither ATI nor NVIDIA produced a perfect solution, at this point, Avivo is definitely a step behind NVIDIA's PureVideo in terms of de-interlacing quality.

We will be keeping tabs on ATI's Avivo as its remaining, and arguably more exciting, features get implemented in later driver revisions. For now, be sure to read our technology and gaming performance coverage on ATI's Radeon X1000 line.

De-Interlacing Quality - Mixed 3:2 Film with Added Video Titles
Comments Locked

20 Comments

View All Comments

  • ST - Thursday, October 6, 2005 - link

    Any chance you can get 1080i deinterlacing tests in the future? 480i source material is fine, but with OTA HDTV widely available now, and the 7800gt/gtx line flaunting HD spatial temporal deinterlacing, i'm sure this is what most readers want to know about.
  • ksherman - Wednesday, October 5, 2005 - link

    I installed PureVideo, but what players actually take advantage of it?
  • rbV5 - Wednesday, October 5, 2005 - link

    Its nice to see detailed looks into a vastly overlooked area of video card performance. Kudos for using a standard to measure by, now if we see more of this type of scrutiny from more reviewers perhaps we'll actually get to see these features enabled rahter than reading about how great its going to be some day.

    Now lets take a good look at connectivity, custom resolution support, 1:1 pixel mapping, codec support......
  • LoneWolf15 - Wednesday, October 5, 2005 - link

    quote:

    Despite what ATI told us at our Avivo briefing last month (although ATI insists it was a miscommunication), H.264 decode acceleration is not launching alongside the R5xx GPUs. ATI is committed to bringing both H.264 decode acceleration and transcode assist by the end of the year, but for now, we have no way of testing those features.
    nVidia already fooled me with this once. They called it PureVideo and I bought a Geforce 6800 AGP and waited eagerly for driver support that never came for hardware decode of HD WMV files (or hardware encode of MPEG), because the NV40/45 design was borked. nVidia left every single user that bought an NV40/45 card in the lurch. No recourse. So everyone who bought one with the hope of using PureVideo was screwed.

    Not making that mistake again with any company. If a feature isn't supported at the time I purchase a product, that feature doesn't exist. I'm not going to believe press releases anymore, seeing as touted features can be revoked if drivers or hardware don't work out right. Never again.

    Note: I now own an ATI X800XL, and have nothing against ATI or nVidia other than that I 'm too cynical to believe either of them on any feature until I see that feature in action.
  • Lifted - Wednesday, October 5, 2005 - link

    I was thinking the exact same thing. Never again will I by something that will have features added at a later date. This is just a marketing tactic because they already know the hardware won't handle what they promised.
  • Patman2099 - Wednesday, October 5, 2005 - link

    Is it just me, or is there no mention in the article of what Deinterlacing option they used on the ATI board

    you can change it in CCC, Ive found that the Adaptive looks best on my radeon 9700.

    which deinterlaxing mode was used?
  • Anand Lal Shimpi - Wednesday, October 5, 2005 - link

    I just amended the article to include this information:

    "Both the ATI and NVIDIA drivers were set to auto-detect what de-interlacing algorithm the hardware should use. We found that this setting yielded the best results for each platform in the HQV benchmark."

    If I forced the adaptive or motion adaptive settings, some of the HQV tests did worse, while none improved in image quality.

    Take care,
    Anand
  • user loser - Wednesday, October 5, 2005 - link

    Am I the only one that thinks the NV version of "De-Interlacing Quality: Vertical Detail" (page 3) is worse? Some of the red/green alternating lines are completely green or lose detail.

    Compare to the original:
    http://www.belle-nuit.com/testchart.html">http://www.belle-nuit.com/testchart.html
    (720 * 486 (NTSC) )

    And how often do the different film cadence modes get used really ? (However, they get the same amount of points (weight) as some more elementary tests.) And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".

    Or are the reasons for these differences only visible in moving video?

  • TheSnowman - Wednesday, October 5, 2005 - link

    [quote] And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".

    Or are the reasons for these differences only visible in moving video?[/quote]
    Nah, de-interlacing artfacts would always turn up in the proper still framegrab and be easier to see that way as well, but I can't see any de-interlacing artfacts on any of the shots that are claimed to have such issues so am at a loss to understand the author's conclusions on that page. The first ATI shot does show some nasty compression for some reason or another, but I don't see any of interlacing issues in the shots on that page from either ATI or Nvidia.
  • Anand Lal Shimpi - Wednesday, October 5, 2005 - link

    It's tough to see here, but those are actually supposed to be interlacing artifacts. They appear as compression artifacts here, but in motion you get a very clear lined pattern.

    Take care,
    Anand

Log in

Don't have an account? Sign up now