Late last month. ATI announced their Avivo platform, which would become ATI's new baseline for overall picture and video quality on the PC.  The problem with ATI's launch last month was that, without any R5xx GPUs, the Avivo platform was literally nothing more than ATI's Theater 550 TV tuner, which is pretty much old news by this point.  Luckily, today we have ATI's Radeon X1800, X1600 and X1300 GPUs, all of which are Avivo compliant GPUs, so we can begin actually testing the features of Avivo.  Well, not exactly.

Despite what ATI told us at our Avivo briefing last month (although ATI insists that it was a miscommunication), H.264 decode acceleration is not launching alongside the R5xx GPUs.  ATI is committed to bringing both H.264 decode acceleration and transcode assist by the end of the year, but for now, we have no way of testing those features. Update: Just to clarify, the R5xx GPUs do feature hardware support for H.264 acceleration. We have seen the decode acceleration in action on an X1800 twice, once at Computex and once at ATI's Avivo briefing in NYC. What ATI does not yet have ready is driver and application support for the acceleration, which we are hearing will be ready sometime in November, or at least by the end of the year.

The capture and encoding aspects of Avivo, we've already looked at with the Theater 550, which leaves Avivo's 10-bit display pipeline, Xileon TV encoder, dual-link DVI, and ATI's enhanced de-interlacing/video scaling.  And we're holding off on testing the Xileon TV encoder until we get a component dongle for the cards.

Two of the aforementioned features are very easy to talk about, especially now that we have ATI's solutions in house.  For starters, the 10-bit display pipeline is truly difficult to quantify, much less demonstrate as a noticeable advantage in normal usage that the R5xx GPUs offer over their predecessors.  While there is undoubtedly some advantage, during our short time with the cards focusing on Avivo testing, we weren't able to discern that advantage. 

The next feature that's easy to talk about is the R5xx's integrated dual-link TMDS transmitter(s).  As we mentioned in our original Avivo preview, this means that any R5xx GPU should be able to support current and upcoming high-resolution LCD monitors, such as Apple's 30" Cinema Display.  It is up to the board manufacturer to decide how many dual-link DVI ports are placed on a specific board, but the GPU should support a minimum of one dual-link DVI port. 

The Radeon X1800 series will support up to two dual-link DVI ports, while the X1600 and X1300 will support up to one dual-link and one single-link port. 

We, of course, tested the new GPUs' dual link DVI with Apple's 30" Cinema Display, and here, we ran into our first problem.  The RV515 (Radeon X1300) board that we were sent by ATI only had a single-link DVI output and one analog VGA output on it for some reason.  A quick email to ATI revealed that the board that we had was just a reference board, and the shipping version of the card would be equipped with a dual-link DVI port.

So, we switched to the Radeon X1600 card that ATI sent us (RV530), and that worked perfectly.  The card had no problems running at the 30" display's native 2560 x 1600 resolution.

With those features out of the way, it was time to test the most intricate feature of Avivo that we had available to us: ATI's updated de-interlacing and scaling algorithms.

Before proceeding, be sure that you've read our primer on why de-interlacing is necessary and what contributes to good image quality while de-interlacing.

A New Video Quality Benchmark: HQV
POST A COMMENT

20 Comments

View All Comments

  • ST - Thursday, October 06, 2005 - link

    Any chance you can get 1080i deinterlacing tests in the future? 480i source material is fine, but with OTA HDTV widely available now, and the 7800gt/gtx line flaunting HD spatial temporal deinterlacing, i'm sure this is what most readers want to know about. Reply
  • ksherman - Wednesday, October 05, 2005 - link

    I installed PureVideo, but what players actually take advantage of it? Reply
  • rbV5 - Wednesday, October 05, 2005 - link

    Its nice to see detailed looks into a vastly overlooked area of video card performance. Kudos for using a standard to measure by, now if we see more of this type of scrutiny from more reviewers perhaps we'll actually get to see these features enabled rahter than reading about how great its going to be some day.

    Now lets take a good look at connectivity, custom resolution support, 1:1 pixel mapping, codec support......
    Reply
  • LoneWolf15 - Wednesday, October 05, 2005 - link

    quote:

    Despite what ATI told us at our Avivo briefing last month (although ATI insists it was a miscommunication), H.264 decode acceleration is not launching alongside the R5xx GPUs. ATI is committed to bringing both H.264 decode acceleration and transcode assist by the end of the year, but for now, we have no way of testing those features.
    nVidia already fooled me with this once. They called it PureVideo and I bought a Geforce 6800 AGP and waited eagerly for driver support that never came for hardware decode of HD WMV files (or hardware encode of MPEG), because the NV40/45 design was borked. nVidia left every single user that bought an NV40/45 card in the lurch. No recourse. So everyone who bought one with the hope of using PureVideo was screwed.

    Not making that mistake again with any company. If a feature isn't supported at the time I purchase a product, that feature doesn't exist. I'm not going to believe press releases anymore, seeing as touted features can be revoked if drivers or hardware don't work out right. Never again.

    Note: I now own an ATI X800XL, and have nothing against ATI or nVidia other than that I 'm too cynical to believe either of them on any feature until I see that feature in action.
    Reply
  • Lifted - Wednesday, October 05, 2005 - link

    I was thinking the exact same thing. Never again will I by something that will have features added at a later date. This is just a marketing tactic because they already know the hardware won't handle what they promised. Reply
  • Patman2099 - Wednesday, October 05, 2005 - link

    Is it just me, or is there no mention in the article of what Deinterlacing option they used on the ATI board

    you can change it in CCC, Ive found that the Adaptive looks best on my radeon 9700.

    which deinterlaxing mode was used?
    Reply
  • Anand Lal Shimpi - Wednesday, October 05, 2005 - link

    I just amended the article to include this information:

    "Both the ATI and NVIDIA drivers were set to auto-detect what de-interlacing algorithm the hardware should use. We found that this setting yielded the best results for each platform in the HQV benchmark."

    If I forced the adaptive or motion adaptive settings, some of the HQV tests did worse, while none improved in image quality.

    Take care,
    Anand
    Reply
  • user loser - Wednesday, October 05, 2005 - link

    Am I the only one that thinks the NV version of "De-Interlacing Quality: Vertical Detail" (page 3) is worse? Some of the red/green alternating lines are completely green or lose detail.

    Compare to the original:
    http://www.belle-nuit.com/testchart.html">http://www.belle-nuit.com/testchart.html
    (720 * 486 (NTSC) )

    And how often do the different film cadence modes get used really ? (However, they get the same amount of points (weight) as some more elementary tests.) And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".

    Or are the reasons for these differences only visible in moving video?

    Reply
  • TheSnowman - Wednesday, October 05, 2005 - link

    [quote] And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".

    Or are the reasons for these differences only visible in moving video?[/quote]
    Nah, de-interlacing artfacts would always turn up in the proper still framegrab and be easier to see that way as well, but I can't see any de-interlacing artfacts on any of the shots that are claimed to have such issues so am at a loss to understand the author's conclusions on that page. The first ATI shot does show some nasty compression for some reason or another, but I don't see any of interlacing issues in the shots on that page from either ATI or Nvidia.
    Reply
  • Anand Lal Shimpi - Wednesday, October 05, 2005 - link

    It's tough to see here, but those are actually supposed to be interlacing artifacts. They appear as compression artifacts here, but in motion you get a very clear lined pattern.

    Take care,
    Anand
    Reply

Log in

Don't have an account? Sign up now