De-Interlacing Quality - 3:2 Detection
"Although video programs are transmitted to your TV using one of two picture refresh rates - 30 frames per second, interlaced (30i) and 60 frames per second, progressive scan (60p) - the original program content may have vastly different refresh rates. For example, motion picture film is shot, edited, and screened with a picture refresh rate of 24 frames per second, progressive scan (24p).

To convert such programs for television, a conversion process is used to find a common mathematical relationship between the original program and the broadcast format in use. One common technique is called 3:2 pulldown. During this technique, one additional film frame is repeated in every fifth field of video - hence, the term "3:2". A complete film-to-video sequence actually has a 2:3:2:3 pattern.

A quality video processing circuit will detect the extra frame and remove it to result in a smooth presentation of motion. However, the 3:2 sequences can be corrupted during digital editing, insertion of video effects and titles, digital compositing, and intercutting with animated sequences (which often have very different cadences).

Electronic editing is the most common source of discontinuities in the 3:2 sequence. If all edits started on the first, odd-numbered field of video (often called the 'A' frame), then the job of the 3:2 circuitry in your TV would be quite simple. However, when edits do not start on the 'A' frame, your 3:2 processor can lose count and must recapture the sequence.

For this test, your TV's progressive scan image or 3:2 cadence processor must be set in "Automatic" mode, not "Film Mode". As you watch the test image, pay attention to detail in the rows of seats in the racetrack grandstand. In addition to smooth motion and image detail, observe how quickly the TV's image processor picks up the 3:2 pattern.

No more than 5 frames (about .2 seconds) should pass before this happens, which is about the time it takes the racecar to reach the "HOMESTEAD" billboard on the wall along the track. If you see a strong moiré interference pattern in the grandstand, it is evidence that the processor has not correctly detected the image cadence."
Both ATI and NVIDIA detected the 3:2 sequence and properly de-interlaced the scene within the 0.2 second/5 frame suggested limit by the benchmark, but what was truly interesting was the fact that NVIDIA's solution didn't have any visible moiré pattern at all, even for 5 frames.

For the ATI solution, there was a short period of time where the following was visible:

But before the homestead billboard, ATI's de-interlacing kicked in and we saw this:

We never saw a screenshot similar to the first one with NVIDIA; we only saw what you see below:

But since both ATI and NVIDIA qualified for a score of 10 by the benchmark's standards, they both get the same score here, despite the descrepancy noted.

Scoring Description
10 OVERALL SHARPNESS IS GOOD, NO MOIRÉ PATTERN IS SEEN, AND THE TV LOCKS INTO FILM MODE ALMOST INSTANTLY (NO MORE THAN 5 FRAMES OR ABOUT .2 SECONDS)
5 THE IMAGE LOOKS DETAILED AND MOTION IS SMOOTH, BUT MOIRÉ IS SEEN IN THE GRANDSTAND FOR UP TO ONE HALF SECOND (ABOUT 15 FRAMES) AS THE TV SWITCHES INTO FILM MODE
0 THE TV TAKES TOO LONG TO LOCK INTO FILM MODE OR DROPS IN AND OUT OF FILM MODE, AND A STRONG MOIRÉ PATTERN IS SEEN IN THE GRANDSTAND


De-Interlacing Quality: Waving Flag De-Interlacing Quality - Film Cadence
Comments Locked

20 Comments

View All Comments

  • ST - Thursday, October 6, 2005 - link

    Any chance you can get 1080i deinterlacing tests in the future? 480i source material is fine, but with OTA HDTV widely available now, and the 7800gt/gtx line flaunting HD spatial temporal deinterlacing, i'm sure this is what most readers want to know about.
  • ksherman - Wednesday, October 5, 2005 - link

    I installed PureVideo, but what players actually take advantage of it?
  • rbV5 - Wednesday, October 5, 2005 - link

    Its nice to see detailed looks into a vastly overlooked area of video card performance. Kudos for using a standard to measure by, now if we see more of this type of scrutiny from more reviewers perhaps we'll actually get to see these features enabled rahter than reading about how great its going to be some day.

    Now lets take a good look at connectivity, custom resolution support, 1:1 pixel mapping, codec support......
  • LoneWolf15 - Wednesday, October 5, 2005 - link

    quote:

    Despite what ATI told us at our Avivo briefing last month (although ATI insists it was a miscommunication), H.264 decode acceleration is not launching alongside the R5xx GPUs. ATI is committed to bringing both H.264 decode acceleration and transcode assist by the end of the year, but for now, we have no way of testing those features.
    nVidia already fooled me with this once. They called it PureVideo and I bought a Geforce 6800 AGP and waited eagerly for driver support that never came for hardware decode of HD WMV files (or hardware encode of MPEG), because the NV40/45 design was borked. nVidia left every single user that bought an NV40/45 card in the lurch. No recourse. So everyone who bought one with the hope of using PureVideo was screwed.

    Not making that mistake again with any company. If a feature isn't supported at the time I purchase a product, that feature doesn't exist. I'm not going to believe press releases anymore, seeing as touted features can be revoked if drivers or hardware don't work out right. Never again.

    Note: I now own an ATI X800XL, and have nothing against ATI or nVidia other than that I 'm too cynical to believe either of them on any feature until I see that feature in action.
  • Lifted - Wednesday, October 5, 2005 - link

    I was thinking the exact same thing. Never again will I by something that will have features added at a later date. This is just a marketing tactic because they already know the hardware won't handle what they promised.
  • Patman2099 - Wednesday, October 5, 2005 - link

    Is it just me, or is there no mention in the article of what Deinterlacing option they used on the ATI board

    you can change it in CCC, Ive found that the Adaptive looks best on my radeon 9700.

    which deinterlaxing mode was used?
  • Anand Lal Shimpi - Wednesday, October 5, 2005 - link

    I just amended the article to include this information:

    "Both the ATI and NVIDIA drivers were set to auto-detect what de-interlacing algorithm the hardware should use. We found that this setting yielded the best results for each platform in the HQV benchmark."

    If I forced the adaptive or motion adaptive settings, some of the HQV tests did worse, while none improved in image quality.

    Take care,
    Anand
  • user loser - Wednesday, October 5, 2005 - link

    Am I the only one that thinks the NV version of "De-Interlacing Quality: Vertical Detail" (page 3) is worse? Some of the red/green alternating lines are completely green or lose detail.

    Compare to the original:
    http://www.belle-nuit.com/testchart.html">http://www.belle-nuit.com/testchart.html
    (720 * 486 (NTSC) )

    And how often do the different film cadence modes get used really ? (However, they get the same amount of points (weight) as some more elementary tests.) And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".

    Or are the reasons for these differences only visible in moving video?

  • TheSnowman - Wednesday, October 5, 2005 - link

    [quote] And I can't tell the functional difference between ATI/NV in the second image in page 9 "De-Interlacing Quality - Mixed 3:2 Film With Added Video Titles".

    Or are the reasons for these differences only visible in moving video?[/quote]
    Nah, de-interlacing artfacts would always turn up in the proper still framegrab and be easier to see that way as well, but I can't see any de-interlacing artfacts on any of the shots that are claimed to have such issues so am at a loss to understand the author's conclusions on that page. The first ATI shot does show some nasty compression for some reason or another, but I don't see any of interlacing issues in the shots on that page from either ATI or Nvidia.
  • Anand Lal Shimpi - Wednesday, October 5, 2005 - link

    It's tough to see here, but those are actually supposed to be interlacing artifacts. They appear as compression artifacts here, but in motion you get a very clear lined pattern.

    Take care,
    Anand

Log in

Don't have an account? Sign up now