Original Link: https://www.anandtech.com/show/1809



Late last month. ATI announced their Avivo platform, which would become ATI's new baseline for overall picture and video quality on the PC.  The problem with ATI's launch last month was that, without any R5xx GPUs, the Avivo platform was literally nothing more than ATI's Theater 550 TV tuner, which is pretty much old news by this point.  Luckily, today we have ATI's Radeon X1800, X1600 and X1300 GPUs, all of which are Avivo compliant GPUs, so we can begin actually testing the features of Avivo.  Well, not exactly.

Despite what ATI told us at our Avivo briefing last month (although ATI insists that it was a miscommunication), H.264 decode acceleration is not launching alongside the R5xx GPUs.  ATI is committed to bringing both H.264 decode acceleration and transcode assist by the end of the year, but for now, we have no way of testing those features. Update: Just to clarify, the R5xx GPUs do feature hardware support for H.264 acceleration. We have seen the decode acceleration in action on an X1800 twice, once at Computex and once at ATI's Avivo briefing in NYC. What ATI does not yet have ready is driver and application support for the acceleration, which we are hearing will be ready sometime in November, or at least by the end of the year.

The capture and encoding aspects of Avivo, we've already looked at with the Theater 550, which leaves Avivo's 10-bit display pipeline, Xileon TV encoder, dual-link DVI, and ATI's enhanced de-interlacing/video scaling.  And we're holding off on testing the Xileon TV encoder until we get a component dongle for the cards.

Two of the aforementioned features are very easy to talk about, especially now that we have ATI's solutions in house.  For starters, the 10-bit display pipeline is truly difficult to quantify, much less demonstrate as a noticeable advantage in normal usage that the R5xx GPUs offer over their predecessors.  While there is undoubtedly some advantage, during our short time with the cards focusing on Avivo testing, we weren't able to discern that advantage. 

The next feature that's easy to talk about is the R5xx's integrated dual-link TMDS transmitter(s).  As we mentioned in our original Avivo preview, this means that any R5xx GPU should be able to support current and upcoming high-resolution LCD monitors, such as Apple's 30" Cinema Display.  It is up to the board manufacturer to decide how many dual-link DVI ports are placed on a specific board, but the GPU should support a minimum of one dual-link DVI port. 

The Radeon X1800 series will support up to two dual-link DVI ports, while the X1600 and X1300 will support up to one dual-link and one single-link port. 

We, of course, tested the new GPUs' dual link DVI with Apple's 30" Cinema Display, and here, we ran into our first problem.  The RV515 (Radeon X1300) board that we were sent by ATI only had a single-link DVI output and one analog VGA output on it for some reason.  A quick email to ATI revealed that the board that we had was just a reference board, and the shipping version of the card would be equipped with a dual-link DVI port.

So, we switched to the Radeon X1600 card that ATI sent us (RV530), and that worked perfectly.  The card had no problems running at the 30" display's native 2560 x 1600 resolution.

With those features out of the way, it was time to test the most intricate feature of Avivo that we had available to us: ATI's updated de-interlacing and scaling algorithms.

Before proceeding, be sure that you've read our primer on why de-interlacing is necessary and what contributes to good image quality while de-interlacing.



A New Video Quality Benchmark: HQV

HQV, a high end video processor manufacturer, has released an excellent benchmark for evaluating the image quality of consumer electronics and PC based video products.  The HQV Benchmark DVD is a collection of tests that stress various aspects of motion video quality, with the goal of offering a standard by which video quality can be judged. 

In the past, we've used a handful of problematic DVDs to judge items like de-interlacing quality, but with HQV, we're not only able to replace those DVDs, but we're able to compare de-interlacing algorithms in a much more scientific manner, and for a wider range of scenarios. 

The benchmark is a simple DVD with a bunch of video sequences that each test a different feature that a high quality video setup should implement.  Each video sequence is accompanied with a list of things for which to look, as well as guidelines of how to rate a particular platform's success in playing the videoproperly.  Each test is subjectively rated and given a score from 0 - 10, higher being better.  Most tests can only be rated a "0" for failing, "5" for completing some of the requirements of the test, and a "10" for being perfect. 

The HQV Benchmark DVD is available for public purchase, and documentation is available on each of the tests that the benchmark runs.  The benchmark also provides guidelines on how to subjectively rate each test so that the results may be reproducible and scientific at the same time, despite their subjective nature.

Test Hardware

CPU: AMD Athlon 64 X2 4200+ (2.2GHz/512KBx2)
Motherboard: ASUS A8N-SLI Deluxe
Motherboard BIOS: ASUS: Version 1013 Dated 08/10/2005
Chipset: NVIDIA nForce4 SLI
Chipset Drivers: nForce4 6.66
Memory: OCZ PC3500 DDR 2-2-2-7
Video Card: ATI Radeon X1600 XT
NVIDIA GeForce 7800GTX
Video Drivers: ATI Catalyst 8.173.1.2
NVIDIA ForceWare 81.82
Desktop Resolution: 1600 x 1200 - 32-bit @ 60Hz
OS: Windows XP Professional SP2
Benchmarking Software: HQV Benchmark DVD
DVD Decoder: NVIDIA PureVideo 1.02-150

Note that we verified that ATI's image quality was the same whether we used NVIDIA's PureVideo DVD decoder or Intervideo's DVD decoder. We chose to benchmark with NVIDIA's PureVideo decoder in order to minimize the number of variables between cards.

Also note that we are only comparing ATI's X1600 XT to NVIDIA's GeForce 7800GTX. We have already compared NVIDIA's GeForce6/7 series of GPUs to ATI's previous generation of GPUs and determined that NVIDIA offered superior de-interlacing quality, so today's comparison will focus on the latest and greatest from both vendors to see if those standings have changed. Remember that the X1600 XT has the same de-interlacing engine as the X1300 and X1800, so the results here are directly applicable to all of ATI's new GPUs.

Both the ATI and NVIDIA drivers were set to auto-detect what de-interlacing algorithm the hardware should use. We found that this setting yielded the best results for each platform in the HQV benchmark.


De-Interlacing Quality: Vertical Detail

The first test on the HQV Benchmark DVD is a test of vertical resolution, taken from HQV's documentation:
"You will see varying degrees of detail in the alternating black-and-white bar pattern at marker "1". The more detail you see, the higher the quality of static de-interlacing that is being employed.

If these bar patterns flicker or are soft or missing, you are not seeing the full vertical resolution possible in the source.

Simple de-interlacers will not preserve the full vertical resolution of still images. A stable image with image detail at "1" and correctly shaded colors gets a passing grade."
The ATI and NVIDIA solutions both produced the same quality of picture here (the default image is ATI's):



Hold mouse over image to see NVIDIA's Image Quality

The possible scores for this test are as follows:

Scoring Description
10 IMAGE DETAIL IS SEEN AT MARKER "1", NO FLICKER IS OBSERVED
5 MINOR FLICKERING IS SEEN AT MARKER "1"
0 NO IMAGE DETAIL IS SEEN AT MARKER "1"

Both images showed image detail, but there was some flickering, so both cards received a rating of 5.



De-Interlacing Quality: Jaggies Pattern 1

The next test takes a another stab at de-interlacing capabilities of the cards:
"Interlaced video creates images with scan line artifacts. When video is de-interlaced for display, some of these artifacts may not be completely eliminated. As a result, diagonal lines may appear to have stepped edges. We call these artifacts 'jaggies' as they resemble a jagged edge.

Both 480i standard definition and 1080i high definition video program formats are interlaced and thus produce images with these artifacts. (480p and 720p program formats do not.) They can be corrected with a good quality video processor. A technique called motion adaptive de-interlacing with directional interpolation is often employed to process these types of signals."
Both ATI and NVIDIA perform equivalently here, with the first hints of jaggies appearing as the bar enters the yellow area.



Hold mouse over image to see NVIDIA's Image Quality.

Scoring Description
5 JAGGIES ARE NOT SEEN UNTIL THE BAR ENTERS THE GREEN AREA (< 10 DEGREES) AND THE LOGO IS FREE OF JAGGIES.
3 JAGGIES ARE NOT SEEN UNTIL THE BAR ENTERS THE YELLOW AREA (< 20 DEGREES) AND THE LOGO IS FREE OF JAGGIES.
0 JAGGIES ARE VISIBLE AS THE BAR ENTERS THE RED AREA (< 45 DEGREES) OR THE LOGO HAS JAGGIES.




De-Interlacing Quality: Jaggies Pattern 2
"Motion adaptive de-interlacing is just one step in cleaning up scan line artifacts. The video signal processor should also employ directional filtering to catch rapidly moving shapes that may change direction, speed, and angle.

In the second jaggies test, you'll see a cluster of three equally spaced white bars of the same thickness, rapidly moving up and down between a 5-degree and 35-degree angle. If all three bars appear to have jagged edges at all times, the video processor does not use directional filtering to smooth the images. If all three bars are smooth throughout the test, the video processing earns a passing grade."
NVIDIA has the clear advantage in this test, as ATI's de-interlacing shows jaggies in all three bars, while NVIDIA's leaves only the bottom bar with jagged edges.



Hold mouse over image to see NVIDIA's Image Quality.


Scoring Description
5 ALL THREE BARS HAVE SMOOTH EDGES AT ALL TIMES
3 THE TOP TWO BARS HAVE SMOOTH EDGES, BUT THE BOTTOM BAR DOES NOT
1 ONLY THE TOP BAR HAS SMOOTH EDGES
0 NONE OF THE BARS HAVE SMOOTH EDGES




De-Interlacing Quality: Waving Flag
"Moving bars are a good way to evaluate the quality of motion adaptive de-interlacing and directional filtering, but the ideal test is to use a real-world object with rapidly moving lines at multiple, continuously shifting angles. The flag of the United States of America, with its 13 red and white stripes, is perfectly suited to this test.

As the flag slowly furls and unfurls in the wind, the 13 bars create a natural jaggies test, much like our Jaggies #2 pattern. The cleaner the edges of the red and white bars appear, the higher the quality of motion adaptive de-interlacing in use. Some lower-cost processors intentionally blur the image to mask scan line artifacts they can't clean up, so pay attention also to background sharpness in this test!"
Both ATI and NVIDIA offered similar performance in the flag test; while neither was able to rid the flag of jagged edges completely, both did a reasonable job, and thus, both cards came away with a 5 in this test.



Hold mouse over image to see NVIDIA's Image Quality.


Scoring Description
10 JAGGED EDGES ARE NOT SEEN IN THE RED AND WHITE BARS, AND THE FLAG EXHIBITS FINE DETAIL
5 SOME JAGGED EDGES ARE SEEN, AND/OR THE BACKGROUND APPEARS SOFT
0 JAGGED EDGES ARE QUITE APPARENT ALONG EDGES OF THE BARS




De-Interlacing Quality - 3:2 Detection
"Although video programs are transmitted to your TV using one of two picture refresh rates - 30 frames per second, interlaced (30i) and 60 frames per second, progressive scan (60p) - the original program content may have vastly different refresh rates. For example, motion picture film is shot, edited, and screened with a picture refresh rate of 24 frames per second, progressive scan (24p).

To convert such programs for television, a conversion process is used to find a common mathematical relationship between the original program and the broadcast format in use. One common technique is called 3:2 pulldown. During this technique, one additional film frame is repeated in every fifth field of video - hence, the term "3:2". A complete film-to-video sequence actually has a 2:3:2:3 pattern.

A quality video processing circuit will detect the extra frame and remove it to result in a smooth presentation of motion. However, the 3:2 sequences can be corrupted during digital editing, insertion of video effects and titles, digital compositing, and intercutting with animated sequences (which often have very different cadences).

Electronic editing is the most common source of discontinuities in the 3:2 sequence. If all edits started on the first, odd-numbered field of video (often called the 'A' frame), then the job of the 3:2 circuitry in your TV would be quite simple. However, when edits do not start on the 'A' frame, your 3:2 processor can lose count and must recapture the sequence.

For this test, your TV's progressive scan image or 3:2 cadence processor must be set in "Automatic" mode, not "Film Mode". As you watch the test image, pay attention to detail in the rows of seats in the racetrack grandstand. In addition to smooth motion and image detail, observe how quickly the TV's image processor picks up the 3:2 pattern.

No more than 5 frames (about .2 seconds) should pass before this happens, which is about the time it takes the racecar to reach the "HOMESTEAD" billboard on the wall along the track. If you see a strong moiré interference pattern in the grandstand, it is evidence that the processor has not correctly detected the image cadence."
Both ATI and NVIDIA detected the 3:2 sequence and properly de-interlaced the scene within the 0.2 second/5 frame suggested limit by the benchmark, but what was truly interesting was the fact that NVIDIA's solution didn't have any visible moiré pattern at all, even for 5 frames.

For the ATI solution, there was a short period of time where the following was visible:

But before the homestead billboard, ATI's de-interlacing kicked in and we saw this:

We never saw a screenshot similar to the first one with NVIDIA; we only saw what you see below:

But since both ATI and NVIDIA qualified for a score of 10 by the benchmark's standards, they both get the same score here, despite the descrepancy noted.

Scoring Description
10 OVERALL SHARPNESS IS GOOD, NO MOIRÉ PATTERN IS SEEN, AND THE TV LOCKS INTO FILM MODE ALMOST INSTANTLY (NO MORE THAN 5 FRAMES OR ABOUT .2 SECONDS)
5 THE IMAGE LOOKS DETAILED AND MOTION IS SMOOTH, BUT MOIRÉ IS SEEN IN THE GRANDSTAND FOR UP TO ONE HALF SECOND (ABOUT 15 FRAMES) AS THE TV SWITCHES INTO FILM MODE
0 THE TV TAKES TOO LONG TO LOCK INTO FILM MODE OR DROPS IN AND OUT OF FILM MODE, AND A STRONG MOIRÉ PATTERN IS SEEN IN THE GRANDSTAND




De-Interlacing Quality - Film Cadence
"The ability of a processor to detect and correct for a given film-to-video cadence affects image detail and may introduce scan line artifacts ("jaggies") as the video processor defaults to video mode. Observe the lines in the coffee cups and watch to see if they appear to jump or flicker, a sign of incorrect cadence detection resulting in half-resolution images. The text in the newspaper may also exhibit moiré and interlaced scan line artifacts."
In this test, we have a 2:2 cadence that neither card handles properly. According to the benchmark's documentation, "documentaries shot on high speed film use 30 frames per second frame rates resulting in a 2:2 cadence."



Hold mouse over image to see NVIDIA's Image Quality.


Scoring Description (5 points for each cadence)
5 THE INDIVIDUAL TEST CADENCE IS PRESENTED SMOOTHLY WITH NO FLICKERING OR JAGGIES IN THE COFFEE CUPS, NO MOIRÉ IN THE NEWSPAPER, AND NO LOSS OF RESOLUTION
0 ANY OF THE ABOVE ARTIFACTS APPEAR DURING ANY INDIVIDUAL TEST CADENCES




De-Interlacing Quality - Mixed 3:2 Film with Added Video Titles
"Filmed content edited electronically for video can introduce additional problems for a video processor. 30 fps video elements, such as title crawls and scene transitions, may confuse the processor as it tries to detect and hold a 3:2 sequence, preserving the smooth motion of 24 fps film.

A worst-case scenario is when a movie is transferred to video for broadcast or distribution on DVD and an entirely new electronic end title sequence is created. The best video processors will be able to distinguish between film and video content, converting different parts of the image of a per-pixel basis.

Look closely at the various filmed scenes to see if image detail is preserved while electronic titles crawl across and up the screen. Are diagonal lines smooth, or jagged? Is the title crawl text smooth and crisp, or does it appear soft and degraded?"
While NVIDIA does well in this first test, there are noticeable interlacing artifacts in the ATI screengrab:



Hold mouse over image to see NVIDIA's Image Quality.


Scoring Description (Vertical Text Scroll)
10 THE CREDIT SEQUENCE TEXT MOVES SMOOTHLY UP THE SCREEN, THE TEXT IS SHARP AND CRISP, AND THE BACKGROUND IMAGE EXHIBITS EXCELLENT DETAIL WITHOUT SCAN LINE ARTIFACTS ("JAGGIES")
5 THE SCROLLING TEXT IS SMOOTH AND CRISP, BUT THE BACKGROUND IMAGES HAVE NOTICEABLE SCAN LINE ARTIFACTS
0 THE SCROLLING TEXT EXHIBITS NOTICEABLE TEARING OR COMBING




Final Words

We excluded a few of the tests from our image quality comparisons, simply because they were more focused on noise reduction and both ATI and NVIDIA did equally poorly there. Instead, we decided to focus on cadence detection and de-interlacing quality, using the tests from the previous pages.

Let's first look at how all of the numbers tally up for ATI and NVIDIA:

Test NVIDIA PureVideo ATI Avivo
Color Bar/Vertical Detail 5 5
Jaggies Pattern 1 3 3
Jaggies Pattern 2 3 0
Flag 5 5
Picture Detail 0 0
Noise Reduction 0 0
Motion Adaptive Noise Reduction 0 0
3:2 Detection 10 10
Film Cadence
2:2 0 0
2:2:2:4 0 0
2:3:3:25 5 0
5:5 0 0
6:4 0 0
8:7 0 0
3:2 5 5
Scrolling Text (Horiz) 5 5
Scrolling Text (Vert) 10 5
Total 51 38

The subjective numbers add up to pretty much summarize our experience with ATI's Avivo at this point. While neither ATI nor NVIDIA produced a perfect solution, at this point, Avivo is definitely a step behind NVIDIA's PureVideo in terms of de-interlacing quality.

We will be keeping tabs on ATI's Avivo as its remaining, and arguably more exciting, features get implemented in later driver revisions. For now, be sure to read our technology and gaming performance coverage on ATI's Radeon X1000 line.

Log in

Don't have an account? Sign up now