HD HQV Image Quality Analysis

We have already explored Silicon Optix HD HQV in detail. The tests and what we are looking for in them have not changed since our first round. Fortunately, the ability of NVIDIA and AMD hardware to actually perform the tasks required of HD HQV has changed quite a bit.

Both AMD and NVIDIA told us to expect scores of 100 out of 100 using their latest drivers and hardware. We spent quite a bit of time and effort in fully evaluating this test. We feel that we have judged the performance of these solutions fairly and accurately despite the fact that some subjectivity is involved. Here's what we've come up with.

Silicon Optix HD HQV Scores
Noise Reduction Video Res Loss Jaggies Film Res Loss Stadium Total
AMD Radeon HD 2900 XT 15 20 20 25 10 90
AMD Radeon HD 2600 XT 15 20 20 25 10 90
AMD Radeon HD 2600 Pro 15 20 20 25 10 90
AMD Radeon HD 2400 XT 0 20 0 25 10 55
NVIDIA GeForce 8800 GTX 25 20 20 25 10 100
NVIDIA GeForce 8600 GTS 25 20 20 25 10 100
NVIDIA GeForce 8600 GT 25 20 20 25 10 100

The bottom line is that NVIDIA comes out on top in terms of quality. We've seen arguments for scoring these cards differently, but we feel that this is the most accurate representation of the capabilities offered by each camp.

On the low end, both AMD and NVIDIA hardware begin to stumble in terms of quality. The HD 2400 XT posts quite a lack luster performance, failing in noise reduction and HD deinterlacing (jaggies). But at least it poorly deinterlaces video at full resolution. We excluded tests of NVIDIA's 8500 series, as their video drivers have not yet been optimized for their low end hardware. Even so, we have been given indications not to expect the level of performance we see from the 8600 series. We would guess that the 8500 series will perform on par with the AMD HD 2400 series, though we will really have to wait and see when NVIDIA releases a driver for this.

With video decode hardware built in as a separate block of logic and post processing being handled by the shader hardware, it's clear that the horrendous 3D performance of low end parts has bled through to their video processing capability as well. This is quite disturbing, as it removes quite a bit of potential value from low cost cards that include video decode hardware.

Both AMD and NVIDIA perform flawlessly and identically in every test but the noise reduction test. AMD uses an adaptive noise reduction algorithm that the user is unable to disable or even adjust in any way. NVIDIA, on the other hand, provides an adjustable noise reduction filter. In general, we prefer having the ability to adjust and tweak our settings, but simply having this ability is irrelevant in HQV scores.

The major issue that resulted in our scoring AMD down in noise reduction was that noise was not reduced significantly enough to match what we expected. In addition to the tests, Silicon Optix provides a visual explanation of the features tested, including noise reduction. They show a side by side video of a yellow flower (a different flower than the one presented in the actual noise reduction test). The comparison shows a noisy video on the left and a video with proper noise reduction applied on the right. The bottom line is that there is almost no noise at all in the video on the right.

During the test, although noise is reduced using AMD hardware, it is not reduced to the level of expectation set by the visual explanation of the test. Based on this assessment, we feel that AMD noise reduction deserves a score of 15 out of 25. Silicon Optix explains a score of 15 as: "The level of noise is reduced somewhat and detail is preserved." In order to achieve a higher score, we expect the noise to be reduced to the point where we do not notice any "sparkling" effect in the background of the image at all.

By contrast, with NVIDIA, setting the noise reduction slider anywhere between 51% and 75% gave us a higher degree of noise reduction than AMD with zero quality loss. At 75% and higher we noticed zero noise in the image with no detail loss until noise reduction was set very high. Tests done with the noise reduction slider at 100% show some detail loss, but there is no reason to crank it up that high unless your HD source is incredibly noisy (which will not likely be the case). In addition, at such high levels of noise reduction, we noticed banding and artifacts in some cases. This was especially apparent in the giant space battle near the end of Serenity. It seems to us that computer generated special effects seemed to suffer from this issue more than other aspects of the video.

While, ideally, we would like to see artifacts avoided at all cost, NVIDIA has provided a solution that offers much more flexibility than their competition. With a little experimentation, a higher quality experience can be delivered on NVIDIA hardware than on AMD hardware. In fact, because NVIDIA sets noise reduction to default off, we feel that the overall experience provided to consumers will be higher.

The Test Transporter 2 Trailer (High Bitrate H.264) Performance


View All Comments

  • smitty3268 - Monday, July 23, 2007 - link

    No. The 2400 and 2600 have support for Avivo HD feature set even with VC-1 decoding, while the G84 and G86 don't so their quote is correct. If a little confusing, since Avivo is ATI terminology. Nevertheless, it is basically equivalent to the NVIDIA hardware. Reply
  • scosta - Monday, July 23, 2007 - link

    I think this sentence in page 1 is wrong!

    <blockquote>While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", <b>G84 and G86<\b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>

    Dont you mean ...
    <blockquote>the features listed as "Avivo", <b>HD 2400 and HD 2600</b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>

  • iwodo - Monday, July 23, 2007 - link


    We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.

    May be i am the only one who doesn't understand why would they not recommend a Geforce 8500 for Low end machine?
  • Chunga29 - Monday, July 23, 2007 - link

    The NVIDIA 8500 drivers are not currently working with PureVideo HD, I believe was mentioned. Reply
  • ssiu - Monday, July 23, 2007 - link

    NVIDIA PureVideo HD still doesn't support Windows XP, correct? That would be the deciding factor for many people (instead of a noise reduction score of 15% versus 25% etc.) Reply
  • legoman666 - Monday, July 23, 2007 - link

    this man hit the nail on the head. A couple months ago i was on the verge of buying a new video card for my htpc with h.264 acceleration, but upon learning that those features were only enabled for vista (bleh) I decided not to upgrade at all. Reply
  • DigitalFreak - Monday, July 23, 2007 - link

    Any ideas as to why the HQV scores are almost totally opposite of what http://techreport.com/reviews/2007q3/radeon-hd-240...">The Techreport came up with? I'd trust AT's review more, but it seems strange that the scores are so different. Reply
  • phusg - Monday, July 23, 2007 - link

    Yes very interesting! FTA:

    Also, even on the 8600 GTS, Nvidia's noise reduction filter isn't anywhere near ready for prime-time. This routine may produce a solid score in HQV, but it introduces visible color banding during HD movie playback. AMD's algorithms quite clearly perform better.
  • DigitalFreak - Monday, July 23, 2007 - link

    I'm wondering if they ran with the noise filter at over 75% in their test. As Derek mentioned, higher than 75% produced banding. I also noticed that Derek used 163.x drivers, while TR used 162.x.

    Honestly, I wish there was an 8600 GT/GTS with HDMI out. Would really love to avoid running two cables to my receiver.
  • Gary Key - Monday, July 23, 2007 - link


    Honestly, I wish there was an 8600 GT/GTS with HDMI out. Would really love to avoid running two cables to my receiver.

    There will be in about 60 days, hardware is sampling now. ;)

Log in

Don't have an account? Sign up now