HD HQV Image Quality Analysis

We have already explored Silicon Optix HD HQV in detail. The tests and what we are looking for in them have not changed since our first round. Fortunately, the ability of NVIDIA and AMD hardware to actually perform the tasks required of HD HQV has changed quite a bit.

Both AMD and NVIDIA told us to expect scores of 100 out of 100 using their latest drivers and hardware. We spent quite a bit of time and effort in fully evaluating this test. We feel that we have judged the performance of these solutions fairly and accurately despite the fact that some subjectivity is involved. Here's what we've come up with.

Silicon Optix HD HQV Scores
Noise Reduction Video Res Loss Jaggies Film Res Loss Stadium Total
AMD Radeon HD 2900 XT 15 20 20 25 10 90
AMD Radeon HD 2600 XT 15 20 20 25 10 90
AMD Radeon HD 2600 Pro 15 20 20 25 10 90
AMD Radeon HD 2400 XT 0 20 0 25 10 55
NVIDIA GeForce 8800 GTX 25 20 20 25 10 100
NVIDIA GeForce 8600 GTS 25 20 20 25 10 100
NVIDIA GeForce 8600 GT 25 20 20 25 10 100


The bottom line is that NVIDIA comes out on top in terms of quality. We've seen arguments for scoring these cards differently, but we feel that this is the most accurate representation of the capabilities offered by each camp.

On the low end, both AMD and NVIDIA hardware begin to stumble in terms of quality. The HD 2400 XT posts quite a lack luster performance, failing in noise reduction and HD deinterlacing (jaggies). But at least it poorly deinterlaces video at full resolution. We excluded tests of NVIDIA's 8500 series, as their video drivers have not yet been optimized for their low end hardware. Even so, we have been given indications not to expect the level of performance we see from the 8600 series. We would guess that the 8500 series will perform on par with the AMD HD 2400 series, though we will really have to wait and see when NVIDIA releases a driver for this.

With video decode hardware built in as a separate block of logic and post processing being handled by the shader hardware, it's clear that the horrendous 3D performance of low end parts has bled through to their video processing capability as well. This is quite disturbing, as it removes quite a bit of potential value from low cost cards that include video decode hardware.

Both AMD and NVIDIA perform flawlessly and identically in every test but the noise reduction test. AMD uses an adaptive noise reduction algorithm that the user is unable to disable or even adjust in any way. NVIDIA, on the other hand, provides an adjustable noise reduction filter. In general, we prefer having the ability to adjust and tweak our settings, but simply having this ability is irrelevant in HQV scores.

The major issue that resulted in our scoring AMD down in noise reduction was that noise was not reduced significantly enough to match what we expected. In addition to the tests, Silicon Optix provides a visual explanation of the features tested, including noise reduction. They show a side by side video of a yellow flower (a different flower than the one presented in the actual noise reduction test). The comparison shows a noisy video on the left and a video with proper noise reduction applied on the right. The bottom line is that there is almost no noise at all in the video on the right.

During the test, although noise is reduced using AMD hardware, it is not reduced to the level of expectation set by the visual explanation of the test. Based on this assessment, we feel that AMD noise reduction deserves a score of 15 out of 25. Silicon Optix explains a score of 15 as: "The level of noise is reduced somewhat and detail is preserved." In order to achieve a higher score, we expect the noise to be reduced to the point where we do not notice any "sparkling" effect in the background of the image at all.

By contrast, with NVIDIA, setting the noise reduction slider anywhere between 51% and 75% gave us a higher degree of noise reduction than AMD with zero quality loss. At 75% and higher we noticed zero noise in the image with no detail loss until noise reduction was set very high. Tests done with the noise reduction slider at 100% show some detail loss, but there is no reason to crank it up that high unless your HD source is incredibly noisy (which will not likely be the case). In addition, at such high levels of noise reduction, we noticed banding and artifacts in some cases. This was especially apparent in the giant space battle near the end of Serenity. It seems to us that computer generated special effects seemed to suffer from this issue more than other aspects of the video.

While, ideally, we would like to see artifacts avoided at all cost, NVIDIA has provided a solution that offers much more flexibility than their competition. With a little experimentation, a higher quality experience can be delivered on NVIDIA hardware than on AMD hardware. In fact, because NVIDIA sets noise reduction to default off, we feel that the overall experience provided to consumers will be higher.

The Test Transporter 2 Trailer (High Bitrate H.264) Performance
POST A COMMENT

63 Comments

View All Comments

  • bpt8056 - Monday, July 23, 2007 - link

    Does it have HDMI 1.3?? Reply
  • phusg - Monday, July 23, 2007 - link

    quote:

    As Derek mentioned, higher than 75% produced banding.


    Indeed, which makes it strange that he gave the nvidia cards 100% scores! Sure manual control on the noise filter is nice, but 100% is 100% Derek. It working badly when set above 75% makes for a less than perfect HQV score IMHO. Personally I would have gone with knocking off 5 points from the nvidia card's noise scores for this.
    Reply
  • Scrogneugneu - Monday, July 23, 2007 - link

    I would have cut points back too, but not because at 100% the image quality goes down. There's no sense in providing a slider if every position on the slider gives the same perfect image, doesn't it?

    Giving a slider, however, isn't very user-friendly, from an average Joe's perspective. I want to dump my movie in the player and listen to it, and I want it to look great. I do not want to move a slider around for every movie to get a good picture quality. Makes me think about the Tracking on old VHS. Quite annoying.


    From a technological POV, yes, NVidia's implementation enables players to be great. From a consumer's POV, it doesn't. I wanna listen to a movie not fine tune my player.
    Reply
  • Chunga29 - Monday, July 23, 2007 - link

    It's all about the drivers, people! TechReport did their review with older drivers (at least on the NVIDIA side). So in the past two weeks, NVIDIA apparently addressed some problems and AT took a look at the current results. Probably delayed the article a couple times to rerun tests as well, I bet!

    As for the above comment about the slider, what you're failing to realize is that noise reduction impacts the final output. I believe Sin City used a lot of noise intentionally, so if you watch that on ATI hardware the result will NOT be what the director wanted. A slider is a bit of a pain, but then being a videophile is also a pain at times. With an imperfect format and imperfect content, we will always have to deal with imperfect solutions. I'd take NVIDIA here as well, unless/until ATI offers the ability to shut off NR.
    Reply
  • phusg - Monday, July 23, 2007 - link

    Hi Derek,
    Nice article, although I've just noticed a major omission: you didn't bench any AGP cards! There are AGP versions of the 2600 and 2400 cards and I think these are very attractive upgrades for AGP HTPC owners who are probably lacking the CPU power for full HD. The big question is whether the unidirectional AGP bus is up to the HD decode task. The previous generation ATi X1900 AGP cards reportedly had problems with HD playback.

    Hopefully you'll be able to look into this, as AFAIK no-one else has yet.

    Regards, Pete
    Reply
  • ericeash - Monday, July 23, 2007 - link

    i would really like to see these tests done on an AMD x2 proc. the core 2 duo's don't need as much offloading as we do. Reply
  • Orville - Monday, July 23, 2007 - link

    Derek,

    Thanks so much for the insightful article. I’ve been waiting on it for about a month now, I guess. You or some reader could help me out with a couple of embellishments, if you would.

    1.How much power do the ATI Radeon HD 2600 XT, Radeon HD 2600 Pro, Nvidia GeForce 6800 GTS and GeForce 6800 GT graphics cards burn?

    2.Do all four of the above mentioned graphics cards provide HDCP for their DVI output? Do they provide simultaneous HDCP for dual DVI outputs?

    3.Do you recommend CyberLink’s Power DVD video playing software, only?

    Regards,

    Orville

    Reply
  • DerekWilson - Monday, July 23, 2007 - link

    we'll add power numbers tonight ... sorry for the omission

    all had hdcp support, not all had hdcp over dual-link dvi support

    powerdvd and windvd are good solutions, but powerdvd is currently further along. we don't recommend it exclusively, but it is a good solution.
    Reply
  • phusg - Wednesday, July 25, 2007 - link

    I still can't see them, have they been added? Thanks. Reply
  • GlassHouse69 - Monday, July 23, 2007 - link

    I agree here, good points.

    15% cpu utilization looks great until.... you find that a e4300 takes so little power that to use 50% of it to decode is only 25 watts of power. It is nice seeing things offloaded from the cpu.... IF the video card isnt cranking up alot of heat and power.

    Reply

Log in

Don't have an account? Sign up now