In the past, when testing video playback features of PC graphics hardware, we have looked at the HQV benchmark by Silicon Optix. Over the years, HQV scores have improved, as we can see when comparing our first article on the subject to one written four months later. Current scores are nearly perfect on both NVIDIA and AMD hardware. But there is something lacking in these tests: they only provide insight into how hardware performs when handling standard definition content.

With the introduction of HD DVD and Blu-ray content, we have been waiting for a benchmark with which to test image quality of HD playback. Graphics hardware may ultimately have less of an impact on the HD viewing experience in the long run because media and players natively support 1080p, but it is still an important link in the chain. Interlaced media is available on both HD DVD and Blu-ray, and high quality deinterlacing at HD resolutions is just as important as it is on DVDs.

The benchmark not only looks at deinterlacing quality, but noise reduction as well. Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.

There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue.

We have wanted to play with an HD version of HQV for a while, and we are glad to have our hands on this early version. Before we take a look at just how the competition stacks up, we will look at the tests themselves and Silicon Optix scoring system.

The HD HQV Tests


View All Comments

  • ShizNet - Thursday, February 8, 2007 - link

    i agree with last dude - if we are talking about PC Hard/Software mixed with Cust.Electronics [40"+ LCD i guess] why not add">this guy or similar to the mix? and see: should people put more money into VidCard/CPU [for best 1080p] or save for receiver/DVD in their HTPC?

    otherwise - great that you guys getting down and dirty to address some issues and breaking ice for the rest of us, before we spent all that $$$ and get mid of the road performance
  • Visual - Thursday, February 8, 2007 - link

    i dont even understand exactly what you guys just tested... was this just some test-disc played with a software player? why didn't you start the article with more information about the test?
    what was the system's configuration?
    what codec is used for the content, and does it have the proper flags and information needed for correct deinterlacing?
    which player app and decoders you used, etc?

    if there were flaws in the playback, isn't it the software's fault, not the hardware's? if there were differences on ati/nvidia hardware, isn't it because the software used their built-in capabilities improperly and in different ways? surely there can be player software that handles deinterlacing perfectly without even using any hardware acceleration...

    with a digital source like a hddvd/bluray disc, i don't think these kind of tests can even apply. noise reduction, wtf? we're talking of digital storage, not audio tapes after all. noise can't just appear with age. if there is "noise" on the source, it was probably put there on purpose, not real "noise" but something that was meant to be there. why should the playback system remove it?
    resolution loss and jaggies, stuff that is related to deinterlacing, and it just pisses me off. why oh why should anyone be bothered with deinterlacing in this day and age?
    you say "Interlaced media is available on both HD DVD and Blu-ray" but from what i've heared, the majority (if not all) of hd-dvd and blue-ray content is currently stored as 1080p on the discs. who and why would be as dumb as to produce interlaced hd content?
  • DerekWilson - Thursday, February 8, 2007 - link

    I've updated page 3 of the article with information on the HD DVD player used and the drivers used for AMD and NVIDIA cards.

    The software player enabled hardware acceleration which enables AMD and NVIDIA to handle much of the decode and deinterlacing of the HD content. This is a test of the hardware and drivers provided by AMD and NVIDIA.

    Codec doesn't matter and proper flags don't matter -- a good deinterlacing algorithm should detect the type of content being played. In fact, AMD and NVIDIA both do this for standard definition content.

    It might be possible for software HD DVD and Blu-ray players to handle proper deinterlacing, but most software DVD players don't even do it as effectively as possible. There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test.

    I do appologize if I didn't explain noise well enough.

    The problem comes in the transfer of a movie from film to digital media. CCDs used to pick up light shining through film will absolutely introduce noise, especially in large blocks of similar color like sky. Even digital HD cameras don't have an infinite color space and will have problems with noise in similar situations due to small fluctionations in the exact digital color at each pixel for each frame.

    This type of noise can be reduced by post processing, but studios usually do not do this. All you need to do is watch X-Men 3 on Blu-ray to see that noise is a huge problem.

    In addition, encoding and compression introduce noise. This noise can't be removed except in the decode process.

    Noise is a major issue in HD content, and while much of it could be fixed with post processing, it looks horrible at high resolution.

    As for interlacing, most movies will definitely be progressive. But there are some that are 1080i and will need good deinterlacing support.

    The big issue, as has been pointed out elsewhere in the comments, is TV. 1080i is the standard here.

    In fact, when stations start distributing series on HD DVD and Blu-ray, it is very likely we will see them in interlaced format. Most of my DVD collection consists of TV series, so I consider deinterlacing an imporant step in HD video playback.

    As much as I dislike interlaced content in general, it is unfortunately here to stay.
  • RamarC - Friday, February 9, 2007 - link

    Because a TV program is broadcast in 1080i does in no way mean that's the format it is captured/mastered in. "24p" is the current standard for mastering of most network programming and it can result in 720p or 1080i or 1080p content.">
    In an interview with Microsoft in the Audioholics magazine in January 2006 indicated that HD DVD movies will be stored in 1080p format like BD, even if initial players can only output at 1080i.

    Interlaced HD/BluRay content will be a rarity and the performance of playback software with that content is a trivial issue.
  • ianken - Friday, February 9, 2007 - link

    " There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test. "

    Becuase they don't need it as the content is 1080p.

    Silicon Optix is in the business to sell video processing chips. Their benchmark is designed to get people to look for players with their hardware.

    For properly authored discs NR and adaptive deinterlace is wasted.

    The thing I like about the HQV dics is that sites like this use them and that motivates ATI and NVIDIA to pass them and that gets folks a better 1080i broadcast experience. It's in the realm of poorly encoded broadcast HD TV that this stuff is important.

  • autoboy - Thursday, February 8, 2007 - link

    Sorry about being a huge pain in the ass. I really do like reading your articles about video processing and they are always quite good. For me though, there is always something that seems to be missing.

    I just found this quote from the head of the mutlimedia division at Nvidia

    FiringSquad: PureVideo seems to do more than regular bob deinterlacing when tested with the HQV Benchmark DVD. Can you give us any more details on what's being done?

    Scott Vouri: Yes, we do much more than regular ‘bob’ deinterlacing, but unfortunately we can’t disclose the algorithms behind our de-interlacing technology. I do want to point out that HQV doesn’t even test one of the best things about our spatial-temporal de-interlacing – the fact that we do it on 1080i HD content, which is quite computationally intensive.

    So it appears that they at least do adaptive deinterlacing which means they do what they say which means they should do inverse telecine and 3:2 pulldown correctly as well. I just can't help but think there is something missing from your setup. They should score better than a 0. Is the HQV benchmark copy protected? Can it be played on regular mpeg2 decoders? Is the PowerDVD hardware acceleration broken?
  • autoboy - Thursday, February 8, 2007 - link

    So the codec doesn't matter for deinterlacing? The decoder decodes the video in a sort of raw format and then the video card takes over the deinterlacing? Hmm. I didn't know that. I was under the impression that the codec was the most important part of the equation. Why is interlaced video such a mystery to most of us. i have been trying to fully understand it for 6 months and I find out that I still don't know anything. I just want proper deinterlacing. Is that too much to ask?

    Is is really that hard to get good video playback on a PC for interlaced material! Come on...

Log in

Don't have an account? Sign up now