Final Words

While noise reduction can be a good thing, when viewing well mastered and high quality compressed HD video, noise should be kept at a minimum anyway. We've seen our fair share of early HD releases where noise is simply atrocious, however, and we expect that it will take some studios a little time to adjust to the fact that higher resolution movies not only look better, but reveal flaws more readily as well. For now (especially for movies like X-Men 3), noise reduction is highly appreciated. But down the line we hope that studios will put a bit more effort into delivering a polished product.

There are cases where blending effects require a bit of added noise to give scenes a more natural feel. Noise can even be cranked way up by a director to provide an artistic or dated effect. In these cases (which will hopefully be most cases where noise is evident in the future), we want to view HD material as it was delivered. When presented with poor post processing from a studio it is nice to have the ability to make our own decisions on how we want to view the content. These facts make it clear to us that the ability to enable or disable noise reduction is an imperative feature for video processors. While fully adjustable noise reduction might not be as necessary, it is absolutely appreciated and offers those who know what they are doing the highest potential image quality across every case.

Those who choose to stick with very well produced 1080p content may not need post processing noise reduction or deinterlacing, but they might miss out on imported content or HD releases of some TV series (depending on what studios choose to do in that area). For now, we're going to recommend that users interested in HTPC setups stick with the tools that can get the job done best no matter what the source material is. The only options for HD video intensive systems today are the Radeon HD 2600 and GeForce 8600 series cards. For its better handling of noise reduction (and especially the fact that it can be turned off) we recommend the 8600 GT/GTS above the other options in spite of the fact that the 2600 XT provided better CPU offloading.

We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.

AMD's UVD does beat out NVIDIA's VP2 in both H.264 and VC-1 decode performance. However, it isn't really enough to make a tangible difference in the viewing of movies. Performance is important, and UVD performance is certainly impressive. But we still have to favor the 8600 for its superior image quality.

VC-1 bitstream decoding doesn't have as large an impact as H.264 bitstream decoding. We would have to drop down to a significantly slower CPU in order for the difference to offer AMD an advantage. In the scenarios we tested, we feel that NVIDIA didn't make a serious blunder in skipping the inclusion of hardware to handle VC-1 bitstreams. At least, they didn't make as serious a blunder as AMD did by not including UVD in their HD 2900 XT.

In the future, we won't "need" H.264 or VC-1 decode on our GPUs either (just as we don't "need" MPEG-2 acceleration for current CPUs), but we don't see this as a valid excuse not to provide a full range of functionality for end users. And need is a relative term at best. We can do good realtime 3D on CPUs these days, but we don't see graphics card companies saying "this card will be paired with a high end CPU so we decided not to implement [insert key 3D feature] in hardware." We want to see AMD and NVIDIA include across the board support for video features in future product lineups. Saving CPU cycles isn't an exclusive desire of owners of low end hardware, and when we buy higher end hardware we expect higher performance.

Serenity (VC-1) Performance
Comments Locked

63 Comments

View All Comments

  • smitty3268 - Monday, July 23, 2007 - link

    No. The 2400 and 2600 have support for Avivo HD feature set even with VC-1 decoding, while the G84 and G86 don't so their quote is correct. If a little confusing, since Avivo is ATI terminology. Nevertheless, it is basically equivalent to the NVIDIA hardware.
  • scosta - Monday, July 23, 2007 - link

    I think this sentence in page 1 is wrong!

    <blockquote>While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", <b>G84 and G86<\b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>

    Dont you mean ...
    <blockquote>the features listed as "Avivo", <b>HD 2400 and HD 2600</b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>

    Regards
  • iwodo - Monday, July 23, 2007 - link

    quote:

    We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.


    May be i am the only one who doesn't understand why would they not recommend a Geforce 8500 for Low end machine?
  • Chunga29 - Monday, July 23, 2007 - link

    The NVIDIA 8500 drivers are not currently working with PureVideo HD, I believe was mentioned.
  • ssiu - Monday, July 23, 2007 - link

    NVIDIA PureVideo HD still doesn't support Windows XP, correct? That would be the deciding factor for many people (instead of a noise reduction score of 15% versus 25% etc.)
  • legoman666 - Monday, July 23, 2007 - link

    this man hit the nail on the head. A couple months ago i was on the verge of buying a new video card for my htpc with h.264 acceleration, but upon learning that those features were only enabled for vista (bleh) I decided not to upgrade at all.
  • DigitalFreak - Monday, July 23, 2007 - link

    Any ideas as to why the HQV scores are almost totally opposite of what http://techreport.com/reviews/2007q3/radeon-hd-240...">The Techreport came up with? I'd trust AT's review more, but it seems strange that the scores are so different.
  • phusg - Monday, July 23, 2007 - link

    Yes very interesting! FTA:
    quote:

    Also, even on the 8600 GTS, Nvidia's noise reduction filter isn't anywhere near ready for prime-time. This routine may produce a solid score in HQV, but it introduces visible color banding during HD movie playback. AMD's algorithms quite clearly perform better.
  • DigitalFreak - Monday, July 23, 2007 - link

    I'm wondering if they ran with the noise filter at over 75% in their test. As Derek mentioned, higher than 75% produced banding. I also noticed that Derek used 163.x drivers, while TR used 162.x.

    Honestly, I wish there was an 8600 GT/GTS with HDMI out. Would really love to avoid running two cables to my receiver.
  • Gary Key - Monday, July 23, 2007 - link

    quote:

    Honestly, I wish there was an 8600 GT/GTS with HDMI out. Would really love to avoid running two cables to my receiver.


    There will be in about 60 days, hardware is sampling now. ;)

Log in

Don't have an account? Sign up now