Final Words

While noise reduction can be a good thing, when viewing well mastered and high quality compressed HD video, noise should be kept at a minimum anyway. We've seen our fair share of early HD releases where noise is simply atrocious, however, and we expect that it will take some studios a little time to adjust to the fact that higher resolution movies not only look better, but reveal flaws more readily as well. For now (especially for movies like X-Men 3), noise reduction is highly appreciated. But down the line we hope that studios will put a bit more effort into delivering a polished product.

There are cases where blending effects require a bit of added noise to give scenes a more natural feel. Noise can even be cranked way up by a director to provide an artistic or dated effect. In these cases (which will hopefully be most cases where noise is evident in the future), we want to view HD material as it was delivered. When presented with poor post processing from a studio it is nice to have the ability to make our own decisions on how we want to view the content. These facts make it clear to us that the ability to enable or disable noise reduction is an imperative feature for video processors. While fully adjustable noise reduction might not be as necessary, it is absolutely appreciated and offers those who know what they are doing the highest potential image quality across every case.

Those who choose to stick with very well produced 1080p content may not need post processing noise reduction or deinterlacing, but they might miss out on imported content or HD releases of some TV series (depending on what studios choose to do in that area). For now, we're going to recommend that users interested in HTPC setups stick with the tools that can get the job done best no matter what the source material is. The only options for HD video intensive systems today are the Radeon HD 2600 and GeForce 8600 series cards. For its better handling of noise reduction (and especially the fact that it can be turned off) we recommend the 8600 GT/GTS above the other options in spite of the fact that the 2600 XT provided better CPU offloading.

We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.

AMD's UVD does beat out NVIDIA's VP2 in both H.264 and VC-1 decode performance. However, it isn't really enough to make a tangible difference in the viewing of movies. Performance is important, and UVD performance is certainly impressive. But we still have to favor the 8600 for its superior image quality.

VC-1 bitstream decoding doesn't have as large an impact as H.264 bitstream decoding. We would have to drop down to a significantly slower CPU in order for the difference to offer AMD an advantage. In the scenarios we tested, we feel that NVIDIA didn't make a serious blunder in skipping the inclusion of hardware to handle VC-1 bitstreams. At least, they didn't make as serious a blunder as AMD did by not including UVD in their HD 2900 XT.

In the future, we won't "need" H.264 or VC-1 decode on our GPUs either (just as we don't "need" MPEG-2 acceleration for current CPUs), but we don't see this as a valid excuse not to provide a full range of functionality for end users. And need is a relative term at best. We can do good realtime 3D on CPUs these days, but we don't see graphics card companies saying "this card will be paired with a high end CPU so we decided not to implement [insert key 3D feature] in hardware." We want to see AMD and NVIDIA include across the board support for video features in future product lineups. Saving CPU cycles isn't an exclusive desire of owners of low end hardware, and when we buy higher end hardware we expect higher performance.

Serenity (VC-1) Performance
Comments Locked

63 Comments

View All Comments

  • erwos - Monday, July 23, 2007 - link

    Does it? Because I thought that was only for MPEG-2. Link?
  • smitty3268 - Monday, July 23, 2007 - link

    Most drivers only support it with MPEG-2, but that doesn't mean it isn't capable of more. Looking again, I'm a little unclear about how much work would be required to get it working. I'm not sure if it is completely done and just requires support from the hardware vendors or if it also needs some additional work before that happens.

    http://www.mythtv.org/wiki/index.php/XvMC">http://www.mythtv.org/wiki/index.php/XvMC
    http://en.wikipedia.org/wiki/X-Video_Motion_Compen...">http://en.wikipedia.org/wiki/X-Video_Motion_Compen...
  • Per Hansson - Monday, July 23, 2007 - link

    Hi, it would be really interesting to see similar tests done in Linux also

    For example how cheap of a HTPC rig can you build, with free software too, and still provide betters features than any of the commercial solutions.

    I think we are many that have some old hardware laying around. And when seeing this article it brings up ideas. Pairing the old computer with a (AGP?) ATI 2600 card would provide an ideal solution in a nice HTPC chassi under the TV perhaps?
  • jojo4u - Monday, July 23, 2007 - link

    Linux is not practical. You would have to crack AACS and dump the disc first.
  • Per Hansson - Monday, July 23, 2007 - link

    Hmm, I did not realize that

    However a HTPC can still be built to be a player for satellite data for example, granted configuring all that up with a subscription card will not be for the faint of heart. But then again the Dreambox 8000 is not available yet, only a new decoder from Kathrein UFS910 with no decent software (yet)
  • jojo4u - Monday, July 23, 2007 - link

    Hi Derek,

    good review. However, based on a review of the german written magazine C't I have some suggestions and additions:
    PowerDVD patch 2911, Catalyst 7.6, Nvidia 158.24
    - the Geforce G84/85 miss not only VC-1 but also MPEG-2 bitstream processing.
    - the HD 2400 does not have MPEG-2 bitstream processing, frequency transform and pixel prediction or it is not activated.
    - A single core Athlon is significantly worse than a single core Pentium IV. The reson is AACS. Decryption puts a hudge load on the CPU and is optimized for Intel CPUs (9%->39% H.264, Pentium IV, Casino Royale). Perhaps later patches made the situation better (like your Yozakura shows?)
    - VC-1 on the Radeons and Geforces showed picture distortions, but based on your review this seems to be fixed now

    Combinations of Athlon 3500+, X2 6000+, Pentium IV 3,2 GHz, Pentium E2160 and HD 2400/2600, Geforce 8600 GTS which resulted in lagging in MPEG-2 or VC-1 or H.264
    3500+ + 690G/2400/2600/8600
    6000+ + 690G
    Pentium IV + 8600
  • Chunga29 - Monday, July 23, 2007 - link

    Why run with older drivers? If these features are important to you, you will need to stay on top of the driver game. Would have been interesting to see AMD chips in there, but then that would require a different motherboard as well. I think the use of a P4 560 was perfectly acceptable - it's a low-end CPU and if it can handle playback with the 2600/8600 then Athlons will be fine as well.
  • 8steve8 - Monday, July 23, 2007 - link

    nice article..

    but, while i usually think anandtech conclusions are insightful and spot on,

    it seems odd not to give props to the 2600xt which dominated the benchmarks.


    for the occasional gamer who often likes watching videos, it seems the 2600xt is a great choice, better than the 8600gts.

    for example for VC1, on a low end c2duo the difference between 7% and 19.2% matters, esp if the person likes watching a video while working or browsing or whatever...




    can amd add noise reduction options later w/ a driver update?
  • defter - Tuesday, July 24, 2007 - link

    quote:

    for example for VC1, on a low end c2duo the difference between 7% and 19.2% matters, esp if the person likes watching a video while working or browsing or whatever...


    How can that matter? Even in worst case you have 80% of idle CPU time.

    Besides, how can you "work" while watching video at the same time? And don't try to tell me that a web browser takes over 80% of CPU time with Core2 Duo system...
  • drebo - Monday, July 23, 2007 - link

    quote:

    it seems odd not to give props to the 2600xt which dominated the benchmarks.


    We all know why this is.

    I'll give you a hint: look at the overwhelming presence of Intel advertising on this site.

    It doesn't take a genius to figure it out. That's why I don't take the video and CPU reviews on this site seriously anymore.

Log in

Don't have an account? Sign up now