Final Words

While noise reduction can be a good thing, when viewing well mastered and high quality compressed HD video, noise should be kept at a minimum anyway. We've seen our fair share of early HD releases where noise is simply atrocious, however, and we expect that it will take some studios a little time to adjust to the fact that higher resolution movies not only look better, but reveal flaws more readily as well. For now (especially for movies like X-Men 3), noise reduction is highly appreciated. But down the line we hope that studios will put a bit more effort into delivering a polished product.

There are cases where blending effects require a bit of added noise to give scenes a more natural feel. Noise can even be cranked way up by a director to provide an artistic or dated effect. In these cases (which will hopefully be most cases where noise is evident in the future), we want to view HD material as it was delivered. When presented with poor post processing from a studio it is nice to have the ability to make our own decisions on how we want to view the content. These facts make it clear to us that the ability to enable or disable noise reduction is an imperative feature for video processors. While fully adjustable noise reduction might not be as necessary, it is absolutely appreciated and offers those who know what they are doing the highest potential image quality across every case.

Those who choose to stick with very well produced 1080p content may not need post processing noise reduction or deinterlacing, but they might miss out on imported content or HD releases of some TV series (depending on what studios choose to do in that area). For now, we're going to recommend that users interested in HTPC setups stick with the tools that can get the job done best no matter what the source material is. The only options for HD video intensive systems today are the Radeon HD 2600 and GeForce 8600 series cards. For its better handling of noise reduction (and especially the fact that it can be turned off) we recommend the 8600 GT/GTS above the other options in spite of the fact that the 2600 XT provided better CPU offloading.

We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.

AMD's UVD does beat out NVIDIA's VP2 in both H.264 and VC-1 decode performance. However, it isn't really enough to make a tangible difference in the viewing of movies. Performance is important, and UVD performance is certainly impressive. But we still have to favor the 8600 for its superior image quality.

VC-1 bitstream decoding doesn't have as large an impact as H.264 bitstream decoding. We would have to drop down to a significantly slower CPU in order for the difference to offer AMD an advantage. In the scenarios we tested, we feel that NVIDIA didn't make a serious blunder in skipping the inclusion of hardware to handle VC-1 bitstreams. At least, they didn't make as serious a blunder as AMD did by not including UVD in their HD 2900 XT.

In the future, we won't "need" H.264 or VC-1 decode on our GPUs either (just as we don't "need" MPEG-2 acceleration for current CPUs), but we don't see this as a valid excuse not to provide a full range of functionality for end users. And need is a relative term at best. We can do good realtime 3D on CPUs these days, but we don't see graphics card companies saying "this card will be paired with a high end CPU so we decided not to implement [insert key 3D feature] in hardware." We want to see AMD and NVIDIA include across the board support for video features in future product lineups. Saving CPU cycles isn't an exclusive desire of owners of low end hardware, and when we buy higher end hardware we expect higher performance.

Serenity (VC-1) Performance
Comments Locked

63 Comments

View All Comments

  • TA152H - Monday, July 23, 2007 - link

    Just my opinion, but I would save money on the Power DVD if you are buying ATI and just use theirs. Power DVD is not cheap, and I personally do not like it is much, but I am sure others do. He has to use it, of course, because how else would he be able to test Nvidia and ATI on the same software. But it's not a trivial expense, and the ATI stuff works well enough that it seems, to me, an unnecessary expense. You might be happier with spending that money on hardware instead of Power DVD. Again, all this assumes an ATI card purchase.
  • phusg - Monday, July 23, 2007 - link

    Good questions. From what I've seen the 2600 Pro is the least power hungry card at under 50W. Any chance you could shed some light Derek?
  • TA152H - Monday, July 23, 2007 - link

    Choosing a Pentium 4 560 is a really strange choice, do you think there are a lot of them out there with PCI-E waiting to upgrade to one of these cards. It's a minor point, but I think a Pentium D 805 would have been an excellent choice, since a lot of people bought these and it would be a much more interesting data point, and many of them on PCI-E based motherboards.

    My next point is the expectation of the 2900 XT. I totally disagree this is something they needed to add, because what they are saying is absolutely true. Someone who will buy this item will almost certainly do it with a very capable CPU. Since high end processors are dual cores, it is not as if you can not do something else if the CPU is assisting with it. It's not free, you pay for it with cost, and you pay for it with power use, and you pay for it to heat, and it's going to be a waste the vast majority of time. Considering the power use of the 2900 is appalling already, adding to this is highly undesirable considering the very questionable usefulness of it.

    I think they should be congratulated for using intelligent feature targeting for their products, rather than bloating a product with useless features and making people pay for it.
  • johnsonx - Tuesday, July 24, 2007 - link

    Clearly, the point was to get a single-core point of reference. While admittedly that exact CPU would be a slightly rare case, it's a simple matter to benchmark it since it fits the same 775 mainboard as the two Core2 chips. A PD805 wouldn't be much use to compare, as it would simply be a bit slower than the E4300... so what? The P4 560 makes a reasonable proxy for the variety of good performing single-core P4's and Athlon64's out there, while the E4300 stands in for all the X2's.
  • TA152H - Tuesday, July 24, 2007 - link

    Are you crazy?

    The Pentium D 805 is a very popular chip and widely used, and represents an entirely different architecture. It would be an extremely valid data point because it's a popular item. It's not "a little slower", it has completely different performance characteristics.

    A Pentium 560 owner will probably never buy this card, and many of these owners are not even on a PCI-E platform. I wouldn't even have had a problem if they sold a single core Sempron, but a Pentium 560 makes no sense at all. People are still buying the 805, in fact, and you don't think the idea of popping one of these cards with an 805, while waiting for the Penryn to come out, is not something people think about? Or a similar Pentium D? Except, they'll not know how it performs. Luckily, though, they'll know how the Pentium 560 performs, because, I'm sure, that is their next choice.

  • 100proof - Monday, July 23, 2007 - link

    Derek,

    Seeing as this is an article concerning media decoding with an emphasis towards HD media playback, shouldn't Anandtech be applying some pressure on Nvidia to support open drivers for linux? mythTV and XBMC are promising HTPC options, perfectly suited towards this test scenario.

    Why should h.264 offloading be exclusive to users of Microsoft operating systems?
  • 100proof - Monday, July 23, 2007 - link

    This complaint applies to ATi\AMD as well.
  • erwos - Monday, July 23, 2007 - link

    Linux doesn't have a framework to support H.264 or VC-1 acceleration yet. When that happens, I would expect the binary drivers to catch up fairly quickly.
  • smitty3268 - Monday, July 23, 2007 - link

    Actually, it does. The problem is that it is open source, while the MS equivalent is closed. ATI/NVIDIA don't want to share their specs in an open manner and never came up with a suitable API to make public.
  • wien - Monday, July 23, 2007 - link

    Well, gstreamer allows for closed source plug-ins since it's licensed under LGPL. Fluendo has already implemented a lot of proprietary (patented) codecs in gstreamer. With the required features exposed through the driver, it shouldn't be too hard for the IHVs to do the same with hardware accelerated H.264/VC-1.

    It's probably not worth their time yet though...

Log in

Don't have an account? Sign up now