Final Words

While noise reduction can be a good thing, when viewing well mastered and high quality compressed HD video, noise should be kept at a minimum anyway. We've seen our fair share of early HD releases where noise is simply atrocious, however, and we expect that it will take some studios a little time to adjust to the fact that higher resolution movies not only look better, but reveal flaws more readily as well. For now (especially for movies like X-Men 3), noise reduction is highly appreciated. But down the line we hope that studios will put a bit more effort into delivering a polished product.

There are cases where blending effects require a bit of added noise to give scenes a more natural feel. Noise can even be cranked way up by a director to provide an artistic or dated effect. In these cases (which will hopefully be most cases where noise is evident in the future), we want to view HD material as it was delivered. When presented with poor post processing from a studio it is nice to have the ability to make our own decisions on how we want to view the content. These facts make it clear to us that the ability to enable or disable noise reduction is an imperative feature for video processors. While fully adjustable noise reduction might not be as necessary, it is absolutely appreciated and offers those who know what they are doing the highest potential image quality across every case.

Those who choose to stick with very well produced 1080p content may not need post processing noise reduction or deinterlacing, but they might miss out on imported content or HD releases of some TV series (depending on what studios choose to do in that area). For now, we're going to recommend that users interested in HTPC setups stick with the tools that can get the job done best no matter what the source material is. The only options for HD video intensive systems today are the Radeon HD 2600 and GeForce 8600 series cards. For its better handling of noise reduction (and especially the fact that it can be turned off) we recommend the 8600 GT/GTS above the other options in spite of the fact that the 2600 XT provided better CPU offloading.

We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.

AMD's UVD does beat out NVIDIA's VP2 in both H.264 and VC-1 decode performance. However, it isn't really enough to make a tangible difference in the viewing of movies. Performance is important, and UVD performance is certainly impressive. But we still have to favor the 8600 for its superior image quality.

VC-1 bitstream decoding doesn't have as large an impact as H.264 bitstream decoding. We would have to drop down to a significantly slower CPU in order for the difference to offer AMD an advantage. In the scenarios we tested, we feel that NVIDIA didn't make a serious blunder in skipping the inclusion of hardware to handle VC-1 bitstreams. At least, they didn't make as serious a blunder as AMD did by not including UVD in their HD 2900 XT.

In the future, we won't "need" H.264 or VC-1 decode on our GPUs either (just as we don't "need" MPEG-2 acceleration for current CPUs), but we don't see this as a valid excuse not to provide a full range of functionality for end users. And need is a relative term at best. We can do good realtime 3D on CPUs these days, but we don't see graphics card companies saying "this card will be paired with a high end CPU so we decided not to implement [insert key 3D feature] in hardware." We want to see AMD and NVIDIA include across the board support for video features in future product lineups. Saving CPU cycles isn't an exclusive desire of owners of low end hardware, and when we buy higher end hardware we expect higher performance.

Serenity (VC-1) Performance
Comments Locked

63 Comments

View All Comments

  • DigitalFreak - Monday, July 23, 2007 - link

    Based on Derek's results, I ordered the parts for my new HTPC.

    C2D E6850 (Newegg are bastards for pricing this at $325, but I didn't want to wait)
    Intel P35 board
    2GB PC2-800
    Gigabyte 8600GTS (passive cooling)

    Wished there was an 8600 board with HDMI out, but oh well...
  • SunAngel - Monday, July 23, 2007 - link

    You confirmed my suspicions all along. I always wondered if the true motive for SLI and Crossfire was to double the benefits of GPU processing rather than separate the graphics performance of 3D and video acceleration. In my eyes, I see SLI and Crossfire being a "bridge" for 3D graphics and Video accleration cards. What I am referring to is the PCIex16(16 lane) slot been for high powered 3D GPUs and the PCIex16(8 lane) slot being for video accleration GPUs.

    It is obvious between the HD2900XT and the HD2600XT that one is great at rendering 3D game graphics while the other is great at acceleration motion picture movies.

    Personally, this is an okay tactic by the card manufacturers. It segments the performance a little bit better. I do not game the least bit, so the high end cards are something I don't want. But, my taste are different than others that do want it. But those that desire both, can have their cake and eat it too, but using a dual PCIex16 motherboard and installing each type of card.

    Overall, good article. You enlightened my purchasing decision. With all the talk about futureproofing that was going around for a while, buying a dual PCIex16 motherboard makes a lot of sense now.
  • TA152H - Monday, July 23, 2007 - link

    I don't think you understand the point of the cards.

    If you buy the 2900 and a high end processor, you will not have any problems with HD playback, that's the whole point. You don't need a 2600 to go with it. The number of people that buy something as expensive as the 2900XT and a low end processor that is incapable of playing back HD is very, very low to the point where ATI decided it was a mistake to buy it.

    So, no, you wouldn't get a 2600 to go with it, you'd get a good processor and the 2900 and that's all you'd need to have the best of both worlds.
  • Chunga29 - Monday, July 23, 2007 - link

    Yes, if by "best" you mean:
    - Higher CPU utilization when viewing any HD video content, compared to 8800
    - Generally lower price/performance in games compared to 8800
    - More flaky (in my experience) drivers than 8800 (though I believe AMD might actually be better on Vista - not that I really care at this point)

    Don't pat AMD on the back for skipping UVD on R600. NVIDIA didn't bother to work on VP2 for G80, and yet no one is congratulating them on the decision. I agree that the omission is not the end of the world, mostly because I don't think people running 8800/X2900 cards are really all that concerned with H.264 video. If I were looking to go Blu-ray or HD-DVD, I'd be looking at a set-top box to hook up to my HDTV.

    My PC is connected to a 24" LCD that I use for work, not watching movies, and trying to put it next to the TV is more effort than it's worth. Unless H.264 suddenly makes a difference for YouTube and the like (hey - I'd love to see higher quality videos online), I can't say I'm all that concerned. Seems to me there's just a vocal minority whining about the latest features that are used by less than 10% of people.

    UVD, PureVideo HD, and a partridge in a pear tree: it's all moot to me!
  • TA152H - Tuesday, July 24, 2007 - link

    OK, do you understand the meaning of the word "context"?

    I'm not going into the merits of Nvidia and ATI. I have used both, I consider Nvidia junk, and I do not buy them. If you have had better luck, then go with them. That's not the point, but anyone with any reading comprehension should have figured that out.

    He was talking about putting a 2600 and a 2900 on the same motherboard to get the best of both worlds, meaning having all the performance of the 2900 yet getting the HD capabilities of the 2900. Do you understand that?

    My point is you don't need the 2600 to get "the best of both worlds", you just need a good processor and you will not miss that feature. I think Nvidia made the right choice too. Most people are morons, and they want just because they want, and they fail to realize nothing is free. Including useless features at a cost is a bad idea, and ATI did the right thing not to, even though you'll have the idiots that think they are missing out on something. Yes, you are, you're missing out on additional cost, additional electricity use, and additional heat dissipation. You don't need it if you buy a reasonable processor for the item. That's the point. Try to understand context better, and realize what he meant by the best of both worlds.
  • strikeback03 - Wednesday, July 25, 2007 - link

    Assuming your "good processor" falls somewhere between the two tested C2D processors, dropping UVD boosts average processor usage around 42% in Transporter2, 44% in Yozakura, and 24% in Serenity. So which uses more electricity and generates more heat - the additional transistors needed for UVD on the 2900, or moving your CPU off idle to do the work?
  • TA152H - Tuesday, July 24, 2007 - link

    Previous post should have said "HD capability of the 2600".
  • Chunga29 - Tuesday, July 24, 2007 - link

    For someone trying to act superior, you need to take a look in the mirror (and the dictionary) for a moment. I agree it's silly to use something like a 2600 and 2900 in the same system. However, if you want the "best of both worlds", let's consider for a minute what that means:

    Best (courtesy of Mirriam Webster):
    1 : excelling all others
    2 : most productive of good: offering or producing the greatest advantage, utility, or satisfaction

    So, if you truly want the best of both worlds, what you really want is:

    UVD from ATI RV630
    3D from NVIDIA G80

    Anything less than that is not the "best" anymore (though I'm sure some ATI fans would put R600 3D above G80 for various reasons).

    Try ditching the superlatives instead of copping an attitude and constantly defending ever post you make. If you want to say that R600 with a fast CPU is more than sufficient for H.264 playback as well as providing good 3D performance, you're right. The same goes for G80. If you want to argue that it may have been difficult and not entirely necessary to cram UVD into R600, you can do that, but others will disagree.

    Since they were at something like 700 million transistors, they may have been out of room. That seems very bloated (especially considering the final performance), but how many transistors are required for UVD? I'd say it was certainly possible to get UVD in there, but would the benefit be worth the cost? Given the delays, it probably was best to scrap UVD. However, the resulting product certainly isn't able to offer the best possible feature set in every area. In fact, I'd say it's second in practically every area to other GPUs (G80/86 and RV630, depending on the feature). As others have pointed out in the past, that's a lot like the NV30 launch.
  • autoboy - Monday, July 23, 2007 - link

    What??
  • scosta - Monday, July 23, 2007 - link

    I think this sentence in page 1 is wrong!

    quote:

    While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", G84 and G86 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...


    Dont you mean ...

    quote:

    While the R600 based Radeon HD 2900 XT only supports the the features listed as "Avivo", HD 2400 and HD 2600 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...


    Regards

Log in

Don't have an account? Sign up now