Yozakura (High Complexity H.264) Performance

H.264 offers quite a range of options, and we haven't seen everyone taking advantage of some of the more advanced features. Yozakura is encoded in 1080i at 25 Mbps. This is fairly low for H.264 maximums, but this is still very CPU intensive because the video is encoded using macroblock adaptive frame/field (MBAFF) coding. MBAFF is a high quality technique to ensure maximum visual fidelity in interlaced video by adaptively selecting frame or field encoding per macroblock based on a motion threshold.

While 1080p is clearly Hollywood's choice of resolution, there is 1080i encoded content out there now and more likely to come. As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option. It's nice to know that H.264 offers high quality interlaced HD encoding options, and we hope content authors who decide to release their creations in 1080i will take advantage of things like MBAFF.

Additionally, good deinterlacing is essential for getting a good experience with movies like this. Poorly deinterlaced HD content is not only sad to watch, but gives this author quite a headache. Jaggies and feathering are horrible distractions at this resolution. As long as you stick with an HD 2600 or GeForce 8600 series or higher you should be fine here. Any slower just won't cut it when trying to watch 1080i on a progressive scan display.

Virtual Trip: Yozakura Performance


Our high end CPU is able to cope fairly well, with the 8800 GTX besting the 2900 XT in performance while UVD leads VP2 putting the 2600 XT ahead of the 8600 GTS.

Virtual Trip: Yozakura Performance


For our cheap yet current processor, we do see utilization go up, but the hardware with bitstream decoding maintains very low overhead. All of our GPUs maintain good performance when paired with this level of processor. Of course, we would likely not see the high end GPUs matched with such a CPU (unless we are looking at notebooks, but that's a whole other article).

Virtual Trip: Yozakura Performance


For our older hardware, Yozakura is simply not watchable without bitstream decoding. With the numbers for the high end AMD and NVIDIA GPUs even worse than under our Transporter 2 trailer test, it's clear that NetBurst does not like whatever Yozakura is doing. It may be that decoding the bitstream when MBAFF is used is branch heavy causing lots of stalls. All we can say for sure is that, once again, GPU accelerated bitstream decoding is necessary to watch H.264 content on older/slower hardware.

Transporter 2 Trailer (High Bitrate H.264) Performance Serenity (VC-1) Performance
Comments Locked

63 Comments

View All Comments

  • DigitalFreak - Monday, July 23, 2007 - link

    Based on Derek's results, I ordered the parts for my new HTPC.

    C2D E6850 (Newegg are bastards for pricing this at $325, but I didn't want to wait)
    Intel P35 board
    2GB PC2-800
    Gigabyte 8600GTS (passive cooling)

    Wished there was an 8600 board with HDMI out, but oh well...
  • SunAngel - Monday, July 23, 2007 - link

    You confirmed my suspicions all along. I always wondered if the true motive for SLI and Crossfire was to double the benefits of GPU processing rather than separate the graphics performance of 3D and video acceleration. In my eyes, I see SLI and Crossfire being a "bridge" for 3D graphics and Video accleration cards. What I am referring to is the PCIex16(16 lane) slot been for high powered 3D GPUs and the PCIex16(8 lane) slot being for video accleration GPUs.

    It is obvious between the HD2900XT and the HD2600XT that one is great at rendering 3D game graphics while the other is great at acceleration motion picture movies.

    Personally, this is an okay tactic by the card manufacturers. It segments the performance a little bit better. I do not game the least bit, so the high end cards are something I don't want. But, my taste are different than others that do want it. But those that desire both, can have their cake and eat it too, but using a dual PCIex16 motherboard and installing each type of card.

    Overall, good article. You enlightened my purchasing decision. With all the talk about futureproofing that was going around for a while, buying a dual PCIex16 motherboard makes a lot of sense now.
  • TA152H - Monday, July 23, 2007 - link

    I don't think you understand the point of the cards.

    If you buy the 2900 and a high end processor, you will not have any problems with HD playback, that's the whole point. You don't need a 2600 to go with it. The number of people that buy something as expensive as the 2900XT and a low end processor that is incapable of playing back HD is very, very low to the point where ATI decided it was a mistake to buy it.

    So, no, you wouldn't get a 2600 to go with it, you'd get a good processor and the 2900 and that's all you'd need to have the best of both worlds.
  • Chunga29 - Monday, July 23, 2007 - link

    Yes, if by "best" you mean:
    - Higher CPU utilization when viewing any HD video content, compared to 8800
    - Generally lower price/performance in games compared to 8800
    - More flaky (in my experience) drivers than 8800 (though I believe AMD might actually be better on Vista - not that I really care at this point)

    Don't pat AMD on the back for skipping UVD on R600. NVIDIA didn't bother to work on VP2 for G80, and yet no one is congratulating them on the decision. I agree that the omission is not the end of the world, mostly because I don't think people running 8800/X2900 cards are really all that concerned with H.264 video. If I were looking to go Blu-ray or HD-DVD, I'd be looking at a set-top box to hook up to my HDTV.

    My PC is connected to a 24" LCD that I use for work, not watching movies, and trying to put it next to the TV is more effort than it's worth. Unless H.264 suddenly makes a difference for YouTube and the like (hey - I'd love to see higher quality videos online), I can't say I'm all that concerned. Seems to me there's just a vocal minority whining about the latest features that are used by less than 10% of people.

    UVD, PureVideo HD, and a partridge in a pear tree: it's all moot to me!
  • TA152H - Tuesday, July 24, 2007 - link

    OK, do you understand the meaning of the word "context"?

    I'm not going into the merits of Nvidia and ATI. I have used both, I consider Nvidia junk, and I do not buy them. If you have had better luck, then go with them. That's not the point, but anyone with any reading comprehension should have figured that out.

    He was talking about putting a 2600 and a 2900 on the same motherboard to get the best of both worlds, meaning having all the performance of the 2900 yet getting the HD capabilities of the 2900. Do you understand that?

    My point is you don't need the 2600 to get "the best of both worlds", you just need a good processor and you will not miss that feature. I think Nvidia made the right choice too. Most people are morons, and they want just because they want, and they fail to realize nothing is free. Including useless features at a cost is a bad idea, and ATI did the right thing not to, even though you'll have the idiots that think they are missing out on something. Yes, you are, you're missing out on additional cost, additional electricity use, and additional heat dissipation. You don't need it if you buy a reasonable processor for the item. That's the point. Try to understand context better, and realize what he meant by the best of both worlds.
  • strikeback03 - Wednesday, July 25, 2007 - link

    Assuming your "good processor" falls somewhere between the two tested C2D processors, dropping UVD boosts average processor usage around 42% in Transporter2, 44% in Yozakura, and 24% in Serenity. So which uses more electricity and generates more heat - the additional transistors needed for UVD on the 2900, or moving your CPU off idle to do the work?
  • TA152H - Tuesday, July 24, 2007 - link

    Previous post should have said "HD capability of the 2600".
  • Chunga29 - Tuesday, July 24, 2007 - link

    For someone trying to act superior, you need to take a look in the mirror (and the dictionary) for a moment. I agree it's silly to use something like a 2600 and 2900 in the same system. However, if you want the "best of both worlds", let's consider for a minute what that means:

    Best (courtesy of Mirriam Webster):
    1 : excelling all others
    2 : most productive of good: offering or producing the greatest advantage, utility, or satisfaction

    So, if you truly want the best of both worlds, what you really want is:

    UVD from ATI RV630
    3D from NVIDIA G80

    Anything less than that is not the "best" anymore (though I'm sure some ATI fans would put R600 3D above G80 for various reasons).

    Try ditching the superlatives instead of copping an attitude and constantly defending ever post you make. If you want to say that R600 with a fast CPU is more than sufficient for H.264 playback as well as providing good 3D performance, you're right. The same goes for G80. If you want to argue that it may have been difficult and not entirely necessary to cram UVD into R600, you can do that, but others will disagree.

    Since they were at something like 700 million transistors, they may have been out of room. That seems very bloated (especially considering the final performance), but how many transistors are required for UVD? I'd say it was certainly possible to get UVD in there, but would the benefit be worth the cost? Given the delays, it probably was best to scrap UVD. However, the resulting product certainly isn't able to offer the best possible feature set in every area. In fact, I'd say it's second in practically every area to other GPUs (G80/86 and RV630, depending on the feature). As others have pointed out in the past, that's a lot like the NV30 launch.
  • autoboy - Monday, July 23, 2007 - link

    What??
  • scosta - Monday, July 23, 2007 - link

    I think this sentence in page 1 is wrong!

    quote:

    While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", G84 and G86 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...


    Dont you mean ...

    quote:

    While the R600 based Radeon HD 2900 XT only supports the the features listed as "Avivo", HD 2400 and HD 2600 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...


    Regards

Log in

Don't have an account? Sign up now