The Test

Our test setup consisted of multiple processors including a high end, low end, and previous generation test case. Our desire was to evaluate how much difference hardware decode makes for each of these classes of CPU and to determine how much value video offload really brings to the table today.

Performance Test Configuration:
CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Intel Core 2 Duo E4300 (1.8GHz/2MB)
Intel Pentium 4 560 (3.6GHz)
Motherboard: ASUS P5W-DH
Chipset: Intel 975X
Chipset Drivers: Intel 8.2.0.1014
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 8.38.9.1-rc2
NVIDIA ForceWare 163.11
Desktop Resolution: 1920 x 1080 - 32-bit @ 60Hz
OS: Windows Vista x86


We are using PowerDVD Ultra 7.3 with patch 3104a applied. This patch fixed a lot of our issues with playback and brought PowerDVD up to the level we wanted and expected. We did, however, have difficulty disabling GPU acceleration with this version of PowerDVD, so we will be unable to present CPU only decoding numbers. From our previous experience though, only CPUs faster than an E6600 can guarantee smooth decoding in the absence of GPU acceleration.

As for video tests, we have the final version of Silicon Optix HD HQV for HD-DVD, and we will be scoring these subjective tests to the best of our ability using the criteria provided by Silicon Optix and the examples they provide on their disk.

For performance we used perfmon to record average CPU utilization over 100 seconds (the default loop time). Our performance tests will include three different clips: The Transporter 2 trailer from The League of Extraordinary Gentlemen Blu-ray disc (H.264), Yozakura (H.264), and Serenity (VC-1). All of these tests proved to be very consistent in performance under each of our hardware configurations. Therefore, for readability's sake, we will only be reporting average CPU overhead.

Index HD HQV Image Quality Analysis
Comments Locked

63 Comments

View All Comments

  • Wozza - Monday, March 17, 2008 - link

    "As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option."

    I would like to point out that 1080i has become a popular broadcast standard because of it's lower broadcast bandwidth requirements. TV shows are generally mastered on 1080p, then 1080i dubs are pulled from those masters and delivered to broadcasters (although some networks still don't work with HD at all, MTV for instance who take all deliveries on Digital Beta Cam). Pretty much the only people shooting and mastering in 1080i are live sports, some talk shows, reality TV and the evening news.
    Probably 90% of TV and film related blu-rays will be 1080p.
  • redpriest_ - Monday, July 23, 2007 - link

    Hint: They didn't. What anandtech isn't telling you is that NO nvidia card supports HDCP over dual-DVI, so yeah, you know that hot and fancy 30" LCD with gorgeous 2560x1600 res? You need to drop it down to 1280x800 to get it to work with an nvidia solution.

    This is a very significant problem, and I for one applaud ATI for including HDCP over dual-DVI.
  • DigitalFreak - Wednesday, July 25, 2007 - link

    Pwnd!
  • defter - Tuesday, July 24, 2007 - link

    You are wrong.

    Check Anand's 8600 review, they clearly state that 8600/8500 cards support HDCP over dual-DVI.
  • DigitalFreak - Monday, July 23, 2007 - link

    http://guru3d.com/article/Videocards/443/5/">http://guru3d.com/article/Videocards/443/5/http://guru3d.com/article/Videocards/443/5/
  • Chadder007 - Monday, July 23, 2007 - link

    I see the ATI cards lower CPU usage, but how is the power readings when the GPU is being used compared to the CPU??
  • chris92314 - Monday, July 23, 2007 - link

    Does the HD video acceleration work with other programs, and with non blueray/hddvd sources? For example if I wanted to watch a h.264 encoded .mkv file would I still see the performance and image enhancements.
  • GPett - Monday, July 23, 2007 - link

    Well, what annoys me is that there used to be all-in-wonder video cards for this kinda stuff. I do not mind a product line that has TV tuners and HD playback codecs, but not at the expense of 3d performance.

    It is a mistake for ATI and Nvidia to try to include this stuff on all video cards. The current 2XXX and 8XXX generation of video cards might not been as pathetic had the two GPU giants focused on actually making a GPU good instead of adding features that not everyone wants.

    I am sure lots of people watch movies on their computer. I do not. I don't want a GPU with those features. I want a GPU that is good at playing games.
  • autoboy - Wednesday, July 25, 2007 - link

    All in wonder cards are a totally different beast. The all in wonder card was simply a combination of a TV tuner card (and a rather poor one) and a normal graphics chip. The TV tuner simply records TV and has nothing to do with playback. ATI no longer sells All in wonder cards because the TV tuner card did not go obsolete quickly, while the graphics core in the AIW card went obsolete quickly, requiring the buyer to buy another expensive AIW card when only the graphics part was obsolete. A separate tuner card made so much more sense.

    Playback of video is a totally different thing and the AIW cards performed exactly the same as regular video cards based on the same chip. At the time, playing video on the PC was more rare and the video playback of all cards was essentially the same because no cards offered hardware deinterlacing on their video cards. Now, video on the PC is abundant and is the new Killer App (besides graphics) which drives PC performance, storage, and internet speed. Nvidia was first to the party offering Purevideo support, which did hardware deinterlacing for DVDs and SD TV on the video card instead of in software. It was far superior to any software solution at the time (save a few diehard fans of Dscaler with IVTC) and came out at exactly the right time, with the introduction of media center and cheap TV tuner cards and HD video. Now, Purevideo 2 and AVIVO HD introduce the same high quality deinterlacing to HD video for mpeg2 (7600GT and up could do HD mpeg2 deinterlacing) as well as VC-1 and H.264 content. If you don't think this is important, remember that all new satelite HD broadcasts coming online are in 1080i h264, requiring deinterlacing to look its best, and new products are coming and exist already if you are willing to work for it, that allow you to record this content on your computer. Also, new TV series are likely to be released in 1080i on HD discs because that is their most common broadcast format. If you don't need this fine, but they sell a lot of cards to people who do.
  • autoboy - Wednesday, July 25, 2007 - link

    Oh, I forgot to mention that only the video decode acceleration requires extra transistors, the deinterlacing calculations are done on the programable shaders of the cards requiring no additional hardware, just extra code in the drivers to work. The faster the video card, the better your deinterlacing, which explains why the 2400 and the 8500 cannot get perfect scores on the HQV tests. You can verify this on HD 2X00 cards by watching the GPU% in Riva Tuner while forcing different adaptive deinterlacing in CCC. This only works in XP btw.

Log in

Don't have an account? Sign up now