The Test

Our test setup consisted of multiple processors including a high end, low end, and previous generation test case. Our desire was to evaluate how much difference hardware decode makes for each of these classes of CPU and to determine how much value video offload really brings to the table today.

Performance Test Configuration:
CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Intel Core 2 Duo E4300 (1.8GHz/2MB)
Intel Pentium 4 560 (3.6GHz)
Motherboard: ASUS P5W-DH
Chipset: Intel 975X
Chipset Drivers: Intel 8.2.0.1014
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 8.38.9.1-rc2
NVIDIA ForceWare 163.11
Desktop Resolution: 1920 x 1080 - 32-bit @ 60Hz
OS: Windows Vista x86


We are using PowerDVD Ultra 7.3 with patch 3104a applied. This patch fixed a lot of our issues with playback and brought PowerDVD up to the level we wanted and expected. We did, however, have difficulty disabling GPU acceleration with this version of PowerDVD, so we will be unable to present CPU only decoding numbers. From our previous experience though, only CPUs faster than an E6600 can guarantee smooth decoding in the absence of GPU acceleration.

As for video tests, we have the final version of Silicon Optix HD HQV for HD-DVD, and we will be scoring these subjective tests to the best of our ability using the criteria provided by Silicon Optix and the examples they provide on their disk.

For performance we used perfmon to record average CPU utilization over 100 seconds (the default loop time). Our performance tests will include three different clips: The Transporter 2 trailer from The League of Extraordinary Gentlemen Blu-ray disc (H.264), Yozakura (H.264), and Serenity (VC-1). All of these tests proved to be very consistent in performance under each of our hardware configurations. Therefore, for readability's sake, we will only be reporting average CPU overhead.

Index HD HQV Image Quality Analysis
Comments Locked

63 Comments

View All Comments

  • bpt8056 - Monday, July 23, 2007 - link

    Does it have HDMI 1.3??
  • phusg - Monday, July 23, 2007 - link

    quote:

    As Derek mentioned, higher than 75% produced banding.


    Indeed, which makes it strange that he gave the nvidia cards 100% scores! Sure manual control on the noise filter is nice, but 100% is 100% Derek. It working badly when set above 75% makes for a less than perfect HQV score IMHO. Personally I would have gone with knocking off 5 points from the nvidia card's noise scores for this.
  • Scrogneugneu - Monday, July 23, 2007 - link

    I would have cut points back too, but not because at 100% the image quality goes down. There's no sense in providing a slider if every position on the slider gives the same perfect image, doesn't it?

    Giving a slider, however, isn't very user-friendly, from an average Joe's perspective. I want to dump my movie in the player and listen to it, and I want it to look great. I do not want to move a slider around for every movie to get a good picture quality. Makes me think about the Tracking on old VHS. Quite annoying.


    From a technological POV, yes, NVidia's implementation enables players to be great. From a consumer's POV, it doesn't. I wanna listen to a movie not fine tune my player.
  • Chunga29 - Monday, July 23, 2007 - link

    It's all about the drivers, people! TechReport did their review with older drivers (at least on the NVIDIA side). So in the past two weeks, NVIDIA apparently addressed some problems and AT took a look at the current results. Probably delayed the article a couple times to rerun tests as well, I bet!

    As for the above comment about the slider, what you're failing to realize is that noise reduction impacts the final output. I believe Sin City used a lot of noise intentionally, so if you watch that on ATI hardware the result will NOT be what the director wanted. A slider is a bit of a pain, but then being a videophile is also a pain at times. With an imperfect format and imperfect content, we will always have to deal with imperfect solutions. I'd take NVIDIA here as well, unless/until ATI offers the ability to shut off NR.
  • phusg - Monday, July 23, 2007 - link

    Hi Derek,
    Nice article, although I've just noticed a major omission: you didn't bench any AGP cards! There are AGP versions of the 2600 and 2400 cards and I think these are very attractive upgrades for AGP HTPC owners who are probably lacking the CPU power for full HD. The big question is whether the unidirectional AGP bus is up to the HD decode task. The previous generation ATi X1900 AGP cards reportedly had problems with HD playback.

    Hopefully you'll be able to look into this, as AFAIK no-one else has yet.

    Regards, Pete
  • ericeash - Monday, July 23, 2007 - link

    i would really like to see these tests done on an AMD x2 proc. the core 2 duo's don't need as much offloading as we do.
  • Orville - Monday, July 23, 2007 - link

    Derek,

    Thanks so much for the insightful article. I’ve been waiting on it for about a month now, I guess. You or some reader could help me out with a couple of embellishments, if you would.

    1.How much power do the ATI Radeon HD 2600 XT, Radeon HD 2600 Pro, Nvidia GeForce 6800 GTS and GeForce 6800 GT graphics cards burn?

    2.Do all four of the above mentioned graphics cards provide HDCP for their DVI output? Do they provide simultaneous HDCP for dual DVI outputs?

    3.Do you recommend CyberLink’s Power DVD video playing software, only?

    Regards,

    Orville

  • DerekWilson - Monday, July 23, 2007 - link

    we'll add power numbers tonight ... sorry for the omission

    all had hdcp support, not all had hdcp over dual-link dvi support

    powerdvd and windvd are good solutions, but powerdvd is currently further along. we don't recommend it exclusively, but it is a good solution.
  • phusg - Wednesday, July 25, 2007 - link

    I still can't see them, have they been added? Thanks.
  • GlassHouse69 - Monday, July 23, 2007 - link

    I agree here, good points.

    15% cpu utilization looks great until.... you find that a e4300 takes so little power that to use 50% of it to decode is only 25 watts of power. It is nice seeing things offloaded from the cpu.... IF the video card isnt cranking up alot of heat and power.

Log in

Don't have an account? Sign up now