Before proceeding to the conclusions, let us deal with a couple of topics which didn't fit into any of the preceding sections.

First off, we have some power consumption numbers. In addition to idle power, we also measure the average power consumption of the testbed over a 15 minute interval when playing back a 1080p24 MKV file in MPC-HC.

 
HTPC Testbed Power Consumption
  Idle Power Consumption (W) Playback Power Consumption (W)
HTPC Testbed (Core i5-680) 56.6 67.7
AMD 6450 66.4 84.9
MSI 6450 66.2 78.4
Sapphire 6570 66.7 79.6
NVIDIA GT 430 65.7 76
MSI GT 520 67 73.4

There is not much to infer from the above power consumption numbers except that the GDDR5 based AMD 6450 needs to be avoided. All the cards idle around the same value. The AMD cards consume slightly more power when playing back the video.

I am sure many readers are also interested in the performance of the GPUs for 3D videos. With the latest PowerDVD and Total Media Theater builds, all the 3D Blu-rays we tried played back OK. Beyond this, we did't feel it necessary to devote time to develop a benchmarking methodology for 3D videos. There is no standardized way to store and transfer 3D videos. 3D Blu-ray ISOs are different from the 3D MKV standard, which, in turn are different from the standards adopted by some of the camcorder manufacturers. In our personal opinion, the 3D ecosystem for HTPCs is still in a mess. It is no secret that NVIDIA has invested heavily in the 3D ecosystem. In addition to the support for 3D movies, they also supply software to view stereoscopic photographs. If you plan on connecting your HTPC to a 3D TV and also plan to invest in 3D cameras or camcorders, the NVIDIA GPUs are a better choice (purely from a support viewpoint). If all you want to do is to play back your 3D Blu-rays any current GPU solution (Intel or AMD or NVIDIA) should be fine. Note that SBS/TAB (side-by-side/Top-and-Bottom) 3D streams (as used in TV broadcasts) are likely to have performance similar to that of the 2D 720p/1080i content.

From a broadcast perspective, MPEG-2 is a mature codec, but it is not very efficient at HD resolutions. H.264 is widely preferred. Current H.264 broadcast encoders take in the raw 4:2:2 10-bit data, but compress them using 8-bit 4:2:0 encoders. Recently, companies have put forward 10-bit 4:2:2 encoding [PDF] as a way to boost the efficiency of H.264 encoding. Unfortunately, none of the GPUs have support for decoding such streams (encoded with profile level High10). Considering that 10-bit 4:2:2 is finding acceptance within the professional community only now, we wouldn't fault the GPU vendors too much. However, x264 has started implementing 10-bit support now, making it possible for users to generate / back-up videos in the new profile. We would like GPU vendors to provide decode support for the High10 AVC profile as soon as possible in their mainstream consumer offerings.

Benchmarking LAV CUVID with madVR Final Words
Comments Locked

70 Comments

View All Comments

  • jwilliams4200 - Monday, June 13, 2011 - link

    All the numbers add up correctly now. Thanks for monitoring the comments and fixing the errors!
  • Samus - Monday, June 13, 2011 - link

    Honestly, my Geforce 210 has been chillin' in my HTPC for 2+ years, and works perfectly :)
  • josephclemente - Monday, June 13, 2011 - link

    If I am running a Sandy Bridge system with Intel HD Graphics 3000, do these cards have any benefit over integrated graphics? What is Anandtech's HQV Benchmark score?

    I tried searching for scores, but people say this is subjective and one reviewer may differ from another. One site says 196 and another in the low 100's. What does this reviewer say?
  • ganeshts - Monday, June 13, 2011 - link

    Give me a couple of weeks. I will be getting a test system soon with the HD 3000, and I will do detailed HQV benchmarking in that review too.
  • dmsher99@gmail.com - Tuesday, June 14, 2011 - link

    I recently built a HTPC with a core i5-2500k on a ASUS P8H67 EVO with a Ceton InfiniTV cable card. Note that the Intel driver is fundamentally flawed and will destroy a system if patched. See the Intel communities thread 20439 for more details.

    Besides causing BSOD over HDMI output when patched, the stable versions have their own sets of bugs including a memory bleed when watching some premium content on HD channels that crashed WMC. Intel appears to have 1 part time developer working on this problem but every test river he puts out breaks more than it fixes. Watching the same, content with a system running a NVIDIA GPU and the memory bleed goes away.

    In my opinion, second gen SB chips is just not ready for prime time in a fully loaded HTPC.
  • jwilliams4200 - Monday, June 13, 2011 - link

    "The first shot shows the appearance of the video without denoising turned on. The second shot shows the performance with denoising turned off. "

    Heads I win, tails you lose!
  • ganeshts - Monday, June 13, 2011 - link

    Again, sorry for the slip-up, and thanks for bringing it to our notice. Fixed it. Hopefully, the gallery pictures cleared up the confusion (particularly the Noise Reduction entry in the NVIDIA Control Panel)
  • stmok - Monday, June 13, 2011 - link

    Looking through various driver release README files, it appears the mobile Nvidia Quadro NVS 4200M (PCI Device ID: 0x1056) also has this feature set.

    The first stable Linux driver (x86) to introduce support for Feature Set D is 270.41.03 release.
    => ftp://download.nvidia.com/XFree86/Linux-x86/270.41...

    It shows only the Geforce GT 520 and Quadro NVS 4200M support Feature Set D.

    The most recent one confirms that they are still the only models to support it.
    => ftp://download.nvidia.com/XFree86/Linux-x86/275.09...
  • ganeshts - Monday, June 13, 2011 - link

    Thanks for bringing it to our notice. When that page was being written (around 2 weeks back), the README indicated that the GT 520 was the only GPU supporting Feature Set D. We will let the article stand as-is, and I am sure readers perusing the comments will become aware of this new GPU.
  • havoti97 - Monday, June 13, 2011 - link

    So basically the app store's purpose is to attract submissions of ideas for features of their next OS, uncompensated of course. All the other crap/fart apps not worthy are approved and people make pennies of those.

Log in

Don't have an account? Sign up now