The lack of a standardized HTPC GPU evaluation methodology always puts us in a quandary when covering the low end / integrated GPUs. Towards this, I had a long discussion with Andrew Van Til, Mathias Rauen and Hendrik Leppkes, all popular open source multimedia software developers. The methodology we developed is presented below.

The first step is to ensure that all the post processing steps work as expected. HQV benchmarking gives us an idea. Once the cards' post processed videos pass visual inspection, we need to gather an idea of how much time is left for the GPU to do further post processing activities. These may include specialized scaling algorithms, bit-depth etc. as implemented by custom MPC-HC shaders / renderers like madVR.

Deinterlacing and cadence detection are aspects which affect almost all HTPC users. Other aspects such as denoising, edge sharpening, dynamic contrast enhancement etc. are not needed in the mainstream HTPC user's usage scenario. Most mainstream videos being watched are either from a Blu-ray source or re-encoded offline or TV shows which need deinterlacing (if they are in 480i / 1080i format).
 

Denoising OFF Denoising ON
Under what circumstances would a GPU run out of steam for such post processing?

The intent of the benchmark is to first disable all post processing and check how fast the decoder can pump out decoded frames. In the typical scenario, we expect post processing to take more time than the decoding. Identifying the stage which decides the throughput of the decoded frames can give us an idea of whether we can put in more post processing steps. This is similar to a pipeline whose operating frequency is decided by the slowest stage. We then enable post processing steps one by one and see how the throughput is affected.

DXVAChecker enables us to measure the performance of the DXVA decoders. We use a standard set of 1080p / 1080i H264 / MPEG-2 and VC-1 clips. We also have 1080p DIVX / XVID and MS-MPEG4 clips. Cyberlink PowerDVD 11, Arcsoft Total Media Theater 5 and MPC-HC video decoders were registered under DirectShow. DXVA Checker was used to identify which codecs could take advantage of DXVA2 and capable of rendering under EVR for the sample clips. An interesting aspect to note was that none of the codecs could process 1080i VC-1 or the MPEG-4 clips with DXVA2.

Note that the results in the next section list all the cards being tested. However, the 6450s and GT 520 shouldn't really be considered with seriousness because of the issues pointed out in the previous sections.

Denoising Performance and ESVP on the 6450s DXVA Benchmarking
Comments Locked

70 Comments

View All Comments

  • jwilliams4200 - Monday, June 13, 2011 - link

    All the numbers add up correctly now. Thanks for monitoring the comments and fixing the errors!
  • Samus - Monday, June 13, 2011 - link

    Honestly, my Geforce 210 has been chillin' in my HTPC for 2+ years, and works perfectly :)
  • josephclemente - Monday, June 13, 2011 - link

    If I am running a Sandy Bridge system with Intel HD Graphics 3000, do these cards have any benefit over integrated graphics? What is Anandtech's HQV Benchmark score?

    I tried searching for scores, but people say this is subjective and one reviewer may differ from another. One site says 196 and another in the low 100's. What does this reviewer say?
  • ganeshts - Monday, June 13, 2011 - link

    Give me a couple of weeks. I will be getting a test system soon with the HD 3000, and I will do detailed HQV benchmarking in that review too.
  • dmsher99@gmail.com - Tuesday, June 14, 2011 - link

    I recently built a HTPC with a core i5-2500k on a ASUS P8H67 EVO with a Ceton InfiniTV cable card. Note that the Intel driver is fundamentally flawed and will destroy a system if patched. See the Intel communities thread 20439 for more details.

    Besides causing BSOD over HDMI output when patched, the stable versions have their own sets of bugs including a memory bleed when watching some premium content on HD channels that crashed WMC. Intel appears to have 1 part time developer working on this problem but every test river he puts out breaks more than it fixes. Watching the same, content with a system running a NVIDIA GPU and the memory bleed goes away.

    In my opinion, second gen SB chips is just not ready for prime time in a fully loaded HTPC.
  • jwilliams4200 - Monday, June 13, 2011 - link

    "The first shot shows the appearance of the video without denoising turned on. The second shot shows the performance with denoising turned off. "

    Heads I win, tails you lose!
  • ganeshts - Monday, June 13, 2011 - link

    Again, sorry for the slip-up, and thanks for bringing it to our notice. Fixed it. Hopefully, the gallery pictures cleared up the confusion (particularly the Noise Reduction entry in the NVIDIA Control Panel)
  • stmok - Monday, June 13, 2011 - link

    Looking through various driver release README files, it appears the mobile Nvidia Quadro NVS 4200M (PCI Device ID: 0x1056) also has this feature set.

    The first stable Linux driver (x86) to introduce support for Feature Set D is 270.41.03 release.
    => ftp://download.nvidia.com/XFree86/Linux-x86/270.41...

    It shows only the Geforce GT 520 and Quadro NVS 4200M support Feature Set D.

    The most recent one confirms that they are still the only models to support it.
    => ftp://download.nvidia.com/XFree86/Linux-x86/275.09...
  • ganeshts - Monday, June 13, 2011 - link

    Thanks for bringing it to our notice. When that page was being written (around 2 weeks back), the README indicated that the GT 520 was the only GPU supporting Feature Set D. We will let the article stand as-is, and I am sure readers perusing the comments will become aware of this new GPU.
  • havoti97 - Monday, June 13, 2011 - link

    So basically the app store's purpose is to attract submissions of ideas for features of their next OS, uncompensated of course. All the other crap/fart apps not worthy are approved and people make pennies of those.

Log in

Don't have an account? Sign up now