We will see in the 'DXVA Benchmarking' section that denoising is one of the more GPU intensive video post-processing tasks. To put that in perspective, let us take a look at how the denoising performance of each card is, and the factors which affect it.

In each of the galleries above, you can see a screenshot of a noisy video being played back with PowerDVD. The first shot shows the appearance of the video without denoising turned on. The second shot shows the performance with denoising enabled. For both cards, it can be seen that the denoising kicks in, as expected. This is also reflected in the relevant HQV benchmark section. With denoising turned on, note that the GPU load increases from 75% to 81% for the GT 520, while the corresponding increase in the GT 430 is much smaller.

Is it similarly straightforward to test the denoising performance on the AMD GPUs? Unfortunately, that is not the case. AMD has this nifty feature 'Enforce Smooth Video Playback' (ESVP) in the Catalyst Control Center.
Simply put, it just means that the drivers automatically turn off post processing features if it finds that the card is not powerful enough to do it in real time. How well does this feature work? While we are on the topic of denoising, let us check up on that first.

The first shot shows the noisy video being played back with ESVP on and the denoising options turned off.
The second and third shots sows the denoising options (Denoise and Mosquito Noise Reduction) taking effect. Note the GPU load increasing from 40 to 49%. The fourth shot in the above gallery show that ESVP has no effect on denoising. Note that turning off ESVP increases the GPU load from 49% to 88%. This implies that some other post processing option was enabled in CCC, but didn't actually kick in because the card was too weak.

Moving on to the MSI 6450, the gallery below presents two shots.

The first one forces the denoising algorithms to take effect by disabling ESVP. Note that the GPU load rocketed up to 100%. The video became a slideshow soon enough. The second shot shows that ESVP is turned on, and the denoising algorithms are also turned on. It was quite evident that the denoising algorithms didn't take effect and the drivers silently turned off the denoising algorithms. This can also be inferred from the fact that enabling the denoising algorithms increased the GPU load to 100%.

AMD acknowledge the issue and indicated that they are working on a fix. I have little doubt that this is going to be resolved soon because the same files on a Blu-ray disc play back with all the post processing options. However, with the current drivers, the DDR3 based 6450 suffers heavily.

The Sapphire 6570 is, thankfully, not an ESVP mess like the 6450. The gallery below presents two shots.

The first one has ESVP on, but the denoising algorithms are off. The video is clearly noisy, and GPU utilization is pegged at 52%. In the second shot, ESVP is off (which means that almost all the video post processing algorithms except brightness level adjustments are forced to take effect). GPU utilization shoots up to 76%, but the end results are very good. It is a matter of personal taste, but the addition of mosquito noise reduction seems to make the AMD denoising results much better than NVIDIA's.

Let us come back to the ESVP mess on the 6450s. The intent of ESVP is to make sure that the decoder puts out the decoded frame within the required time. It should be OK to forsake any post processing steps in case the GPU is not able to keep up. We saw in the 'Custom Refresh Rates' section that both the 6450s were unable to keep up with 1080p60 H264 decoding. Those tests were run with ESVP turned on. The gallery below shows how the same video can be played back with all the post processing options turned off (including ESVP).

It is clear that the UVD engine in the 6450 can handle 1080p60 H264 decoding. It is a combination of ESVP and other post processing features which makes AVCHD clips unplayable on the 6450s. The last two shots in the gallery are from the MSI 6450. They show that 1080p60 H264 decode with all the CCC options turned off has a GPU load of 36%. Turning on ESVP makes it shoot up to 100% and results in jerky playback. This, however, has not yet been acknowledged by AMD as a problem yet.

In addition, the gallery below shows screenshots of a 1080p24 video being played back on the MSI 6450 (DDR3 based, lower core clock) in PowerDVD 11 and MPC-HC.

In both cases, GPU load regularly spikes up to 100% resulting in very noticeable stutters in the video playback. We were able to reproduce the problem with MPC-HC also. We suspect it is a combination of AMD's drivers as well as the lower core clock in the MSI 6450 which is causing this issue.

The takeaway from this section is that the AMD drivers need a lot of work with respect to ESVP on the 6450s. The denoising performance of both the NVIDIA cards is passable. I personally find AMD's denoising implementation (in the 6570) to be better. However, I strongly recommend readers to avoid the 6450s for some time to come.

Deinterlacing Performance Designing a HTPC GPU Evaluation Strategy
Comments Locked

70 Comments

View All Comments

  • jwilliams4200 - Monday, June 13, 2011 - link

    All the numbers add up correctly now. Thanks for monitoring the comments and fixing the errors!
  • Samus - Monday, June 13, 2011 - link

    Honestly, my Geforce 210 has been chillin' in my HTPC for 2+ years, and works perfectly :)
  • josephclemente - Monday, June 13, 2011 - link

    If I am running a Sandy Bridge system with Intel HD Graphics 3000, do these cards have any benefit over integrated graphics? What is Anandtech's HQV Benchmark score?

    I tried searching for scores, but people say this is subjective and one reviewer may differ from another. One site says 196 and another in the low 100's. What does this reviewer say?
  • ganeshts - Monday, June 13, 2011 - link

    Give me a couple of weeks. I will be getting a test system soon with the HD 3000, and I will do detailed HQV benchmarking in that review too.
  • dmsher99@gmail.com - Tuesday, June 14, 2011 - link

    I recently built a HTPC with a core i5-2500k on a ASUS P8H67 EVO with a Ceton InfiniTV cable card. Note that the Intel driver is fundamentally flawed and will destroy a system if patched. See the Intel communities thread 20439 for more details.

    Besides causing BSOD over HDMI output when patched, the stable versions have their own sets of bugs including a memory bleed when watching some premium content on HD channels that crashed WMC. Intel appears to have 1 part time developer working on this problem but every test river he puts out breaks more than it fixes. Watching the same, content with a system running a NVIDIA GPU and the memory bleed goes away.

    In my opinion, second gen SB chips is just not ready for prime time in a fully loaded HTPC.
  • jwilliams4200 - Monday, June 13, 2011 - link

    "The first shot shows the appearance of the video without denoising turned on. The second shot shows the performance with denoising turned off. "

    Heads I win, tails you lose!
  • ganeshts - Monday, June 13, 2011 - link

    Again, sorry for the slip-up, and thanks for bringing it to our notice. Fixed it. Hopefully, the gallery pictures cleared up the confusion (particularly the Noise Reduction entry in the NVIDIA Control Panel)
  • stmok - Monday, June 13, 2011 - link

    Looking through various driver release README files, it appears the mobile Nvidia Quadro NVS 4200M (PCI Device ID: 0x1056) also has this feature set.

    The first stable Linux driver (x86) to introduce support for Feature Set D is 270.41.03 release.
    => ftp://download.nvidia.com/XFree86/Linux-x86/270.41...

    It shows only the Geforce GT 520 and Quadro NVS 4200M support Feature Set D.

    The most recent one confirms that they are still the only models to support it.
    => ftp://download.nvidia.com/XFree86/Linux-x86/275.09...
  • ganeshts - Monday, June 13, 2011 - link

    Thanks for bringing it to our notice. When that page was being written (around 2 weeks back), the README indicated that the GT 520 was the only GPU supporting Feature Set D. We will let the article stand as-is, and I am sure readers perusing the comments will become aware of this new GPU.
  • havoti97 - Monday, June 13, 2011 - link

    So basically the app store's purpose is to attract submissions of ideas for features of their next OS, uncompensated of course. All the other crap/fart apps not worthy are approved and people make pennies of those.

Log in

Don't have an account? Sign up now