The author of the LAV Splitter / Audio Decoder has another nifty tool coded up for HTPC users with NVIDIA cards. Based on the CUDA SDK, it is called LAV CUVID. The video decoder is not a typical CUDA API and does not use CUDA to decode. NVIDIA provides an extension to CUDA called CUVID, which just accesses the hardware decoder.

The only unfortunate aspect of LAV CUVID is that it is restricted to NVIDIA GPUs only. While OpenCL might be an open CUDA, it does not provide a CUVID-like video decoder extension on its own. ATI/AMD has the OpenVideoDecode API, which is an extension to OpenCL. Despite being open, it hasn't gained as much traction as CUDA. The AMD APIs are also fairly new and probably not mature enough for developers to focus attention on them yet. Intel offers a similar API through their Media SDK. Again, the lack of support seems to turn away developers.

On Linux, there is the VA-API abstraction layer, which is natively supported by Intel, and has compatibility layers onto VDPAU (NVIDIA), and OVD (ATI/AMD). So, on Linux it is theoretically possible for developers to create a multi-format video decoder. But, there is no support for HD audio bitstreaming with Linux. For Windows, developers are forced to use DXVA(2) for multi-platform video decoding applications.

Is there an incentive for NVIDIA users to shift from the tried and tested MPC Video Decoder (which uses DXVA(2))? I personally use LAV CUVID as my preferred decoder on NVIDIA systems for the following reasons:

  • Support for uniform hardware acceleration for multiple codecs: Theoretically, everything listed under the supported DXVA modes by DXVA Checker should be utilized by the software decoders. Unfortunately, that is not the case. This is evident when the 'Check DirectShow/MediaFoundation Decoders' feature is used to verify the compatibility of a MPEG-4 or interlaced VC-1 stream. The mode either comes out as 'Unsupported', or, it is active only under DXVA1 for VMR (Video Mixing Renderer). LAV CUVID doesn't show DXVA support under DXVA Checker (because it really doesn't use DXVA). However, analysis of the GPU/CPU load reveals that its performance and usage of the GPU are very similar to that of the DXVA2 decoders. Furthermore, all our GPU stress tests were hardware accelerated except for the MS-MPEG4 clip.
  • Support for choice of renderer: For the average Windows 7 HTPC user, the EVR (Enhanced Video Renderer) is much better than VMR since it contains multiple enhancements which are beyond the scope of this piece.

Almost all DXVA2 decoders can connect to the EVR. Advanced HTPC users are more demanding and want to do more post-processing than what EVR provides. madVR enters the scene here, and has support for multiple post processing steps which we will cover further down in this section. However, it doesn't interface with DXVA decoders. The LAV CUVID decoder can interface to all these renderers, and is not restricted like the other DXVA2 decoders.

Starting with v0.8, LAV CUVID has an installation program. Prior to that, the filters had to be registered manually, as in the gallery below.

After downloading and extracting the archive, the installation batch script needs to be run with administrator privileges. If the filter gets successfully registered, your favorite DirectShow player can be configured to use LAV CUVID. The process setup for MPC-HC is shown in the gallery. Make sure that the internal transform filters for the codecs you want to decode with LAV CUVID are unselected. After that, add LAV CUVID in the External Filters section and set it to 'Prefer'.

Here is a sample screenshot with EVR CP statistics for a MKV file played back with LAV Splitter, Audio Decoder and LAV CUVID Decoder.

Click to Enlarge

Now, let us shift our focus to madVR. It is a renderer replacement for EVR, and can be downloaded here. Currently, madVR does not do deinterlacing, noise reduction, edge enhancement and other post processing steps by itself. These need to be done before the frame is presented to madVR for rendering. When using a DXVA decoder, these steps are enabled from the NVIDIA or AMD control panel settings. With the LAV CUVID decoder, we get the post processing steps as enabled in the drivers. The decoded frames are copied back to the system RAM for madVR to use.

The madVR renderer uses the GPU pixel shader hardware for the following steps:

  1. Chroma upsampling
  2. High bit-depth color conversion
  3. Scaling
  4. Display calibration (optionally, if you have your own meter)
  5. Dithering of the internal calculation bit-depth (32bit+) down to the display bit-depth (8 bit)

All of these things are realized with a higher bit-depth and quality compared to what the standard GPU post processing algorithms do.

The gallery below gives an overview of how to install madVR and configure it appropriately.

After downloading and extracting the archive, run the installation batch script to register the renderer filter. By default, the MPC-HC 'Output' options has madVR grayed out under the 'DirectShow Video' section. After the registration of the madVR filter, it becomes possible to select this option. When you try to play a video with the new output settings, it is possible that a security warning pops up asking for permission to open the madVR control application. Allowing the application to run creates a tray icon to control the madVR settings as shown in the fifth screenshot in the gallery. Screenshots 6 through 12 show the various madVR post processing options.

madVR requires a very powerful GPU for its functioning. Do the GT 430 and GT 520 cut it for the full madVR experience? We will try to find that out in the next section.

Software for HTPCs : LAV Splitter and Audio Decoder Benchmarking LAV CUVID with madVR
Comments Locked

70 Comments

View All Comments

  • jwilliams4200 - Monday, June 13, 2011 - link

    All the numbers add up correctly now. Thanks for monitoring the comments and fixing the errors!
  • Samus - Monday, June 13, 2011 - link

    Honestly, my Geforce 210 has been chillin' in my HTPC for 2+ years, and works perfectly :)
  • josephclemente - Monday, June 13, 2011 - link

    If I am running a Sandy Bridge system with Intel HD Graphics 3000, do these cards have any benefit over integrated graphics? What is Anandtech's HQV Benchmark score?

    I tried searching for scores, but people say this is subjective and one reviewer may differ from another. One site says 196 and another in the low 100's. What does this reviewer say?
  • ganeshts - Monday, June 13, 2011 - link

    Give me a couple of weeks. I will be getting a test system soon with the HD 3000, and I will do detailed HQV benchmarking in that review too.
  • dmsher99@gmail.com - Tuesday, June 14, 2011 - link

    I recently built a HTPC with a core i5-2500k on a ASUS P8H67 EVO with a Ceton InfiniTV cable card. Note that the Intel driver is fundamentally flawed and will destroy a system if patched. See the Intel communities thread 20439 for more details.

    Besides causing BSOD over HDMI output when patched, the stable versions have their own sets of bugs including a memory bleed when watching some premium content on HD channels that crashed WMC. Intel appears to have 1 part time developer working on this problem but every test river he puts out breaks more than it fixes. Watching the same, content with a system running a NVIDIA GPU and the memory bleed goes away.

    In my opinion, second gen SB chips is just not ready for prime time in a fully loaded HTPC.
  • jwilliams4200 - Monday, June 13, 2011 - link

    "The first shot shows the appearance of the video without denoising turned on. The second shot shows the performance with denoising turned off. "

    Heads I win, tails you lose!
  • ganeshts - Monday, June 13, 2011 - link

    Again, sorry for the slip-up, and thanks for bringing it to our notice. Fixed it. Hopefully, the gallery pictures cleared up the confusion (particularly the Noise Reduction entry in the NVIDIA Control Panel)
  • stmok - Monday, June 13, 2011 - link

    Looking through various driver release README files, it appears the mobile Nvidia Quadro NVS 4200M (PCI Device ID: 0x1056) also has this feature set.

    The first stable Linux driver (x86) to introduce support for Feature Set D is 270.41.03 release.
    => ftp://download.nvidia.com/XFree86/Linux-x86/270.41...

    It shows only the Geforce GT 520 and Quadro NVS 4200M support Feature Set D.

    The most recent one confirms that they are still the only models to support it.
    => ftp://download.nvidia.com/XFree86/Linux-x86/275.09...
  • ganeshts - Monday, June 13, 2011 - link

    Thanks for bringing it to our notice. When that page was being written (around 2 weeks back), the README indicated that the GT 520 was the only GPU supporting Feature Set D. We will let the article stand as-is, and I am sure readers perusing the comments will become aware of this new GPU.
  • havoti97 - Monday, June 13, 2011 - link

    So basically the app store's purpose is to attract submissions of ideas for features of their next OS, uncompensated of course. All the other crap/fart apps not worthy are approved and people make pennies of those.

Log in

Don't have an account? Sign up now