We have discussed madVR in extensive detail in the Discrete HTPC GPU Shootout piece. In all our HTPC reviews dealing with madVR, we restrict ourselves to the high quality settings suggested by Mathias Rauen (4-tap Lanczos for luma scaling and SoftCubic (softness 70) for chroma scaling). There is no quality tradeoff for performance, and deinterlacing is enabled (and forced to be active in doubtful cases). Full Screen Exclusive works better on the whole compared to Full Screen Windowed, and all the queues had to be put at the maximum value. In v301.24 of the NVIDIA drivers, madVR doesn't work if the presentation is done on a separate device. So, that option had to be turned off.

LAV Video Decoder can connect to the madVR renderer under the following hardware decode settings:

  1. None (Software decoding using avcodec)
  2. QuickSync (QS Decoder on supported systems - Intel Sandy Bridge and Ivy Bridge)
  3. NVIDIA CUVID
  4. DXVA2 Copy-Back (DXVA2 CB)

Note that the native DXVA2 mode doesn't connect to the madVR renderer. In our experiments, we tried out all of the above except QuickSync. The relevant graphs are presented below.

Software Decode with madVR (FSE) DXVA2 Copy-Back with madVR (FSE) LAV CUVID with madVR (FSE) LAV CUVID with madVR (FSE)
Software Decode with madVR (FSE) DXVA2 Copy-Back with madVR (FSE) LAV CUVID with madVR (FSE) LAV CUVID with madVR (FSE)

Resource Usage Comparison - Software Decode vs. DXVA2 Copy-Back vs. LAV CUVID (FSE & FSW) with madVR

In both software decode and DXVA2 CB mode, the GPU core utilization shot up over 90%. Generally, whenever there were sudden spikes above 90%, we saw that the presentation and the render queues in madVR dropped to alarmingly low levels, resulting in dropped frames. DXVA2 Copy-Back mode resulted in an increase in the memory controller load. LAV CUVID in FSE (Full Screen Exclusive) mode had the lowest GPU core utilization (around 83% was the maximum for the 1080i60 VC-1 clip). In FSW mode, the utilization went up slightly, but still remained below 90%. In any case, the take away from this section is that if the end user is going to use madVR as the renderer, CUVID should be the video decoder of choice, particularly for high frame rates and resolutions.The CPU utlization when using madVR is slightly more than what is seen when using EVR.

An important point to note with respect to the settings is the fact that both LAV Video Decoder and madVR have deinterlacing options. It is best to turn off the deinterlacing in LAV Video Decoder (set Hardware Deinterlacing to Weave (none)). Performing the deinterlacing closer to the presentation stage (i.e, in the madVR renderer) reduces the memory / controller loading and is generally easier on the GPU (lesser chance of dropping frames).

HTPC Decoding & Rendering Benchmarks : EVR Playback Software : XBMC and JRiver MC 17
Comments Locked

60 Comments

View All Comments

  • BPB - Monday, May 7, 2012 - link

    I am planing on doing the same thing. Been trying it out with my notebook and like the way it's working with my HDHomerun Prime, so it looks like I'm losing a notebook but gaining an HTPC that's going to use little power and can be unplugged and still used as a laptop when needed. Now I have to get an good size external HDD and I am set. The nice thing is when the notebook is in use as a notebook I can use my desktop PC and Xbox combination to record/watch TV.
  • IntoxicatedPuma - Tuesday, May 8, 2012 - link

    Yeah you could easily by an Asus U36 series for around $600-$650 with similar performance. I don't know that I agree with the article about desktop CPU's being noisy and hot. For half the price of that machine, you could build an H61 machine with a 2100T, same hard drive, and equivalent video card that was about the same size and used about the same amount of power, and wouldn't run any hotter or be any noticeably louder.
  • yottabit - Monday, May 7, 2012 - link

    Regarding the line:
    "We are a little worried about the full loading power consumption being more than what the power supply is rated for"

    I'm not sure this is true since you are comparing apples to oranges. Power supplies are typically rated for DC OUTPUT but you are comparing the rated DC output to the draw at the wall. Assuming the PSU is 80% efficient, then a 90W rating should equal approximately 90W/0.8= 112.5 W at the wall. Just a food for thought, I see this error commonly.
  • ganeshts - Monday, May 7, 2012 - link

    Thanks for the pointer. The power consumption of > 109 W is still more than that of the first generation Vision 3D which was 82 W. This still makes us worried. I am trying to determine the power efficiency of the PSU (Delta Electronics ADP-90CD DB).
  • Angengkiat - Monday, May 7, 2012 - link

    Hi Ganesh,
    Any idea what software to use if we want to play 3D nicely on the machine, cos I am using TMT5 but it does not seem to be able to display the same 3D effects compared to a dedicated bluray player?

    Thanks!
    EK
  • ganeshts - Monday, May 7, 2012 - link

    That is a bit surprising. TMT 5 has full 3D Blu-ray support. Maybe the 3D Blu-ray player is assuming some settings which have to be configured in TMT 5 (like the depth of view). Also, did you run the NVIDIA 3D display setup?
  • MichaelD - Monday, May 7, 2012 - link

    $1.2K. REALLY? That's just nuts. Nice piece of hardware, but not worth what they're asking for it. Plus at this pricepoint there had better be an SSD in there. At least a 64GB for the OS and programs. There's enough room in the chassis for a second 2.5" drive. They should've done a 2-drive, SSD/HD combo at this price.
  • ganeshts - Monday, May 7, 2012 - link

    Agreed :) I have recommended the same to ASRock.
  • tctc - Monday, May 7, 2012 - link

    Hi - couple of questions about the twin GPU configuration

    1. What determines which GPU is used by a particular application?
    2. Can the iGPU be disabled so that only the NVIDIA 540 is used?

    Regards,
    tctc
  • ganeshts - Monday, May 7, 2012 - link

    Yes, this is handled by Virtu. If you don't install Virtu, the 540M is the only one that is used. You need Virtu to choose applications for which the iGPU gets used (commonly MediaConverter / any app for which you want to use QuickSync)

Log in

Don't have an account? Sign up now