Advanced HTPC users adopt specialized renderers such as madVR which provide better quality when rendering videos. Unlike the standard EVR-CP (Enhanced Video Renderer-Custom Presenter) which doesn't stress the GPU much, renderers like madVR are very GPU-intensive. This has often been the sole reason for many HTPC users to go in for NVIDIA or AMD cards for their HTPC. Traditionally, Intel GPUs have lacked the performance necessary for madVR to function properly (particularly with high definition streams). We did some experiments to check whether Ivy Bridge managed some improvements.

Using our testbed with the 4 GB of DRAM running at DDR3-1333 9-9-9-24, we took one clip each of 1080i60 H.264, 1080i60  VC-1, 1080i60 MPEG-2, 576i50 H.264, 480i60 MPEG-2 and 1080p60 H.264. We tabulated the CPU and GPU usage using various combinations of decoders and renderers. It is quite obvious that using madVR tends to drive up the CPU usage compared to pure DXVA mode (with EVR-CP renderer). This is because the CPU needs to copy back the data to the system memory for madVR to execute the GPU algorithms. A single star against the GPU usage indicates between 5 - 10 dropped frames in a 3 minute duration. Double stars indicate that the number of dropped frames was high and that the dropping of the frames was clearly visible to the naked eye.

  DDR3-1333 [ 9-9-9-24 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
3 74 3 74 4 74 5 28 5 28
576i50
H.264
3 59 3 58 4 58 5 25 5 27
1080i60
H.264
14 86** 11 86** 14 81* 6 42 8 48
1080i60
VC-1
13 84** 13 80* 13 80* 13 47 8 47
1080i60
MPEG-2
12 82** 12 80** 9 78** 5 44 9 48
1080p60
H.264
18 97* 20 97** 18 96** 5 44 12 50

With DDR3-1333, it is evident that 1080i60 streams just can't get processed through madVR without becoming unwatchable. Memory bandwidth constraints are quite problematic for madVR. So, we decided to overclock the memory a bit, and got the G.Skill ECO RAM running at DDR3-1600 without affecting the latency. Of course, we made sure that the system was stable running Prime95 for a couple of hours before proceeding with the testing. With the new memory configuration, we see that the GPU usage improved considerably, and we were able to get madVR to render even 1080p60 videos without dropping frames.

  DDR3-1600 [ 9-9-9-24 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
2 76 2 76 2 73 5 27 5 27
576i50
H.264
2 57 2 57 3 57 5 25 5 24
1080i60
H.264
7 77 11 74 12 74 6 40 9 40
1080i60
VC-1
7 76 11 75 12 79 12 40 8 40
1080i60
MPEG-2
6 74 6 74* 8 75* 5 39 9 40
1080p60
H.264
13 82 14 84 14 80 6 41 10 42

However, the 5 - 10 dropped frames in the 1080i60 MPEG-2 clip continued to bother me. I tried to overclock G.Skill's DDR3-1600 rated DRAM, but was unable to reach DDR3-1800 without sacrificing latency. With a working configuration of DDR3-1800 12-12-12-32, I repeated the tests, but found that the figures didn't improve.

  DDR3-1800 [ 12-12-12-32 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
2 75 2 75 2 72 5 27 5 27
576i50
H.264
2 57 2 57 3 57 5 25 5 24
1080i60
H.264
7 74 11 73 12 74 6 39 9 40
1080i60
VC-1
7 74 11 74 12 77 12 39 8 40
1080i60
MPEG-2
6 74 6 74* 8 74* 5 39 9 40
1080p60
H.264
12 84 14 84 14 80 6 41 10 42

My inference is that a low memory latency is as important as high bandwidth for madVR to function effectively. I am positive that with a judicious choice of DRAM, it is possible to get madVR functioning flawlessly with the Ivy Bridge platofrm. Of course, more testing needs to be done with other algorithms, but the outlook is quite positive.

In all this talk about madVR, let us not forget the efficient QuickSync / native DXVA2 decoders in combination with EVR-CP. With low CPU usage and moderate GPU usage, these combinations deliver satisfactory results for the general HTPC crowd.

Custom Refresh Rates Acceleration for Flash and Silverlight
Comments Locked

70 Comments

View All Comments

  • MGSsancho - Monday, April 23, 2012 - link

    While I agree with most everything there is something I would like to nit pick on, While making a digital copy of old film in what ever format you use, more often than not a lot of touching up needs to be done. Wizard of OZ and all the 007 films can be an example. (I am ignoring the remastering of Star Wars and Lucas deciding to add in 'features' vs giving us a cleaned up remaster sans bonuses.) Still when your spending millions in remaster I expect at least not muddy the entire thing up.

    However I feel we need to bring in higher bitrates first. I will not apologize over this, yes encoders are great but a 4mbs 1080p stream still is not as good as nice as a 20mb-60mb vbr blu-ray film The feeling that a craptastic 4k or even 2k bitrate will ruin the expedience for the non informed. Also notice I am ignore an entire difference debate whether the current can candle true HD streaming to every household, at least in the US.
  • nathanddrews - Monday, April 23, 2012 - link

    Higher bit rates will be inherent with 4K or 2K over 1080p, but bit rates aren't the be all end all. 4K will likely use HVEC H.265 which offers double the compression with better quality than H.264.

    Fixing scratches, tears, or other issues with film elements should never be a reason for mass application of filtering.
  • SlyNine - Tuesday, April 24, 2012 - link

    H.264 doesn't even offer 2x the compression over Mpeg 2. I doubt H.265 offers 2x over 264.

    "This means that the HEVC codec can achieve the same quality as H.264 with a bitrate saving of around 39-44%."

    Source http://www.vcodex.com/h265.html
  • Casper42 - Monday, April 23, 2012 - link

    I LOL'd at "Walmart Black Friday" Nathan :)

    And for the OP, 32", really?
    Its completely understandable you don't see the difference on a screen that size.
    Step up to a 60" screen and then go compare 720p to 1080p (who uses 1080i anymore, oh thats right, crappy 32" LCDs. Don't get me wrong, I own 2, but they go in the bedroom and my office, not my Family Room.)

    I think 60" +/- 5" is pretty much the norm now a days for the average middle class family's main movie watching TV.
  • anirudhs - Monday, April 23, 2012 - link

    Cable TV maxes out at 1080i ( I have Time Warner). My TV can do 1080P.
  • nathanddrews - Monday, April 23, 2012 - link

    1080i @ 60 fields per second when deinterlaced is the same as 1080p @ 30 fields per second. The picture quality is almost entirely dependent upon your display's ability to deinterlace. However, cable TV is generally of a lower bit rate than OTA or satellite.
  • SlyNine - Tuesday, April 24, 2012 - link

    Yea but because of shimmering effects progressive images almost always looks better.

    If the video is 2:2 or 3:2 many tv's can build the frame in to a progressive image anymore.
  • Exodite - Tuesday, April 24, 2012 - link

    In the US, possibly, but I dare say 55-60" TVs are far from the norm everywhere.
  • peterfares - Thursday, September 27, 2012 - link

    2560x 27" and 30" monitors are NOT very pixel dense. 27" is slightly more dense (~12.5% more dense) than the standard display but the 30" is only about 4% more dense than a standard display

    a 1920x1080 13.3" display is 71.88% more dense than a standard display.
  • dcaxax - Tuesday, April 24, 2012 - link

    On a 32" you will certainly not see a difference between 720p and 1080p - it is barely visible on a 40". Once you go to 52"+ however the difference becomes visible.

    On a 61" screen as you suggest the difference will be quite visible.

    Having said that I am still very happy with the Quality of properly mastered DVD's which are only 576p on my 47" TV.

    It's not that I can't tell the difference, its just that it doesn't matter to me that much, which is why I also don't bother with MadVR and all that, and just stick to Windows Media Center for my HTPC.

    Everyone's priorities are different.

Log in

Don't have an account? Sign up now