Advanced HTPC users adopt specialized renderers such as madVR which provide better quality when rendering videos. Unlike the standard EVR-CP (Enhanced Video Renderer-Custom Presenter) which doesn't stress the GPU much, renderers like madVR are very GPU-intensive. This has often been the sole reason for many HTPC users to go in for NVIDIA or AMD cards for their HTPC. Traditionally, Intel GPUs have lacked the performance necessary for madVR to function properly (particularly with high definition streams). We did some experiments to check whether Ivy Bridge managed some improvements.

Using our testbed with the 4 GB of DRAM running at DDR3-1333 9-9-9-24, we took one clip each of 1080i60 H.264, 1080i60  VC-1, 1080i60 MPEG-2, 576i50 H.264, 480i60 MPEG-2 and 1080p60 H.264. We tabulated the CPU and GPU usage using various combinations of decoders and renderers. It is quite obvious that using madVR tends to drive up the CPU usage compared to pure DXVA mode (with EVR-CP renderer). This is because the CPU needs to copy back the data to the system memory for madVR to execute the GPU algorithms. A single star against the GPU usage indicates between 5 - 10 dropped frames in a 3 minute duration. Double stars indicate that the number of dropped frames was high and that the dropping of the frames was clearly visible to the naked eye.

  DDR3-1333 [ 9-9-9-24 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
3 74 3 74 4 74 5 28 5 28
576i50
H.264
3 59 3 58 4 58 5 25 5 27
1080i60
H.264
14 86** 11 86** 14 81* 6 42 8 48
1080i60
VC-1
13 84** 13 80* 13 80* 13 47 8 47
1080i60
MPEG-2
12 82** 12 80** 9 78** 5 44 9 48
1080p60
H.264
18 97* 20 97** 18 96** 5 44 12 50

With DDR3-1333, it is evident that 1080i60 streams just can't get processed through madVR without becoming unwatchable. Memory bandwidth constraints are quite problematic for madVR. So, we decided to overclock the memory a bit, and got the G.Skill ECO RAM running at DDR3-1600 without affecting the latency. Of course, we made sure that the system was stable running Prime95 for a couple of hours before proceeding with the testing. With the new memory configuration, we see that the GPU usage improved considerably, and we were able to get madVR to render even 1080p60 videos without dropping frames.

  DDR3-1600 [ 9-9-9-24 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
2 76 2 76 2 73 5 27 5 27
576i50
H.264
2 57 2 57 3 57 5 25 5 24
1080i60
H.264
7 77 11 74 12 74 6 40 9 40
1080i60
VC-1
7 76 11 75 12 79 12 40 8 40
1080i60
MPEG-2
6 74 6 74* 8 75* 5 39 9 40
1080p60
H.264
13 82 14 84 14 80 6 41 10 42

However, the 5 - 10 dropped frames in the 1080i60 MPEG-2 clip continued to bother me. I tried to overclock G.Skill's DDR3-1600 rated DRAM, but was unable to reach DDR3-1800 without sacrificing latency. With a working configuration of DDR3-1800 12-12-12-32, I repeated the tests, but found that the figures didn't improve.

  DDR3-1800 [ 12-12-12-32 ]
  madVR 0.82.5 EVR-CP 1.6.1.4235
  QuickSync
Decoder
DXVA2
Copy-Back
DXVA2
(SW Fallback)
DXVA2 QuickSync
Decoder
  CPU GPU CPU GPU CPU GPU CPU GPU CPU GPU
480i60
MPEG-2
2 75 2 75 2 72 5 27 5 27
576i50
H.264
2 57 2 57 3 57 5 25 5 24
1080i60
H.264
7 74 11 73 12 74 6 39 9 40
1080i60
VC-1
7 74 11 74 12 77 12 39 8 40
1080i60
MPEG-2
6 74 6 74* 8 74* 5 39 9 40
1080p60
H.264
12 84 14 84 14 80 6 41 10 42

My inference is that a low memory latency is as important as high bandwidth for madVR to function effectively. I am positive that with a judicious choice of DRAM, it is possible to get madVR functioning flawlessly with the Ivy Bridge platofrm. Of course, more testing needs to be done with other algorithms, but the outlook is quite positive.

In all this talk about madVR, let us not forget the efficient QuickSync / native DXVA2 decoders in combination with EVR-CP. With low CPU usage and moderate GPU usage, these combinations deliver satisfactory results for the general HTPC crowd.

Custom Refresh Rates Acceleration for Flash and Silverlight
Comments Locked

70 Comments

View All Comments

  • anirudhs - Monday, April 23, 2012 - link

    I can barely notice the difference between 720P and 1080I on my 32" LCD. Will people notice the difference between 1080P and 4K on a 61" screen?

    It seems we have crossed the point where improvements in HD video playback on Sandy Bridge and post-Sandy Bridge machines are discernible to normal people with normal screens.

    I spoke to a high-end audiophile/videophile dealer, and he tells me that the state of video technology (Blu-Ray) is pretty stable. In fact, it is more stable than it has ever been in the past 40 years. I don't think "improvements" like 4K are going to be noticed by those other consumers in the top 1%. This seems like a first-world problem to me - how to cope with the arrival of 4K?
  • digitalrefuse - Monday, April 23, 2012 - link

    ... Anything being discussed on a Web site like Anandtech is going to be "a first-world problem"...

    That being said, there's not much of a difference between 720 lines of non-interlaced picture and 1080 lines of interlaced picture... If anything a 720P picture tends to be a little better looking than 1080I.

    The transition to 4K can't come soon enough. I'm less concerned with video playback and more concerned with desktop real estate - I'd love to have one monitor with more resolution than two 1080P monitors in tandem.
  • ganeshts - Monday, April 23, 2012 - link

    OK, one of my favourite topics :)

    Why does an iOS device's Retina Display work in the minds of the consumers? What prevents one from wishing for a Retina Display in the TV or computer monitor? The latter is what will drive 4K adoption.

    The reason 4K will definitely get a warmer welcome compared to 3D is the fact that there are no ill-effects (eye strain / headaches) in 4K compared to 3D.
  • Exodite - Monday, April 23, 2012 - link

    We can certainly hope, though with 1080p having been the de-facto high-end standard for desktops for almost a decade I'm not holding my breath.

    Until there's an affordable alternative for improving vertical resolution on the desktop I'll stick to my two 1280*1024 displays.

    Don't get me wrong, I'd love to see the improvements in resolution made in mobile displays spill over into the desktop but I'd not be surprised if the most affordable way of getting a 2048*1536 display on the desktop ends up being a gutted Wi-Fi iPad blu-tacked to your current desktop display.
  • aliasfox - Monday, April 23, 2012 - link

    It would be IPS, too!

    :-P
  • Exodite - Monday, April 23, 2012 - link

    Personally I couldn't care less about IPS, though I acknowledge some do.

    Any trade-off in latency or ghosting just isn't worth it, as accurate color reproduction and better viewing angles just doesn't matter to me.
  • ZekkPacus - Monday, April 23, 2012 - link

    Higher latency and ghosting that maybe one in fifty thousand users will notice, if that. This issue has been blown out of all proportion by the measurable stats at all costs brigade - MY SCREEN HAS 2MS SO IT MUST BE BETTER. The average human eye cannot detect any kind of ghosting/input lag in anything under a 10-14ms refresh window. Only the most seasoned pro gamers would notice, and only if you sat the monitors side by side.

    A slight loss in meaningless statistics is worth it if you get better, more vibrant looking pictures and something where you CAN actually see the difference.
  • SlyNine - Tuesday, April 24, 2012 - link

    I take it you've done hundreds of hours of research and documented your studies and methodology so we can look at the results.

    What if Anand did videocard reviews the same way your spouting out these "facts". They would be worthless conjector, just like your information.

    Drop the, but its a really small number argument. Until you really document what the human eye/brain is capable all your saying its a really small number.

    Well Thz is a really small number to. And we can the human body can pick up things as little as 700 Tera Hz. Its called the EYE!.
  • Exodite - Tuesday, April 24, 2012 - link

    Look, you're of a different opinion - that's fine.

    I, however, don't want IPS.

    Because I can't appreciate the "vibrant" colors, nor the better accuracy or bigger viewing angles.

    Indeed, my preferred display has a slightly cold hue and I always turn saturation and brightness way down because it makes the display more restful for my eyes.

    I work with text and when I don't do that I play games.

    I'd much rather have a 120Hz display with even lower latency than I'd take any improvement in areas that I don't care about and won't even notice.

    Also, if you're going to make outlandish claims about how many people can or cannot notice this or that you should probably back it up.
  • Samus - Tuesday, April 24, 2012 - link

    Exodite, you act like IPS has awful latency or something.

    If we were talking about PVA, I wouldn't be responding to an otherwise reasonable arguement, but we're not. The latency between IPS and TN is virtually identical, especially to the human eye and mind. High frame (1/1000) cameras are required to even measure the difference between IPS and TN.

    Yes, TN is 'superior' with its 2ms latency, but IPS is superior with its <6ms latency, 97.4% Adobe RGB accuracy, 180 degree bi-plane viewing angles, and lower power consumption/heat output (either in LED or cold cathode configurations) due to less grid processing.

    This arguement is closed. Anybody who says they can tell a difference between 2ms and sub 6ms displays is being a whiny bitch.

Log in

Don't have an account? Sign up now