Network Streaming Performance - Netflix

The drivers for the Intel HD 4000 enable hardware accelerated decode and rendering for Netflix streams in Silverlight on Windows 7. The Windows 8 Netflix app has been a bit of a mystery where it is not readily evident as to whether hardware acceleration is present or not, and whether it is being really utilized. In this section, we will take a look at how Netflix behaves on Windows 7 and Windows 8. The ISP at my location is Comcast, and unfortunately, there is no access to the Netflix 5 Mbps Super HD streams yet.

Netflix on Windows 7

On Windows 7, Netflix needs the Silverlight plug-in to be installed. We played back our standard test stream using Internet Explorer.

The Silverlight plug-in attempts GPU acceleration and enables it on our system. Manual stream selection is available. The maximum playback quality has a bitrate of 64 kbps for the audio and 3 Mbps for the video.

We will take a look at the efficiency of the system while playing back the stream towards the end of this section.

Netflix on Windows 8

On Windows 8, Netflix is streamed through a Metro app. Fortunately, the same debug shortcut keys used in the Silverlight version work here too. A nice add-on touch is that the manual stream selection and playback statistics OSD can be made to appear simultaneously.

It is not immediately evident as to whether hardware acceleration is being utilized or not. However, the aspect which stands out immediately is the fact that the video playback bitrate can go as high as 3.85 Mbps. Audio still remains at a lowly 64 kbps. Hopefully, a future update to the Netflix app can provide us with the soundtracks available on specialized media streamers.

Netflix Power Consumption - Windows 7 vs Windows 8

While the Silverlight plug-in OSD helpfully reports that GPU acceleration is being taken advantage of, it doesn't indicate the efficiency in any way. On the other hand, the Windows 8 app doesn't report GPU acceleration status at all. To determine the actual efficiency of Netflix playback, we recorded power consumption at the wall for both scenarios over a 10 minute interval during the middle of the stream.

The graph presents some very interesting results. With Windows 8, the system consumes much less power and the stream is also of higher quality. GPU acceleration in the app makes the streaming more than 30% efficient when compared to the Silverlight version. Compared to Windows 7 Silverlight, the Netflix app is efficient by approximately 35%. Windows 8, by itself, seems to consume less power too. Considering these results, if Netflix forms any part of your HTPC usage scenario, it is a no-brainer to upgrade from Windows 7 to Windows 8.

General Performance Metrics Network Streaming Performance - YouTube
Comments Locked

138 Comments

View All Comments

  • HighTech4US - Sunday, January 20, 2013 - link

    Agree, I see no other overall complete platform that would be better (or even equal) for a 4-OTA Tuner DVR with unlimited storage (only limited by disk size) with free EPG that Windows 7 Media Center provides.

    And by tricking out 7MC with MediaBrowser, MediaControl, SHARK007 Codecs I have a complete on demand system that can play any type of media.

    I use MediaCenterMaster to get program meta information, backdrops and thumbnails for MediaBrowser.

    I also use MakeMKV to rip my DVD's and VideoReDo TVSuite h.264 to edit recorded TV shows and convert them to H.264 MKV's.

    Oh and 7MC can show your digital pictures as a slide show on your big screen with background music.

    I also love the screen saver where it shows random pictures from your picture library then zooms to one (or more) from a folder. When I first got this enabled the wife spent 45 minutes just watching the screen saver.
  • powerarmour - Monday, January 21, 2013 - link

    Agreed, WMC is only EPG based Tuner app that can correctly use Freeview HD DVB-T2 Tuners in the UK, there are no other usable HTPC alternatives.
  • psuedonymous - Sunday, January 20, 2013 - link

    Question: why was the obsolete 2-pass method used instead of the faster (and more common) CRF? Was the encoding benchmark intended as an artificial CPU-stressing benchmark rather than a 'real world' encoding benchmark?
  • ganeshts - Sunday, January 20, 2013 - link

    Hmm.. that is what Graysky's benchmark does, and it keeps the setting consistent across different systems when you want to see how much better or worse your system is, when compared to someone else's.

    FWIW, pass 1 stresses the memory subsystem, while pass 2 stresses the CPU.
  • ganeshts - Sunday, January 20, 2013 - link

    Thanks for the info. I was looking at the FAQ hosted by TechARP here: http://www.techarp.com/showarticle.aspx?artno=442&... ;

    Also, look at Ian's test with various memory speeds here using the same processor (last section on this page):

    http://www.anandtech.com/show/6372/memory-performa...

    There is definitely an impact on pass 1 performance using different memory speeds and the impact is more than on pass 2.
  • Iketh - Sunday, January 20, 2013 - link

    Why is Prime95 v25.9 used? That is grossly outdated. The latest official 27.7 is needed to tax Ivy Bridge with AVX instructions. All those temps and watts you got will increase significantly. Please revise your Prime95. An oversight like this is unacceptable.

    Not to mention the latest Intel compilers have been implementing AVX instructions for like 6+ months now even if the programmer didn't specifically write for it. AND Handbrake has been using AVX in about that same timeframe and is only increasing.....
  • ganeshts - Sunday, January 20, 2013 - link

    I will definitely do some experiments with the new Prime95 and report back.
  • ganeshts - Monday, January 21, 2013 - link

    I repeated the CPU loading with the latest Prime95 (v27.7):

    http://i.imgur.com/lK0zqjR.png

    The readings didn't go up significantly, but, yes, there is an increase. The power consumption at the wall increased from 58.25 to 62.56 W.

    Thanks for bringing this to our attention, and we will make sure future reviews use the updated Prime95.
  • ganeshts - Monday, January 21, 2013 - link

    Oh, but, with full GPU and CPU loading (using Furmark 1.10.3 - latest), the power at the wall is only 89.77 W (compared to 88.75 W earlier). The ~40 W / ~15W TDP distribution between the CPU and the GPU still remains the same.

    http://i.imgur.com/soCGAyk.jpg

    I don't expect the steady state temperatures to be that different because the power increase at the wall is only 1 W.
  • ganeshts - Sunday, January 20, 2013 - link

    Yes, the scaling algorithms affect the performance a lot.

    That is why I mentioned that we used the default settings: Bicubic with sharpness 75 for chroma (no anti-ringing filter), Lanczos 3-tap for image upscaling / Catmull-Rom for image downscaling (no anti-ringing filter or linear light scaling),

    We will look at other scaling algorithms and their performance on the HD 4000 / GT 640 / AMD 7750 in the third part of the HTPC series.

    Also, a note that if you are using HD 4000 (or any other Intel HD Graphics), I would strongly suggest looking at DXVA Scaling. Users might be surprised at the quality delivered without taxing the GPU too much.

Log in

Don't have an account? Sign up now