HTPC Aspects : Network Streaming Performance

Windows 7-based HTPCs need hardware acceleration in both Adobe Flash and Microsoft Silverlight for optimal streaming performance with YouTube and Netflix. The move to Windows 8.1 has made Silverlight unnecessary. The Netflix app on Windows 8.x brings a HTPC's capability on par with dedicated streaming consoles, with support for Super HD (6 Mbps) streams as well as Dolby Digital Plus bitstreaming support. The latest app also renders the video in such a way as to make taking screenshots an exercise in frustration.

As the above photograph shows, the Netflix app can be set to bitstream Dolby Digital Plus to the AV receiver and the 750Ti supports it. The video and audio streams are at 5.8 Mbps and 192 kbps respectively. It is not immediately evident as to whether GPU acceleration is being utilized. However, tracking the GPU / VPU loading and PC power consumption numbers make it obvious that it is not software decode at work in the Netflix app.

Unlike Silverlight, Adobe Flash continues to maintain some relevance right now. YouTube continues to use Adobe Flash to serve FLV (at SD resolutions) and MP4 (at both SD and HD resolutions) streams. YouTube's debug OSD indicates whether hardware acceleration is being used or not.

Similar to our Netflix streaming test, we recorded GPU / VPU loading as well as power consumption at the wall when streaming the 1080p version of the sample YouTube clip. The table below presents the relevant numbers for various configurations and streaming services.

Streaming Video Performance
  Netflix YouTube
  GPU/VPU Load Power GPU/VPU Load Power
NVIDIA GeForce GTX 750 Ti 11.95/12.65% 56.44 W 16.26/15.74% 55.45 W
NVIDIA GeForce GT 640 5.99/25.80% 58.89 W 15.57/25.72% 58.93 W
AMD Radeon HD 7750 0.72% 66.79 W 3.57% 67.11 W

NVIDIA has been touting Maxwell's low power nature, and it proves to be the best of the three candidates in terms of power efficiency when it comes to GPU support for streaming services.

HTPC Aspects : Introduction HTPC Aspects : Decoding & Rendering Benchmarks
Comments Locked

177 Comments

View All Comments

  • texasti89 - Tuesday, February 18, 2014 - link

    http://media.bestofmicro.com/4/R/422667/original/F...
  • texasti89 - Tuesday, February 18, 2014 - link

    Also I was referring to the 750ti (60w) not the 750 (55w) in my comment. Words in the article reflect reviewers opinions. Benchmark results from various tech websites give same conclusion.
  • texasti89 - Tuesday, February 18, 2014 - link

    Another one to look at : http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
  • tspacie - Tuesday, February 18, 2014 - link

    [Coming soon to a flu near you]

    This is a caching error or similar on page 4, right?
  • mindbomb - Tuesday, February 18, 2014 - link

    Hello Ryan and Ganesh. I'd like to point out for your video tests that there is no luma upscaling or image doubling for a 1080p video on a 1080p display, since luma is already scaled. You need to test those with a 720p video, and they are mutually exclusive, since image doubling will convert 1280x720 to 2560x1440, where you will need to downscale rather than upscale.
  • ganeshts - Tuesday, February 18, 2014 - link

    Luma upscaling is present for 480i / 576i / 720p videos and downscaling for the 4Kp30 video. We have nine different sample streams.
  • jwcalla - Tuesday, February 18, 2014 - link

    I'd like to see AT adopt some OpenGL benchmarks in the future.

    Us OpenGL consumers are out here. :)
  • Ryan Smith - Thursday, February 20, 2014 - link

    So would I. But at the moment there aren't any meaningful games using OpenGL that are suitable for benchmarking. After Wolfenstein went out of date and Rage was capped at 60fps, we ended up stuck in that respect.
  • Roland00Address - Tuesday, February 18, 2014 - link

    Feel better Ryan, don't let the flu get you down! (Or is it Ganesh T S?)

    Looks like Nvidia has a 8800gt/9800gt on its hands (for different reasons than the original 8800gt)
  • Hrel - Tuesday, February 18, 2014 - link

    Seriously impressive performance/watt figures in here. Makes me wonder when we're going to see this applied to their higher end GPU's.

    Looking at TSMC's site they are already producing at 20nm in 2 fabs. Starting in May of this year they'll have a 3rd up. Do you think it's likely May/June is when we'll see Maxwell make it's way into higher end GPU's accompanied by a shift to 20nm?

    That approach would make sense to me, they'd have new product out in time for Summer Sales and have enough time to ramp production and satiate early adopters before back to school specials start up.

    On a personal note: I'm still running a GTX460 and the GTX750ti seems to be faster in almost every scenario at lower power draw in a smaller package. So that's pretty cool. But since TSMC is already producing 20nm chips I'm going to wait until this architecture can be applied at a smaller manufacturing process. That GPU is in a media PC, so gaming is a tertiary concern anyway.

Log in

Don't have an account? Sign up now