HTPC Aspects : Network Streaming Performance

Windows 7-based HTPCs need hardware acceleration in both Adobe Flash and Microsoft Silverlight for optimal streaming performance with YouTube and Netflix. The move to Windows 8.1 has made Silverlight unnecessary. The Netflix app on Windows 8.x brings a HTPC's capability on par with dedicated streaming consoles, with support for Super HD (6 Mbps) streams as well as Dolby Digital Plus bitstreaming support. The latest app also renders the video in such a way as to make taking screenshots an exercise in frustration.

As the above photograph shows, the Netflix app can be set to bitstream Dolby Digital Plus to the AV receiver and the 750Ti supports it. The video and audio streams are at 5.8 Mbps and 192 kbps respectively. It is not immediately evident as to whether GPU acceleration is being utilized. However, tracking the GPU / VPU loading and PC power consumption numbers make it obvious that it is not software decode at work in the Netflix app.

Unlike Silverlight, Adobe Flash continues to maintain some relevance right now. YouTube continues to use Adobe Flash to serve FLV (at SD resolutions) and MP4 (at both SD and HD resolutions) streams. YouTube's debug OSD indicates whether hardware acceleration is being used or not.

Similar to our Netflix streaming test, we recorded GPU / VPU loading as well as power consumption at the wall when streaming the 1080p version of the sample YouTube clip. The table below presents the relevant numbers for various configurations and streaming services.

Streaming Video Performance
  Netflix YouTube
  GPU/VPU Load Power GPU/VPU Load Power
NVIDIA GeForce GTX 750 Ti 11.95/12.65% 56.44 W 16.26/15.74% 55.45 W
NVIDIA GeForce GT 640 5.99/25.80% 58.89 W 15.57/25.72% 58.93 W
AMD Radeon HD 7750 0.72% 66.79 W 3.57% 67.11 W

NVIDIA has been touting Maxwell's low power nature, and it proves to be the best of the three candidates in terms of power efficiency when it comes to GPU support for streaming services.

HTPC Aspects : Introduction HTPC Aspects : Decoding & Rendering Benchmarks
Comments Locked

177 Comments

View All Comments

  • Mondozai - Wednesday, February 19, 2014 - link

    Wait for 800 series budget cards if you have the patience. Hopefully no more than 4-5 months if TSMC does very well on 20.
  • Jeffrey Bosboom - Wednesday, February 19, 2014 - link

    I understand the absolute hashrate on these cards will be low, but I'm interested to know how the focus on power consumption improves mining performance per watt. (Though I can't imagine this lowish-end cards would be used, even if efficient, due to the fixed cost of motherboards to put them in.)
  • Antronman - Wednesday, February 19, 2014 - link

    Nvidia's best cards have tiny hash rates compared to 95% of every AMD GPU ever released.
  • JarredWalton - Wednesday, February 19, 2014 - link

    Apparently you're not up to speed on the latest developments. GTX 780 Ti as an example is now hitting about 700 KHash in scrypt, and word is the GTX 750 will be pretty competitive with 250-260 KHash at stock and much lower power consumption. Some people have actually put real effort into optimizing CUDAminer now, so while AMD still has an advantage, it's not nearly as large as it used to be. You could even make the argument that based on perf/watt in mining, some of NVIDIA's cards might even match AMD's top GPUs.
  • darthrevan13 - Wednesday, February 19, 2014 - link

    Why did they chose to retire 650 Ti Boost and replace it with 750Ti? 650 Ti B is a much better card for high end games because of the memory interface. They should have marketed 750Ti as 750 and 750 as 740.

    And why on earth did they not include full support for HEVEC and DX11.2? You're limiting the industry's adoption for years to come because of you're move. I hope they will fix this in the next generation 800 cards or when they will transition to 20nm.
  • Ryan Smith - Thursday, February 20, 2014 - link

    Not speaking for NV here, but keep in mind that 650 Ti Boost is a cut-down GK106 chip. All things considered, 750 Ti will be significantly cheaper to produce for similar performance.

    NVIDIA really only needed it to counter Bonaire, and now that they have GM107 that's no longer the case.
  • FXi - Wednesday, February 19, 2014 - link

    No DX 11.2 or even 11.1 support? For THAT price??
    Pass...
  • rish95 - Wednesday, February 19, 2014 - link

    According to GeForce.com it supports 11.2. Not sure what's up with this:

    http://www.geforce.com/hardware/desktop-gpus/gefor...
  • willis936 - Wednesday, February 19, 2014 - link

    You don't need to be compliant to support something. Compliance means you meet all required criteria. Support means you can run it without having necessarily all the bells and whistles. If console hardware has DX compliance then the devs will take advantage of that and when they're ported you'll lose some of the neat graphics tricks. They might still be able to be done in software, you'll just need a bigger GPU to get the same frame rates :p Some things might not be able to be done in software though. Idk enough about DX to say.
  • sourav - Wednesday, February 19, 2014 - link

    does it will support on a pci v2?

Log in

Don't have an account? Sign up now