HTPC Aspects : Introduction

Home Theater PC (HTPC) enthusiasts keep close tabs on launch of discrete GPUs which don't need a PCIe power connector. Such cards make it easy to upgrade an old PC with a low-wattage PSU into a multimedia powerhouse. Over the last decade or so, GPUs have implemented HTPC functionalities in response to consumer demand as well as changing / expected market trends. In the beginning, we had hardware acceleration for decode of MPEG-2. This was followed by H.264 / VC-1 acceleration (thanks to the emergence of Blu-rays), HD audio bitstreaming and 3D video support. More recently, we had support for playback and decode of videos in 4K resolution.

4K presents tangible benefits to consumers (unlike 3D), and market adoption is rapidly growing. In many respects, this is similar to how people migrated to 720p and 1080i TV sets when vendors started promoting high definition (HD). We know that these early adopters were stuck with expensive CRT-based TVs when the LCD-based 1080p sets came to the market at very reasonable prices. While there is no 'CRT-to-LCD'-like sea-change in the horizon, the imminent launch of HDMI '2.0' (The HDMI consortium wants to do away with version numbers for reasons known only to them) with 4Kp60 capability and display sinks fully compliant with that standard needs to be kept in mind by end users.

In the near future, it is expected that most of the 4K material reaching consumers will be encoded in H.264. Consumer devices such as the GoPro cameras still record 4K in that codec only. From a HTPC GPU perspective, it is imperative that we have support for 4K H.264 decoding. In fact, most real-time encoding activities would utilize H.264, but, a good HEVC (H.265) encoder would definitely be more efficient in terms of bitrate. The problem is that it is very difficult to make a good HEVC encoder operate in real-time. Archiving content wouldn't be a problem, though. So, it can be expected that content from streaming services / local backup (where the encoding is done offline) will move to HEVC first. A future-proof HTPC GPU would be capable of HEVC decode too.

Where does the Maxwell-based 750Ti stand when the above factors are taken into account? Make no mistake, the NVIDIA GT 640 happens to be our favourite HTPC GPU when 4K-capability is considered an absolute necessity. On paper, the 750Ti appears to be a great candidate to take over the reins from the GT 640. In order to evaluate the HTPC credentials, we put the 750Ti to test against the Zotac GT 640 as well as the Sapphire Radeon HD 7750.

In our HTPC coverage, we first look at GPU support for network streaming services, followed by hardware decoder performance for local file playback. This section also covers madVR. In the third section, we take a look some of the miscellaneous HTPC aspects such as refresh rate accuracy and hardware encoder performance.

The HTPC credentials of the cards were evaluated using the following testbed configuration:

NVIDIA GT 750Ti HTPC Testbed Setup
Processor / GPU Intel Core i7-3770K - 3.50 GHz (Turbo to 3.9 GHz)
NVIDIA GT 750Ti / Zotac GT 640 / Sapphire Radeon HD 7750
Motherboard Asus P8H77-M Pro uATX
OS Drive Seagate Barracuda XT 2 TB
Secondary Drive OCZ Vertex 2 60 GB SSD + Corsair P3 128 GB SSD
Memory G.SKILL ECO Series 4GB (2 x 2GB) SDRAM DDR3 1333 (PC3 10666) F3-10666CL7D-4GBECO CAS 9-9-9-24
Case Antec VERIS Fusion Remote Max
Power Supply Antec TruePower New TP-550 550W
Operating System Windows 8.1 Pro
Display / AVR
Sony KDL46EX720 + Pioneer Elite VSX-32
Acer H243H
Graphics Drivers GeForce v334.69 / Catalyst 14.1 Beta
Softwares CyberLink PowerDVD 13
MPC-HC 1.7.3
madVR 0.87.4

All the three cards were evaluated using the same hardware and software configuration. The Sapphire Radeon HD 7750 has an advantage in the power consumption department thanks to its passive cooling system. Other than that, we are doing apples-to-apples comparison when talking about power consumption numbers for various activities in the next few sections.

Meet The Reference GTX 750 Ti & Zotac GTX 750 Series HTPC Aspects : Network Streaming Performance
Comments Locked

177 Comments

View All Comments

  • Mondozai - Wednesday, February 19, 2014 - link

    Wait for 800 series budget cards if you have the patience. Hopefully no more than 4-5 months if TSMC does very well on 20.
  • Jeffrey Bosboom - Wednesday, February 19, 2014 - link

    I understand the absolute hashrate on these cards will be low, but I'm interested to know how the focus on power consumption improves mining performance per watt. (Though I can't imagine this lowish-end cards would be used, even if efficient, due to the fixed cost of motherboards to put them in.)
  • Antronman - Wednesday, February 19, 2014 - link

    Nvidia's best cards have tiny hash rates compared to 95% of every AMD GPU ever released.
  • JarredWalton - Wednesday, February 19, 2014 - link

    Apparently you're not up to speed on the latest developments. GTX 780 Ti as an example is now hitting about 700 KHash in scrypt, and word is the GTX 750 will be pretty competitive with 250-260 KHash at stock and much lower power consumption. Some people have actually put real effort into optimizing CUDAminer now, so while AMD still has an advantage, it's not nearly as large as it used to be. You could even make the argument that based on perf/watt in mining, some of NVIDIA's cards might even match AMD's top GPUs.
  • darthrevan13 - Wednesday, February 19, 2014 - link

    Why did they chose to retire 650 Ti Boost and replace it with 750Ti? 650 Ti B is a much better card for high end games because of the memory interface. They should have marketed 750Ti as 750 and 750 as 740.

    And why on earth did they not include full support for HEVEC and DX11.2? You're limiting the industry's adoption for years to come because of you're move. I hope they will fix this in the next generation 800 cards or when they will transition to 20nm.
  • Ryan Smith - Thursday, February 20, 2014 - link

    Not speaking for NV here, but keep in mind that 650 Ti Boost is a cut-down GK106 chip. All things considered, 750 Ti will be significantly cheaper to produce for similar performance.

    NVIDIA really only needed it to counter Bonaire, and now that they have GM107 that's no longer the case.
  • FXi - Wednesday, February 19, 2014 - link

    No DX 11.2 or even 11.1 support? For THAT price??
    Pass...
  • rish95 - Wednesday, February 19, 2014 - link

    According to GeForce.com it supports 11.2. Not sure what's up with this:

    http://www.geforce.com/hardware/desktop-gpus/gefor...
  • willis936 - Wednesday, February 19, 2014 - link

    You don't need to be compliant to support something. Compliance means you meet all required criteria. Support means you can run it without having necessarily all the bells and whistles. If console hardware has DX compliance then the devs will take advantage of that and when they're ported you'll lose some of the neat graphics tricks. They might still be able to be done in software, you'll just need a bigger GPU to get the same frame rates :p Some things might not be able to be done in software though. Idk enough about DX to say.
  • sourav - Wednesday, February 19, 2014 - link

    does it will support on a pci v2?

Log in

Don't have an account? Sign up now