HTPC Aspects : 4K Decode and Display

From a HTPC perspective, GPUs over the last two generations have done little to tempt users to upgrade since HD audio bitstreaming became a commodity feature. The appearance of 3D TVs called for some updates from the GPU vendors, but the technology didn't really pick up at the pace that the industry wanted it to.

With the 3D craze having been milked dry, it is now the time for a new buzzword: 4K. Retina displays have become the focus of much talk, thanks to Apple's promotion, and, at Computex, we saw the introduction of products with 11" and 13" screens having 1080p resolution. It is not tough to imagine 4K resolution panels becoming commonplace in 32" and larger sized TVs and even 24" and larger sized monitors.

The one aspect that 4K has going for it is the fact that the higher resolution (when it comes to videos, at least) is unlikely to have any ill effects on the viewers' health. Unlike 3D (which caused discomfort to a number of consumers), we expect 4K to have a much smoother sailing in gaining acceptance in the marketplace. In addition, 4K is the natural step towards a more immersive experience. As such, we are more positive about 4K from a consumer as well as industry perspective than we ever were about the 3D initiative.

In terms of being an early adopter, the current issue with the 4K ecosystem is the fact that HDMI officially only supports up to 4096 x 2160 @ 24 Hz and 3840 x 2160 @ 30 Hz. For a smooth desktop experience at 4K resolution, it is imperative that we get 60 Hz refreshes at 4096 x 2160. This is scheduled to come in the next update to the HDMI specifications. It is also unfortunate that we are restricted to 4096 x 2160 as the maximum resolution, when the official 4K cinema specifications are only slightly larger at 4096 x 2304.

In any case, the Zotac GT 640 that we are looking at today is compliant with the current HDMI 4K specifications. The 4K resolution is available over all the three ports (using an appropriate DVI to HDMI converter). Both the DVI ports are also capable of carrying audio.

AMD's GCN lineup is also compliant with the HDMI 4K specifications, but it is the GT 640 which has excited us enough to talk about this in detail. While AMD's 4K hardware decode remains an unusable feature for the general consumer right now, NVIDIA's Kepler implementation fares much better.

Due to the aforementioned issues with the mini-HDMI port on Zotac's card, we tested out the 4K output over the dual-link DVI port connected to the Sony VPL-VW 1000ES through a DL-DVI to HDMI adapter.

Currently, native DXVA mode implementations tend to crash the system. However, using LAV Filters 0.50.5 in the CUVID mode or DXVA2 Copy-Back mode, we are able to decode H.264 streams with resolutions greater than 1080p using the GPU.

 

The screenshot above (click on the picture for full 4K resolution) shows the playback of a 4096 x 2304 H.264 stream at 24 fps.

We see that the GPU's VPU has approximately 60% load. EVR-CP doesn't load up the GPU core too much (less than 50% core utilization). Note that the maximum refresh rate possible at 4096 x 2160 is only 24 Hz, as indicated by the EVR-CP statistics. Another point to note is that the LAV Video Decoder is operating in CUVID mode.

CUVID acceleration is also possible for videos with arbitrary resolutions greater than 1080p. The screenshot below (again, click for full Quad FHD resolution) shows flawless decode acceleration of a 3412 x 1920 video at 25 fps. At 3840 x 2160 (Quad FHD), the GPU is able to drive the desktop with a refresh rate of 29.97 Hz. In this case, the VPU load is a bit lower (around 45%) as per expectations.

 

How well does 4K decode and rendering work with other combinations of decoders / renderers? The usage graphs below present the CPU package power, GPU core load, memory controller load, video engine load and video bus load when playing our 4K test clip (original version of this YouTube video) on a 1080p display (which is probably going to be the way that most consumers are going to enjoy 4K content for some time to come). As usual, we accept no quality tradeoffs in madVR and go with the high quality settings that we have used in previous reviews.

 

It is immediately obvious that the GT 640 is not in any way up to task for madVR processing on 4K content, even when it is just downscaling to 1080p. As evident from the above graph, the core is maxed out whenever we choose madVR as the renderer irrespective of the decoder used. Our suggestion is to retain EVR-CP as the renderer for all 4K content.

HTPC Aspects : What is New? HTPC Aspects : Custom Refresh Rates
Comments Locked

60 Comments

View All Comments

  • cjs150 - Thursday, June 21, 2012 - link

    "God forbid there be a technical reason for it.... "

    Intel and Nvidia have had several generations of chip to fix any technical issue and didnt (HD4000 is good enough though). AMD have been pretty close to the correct frame rate for a while.

    But it is not enough to have the capability to run at the correct frame rate is you make it too difficult to change the frame rate to the correct setting. That is not a hardware issue just bad design of software.
  • UltraTech79 - Wednesday, June 20, 2012 - link

    Anyone else really disappointed in 4 still being standardized around 24 fps? I thought 60 would be the min standard by now with 120 in higher end displays. 24 is crap. Anyone that has seen a movie recorded at 48+FPS know whats I'm talking about.

    This is like putting shitty unleaded gas into a super high-tech racecar.
  • cjs150 - Thursday, June 21, 2012 - link

    You do know that Blu-ray is displayed at 23.976 FPS? That looks very good to me.

    Please do not confuse screen refresh rates with frame rates. Screen refresh runs on most large TVs at between 60 and 120 Hz, anything below 60 tends to look crap. (if you want real crap trying running American TV on an European PAL system - I mean crap in a technical sense not creatively!)

    I must admit that having a fps of 23.976 rather than some round number such as 24 (or higher) FPS is rather daft and some new films are coming out with much higher FPS. I have a horrible recollection that the reason for such an odd FPS is very historic - something to do with the length of 35mm film that would be needed per second, the problem is I cannot remember whether that was simply because 35mm film was expensive and it was the minimum to provide smooth movement or whether it goes right back to days when film had a tendency to catch light and then it was the maximum speed you could put a film through a projector without friction causing the film to catch light. No doubt there is an expert on this site who could explain precisely why we ended up with such a silly number as the standard
  • UltraTech79 - Friday, June 22, 2012 - link

    You are confusing things here. I clearly said 120(fps) would need higher end displays (120Hz) I was rounding up 23.976 FPS to 24, give me a break.

    It looks good /to you/ is wholly irrelevant. Do you realize how many people said "it looks very good to me." Referring to SD when resisting the HD movement? Or how many will say it again referring to 1080p thinking 4k is too much? It's a ridiculous mindset.

    My point was that we are upping the resolution, but leaving another very important aspect in the dust that we need to improve. Even audio is moving faster than framerates in movies, and now that most places are switching to digital, the cost to goto the next step has dropped dramatically.
  • nathanddrews - Friday, June 22, 2012 - link

    It was NVIDIA's choice to only implement 4K @ 24Hz (23.xxx) due to limitations of HDMI. If NVIDIA had optimized around DisplayPort, you could then have 4K @ 60Hz.

    For computer use, anything under 60Hz is unacceptable. For movies, 24Hz has been the standard for a century - all film is 24fps and most movies are still shot on film. In the next decade, there will be more and more films that will use 48, 60, even 120fps. Cameron was cock-blocked by the studio when he wanted to film Avatar at 60fps, but he may get his wish for the sequels. Jackson is currently filming The Hobbit at 48fps. Eventually all will be right with the world.
  • karasaj - Wednesday, June 20, 2012 - link

    If we wanted to use this to compare a 640M or 640M LE to the GT640, is this doable? If it's built on the same card, (both have 384 CUDA cores) can we just reduce the numbers by a rough % of the core clock speed to get rough numbers that the respective cards would put out? I.E. the 640M LE has a clock of 500mhz, the 640M is ~625Mhz. Could we expect ~55% of this for the 640M LE and 67% for the 640M? Assuming DDR3 on both so as not to have that kind of difference.
  • Ryan Smith - Wednesday, June 20, 2012 - link

    It would be fairly easy to test a desktop card at a mobile card's clocks (assuming memory type and functional unit count was equal) but you can't extrapolate performance like that because there's more to performance than clockspeeds. In practice performance shouldn't drop by that much since we're already memory bandwidth bottlenecked with DDR3.
  • jstabb - Wednesday, June 20, 2012 - link

    Can you verify if creating a custom resolution breaks 3D (frame packed) blu-ray playback?

    With my GT430, once a custom resolution has been created for 23/24hz, that custom resolution overrides the 3D frame-packed resolution created when 3D vision is enabled. The driver appeared to have a simple fall through logic. If a custom resolution is defined for the selected resolution/refresh rate it is always used, failing that it will use a 3D resolution if one is defined, failing that it will use the default 2D resolution.

    This issue made the custom resolution feature useless to me with the GT430 and pushed me to an AMD solution for their better OOTB refresh rate matching. I'd like to consider this card if the issue has been resolved.

    Thanks for the great review!
  • MrSpadge - Wednesday, June 20, 2012 - link

    It consumes about just as much as the HD7750-800, yet performs miserably in comparison. This is an amazing win for AMD, especially comparing GTX680 and HD7970!
  • UltraTech79 - Wednesday, June 20, 2012 - link

    This preform about as well as an 8800GTS for twice the price. Or half the preformance of a 460GTX for the same price.

    These should have been priced at 59.99.

Log in

Don't have an account? Sign up now