An interesting feature has turned up in NVIDIA’s latest drivers: the ability to drive certain displays over HDMI at 4K@60Hz. This is a feat that would typically require HDMI 2.0 – a feature not available in any GPU shipping thus far – so to say it’s unexpected is a bit of an understatement. However as it turns out the situation is not quite cut & dry as it first appears, so there is a notable catch.

First discovered by users, including AT Forums user saeedkunna, when Kepler based video cards using NVIDIA’s R340 drivers are paired up with very recent 4K TVs, they gain the ability to output to those displays at 4K@60Hz over HDMI 1.4. These setups were previously limited to 4K@30Hz due to HDMI bandwidth availability, and while those limitations haven’t gone anywhere, TV manufacturers and now NVIDIA have implemented an interesting workaround for these limitations that teeters between clever and awful.

Lacking the available bandwidth to fully support 4K@60Hz until the arrival of HDMI 2.0, the latest crop of 4K TVs such as the Sony XBR 55X900A and Samsung UE40HU6900 have implemented what amounts to a lower image quality mode that allows for a 4K@60Hz signal to fit within HDMI 1.4’s 8.16Gbps bandwidth limit. To accomplish this, manufacturers are making use of chroma subsampling to reduce the amount of chroma (color) data that needs to be transmitted, thereby freeing up enough bandwidth to increase the image resolution from 1080p to 4K.


An example of a current generation 4K TV: Sony's XBR 55X900A

Specifically, manufacturers are making use of Y'CbCr 4:2:0 subsampling, a lower quality sampling mode that requires ¼ the color information of regular Y'CbCr 4:4:4 sampling or RGB sampling. By using this sampling mode manufacturers are able to transmit an image that utilizes full resolution luma (brightness) but a fraction of the chroma resolution, allowing manufacturers to achieve the necessary bandwidth savings.


Wikipedia: diagram on chroma subsampling

The use of chroma subsampling is as old as color television itself, however the use of it in this fashion is uncommon. Most HDMI PC-to-TV setups to date use RGB or 4:4:4 sampling, both of which are full resolution and functionally lossless. 4:2:0 sampling on the other hand is not normally used for the last stage of transmission between source and sink devices – in fact HDMI didn’t even officially support it until recently – and is instead used in the storage of source material itself, be it Blu-Ray discs, TV broadcasts, or streaming videos.

Perceptually 4:2:0 is an efficient way to throw out unnecessary data, making it a good way to pack video, but at the end of the day it’s still ¼ the color information of a full resolution image. Since video sources are already 4:2:0 this ends up being a clever way to transmit video to a TV, as at the most basic level a higher quality mode would be redundant (post-processing aside). But while this works well for video it also only works well for video; for desktop workloads it significantly degrades the image as the color information needed to drive subpixel-accurate text and GUIs is lost.

In any case, with 4:2:0 4K TVs already on the market, NVIDIA has confirmed that they are enabling 4:2:0 4K output on Kepler cards with their R340 drivers. What this means is that Kepler cards can drive 4:2:0 4K TVs at 60Hz today, but they are doing so in a manner that’s only useful for video. For HTPCs this ends up being a good compromise and as far as we can gather this is a clever move on NVIDIA’s part. But for anyone who is seeing the news of NVIDIA supporting 4K@60Hz over HDMI and hoping to use a TV as a desktop monitor, this will still come up short. Until the next generation of video cards and TVs hit the market with full HDMI 2.0 support (4:4:4 and/or RGB), DisplayPort 1.2 will remain the only way to transmit a full resolution 4K image.

POST A COMMENT

55 Comments

View All Comments

  • Mr Perfect - Saturday, June 21, 2014 - link

    Yes, you're right.

    I was thinking he was comparing a 30Hz panel(that has a pixel response that is only fast enough to support 30Hz) to a 60Hz panel(with a pixel response fast enough to support 60Hz).
    Reply
  • cheinonen - Saturday, June 21, 2014 - link

    That's incorrect. In a 60 Hz 4K display, the pixels are always going to refresh at 60 Hz. If it has a 30 Hz signal, then it refreshes each frame twice. If you have a 24 Hz film signal, then you have 3:2 pull-down applied and get judder. Panels that run at 120 Hz in TVs work the same way, only you can eliminate judder by repeating each frame 5 times.

    The benefit here is you can do 4K video content that is at 60 Hz (like the World Cup right now) natively. Since all consumer video content is already 4:2:0 you don't lose any resolution. As many displays convert everything back to 4:2:0 (even if they receive a 4:2:2, 4:4:4 or RGB signal) before sending it to the screen, you won't notice a difference in color detail as well.
    Reply
  • Mr Perfect - Saturday, June 21, 2014 - link

    Hmm, so would the 60Hz have more or less judder on a 24FPS film then 30Hz? I'd think the higher refresh would be more likely to be ready when the frame changes. Reply
  • crimsonson - Sunday, June 22, 2014 - link

    You have a fundamental misunderstanding of chroma subsampling and how it works.

    If you subsample a video that is already a subsample from previous, you further degrade color accuracy and quality.

    And I am unsure what you mean by "onvert everything back to 4:2:0". That is now how chroma subsampling works.
    Reply
  • crimsonson - Sunday, June 22, 2014 - link

    BTW - you actually lose overall resolution resolving power with chroma sumsapling. Reply
  • Sivar - Friday, June 20, 2014 - link

    Interesting stuff, and great article. Packed with useful info and an example in only 7 paragraphs. It's great to feel like I've learned something important and only spent 5 minutes doing it.

    Why does 4:2:0 work so well in video if not on the desktop? Is it because the video quality is already compromised at 4:2:0 or that video inherently needs less color information because of the tendency for smooth gradations and fast motion, compared to the sharp contrasts in desktop video?
    Reply
  • Ryan Smith - Friday, June 20, 2014 - link

    For a better explanation I recommend the Wikipedia article on chroma subsampling, but basically when it comes to images/video the human eye is less sensitive to chroma information than it is luma. Whether someone's skin is sampled 4:4:4 or 4:2:0 is unlikely to be noticed except in extreme circumstances.

    However as you correctly note, text and GUIs have sharp contrasts that natural images do not. This is further exacerbated by the fact that we use subpixel rendering techniques (e.g. ClearType) to improve the readability of text on these relatively low DPI displays. Since 4:2:0 reduces contrast at the edge, it essentially blurs text and makes it harder to read.

    Also of note: JPEG also uses chroma subsampling. Which is part of the reason why it's so hard on text, especially at higher compression ratios where 4:2:0 is used.
    Reply
  • dmytty - Friday, June 20, 2014 - link

    I was surprised that there were no active display adapters shown at Computex '14 for DP 1.2 to HDMI 2.0 with support for 4k 60 fps. Any input? Reply
  • Ryan Smith - Saturday, June 21, 2014 - link

    There's a multitude of potential reasons. At the end of the day the HDMI 2.0 spec was finalized less than a year ago (you usually need a year to bring new silicon like that to market) and I'm not sure if one could do it with just the power provided by DP; you may need to use external USB power, like DL-DVI adapters (which would drive up the cost significantly). Reply
  • dmytty - Saturday, June 21, 2014 - link

    The HDMI 2.0 tx/rx silicon has been ready for some time. As far as power goes, Thunderbolt would have that solved...and think of all those Retina loving Mac fans with Thunderbolt and Retina display needs. Reply

Log in

Don't have an account? Sign up now