An interesting feature has turned up in NVIDIA’s latest drivers: the ability to drive certain displays over HDMI at 4K@60Hz. This is a feat that would typically require HDMI 2.0 – a feature not available in any GPU shipping thus far – so to say it’s unexpected is a bit of an understatement. However as it turns out the situation is not quite cut & dry as it first appears, so there is a notable catch.

First discovered by users, including AT Forums user saeedkunna, when Kepler based video cards using NVIDIA’s R340 drivers are paired up with very recent 4K TVs, they gain the ability to output to those displays at 4K@60Hz over HDMI 1.4. These setups were previously limited to 4K@30Hz due to HDMI bandwidth availability, and while those limitations haven’t gone anywhere, TV manufacturers and now NVIDIA have implemented an interesting workaround for these limitations that teeters between clever and awful.

Lacking the available bandwidth to fully support 4K@60Hz until the arrival of HDMI 2.0, the latest crop of 4K TVs such as the Sony XBR 55X900A and Samsung UE40HU6900 have implemented what amounts to a lower image quality mode that allows for a 4K@60Hz signal to fit within HDMI 1.4’s 8.16Gbps bandwidth limit. To accomplish this, manufacturers are making use of chroma subsampling to reduce the amount of chroma (color) data that needs to be transmitted, thereby freeing up enough bandwidth to increase the image resolution from 1080p to 4K.


An example of a current generation 4K TV: Sony's XBR 55X900A

Specifically, manufacturers are making use of Y'CbCr 4:2:0 subsampling, a lower quality sampling mode that requires ¼ the color information of regular Y'CbCr 4:4:4 sampling or RGB sampling. By using this sampling mode manufacturers are able to transmit an image that utilizes full resolution luma (brightness) but a fraction of the chroma resolution, allowing manufacturers to achieve the necessary bandwidth savings.


Wikipedia: diagram on chroma subsampling

The use of chroma subsampling is as old as color television itself, however the use of it in this fashion is uncommon. Most HDMI PC-to-TV setups to date use RGB or 4:4:4 sampling, both of which are full resolution and functionally lossless. 4:2:0 sampling on the other hand is not normally used for the last stage of transmission between source and sink devices – in fact HDMI didn’t even officially support it until recently – and is instead used in the storage of source material itself, be it Blu-Ray discs, TV broadcasts, or streaming videos.

Perceptually 4:2:0 is an efficient way to throw out unnecessary data, making it a good way to pack video, but at the end of the day it’s still ¼ the color information of a full resolution image. Since video sources are already 4:2:0 this ends up being a clever way to transmit video to a TV, as at the most basic level a higher quality mode would be redundant (post-processing aside). But while this works well for video it also only works well for video; for desktop workloads it significantly degrades the image as the color information needed to drive subpixel-accurate text and GUIs is lost.

In any case, with 4:2:0 4K TVs already on the market, NVIDIA has confirmed that they are enabling 4:2:0 4K output on Kepler cards with their R340 drivers. What this means is that Kepler cards can drive 4:2:0 4K TVs at 60Hz today, but they are doing so in a manner that’s only useful for video. For HTPCs this ends up being a good compromise and as far as we can gather this is a clever move on NVIDIA’s part. But for anyone who is seeing the news of NVIDIA supporting 4K@60Hz over HDMI and hoping to use a TV as a desktop monitor, this will still come up short. Until the next generation of video cards and TVs hit the market with full HDMI 2.0 support (4:4:4 and/or RGB), DisplayPort 1.2 will remain the only way to transmit a full resolution 4K image.

Comments Locked

54 Comments

View All Comments

  • surt - Friday, June 20, 2014 - link

    Yawn. Wake me when I can run 4k 120p.
  • SirKnobsworth - Saturday, June 21, 2014 - link

    DisplayPort 1.3 will allow this, and the spec is due out some time this year. Hopefully we'll start seeing it in actual devices next year.
  • haardrr - Sunday, June 22, 2014 - link

    funny for 120hz 4k... 12Ghz... 3860x2160x12x120... (x * y resolution) * bit depth * frame rate = 12GHz because it is digital. so unless some magic modulation scheme is applied, it's not going to happen
  • willis936 - Monday, June 23, 2014 - link

    I'd like to point out that it's 24 bits per pixel so the required bitrate is closer to 24 Gbps and with typical two state NRZ you need at least 12 Ghz. You had the right number but for wrong reasons.
  • sheh - Friday, June 20, 2014 - link

    What about TVs finally supporting (officially) 1080p at 120Hz? I still don't understand why almost all TVs with >60Hz panels don't support it when there's no lack of bandwidth.
  • eio - Friday, June 20, 2014 - link

    Great workaround...but I wonder what would be the benefit of 60hz in video without 3D?
  • knightspawn1138 - Friday, June 20, 2014 - link

    The benefit is that the TV doesn't have to invent those in-between frames to fill the screen during those extra refresh cycles. Right now, when a 120 Hz TV gets a signal at 30 or 60 Hz, it is supposed to just re-display the same image until the next new one comes in. In a 30-120 situation, the same image is displayed 4 times in a row, until the next one comes to the TV, while a 60-120 situation displays the same image 2 times in a row. When the TV has extra processing (like "Auto Motion Plus," "MotionFlow," or "TruMotion") being done to the signal, what's happening is the TV is guessing what the next frame will look like, and filling the in between refreshes with those best guesses, instead of the same unaltered frame. So, if your computer can send more frames per second to the TV, the TV does less guessing (which means the image you see will have less mistakes), and that is where the real benefit is.
  • Death666Angel - Friday, June 20, 2014 - link

    What about the only scenario where this doesn't involve massive picture quality loss, i.e. 24 fps video? You only have 24 frames per second of video there, so whether you display it at 30Hz or 60 Hz (or higher), you don't get any benefit.
  • Mr Perfect - Friday, June 20, 2014 - link

    If I'm understanding correctly, you're just going to have lower panel lag. The panel switching at 60Hz will change the pixel state twice as fast as the one switching at 30Hz(naturally), so the pixels will be at the correct color sooner. You can see the effect in 120Hz monitor reviews, where the 120Hz panels have all finished changing color before the new frame comes in, but the 60Hz panels still have ghosts of the previous frame. See the following monitor review for examples.

    http://www.tftcentral.co.uk/reviews/asus_vg278he.h...
  • SlyNine - Saturday, June 21, 2014 - link

    Unless I'm mistaking. It doesn't effect the pixel response. The pixels attempt to change as fast as they can. What it changes is how many requests are made per second.

    You could attempt to drive a IPS panel at 120hz. But since the pixels will be changing no faster then before ( which was barely fast enough for 60 fps) you get an old image that persists across multiple frames.

    That's my understanding.

Log in

Don't have an account? Sign up now