4K for the Masses

After our experience with Trinity and Ivy Bridge builds for HTPC purposes, we had reached the conclusion that a discrete GPU was necessary only if advanced rendering algorithms (using madVR's resource intensive scaling algorithms) or 4K support was necessary. In fact, the 4K media player supplied by Sony along with their $25K 84" 4K TV was a Dell XPS desktop PC with a AMD graphics card's HDMI output providing the 4K signal to the TV. Ivy Bridge obtained 4K display support last October, but not over the HDMI port (which is the only way to get 4K content on supported TVs).

The good news is that Haswell's 4K over HDMI works well, in a limited sort of way. In our first experiment, we connected our build to a Sony XBR-84X900 84" 4K LED TV. The full set of supported 4K resolutions (4096x2160 @ 23 Hz and 24 Hz, as well as 3840x2160 @ 23 Hz, 24 Hz, 25 Hz, 29 Hz and 30 Hz) was driven without issues.

4K H.264 decode using DXVA2 Native and QuickSync modes in LAV Video Decoder works without issues (this works well in Ivy Bridge too, just that Ivy Bridge didn't have the ability to output 4K over HDMI or any other single video link). Using madVR with 4K is out of the question (even with DXVA2 scaling), but EVR and EVR-CP both work without dropping any frames.

Now, for the bad news: If you are hoping to drive the ~$1300 Seiki Digital SE50UY04 50" 4K TV (the cheapest 4K TV in the market right now), I would suggest some caution. Our build tried to drive a 3840x2160 @ 30 Hz resolution to the Seiki TV on boot, but the HDMI link never got locked (the display would keep flickering on and off). The frequency of locking was inversely proportional to the HDMI cable length. The NVIDIA GT 640s that we tested in the same setup with the same cables and TV managed to drive the 4K Quad FHD resolutions without problems. We were able to recreate the situation with multiple Seiki units.

At this juncture, we are not sure whether this is an issue with the ASRock Z87E-ITX board in particular or a problem for all Haswell boards. Intel suggested that the HDMI level shifter used by ASRock might not be up to the mark for 4K output, but that doesn't explain why the output to the Sony 84" TV worked without issues. In short, if you have a Seiki 4K TV and want to use a PC to drive that, we would suggest using a NVIDIA GT 640 or greater / AMD 7750 or greater for now. We will update this section as and when we reach closure on the issue with ASRock / Intel.

Network Streaming Performance - Netflix and YouTube QuickSync Gets Open Source Support, Regresses in Quality
Comments Locked

95 Comments

View All Comments

  • eio - Sunday, June 23, 2013 - link

    great example! very interesting.
    I agree with Montage that for most snapshots, HD4600 is significantly better than HD4000 for retaining much more texture, even for this frame 4 in 1080p.
    but in 720p HD4600 shows its trade off of keep more fine grained texture: looks like HD4600 are regressed in low contrast, large scale structral infomation.
    as you said, this type of regression can be more evident in video than snapshots.
  • eio - Sunday, June 23, 2013 - link

    another thing that surprises me is: x264 is a clear loser in this test. I don't understand why, what are the specific params that handbrake used to call x264?
  • nevcairiel - Monday, June 3, 2013 - link

    @ganeshts

    I'm curious, what did you use for DXVA2N testing of VC-1?
    LAV Video doesn't support VC-1 DXVA2 on Intel, at least on Ivy Bridge, and i doubt Haswell changed much (although it would be a nice surprise, i'll see for myself in a few days)
  • ganeshts - Monday, June 3, 2013 - link

    Hendrik,

    I made a note that DXVA2N for interlaced VC-1 has software fallback.

    That issue is still not fixed in Haswell. That is why you see QuickSync consuming lower power compared to DXVA2N for the interlaced VC-1 sample.
  • zilexa - Monday, June 3, 2013 - link

    To be honest, now that I have a near-perfect Raspberry setup, I would never buy a Core ix/AMD Ax HTPC anymore. Huge waiste of money for almost un-noticable image quality improvement.
    The Raspberry Pi will use max 6.5w, usually much lower. Speed in XBMC is no issue anymore, and it plays back all my movies just fine (Batman imax x264 rip 7-15MBps). I play mostly downloaded tv shows, streams and occasionally a movie. It also takes care of the whole download process in the background. So I don't even have a computer anymore at home. I sold my old AMD 780G based Silverstone M2 HTPC for €170 and it was the best decision ever.

    Still cool to read about the high end possibilities of HTPC/MadVR or actually just video playback and encoding, cos thats what this is really about. But I would never buy a system to be able to support this. HTPC in my opinion is to be in a lazy mode and able to playback your shows/movies/watch your photos and streams in good HD quality and audio.

    If you need HTPC, in my opinion there is no need for such an investment in a computer system which is meant for a huge variety of computing tasks.
  • jwcalla - Monday, June 3, 2013 - link

    It's going to depend on individual needs of course, and I think your Raspberry Pi is on the other end of the extreme, but otherwise I kind of have the same reaction. This has got to be an $800+ build here for an HTPC and then I begin to wonder if this is a practical approach.

    Owing to the fact that Intel's entire marketing strategy is to oversell to the consumer (i.e., sell him much more than he really needs), it seems that sometimes these reviews follow the strategy too closely. For an HTPC? Core i3 at the max. And even that's being generous. If one needs certain workloads like transcoding and such then maybe a higher end box is needed. But then I question if that kind of stuff is appropriate for an HTPC.
  • superjim - Monday, June 3, 2013 - link

    Playback a raw M2TS 1080p 60fps file on your Pi and get back to me.
  • phoenix_rizzen - Monday, June 3, 2013 - link

    How did you get around the "interface is not accelerated" issue on the RPi? I found it completely useless when trying to navigate the XBMC interface itself (you know, to select the show to watch). Sure, once the video was loaded, and processing moved over to the hardware decoder, things ran smooth as silk.

    I sold my RPi two weeks after receiving it due to this issue. Just wasn't worth the headaches. Since moved to a quad-core AthlonII running off an SSD with a fanless nVidia dGPU. So much nicer to work with.
  • vlado08 - Monday, June 3, 2013 - link

    What about Frame Rate Conversion (FRC) capability?
  • ericgl21 - Monday, June 3, 2013 - link

    Ganesh,

    Let's assume you have two 4K/60p video files playing in a loop at the same time for a duration of 3 hours.
    Is it possible that Iris or Iris Pro could play those two video streams at the same time, without dropping frames and without the processor throttling throughout the entire movie playback ?
    I mean, connecting two 4K TVs, one to the HDMI port and the other to the DisplayPort, and outputting each video to each TV. Would you say the Iris / Iris Pro is up to this task? Could you test this scenario?

Log in

Don't have an account? Sign up now