A Near-Perfect HTPC

Since 2006 Intel’s graphics cores have supported sending 8-channel LPCM audio over HDMI. In 2010 Intel enabled bitstreaming of up to eight channels of lossless audio typically found on Blu-ray discs via Dolby TrueHD and DTS-HD MA codecs. Intel’s HD Graphics 3000/2000 don’t add anything new in the way of audio or video codec support.

Dolby Digital, TrueHD (up to 7.1), DTS, DTS-HD MA (up to 7.1) can all be bitstreamed over HDMI. Decoded audio can also be sent over HDMI. From a video standpoint, H.264, VC-1 and MPEG-2 are all hardware accelerated. The new GPU enables HDMI 1.4 and Blu-ray 3D support. Let’s run down the list:

Dolby TrueHD Bitstreaming? Works:

DTS HD-MA bitstreaming? Yep:

Blu-ray 3D? Make that three:

How about 23.976 fps playback? Sorry guys, even raking in $11 billion a quarter doesn’t make you perfect.

Here’s the sitch, most movie content is stored at 23.976 fps but incorrectly referred to as 24p or 24 fps. That sub-30 fps frame rate is what makes movies look like, well, movies and not soap operas (this is also why interpolated 120Hz modes on TVs make movies look cheesey since they smooth out the 24 fps film effect). A smaller portion of content is actually mastered at 24.000 fps and is also referred to as 24p.

In order to smoothly playback either of these formats you need a player and a display device capable of supporting the frame rate. Many high-end TVs and projectors support this just fine, however on the playback side Intel only supports the less popular of the two: 24.000Hz.

This isn’t intentional, but rather a propagation of an oversight that started back with Clarkdale. Despite having great power consumption and feature characteristics, Clarkdale had one glaring issue that home theater enthusiasts discovered: despite having a 23Hz setting in the driver, Intel’s GPU would never output anything other than 24Hz to a display.

The limitation is entirely in hardware, particularly in what’s supported by the 5-series PCH (remember that display output is routed from the processor’s GPU to the video outputs via the PCH). One side effect of trying to maintain Intel’s aggressive tick-tock release cadence is there’s a lot of design reuse. While Sandy Bridge was a significant architectural redesign, the risk was mitigated by reusing much of the 5-series PCH design. As a result, the hardware limitation that prevented a 23.976Hz refresh rate made its way into the 6-series PCH before Intel discovered the root cause.

Intel had enough time to go in and fix the problem in the 6-series chipsets, however doing so would put the chipset schedule at risk given that fixing the problem requires a non-trivial amount of work to correct. Not wanting to introduce more risk into an already risky project (brand new out of order architecture, first on-die GPU, new GPU architecture, first integrated PLL), Intel chose to not address it this round, which is why we still have the problem today.


Note the frame rate

What happens when you try to play 23.976 fps content on a display that refreshes itself 24.000 times per second? You get a repeated frame approximately every 40 seconds to synchronize the source frame rate with the display frame rate. That repeated frame appears to your eyes as judder in motion, particularly evident in scenes involving a panning camera.

How big of an issue this is depends on the user. Some can just ignore the judder, others will attempt to smooth it out by setting their display to 60Hz, while others will be driven absolutely insane by it.

If you fall into the latter category, your only option for resolution is to buy a discrete graphics card. Currently AMD’s Radeon HD 5000 and 6000 series GPUs correctly output a 23.976Hz refresh rate if requested. These GPUs also support bitstreaming Dolby TrueHD and DTS-HD MA, while the 6000 series supports HDMI 1.4a and stereoscopic 3D. The same is true for NVIDIA’s GeForce GT 430, which happens to be a pretty decent discrete HTPC card.

Intel has committed to addressing the problem in the next major platform revision, which unfortunately seems to be Ivy Bridge in 2012. There is a short-term solution for HTPC users absolutely set on Sandy Bridge. Intel has a software workaround that enables 23.97Hz output. There’s still a frame rate mismatch at 23.97Hz, but it would be significantly reduced compared to the current 24.000Hz-only situation.

MPC-HC Compatibility Problems

Just a heads up. Media Player Classic Home Cinema doesn't currently play well with Sandy Bridge. Enabling DXVA acceleration in MPC-HC will cause stuttering and image quality issues during playback. It's an issue with MPC-HC and not properly detecting SNB as far as I know. Intel has reached out to the developer for a fix.

The Future: Z68 Chipset in Q2, LGA-2011 in Q4 Intel’s Quick Sync Technology
Comments Locked

283 Comments

View All Comments

  • DanNeely - Monday, January 3, 2011 - link

    The increased power efficiency might allow Apple to squeeze a GPU onto their smaller laptop boards without loosing runtime due to the smaller battery.
  • yuhong - Monday, January 3, 2011 - link

    "Unlike P55, you can set your SATA controller to compatible/legacy IDE mode. This is something you could do on X58 but not on P55. It’s useful for running HDDERASE to secure erase your SSD for example"
    Or running old OSes.
  • DominionSeraph - Monday, January 3, 2011 - link

    "taking the original Casino Royale Blu-ray, stripping it of its DRM"

    Whoa, that's illegal.
  • RussianSensation - Monday, January 3, 2011 - link

    It would have been nice to include 1st generation Core i7 processors such as 860/870/920-975 in Starcraft 2 bench as it seems to be very CPU intensive.

    Also, perhaps a section with overclocking which shows us how far 2500k/2600k can go on air cooling with safe voltage limits (say 1.35V) would have been much appreciated.
  • Hrel - Monday, January 3, 2011 - link

    Sounds like this is SO high end it should be the server market. I mean, why make yet ANOTHER socket for servers that use basically the same CPU's? Everything's converging and I'd just really like to see server mobo's converge into "High End Desktop" mobo's. I mean seriously, my E8400 OC'd with a GTX460 is more power than I need. A quad would help with the video editing I do in HD but it works fine now, and with GPU accelerated rendering the rendering times are totally reasonable. I just can't imagine anyone NEEDING a home computer more powerful than the LGS-1155 socket can provide. Hell, 80-90% of people are probably fine with the power Sandy Bridge gives in laptops now.
  • mtoma - Monday, January 3, 2011 - link

    Perhaps it is like you say, however it's always good for buyers to decide if they want server-like features in a PC. I don't like manufacturers to dictate to me only one way to do it (like Intel does now with the odd combination of HD3000 graphics - Intel H67 chipset). Let us not forget that for a long time, all we had were 4 slots for RAM and 4-6 SATA connections (like you probably have). Intel X58 changed all that: suddenly we had the option of having 6 slots for RAM, 6-8 SATA connections and enough PCI-Express lanes.
    I only hope that LGA 2011 brings back those features, because like you said: it's not only the performance we need, but also the features.
    And, remeber that the software doesn't stay still, it usualy requires multiple processor cores (video transcoding, antivirus scanning, HDD defragmenting, modern OS, and so on...).
    All this aside, the main issue remains: Intel pus be persuaded to stop luting user's money and implement only one socket at a time. I usually support Intel, but in this regard, AMD deserves congratulations!
  • DanNeely - Monday, January 3, 2011 - link

    LGA 2011 is a high end desktop/server convergence socket. Intel started doing this in 2008, with all but the highest end server parts sharing LGA1366 with top end desktop systems. The exception was quad/octo socket CPUs, and those using enormous amounts of ram using LGA 1567.

    The main reason why LGA 1155 isn't suitable for really high end machines is that it doesn't have the memory bandwidth to feed hex and octo core CPUs. It's also limited to 16 PCIe 2.0 lanes on the CPU vs 36 PCIe 3.0 lanes on LGA2011. For most consumer systems that won't matter, but 3/4 GPU card systems will start loosing a bit of performance when running in a 4x slot (only a few percent, but people who spend $1000-2000 on GPUs want every last frame they can get), high end servers with multiple 10GB ethernet cards and PCIe SSD devices also begin running into bottlenecks.

    Not spending an extra dollar or five per system for the QPI connections only used in multi-socket systems in 1155 also adds up to major savings across the hundreds of millions of systems Intel is planning to sell.
  • Hrel - Monday, January 3, 2011 - link

    I'm confused by the upset over playing video at 23.967hz. "It makes movies look like, well, movies instead of tv shows"? What? Wouldn't recording at a lower frame rate just mean there's missed detail especially in fast action scenes? Isn't that why HD runs at 60fps instead of 30fps? Isn't more FPS good as long as it's played back at the appropriate speed? IE whatever it's filmed at? I don't understand the complaint.

    On a related note hollywood and the world need to just agree that everything gets recorded and played back at 60fps at 1920x1080. No variation AT ALL! That way everything would just work. Or better yet 120FPS and with the ability to turn 3D on and off as u see fit. Whatever FPS is best. I've always been told higher is better.
  • chokran - Monday, January 3, 2011 - link

    You are right about having more detail when filming with higher FPS, but this isn't about it being good or bad, it's more a matter of tradition and visual style.
    The look movies have these days, the one we got accustomed to, is mainly achieved by filming it in 24p or 23.967 to be precise. The look you get when filming with higher FPS just doesn't look like cinema anymore but tv. At least to me. A good article on this:
    http://www.videopia.org/index.php/read/shorts-main...
    The problem with movies looking like TV can be tested at home if you got a TV that has some kind of Motion Interpolation, eg. MotionFlow called by Sony or Intelligent Frame Creation by Panasonic. When turned on, you can see the soap opera effect by adding frames. There are people that don't see it and some that do and like it, but I have to turn it of since it doesn't look "natural" to me.
  • CyberAngel - Thursday, January 6, 2011 - link

    http://en.wikipedia.org/wiki/Showscan

Log in

Don't have an account? Sign up now