A Near-Perfect HTPC

Since 2006 Intel’s graphics cores have supported sending 8-channel LPCM audio over HDMI. In 2010 Intel enabled bitstreaming of up to eight channels of lossless audio typically found on Blu-ray discs via Dolby TrueHD and DTS-HD MA codecs. Intel’s HD Graphics 3000/2000 don’t add anything new in the way of audio or video codec support.

Dolby Digital, TrueHD (up to 7.1), DTS, DTS-HD MA (up to 7.1) can all be bitstreamed over HDMI. Decoded audio can also be sent over HDMI. From a video standpoint, H.264, VC-1 and MPEG-2 are all hardware accelerated. The new GPU enables HDMI 1.4 and Blu-ray 3D support. Let’s run down the list:

Dolby TrueHD Bitstreaming? Works:

DTS HD-MA bitstreaming? Yep:

Blu-ray 3D? Make that three:

How about 23.976 fps playback? Sorry guys, even raking in $11 billion a quarter doesn’t make you perfect.

Here’s the sitch, most movie content is stored at 23.976 fps but incorrectly referred to as 24p or 24 fps. That sub-30 fps frame rate is what makes movies look like, well, movies and not soap operas (this is also why interpolated 120Hz modes on TVs make movies look cheesey since they smooth out the 24 fps film effect). A smaller portion of content is actually mastered at 24.000 fps and is also referred to as 24p.

In order to smoothly playback either of these formats you need a player and a display device capable of supporting the frame rate. Many high-end TVs and projectors support this just fine, however on the playback side Intel only supports the less popular of the two: 24.000Hz.

This isn’t intentional, but rather a propagation of an oversight that started back with Clarkdale. Despite having great power consumption and feature characteristics, Clarkdale had one glaring issue that home theater enthusiasts discovered: despite having a 23Hz setting in the driver, Intel’s GPU would never output anything other than 24Hz to a display.

The limitation is entirely in hardware, particularly in what’s supported by the 5-series PCH (remember that display output is routed from the processor’s GPU to the video outputs via the PCH). One side effect of trying to maintain Intel’s aggressive tick-tock release cadence is there’s a lot of design reuse. While Sandy Bridge was a significant architectural redesign, the risk was mitigated by reusing much of the 5-series PCH design. As a result, the hardware limitation that prevented a 23.976Hz refresh rate made its way into the 6-series PCH before Intel discovered the root cause.

Intel had enough time to go in and fix the problem in the 6-series chipsets, however doing so would put the chipset schedule at risk given that fixing the problem requires a non-trivial amount of work to correct. Not wanting to introduce more risk into an already risky project (brand new out of order architecture, first on-die GPU, new GPU architecture, first integrated PLL), Intel chose to not address it this round, which is why we still have the problem today.


Note the frame rate

What happens when you try to play 23.976 fps content on a display that refreshes itself 24.000 times per second? You get a repeated frame approximately every 40 seconds to synchronize the source frame rate with the display frame rate. That repeated frame appears to your eyes as judder in motion, particularly evident in scenes involving a panning camera.

How big of an issue this is depends on the user. Some can just ignore the judder, others will attempt to smooth it out by setting their display to 60Hz, while others will be driven absolutely insane by it.

If you fall into the latter category, your only option for resolution is to buy a discrete graphics card. Currently AMD’s Radeon HD 5000 and 6000 series GPUs correctly output a 23.976Hz refresh rate if requested. These GPUs also support bitstreaming Dolby TrueHD and DTS-HD MA, while the 6000 series supports HDMI 1.4a and stereoscopic 3D. The same is true for NVIDIA’s GeForce GT 430, which happens to be a pretty decent discrete HTPC card.

Intel has committed to addressing the problem in the next major platform revision, which unfortunately seems to be Ivy Bridge in 2012. There is a short-term solution for HTPC users absolutely set on Sandy Bridge. Intel has a software workaround that enables 23.97Hz output. There’s still a frame rate mismatch at 23.97Hz, but it would be significantly reduced compared to the current 24.000Hz-only situation.

MPC-HC Compatibility Problems

Just a heads up. Media Player Classic Home Cinema doesn't currently play well with Sandy Bridge. Enabling DXVA acceleration in MPC-HC will cause stuttering and image quality issues during playback. It's an issue with MPC-HC and not properly detecting SNB as far as I know. Intel has reached out to the developer for a fix.

The Future: Z68 Chipset in Q2, LGA-2011 in Q4 Intel’s Quick Sync Technology
Comments Locked

283 Comments

View All Comments

  • nuudles - Monday, January 3, 2011 - link

    Anand, im not the biggest intel fan (due to their past grey area dealings) but I dont think the naming is that confusing. As I understand it they will move to the 3x00 series with Ivy Bridge, basically the higher the second number the faster the chip.

    It would be nice if there was something in the name to easily tell consumers the number of cores and threads, but the majority of consumers just want the fatest chip for their money and dont care how many cores or threads it has.

    The ix part tells enthusiasts the number of cores/threads/turbo with the i3 having 2/4/no, the i5 having 4/4/yes and i7 4/8/yes. I find this much simpler than the 2010 chips which had some dual and some quad core i5 chips for example.

    I think AMD's gpus has a sensible naming convention (except for the 68/6900 renaming) without the additional i3/i5/i7 modifier by using the second number as the tier indicator while maintaining the rule of thumb of "a higher number within a- generation means faster", if intel adopted something similar it would have been better.

    That said I wish they stick with a naming convention for at least 3 or 4 generations...
  • nimsaw - Monday, January 3, 2011 - link

    ",,but until then you either have to use the integrated GPU alone or run a multimonitor setup with one monitor connected to Intel’s GPU in order to use Quick Sync"

    So have you tested the Transcoding with QS by using an H67 chipset based motherboard? The Test Rig never mentions any H67 motherboard. I am somehow not able to follow how you got the scores for the Transcode test. How do you select the codepath if switching graphics on a desktop motherboard is not possible? Please throw some light on it as i am a bit confused here. You say that QS gives a better quality output than GTX 460, so does that mean, i need not invest in a discrete GPU if i am not gaming. Moreover, why should i be forced to use the discrete GPU in a P67 board when according to your tests, the Intel QS is giving a better output.
  • Anand Lal Shimpi - Monday, January 3, 2011 - link

    I need to update the test table. All of the Quick Sync tests were run on Intel's H67 motherboard. Presently if you want to use Quick Sync you'll need to have an H67 motherboard. Hopefully Z68 + switchable graphics will fix this in Q2.

    Take care,
    Anand
  • 7Enigma - Monday, January 3, 2011 - link

    I think this needs to be a front page comment because it is a serious deficiency that all of your reviews fail to properly describe. I read them all and it wasn't until the comments came out that this was brought to light. Seriously SNB is a fantastic chip but this CPU/mobo issue is not insignificant for a lot of people.
  • Wurmer - Monday, January 3, 2011 - link

    I haven't read through all the comments and sorry if it's been said but I find it weird that the most ''enthusiast'' chip K, comes with the better IGP when most people buying this chip will for the most part end up buying a discreet GPU.
  • Akv - Monday, January 3, 2011 - link

    It's being said in reviews from China to France to Brazil, etc.
  • nimsaw - Monday, January 3, 2011 - link

    Strangely enough i also have the same query. what is the point of better Integrated graphics when you cannot use them on a P67 mobo?
    also i came across this screen shot

    http://news.softpedia.com/newsImage/Intel-Sandy-Br...

    where on the right hand corner you have a Drop Down menu which has selected Intel Quick Sync. Will you see a discrete GPU if you expand it? Does it not mean switching between graphics solutions. In the review its mentioned that switchable graphics is still to find its way in desktop mobos.
  • sticks435 - Tuesday, January 4, 2011 - link

    It looks like that drop down is dithered, which means it's only displaying the QS system at the moment, but has a possibility to select multiple options in the future or maybe if you had 2 graphics cards etc.
  • HangFire - Monday, January 3, 2011 - link

    You are comparing video and not chipsets, right?

    I also take issue with the statement that the 890GX (really HD 4290) is the current onboard video cream of the crop. Test after test (on other sites) show it to be a bit slower than the HD4250, even though it has higher specs.

    I also think Intel is going to have a problem with folks comparing their onboard HD3000 to AMD's HD 4290, it just sounds older and slower.

    No word on Linux video drivers for the new HD2000 and HD3000? Considering what a mess KMS has made of the old i810 drivers, we may be entering an era where accelerated onboard Intel video is no longer supported on Linux.
  • mino - Wednesday, January 5, 2011 - link

    Actually, 890GX is just a re-badged 780G from 2008 with sideport memory.

    And no HD4250 is NOT faster. While some specific implementation of 890GX wthout sideport _might_ be slower, it would also be cheaper and not really a "proper" representative.
    (890GX withou sedeport is like sayin i3 with dual channel RAM is "faster" in games than i5 with single channel RAM ...)

Log in

Don't have an account? Sign up now