A Near-Perfect HTPC

Since 2006 Intel’s graphics cores have supported sending 8-channel LPCM audio over HDMI. In 2010 Intel enabled bitstreaming of up to eight channels of lossless audio typically found on Blu-ray discs via Dolby TrueHD and DTS-HD MA codecs. Intel’s HD Graphics 3000/2000 don’t add anything new in the way of audio or video codec support.

Dolby Digital, TrueHD (up to 7.1), DTS, DTS-HD MA (up to 7.1) can all be bitstreamed over HDMI. Decoded audio can also be sent over HDMI. From a video standpoint, H.264, VC-1 and MPEG-2 are all hardware accelerated. The new GPU enables HDMI 1.4 and Blu-ray 3D support. Let’s run down the list:

Dolby TrueHD Bitstreaming? Works:

DTS HD-MA bitstreaming? Yep:

Blu-ray 3D? Make that three:

How about 23.976 fps playback? Sorry guys, even raking in $11 billion a quarter doesn’t make you perfect.

Here’s the sitch, most movie content is stored at 23.976 fps but incorrectly referred to as 24p or 24 fps. That sub-30 fps frame rate is what makes movies look like, well, movies and not soap operas (this is also why interpolated 120Hz modes on TVs make movies look cheesey since they smooth out the 24 fps film effect). A smaller portion of content is actually mastered at 24.000 fps and is also referred to as 24p.

In order to smoothly playback either of these formats you need a player and a display device capable of supporting the frame rate. Many high-end TVs and projectors support this just fine, however on the playback side Intel only supports the less popular of the two: 24.000Hz.

This isn’t intentional, but rather a propagation of an oversight that started back with Clarkdale. Despite having great power consumption and feature characteristics, Clarkdale had one glaring issue that home theater enthusiasts discovered: despite having a 23Hz setting in the driver, Intel’s GPU would never output anything other than 24Hz to a display.

The limitation is entirely in hardware, particularly in what’s supported by the 5-series PCH (remember that display output is routed from the processor’s GPU to the video outputs via the PCH). One side effect of trying to maintain Intel’s aggressive tick-tock release cadence is there’s a lot of design reuse. While Sandy Bridge was a significant architectural redesign, the risk was mitigated by reusing much of the 5-series PCH design. As a result, the hardware limitation that prevented a 23.976Hz refresh rate made its way into the 6-series PCH before Intel discovered the root cause.

Intel had enough time to go in and fix the problem in the 6-series chipsets, however doing so would put the chipset schedule at risk given that fixing the problem requires a non-trivial amount of work to correct. Not wanting to introduce more risk into an already risky project (brand new out of order architecture, first on-die GPU, new GPU architecture, first integrated PLL), Intel chose to not address it this round, which is why we still have the problem today.


Note the frame rate

What happens when you try to play 23.976 fps content on a display that refreshes itself 24.000 times per second? You get a repeated frame approximately every 40 seconds to synchronize the source frame rate with the display frame rate. That repeated frame appears to your eyes as judder in motion, particularly evident in scenes involving a panning camera.

How big of an issue this is depends on the user. Some can just ignore the judder, others will attempt to smooth it out by setting their display to 60Hz, while others will be driven absolutely insane by it.

If you fall into the latter category, your only option for resolution is to buy a discrete graphics card. Currently AMD’s Radeon HD 5000 and 6000 series GPUs correctly output a 23.976Hz refresh rate if requested. These GPUs also support bitstreaming Dolby TrueHD and DTS-HD MA, while the 6000 series supports HDMI 1.4a and stereoscopic 3D. The same is true for NVIDIA’s GeForce GT 430, which happens to be a pretty decent discrete HTPC card.

Intel has committed to addressing the problem in the next major platform revision, which unfortunately seems to be Ivy Bridge in 2012. There is a short-term solution for HTPC users absolutely set on Sandy Bridge. Intel has a software workaround that enables 23.97Hz output. There’s still a frame rate mismatch at 23.97Hz, but it would be significantly reduced compared to the current 24.000Hz-only situation.

MPC-HC Compatibility Problems

Just a heads up. Media Player Classic Home Cinema doesn't currently play well with Sandy Bridge. Enabling DXVA acceleration in MPC-HC will cause stuttering and image quality issues during playback. It's an issue with MPC-HC and not properly detecting SNB as far as I know. Intel has reached out to the developer for a fix.

The Future: Z68 Chipset in Q2, LGA-2011 in Q4 Intel’s Quick Sync Technology
POST A COMMENT

282 Comments

View All Comments

  • mastrdrver - Monday, January 03, 2011 - link

    Was just looking at the pictures that are downloadable and comparing and notice a couple of differences. Maybe they are just a driver tweak but I thought I remember ATI and/or nVidia getting slammed in the past for pulling similar tactics.

    The first thing I notice was when comparing the AA shots in COD. It appears that maybe the Sandy Bridge graphics isn't applying AA to the twigs in the ground. Or is this just an appearance thing where Intel might have a different algorithm that causing this?

    The second is a little more obvious to me. In the Dirt 2 pictures I notice that Sandy Bridge is blurring and not clearly rendering the distance objects. The sign to the right side is what caught my eye.

    One last thing is the DAO pictures. I've seen someone (in the past) post up pictures of the same exact place in the game. The quality looks a lot better then what Anand has shown and I was wondering if that is correct. I don't have the game so I have no way to confirm.

    As always Anand I appreciate the time you and your staff take to do all of your articles and the quality that results. Its just one of the reasons why I've always found myself coming back here ever since the early years of your website.
    Reply
  • RagingDragon - Monday, January 03, 2011 - link

    Why don't K series parts get the full suite of virtualization features? Reply
  • xxtypersxx - Monday, January 03, 2011 - link

    Anand,
    Great review as always, I love the in depth feature analysis that Anandtech provides.

    Bios updates have been released for Gigabyte, Asus, and Intel P67 boards that correct an internal PLL overvolt issue that was artificially limiting overclocks. Users in the thread over at HWbot are reporting that processors that were stuck at 4.8 before are now hitting 5.4ghz.
    http://hwbot.org/forum/showthread.php?t=15952

    Would you be able to do a quick update on the overclocking results for your chips with the new BIOS updates?
    Reply
  • Gothmoth - Monday, January 03, 2011 - link

    ".....Sandy Bridge will be worth the upgrade for Quick Sync alone."

    you say that and a few pages before you say it will not work on PC´s with a discreet grafic card.

    i don´t know you but videoencoding is done here on performance systems.
    system that have discreet GFX cards like a 460 GTX or better.

    and i think most enthusiast will buy a P67 mainboard and that would mean NO QUICK SYNC for them.

    so please do an update on your review and clarify what exactly happens when you use a P67 mainboard with a discreet GFX card.

    will quick sync really don´t work...??
    Reply
  • Gothmoth - Monday, January 03, 2011 - link

    please make clear how you have tested quick sync in your review.

    i saw a few comments from people that are confused about your review.
    i guess you tested quick sync on an H67 mainboard but i did not notice that you mentioned that in the text.

    for my it looks liek intel is screwing the user who buy this 1. generation sandy bridge chipsets.

    i will wait for Z68 thats for sure......
    Reply
  • Manabu - Monday, January 03, 2011 - link

    In the quick sync test I missed a comparison with x264, that is currently the fastest and highest quality encoder for H.264, on an fast CPU. For example, using the presets superfast and very slow (one for speed with reasonable quality, the other for quality with reasonable speed). Also, with an too high bitrate, even the crapiest encoder will look good...

    I also wanted to see how low you can undervolt an i5-2400 when it has hit the overclocking cap, and how is the power consumption then. The same for the other locked CPUs would be cool too. Also, what is the power consumption of the sandy bridge CPUs running the quick sync hardware encoder?
    Reply
  • NJoy - Monday, January 03, 2011 - link

    Wow, what a SLAP in AMD's face! The idea they nursed for gazillion years and were set to finally release somewhere this week is brought to you, dear customer, first to the market, with a sudden change in NDA deadline to please you sooner with a hyperperformer from Intel. Who cares that NDAs make an important play in all planning activities, PR, logistics and whatever follows - what matters is that they are first to put the GPU on-die and this is what the average Joe will now know, with a bit of PR, perhaps. Snatch another design win. Hey, AMD, remember that pocket money the court ordered us to pay you? SLAP! And the licence? SLAP! Nicely planned and executed whilst everyone was so distracted with the DAAMIT versus nVidia battles and, ironically, a lack of leaks from the red camp.
    I just hope Bulldozer will kick some assess, even though I doubt it's really going to happen...
    Reply
  • DanNeely - Monday, January 03, 2011 - link

    If AMD didn't put a steel toed boot into their own nuts by blowing the original 09Q3 release date for fusion I'd have more sympathy for them. Intel won because they made their launch date while the competition blew theirs by at least half a year. Reply
  • GeorgeH - Monday, January 03, 2011 - link

    With the unlocked multipliers, the only substantive difference between the 2500K and the 2600K is hyperthreading. Looking at the benchmarks here, it appears that at equivalent clockspeeds the 2600K might actually perform worse on average than the 2500K, especially if gaming is a high priority.

    A short article running both the 2500K and the 2600K at equal speeds (say "stock" @3.4GHz and overclocked @4.4GHz) might be very interesting, especially as a possible point of comparison for AMD's SMT approach with Bulldozer.

    Right now it looks like if you're not careful you could end up paying ~$100 more for a 2600K instead of a 2500K and end up with worse performance.
    Reply
  • Gothmoth - Monday, January 03, 2011 - link

    and what benchmarks you are speaking about?

    as anand wrote HT has no negative influence on performance.
    Reply

Log in

Don't have an account? Sign up now