One of the main reasons for HTPC purists to override the Intel integrated GPUs was the lack of a proper 23.976 Hz refresh rate. Till Clarkdale, the Intel GPUs refreshed the display at 24 Hz when set to 23 Hz. When Sandy Bridge was launched, it was discovered that the 23 Hz setting could be activated and made to function as intended if UAC was disabled. With v2372 drivers, the disabling of UAC became unnecessary. While this didn't result in perfect 23.976 Hz (locked around 23.972 Hz), it was definitely much better than the earlier scenario.

How does Ivy Bridge fare? The short story is that the behaviour on the P8H77-M Pro board is very similar to Sandy Bridge. As the screenshot below shows, the refresh rate is quite stable around 23.973 Hz. This is as good as the bad AMD and NVIDIA GPU cards.

The good news is that Intel is claiming that this issue is fully resolved in the latest production BIOS on their motherboard. This means that BIOS updates to the current boards from other manufacturers should also get the fix. Hopefully, we should be able to independently test and confirm this soon.

It is not only the 23 Hz setting which is off the mark by a small amount. Other refresh rates also suffer similar problems (with videos played back at that frame rate dropping a frame every 5 minutes or so). The gallery below shows some of the other refresh rates that we tested.

Another aspect we found irritating with Intel's GPU control panel is the custom resolution section. Intel seems very reliant on EDID and doesn't allow the user to input any frequency not supported by the display. Our testbed display (a Sony Bravia KDL46EX720) doesn't indicate PAL compatibility in its EDID information. I was able to play back PAL videos with matched refresh rates using the Vision 3D (NVIDIA GT 425M) as well as the AMD 7750. However, Intel's control panel wouldn't allow me to set up 50 Hz as the display refresh rate. It is possible that an EDID override might help, but we can't help complaining about Intel's control panel not being as user friendly as NVIDIA's (despite the availability of a custom resolutions section in the control panel).

Video Post Processing in Action Video Decoding and Rendering Benchmarks
Comments Locked

70 Comments

View All Comments

  • Exodite - Tuesday, April 24, 2012 - link

    Anyone that says they can tell any difference between a 65% and 95% color gamut is whiny bitch.

    See, I can play that game too!

    Even if I were to buy your "factual" argument, and I don't, I've clearly stated that I care nothing about the things you consider advantages.

    I sit facing the center of my display, brightness and gamma is turned down to minimum levels and saturation is low. Measured power draw at the socket is 9W.

    It's a 2MS TN panel, obviously.

    All I want is more vertical space at a reasonable price, though a 120Hz display would be nice as well.

    My friend is running a 5ms 1080p eIPS display and between that and what I have I'd still pick my current display.

    End of the day it's personal preference, which I made abundantly clear in my first post.

    Though it seems displays, and IPS panels in general, is starting to attract the same amount of douchiness as the audiophile community.
  • Old_Fogie_Late_Bloomer - Tuesday, April 24, 2012 - link

    Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.

    I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.
  • Exodite - Tuesday, April 24, 2012 - link

    Yeah, you shouldn't have gotten into this.

    Point being that whatever the difference is I bet you the same can be said about latency.

    Besides, as I've said from the start it's about the things that you personally appreciate.

    My preferred settings absolutely destroy any kind of color fidelity anyway, and that doesn't even slightly matter as I don't work with professional imagery.

    But I can most definitely appreciate the difference between TN and even eIPS when it comes to gaming. And I consider the former superior.

    I don't /mind/ higher color fidelity or better viewing angles, I'm just sure as hell not going to pay any extra for it.
  • Old_Fogie_Late_Bloomer - Wednesday, April 25, 2012 - link

    I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.

    But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.

    For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.
  • Exodite - Wednesday, April 25, 2012 - link

    To be fair the counterexample wasn't about being correct, because the poster I replied to weren't, but rather about showing what an asshat argument he was making.

    That said it's about the frame of reference.

    Would you be able to tell the difference working with RAW images pulled from your DSLR or other high-quality imagery?

    Sure, side-by-side I have no doubt you would.

    Would you be able to tell the difference when viewing the desktop, a simple web form or an editor where the only color are black, white, two shades of blue and grey?

    Especially once both displays are calibrated to the point I'm comfortable with them. (Cold hue, 0% brightness, low saturation, negative gamma, high contrast.)

    I dare say not.
  • DarkUltra - Monday, April 30, 2012 - link

    I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.

    Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.
  • Origin32 - Saturday, April 28, 2012 - link

    I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it.
    This all goes double for 4K video.

    That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for.
    But not for film, not for me.
  • Old_Fogie_Late_Bloomer - Monday, April 23, 2012 - link

    Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).

    In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.
  • peterfares - Thursday, September 27, 2012 - link

    a 2560x1600 monitor (available for years) has 1.975 times the amount of pixels as a 1920x1080 screen.

    4K would be even better, though!
  • nathanddrews - Monday, April 23, 2012 - link

    4K is a very big deal for a couple reasons: pixel density and film transparency.

    From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.

    Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).

    Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.

    You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.

Log in

Don't have an account? Sign up now