HTPC enthusiasts are often concerned about the quality of pictures output by the system. While this is a very subjective metric, we have been taking as much of an objective approach as possible. We have been using the HQV 2.0 benchmark in our HTPC reviews to identify the GPUs' video post processing capabilities. The HQV benchmarking procedure has been heavily promoted by AMD, and Intel also seems to be putting its weight behind that.

The control panel for the Ivy Bridge GPU has a number of interesting video post processing control knobs which earlier drivers lacked. The most interesting of these is the ability to perform noise reduction on a per-channel basis, i.e, only for luma or for both luma and chroma. More options are always good for consumers, and the interface makes it simple enough to leave the decision making to the drivers or the application. An explicit skin tone correction option is also available.

HQV scores need to be taken with a grain of salt. In particular, one must check the tests where the GPU lost out points. In case those tests don't reflect the reader's usage scenario, the handicap can probably be ignored. So, it is essential that the scores for each test be compared, rather than just the total value.

The HQV 2.0 test suite consists of 39 different streams divided into 4 different classes. For the Ivy Bridge HTPC, we used Cyberlink PowerDVD 12 with TrueTheater disabled and hardware acceleration enabled for playing back the HQV streams. The playback device was assigned scores for each, depending on how well it played the stream. Each test was repeated multiple times to ensure that the correct score was assigned. The scoring details are available in the testing guide from HQV.

Blu-rays are usually mastered very carefully. Any video post processing (other than deinterlacing) which needs to be done is handled before burning it in. In this context, we don't think it is a great idea to run the HQV benchmark videos off the disc. Instead, we play the streams after copying them over to the hard disk. How does the score compare to what was obtained by the Sandy Bridge and Llano at launch?

In the table below, we indicate the maximum score possible for each test, and how much each GPU was able to get. The HD3000 is from the Core i5-2520M with the Intel 15.22.2.64.2372 drivers. The AMD 6550D was tested with Catalyst 11.6, driver version 8.862 RC1 and the HD4000 with driver version 8.15.10.2696

 
HQV 2.0 Benchmark
Test Class Chapter Tests Max. Score Intel HD3000 AMD 6550D (Local file) Intel HD4000
Video Conversion Video Resolution Dial 5 5 4 5
Dial with Static Pattern 5 5 5 5
Gray Bars 5 5 5 5
Violin 5 5 5 5
Film Resolution Stadium 2:2 5 5 5 5
Stadium 3:2 5 5 5 5
Overlay On Film Horizontal Text Scroll 5 3 5 3
Vertical Text Scroll 5 5 5 5
Cadence Response Time Transition to 3:2 Lock 5 5 5 5
Transition to 2:2 Lock 5 5 5 5
Multi-Cadence 2:2:2:4 24 FPS DVCam Video 5 5 5 5
2:3:3:2 24 FPS DVCam Video 5 5 5 5
3:2:3:2:2 24 FPS Vari-Speed 5 5 5 5
5:5 12 FPS Animation 5 5 5 5
6:4 12 FPS Animation 5 5 5 5
8:7 8 FPS Animation 5 5 5 5
Color Upsampling Errors Interlace Chroma Problem (ICP) 5 2 2 5
Chroma Upsampling Error (CUE) 5 2 2 5
Noise and Artifact Reduction Random Noise SailBoat 5 5 5 5
Flower 5 5 5 5
Sunrise 5 5 5 5
Harbour Night 5 5 5 5
Compression Artifacts Scrolling Text 5 3 3 5
Roller Coaster 5 3 3 5
Ferris Wheel 5 3 3 5
Bridge Traffic 5 3 3 5
Upscaled Compression Artifacts Text Pattern 5 3 3 3
Roller Coaster 5 3 3 3
Ferris Wheel 5 3 3 3
Bridge Traffic 5 3 3 3
Image Scaling and Enhancements Scaling and Filtering Luminance Frequency Bands 5 5 5 5
Chrominance Frequency Bands 5 5 5 5
Vanishing Text 5 5 5 5
Resolution Enhancement Brook, Mountain, Flower, Hair, Wood 15 15 15 15
Video Conversion Contrast Enhancement Theme Park 5 5 5 5
Driftwood 5 5 5 5
Beach at Dusk 5 2 5 5
White and Black Cats 5 5 5 5
Skin Tone Correction Skin Tones 10 0 7 7
             
    Total Score 210 173 184 197

A look at the above table reveals that Intel has caught up with the competition in terms of HQV scores. In fact, they have comfortably surpassed what the Llano got at launch time. Many of the driver problems plaguing AMD's GPUs hadn't been fixed when we looked at the AMD 7750 a couple of months back, so it is likely that the Llano's scores have not budged much from what we have above. In fact, the score of 197 ties with what we obtained for the 6570 during our discrete HTPC GPU shootout.

Testbed and Software Setup Video Post Processing in Action
Comments Locked

70 Comments

View All Comments

  • Exodite - Tuesday, April 24, 2012 - link

    Anyone that says they can tell any difference between a 65% and 95% color gamut is whiny bitch.

    See, I can play that game too!

    Even if I were to buy your "factual" argument, and I don't, I've clearly stated that I care nothing about the things you consider advantages.

    I sit facing the center of my display, brightness and gamma is turned down to minimum levels and saturation is low. Measured power draw at the socket is 9W.

    It's a 2MS TN panel, obviously.

    All I want is more vertical space at a reasonable price, though a 120Hz display would be nice as well.

    My friend is running a 5ms 1080p eIPS display and between that and what I have I'd still pick my current display.

    End of the day it's personal preference, which I made abundantly clear in my first post.

    Though it seems displays, and IPS panels in general, is starting to attract the same amount of douchiness as the audiophile community.
  • Old_Fogie_Late_Bloomer - Tuesday, April 24, 2012 - link

    Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.

    I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.
  • Exodite - Tuesday, April 24, 2012 - link

    Yeah, you shouldn't have gotten into this.

    Point being that whatever the difference is I bet you the same can be said about latency.

    Besides, as I've said from the start it's about the things that you personally appreciate.

    My preferred settings absolutely destroy any kind of color fidelity anyway, and that doesn't even slightly matter as I don't work with professional imagery.

    But I can most definitely appreciate the difference between TN and even eIPS when it comes to gaming. And I consider the former superior.

    I don't /mind/ higher color fidelity or better viewing angles, I'm just sure as hell not going to pay any extra for it.
  • Old_Fogie_Late_Bloomer - Wednesday, April 25, 2012 - link

    I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.

    But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.

    For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.
  • Exodite - Wednesday, April 25, 2012 - link

    To be fair the counterexample wasn't about being correct, because the poster I replied to weren't, but rather about showing what an asshat argument he was making.

    That said it's about the frame of reference.

    Would you be able to tell the difference working with RAW images pulled from your DSLR or other high-quality imagery?

    Sure, side-by-side I have no doubt you would.

    Would you be able to tell the difference when viewing the desktop, a simple web form or an editor where the only color are black, white, two shades of blue and grey?

    Especially once both displays are calibrated to the point I'm comfortable with them. (Cold hue, 0% brightness, low saturation, negative gamma, high contrast.)

    I dare say not.
  • DarkUltra - Monday, April 30, 2012 - link

    I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.

    Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.
  • Origin32 - Saturday, April 28, 2012 - link

    I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it.
    This all goes double for 4K video.

    That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for.
    But not for film, not for me.
  • Old_Fogie_Late_Bloomer - Monday, April 23, 2012 - link

    Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).

    In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.
  • peterfares - Thursday, September 27, 2012 - link

    a 2560x1600 monitor (available for years) has 1.975 times the amount of pixels as a 1920x1080 screen.

    4K would be even better, though!
  • nathanddrews - Monday, April 23, 2012 - link

    4K is a very big deal for a couple reasons: pixel density and film transparency.

    From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.

    Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).

    Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.

    You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.

Log in

Don't have an account? Sign up now