HTPC Aspects : HQV 2.0 Benchmarking and Video Post Processing in Action

HTPC enthusiasts are often concerned about the quality of pictures output by the system.  While this is a very subjective metric, we have decided to take as much of an objective approach as possible. Starting with the Core 100 review in 2010, we have been using the HQV 2.0 benchmark for this purpose.

The HQV 2.0 test suite consists of 39 different streams divided into 4 different classes. The playback device is assigned scores for each, depending on how well it plays the stream.  Each test was repeated multiple times to ensure that the correct score was assigned. The scoring details are available in the testing guide on the HQV website.

In the table below, we indicate the maximum score possible for each test, and how much the Zotac GT 640 was able to get. As mentioned in the previous section, we used NVIDIA Graphics Driver v301.42 for the benchmarking.

 
HQV 2.0 Benchmark - Zotac GT 640
Test Class Chapter Tests Max. Score Zotac GT 640
Video Conversion Video Resolution Dial 5 5
Dial with Static Pattern 5 5
Gray Bars 5 5
Violin 5 5
Film Resolution Stadium 2:2 5 5
Stadium 3:2 5 5
Overlay On Film Horizontal Text Scroll 5 5
Vertical Text Scroll 5 3
Cadence Response Time Transition to 3:2 Lock 5 5
Transition to 2:2 Lock 5 5
Multi-Cadence 2:2:2:4 24 FPS DVCam Video 5 5
2:3:3:2 24 FPS DVCam Video 5 5
3:2:3:2:2 24 FPS Vari-Speed 5 5
5:5 12 FPS Animation 5 5
6:4 12 FPS Animation 5 5
8:7 8 FPS Animation 5 5
Color Upsampling Errors Interlace Chroma Problem (ICP) 5 5
Chroma Upsampling Error (CUE) 5 5
Noise and Artifact Reduction Random Noise SailBoat 5 5
Flower 5 5
Sunrise 5 5
Harbour Night 5 5
Compression Artifacts Scrolling Text 5 5
Roller Coaster 5 5
Ferris Wheel 5 5
Bridge Traffic 5 5
Upscaled Compression Artifacts Text Pattern 5 3
Roller Coaster 5 3
Ferris Wheel 5 3
Bridge Traffic 5 3
Image Scaling and Enhancements Scaling and Filtering Luminance Frequency Bands 5 5
Chrominance Frequency Bands 5 5
Vanishing Text 5 5
Resolution Enhancement Brook, Mountain, Flower, Hair, Wood 15 15
Video Conversion Contrast Enhancement Theme Park 5 2
Driftwood 5 2
Beach at Dusk 5 2
White and Black Cats 5 2
Skin Tone Correction Skin Tones 10 0
         
    Total Score 210 178

We find that score closely tracks what we had for the GT 540M in the ASRock Vision 3D 252B review. In fact, the only difference is the fact that the horizontal scroll response time has been improved a bit, enabling it to score two more points in that test. Given that the GT 540M had no trouble deinterlacing 1080i60 content, it was not a surprise to find that the GT 640 sailed through those tests. In the next section, we will look at some rendering benchmarks to see how deinterlacing operations load up the GPU. Chroma upsampling algorithms are passable, and there is no difference in quality between what was obtained through the 540M and what we got with the GT 640.

In our review of the video post processing features of the GT 540M, we had indicated that the contrast enhancement and skin tone correction features didn't work. We found no change in the v301.42 drivers. However, we did find contrast enhancement working with the black level testing clip in the AVS HD 709 calibration suite. This just proves that the dynamic contrast enhancement feature in the NVIDIA drivers doesn't work as effectively as Intel's or AMD's.

Should the low HQV score or lack of proper dynamic contrast enhancement prevent you from choosing the GT 640 for your HTPC? Definitely not! The nice aspect about NVIDIA GPUs is the fact that there are lots of HTPC software packages available to take advantage of the GPU resources. As long as the hardware deinterlacer works (it does, as the HQV scores for those tests indicate), and there are enough shaders and other computing resources available to let madVR work its magic, the HTPC end-user has no reason to worry. Advanced HTPC users tend to distrust any post processing done by the drivers, and would rather not let the driver mess with the video output by applying its custom post processing algorithms (which tend to break with every new driver release).

However, video post processing algorithms are not the only issue-prone HTPC aspects in the driver. Proper black levels are necessary irrespective of the color space being output. The gallery below shows that the behavior of the driver doesn't correlate in any way to the settings in the control panel. NVIDIA drivers seem to adopt two modes for the limited (16-235) and full (0-255) settings, one for global (desktop, photos etc.) and one for videos. Global mode is chosen to be limited (16-235) for all in-built resolutions and full (0-255) for all custom resolutions when in YCbCr mode with no way to change this (the gallery below shows correct dynamic range being chosen in RGB mode for still photos / desktop). The dynamic range for video and desktops are also different. Toggling the dynamic contrast enhancement box also seems to affect this setting. In addition, there is no way to specifically choose RGB Full or RGB Limited in the current drivers.

This dynamic range issue was apparently present in the Vista days, and fixed earlier. There appears to be a regression in the state of this bug recently, and we have been observing problems since May 2011 at least. A method to fix the issue has been outlined on Microsoft's official Windows community forums. It is disappointing to note that NVIDIA has still not fixed the issue despite the bug being a major annoyance for many HTPC users.

HTPC Aspects : Custom Refresh Rates HTPC Aspects : Decoding and Rendering Benchmarks
Comments Locked

60 Comments

View All Comments

  • cjs150 - Thursday, June 21, 2012 - link

    "God forbid there be a technical reason for it.... "

    Intel and Nvidia have had several generations of chip to fix any technical issue and didnt (HD4000 is good enough though). AMD have been pretty close to the correct frame rate for a while.

    But it is not enough to have the capability to run at the correct frame rate is you make it too difficult to change the frame rate to the correct setting. That is not a hardware issue just bad design of software.
  • UltraTech79 - Wednesday, June 20, 2012 - link

    Anyone else really disappointed in 4 still being standardized around 24 fps? I thought 60 would be the min standard by now with 120 in higher end displays. 24 is crap. Anyone that has seen a movie recorded at 48+FPS know whats I'm talking about.

    This is like putting shitty unleaded gas into a super high-tech racecar.
  • cjs150 - Thursday, June 21, 2012 - link

    You do know that Blu-ray is displayed at 23.976 FPS? That looks very good to me.

    Please do not confuse screen refresh rates with frame rates. Screen refresh runs on most large TVs at between 60 and 120 Hz, anything below 60 tends to look crap. (if you want real crap trying running American TV on an European PAL system - I mean crap in a technical sense not creatively!)

    I must admit that having a fps of 23.976 rather than some round number such as 24 (or higher) FPS is rather daft and some new films are coming out with much higher FPS. I have a horrible recollection that the reason for such an odd FPS is very historic - something to do with the length of 35mm film that would be needed per second, the problem is I cannot remember whether that was simply because 35mm film was expensive and it was the minimum to provide smooth movement or whether it goes right back to days when film had a tendency to catch light and then it was the maximum speed you could put a film through a projector without friction causing the film to catch light. No doubt there is an expert on this site who could explain precisely why we ended up with such a silly number as the standard
  • UltraTech79 - Friday, June 22, 2012 - link

    You are confusing things here. I clearly said 120(fps) would need higher end displays (120Hz) I was rounding up 23.976 FPS to 24, give me a break.

    It looks good /to you/ is wholly irrelevant. Do you realize how many people said "it looks very good to me." Referring to SD when resisting the HD movement? Or how many will say it again referring to 1080p thinking 4k is too much? It's a ridiculous mindset.

    My point was that we are upping the resolution, but leaving another very important aspect in the dust that we need to improve. Even audio is moving faster than framerates in movies, and now that most places are switching to digital, the cost to goto the next step has dropped dramatically.
  • nathanddrews - Friday, June 22, 2012 - link

    It was NVIDIA's choice to only implement 4K @ 24Hz (23.xxx) due to limitations of HDMI. If NVIDIA had optimized around DisplayPort, you could then have 4K @ 60Hz.

    For computer use, anything under 60Hz is unacceptable. For movies, 24Hz has been the standard for a century - all film is 24fps and most movies are still shot on film. In the next decade, there will be more and more films that will use 48, 60, even 120fps. Cameron was cock-blocked by the studio when he wanted to film Avatar at 60fps, but he may get his wish for the sequels. Jackson is currently filming The Hobbit at 48fps. Eventually all will be right with the world.
  • karasaj - Wednesday, June 20, 2012 - link

    If we wanted to use this to compare a 640M or 640M LE to the GT640, is this doable? If it's built on the same card, (both have 384 CUDA cores) can we just reduce the numbers by a rough % of the core clock speed to get rough numbers that the respective cards would put out? I.E. the 640M LE has a clock of 500mhz, the 640M is ~625Mhz. Could we expect ~55% of this for the 640M LE and 67% for the 640M? Assuming DDR3 on both so as not to have that kind of difference.
  • Ryan Smith - Wednesday, June 20, 2012 - link

    It would be fairly easy to test a desktop card at a mobile card's clocks (assuming memory type and functional unit count was equal) but you can't extrapolate performance like that because there's more to performance than clockspeeds. In practice performance shouldn't drop by that much since we're already memory bandwidth bottlenecked with DDR3.
  • jstabb - Wednesday, June 20, 2012 - link

    Can you verify if creating a custom resolution breaks 3D (frame packed) blu-ray playback?

    With my GT430, once a custom resolution has been created for 23/24hz, that custom resolution overrides the 3D frame-packed resolution created when 3D vision is enabled. The driver appeared to have a simple fall through logic. If a custom resolution is defined for the selected resolution/refresh rate it is always used, failing that it will use a 3D resolution if one is defined, failing that it will use the default 2D resolution.

    This issue made the custom resolution feature useless to me with the GT430 and pushed me to an AMD solution for their better OOTB refresh rate matching. I'd like to consider this card if the issue has been resolved.

    Thanks for the great review!
  • MrSpadge - Wednesday, June 20, 2012 - link

    It consumes about just as much as the HD7750-800, yet performs miserably in comparison. This is an amazing win for AMD, especially comparing GTX680 and HD7970!
  • UltraTech79 - Wednesday, June 20, 2012 - link

    This preform about as well as an 8800GTS for twice the price. Or half the preformance of a 460GTX for the same price.

    These should have been priced at 59.99.

Log in

Don't have an account? Sign up now