Intel provided us with a Core i7-3770K processor and Asus was kind enough to supply the HTPC friendly P8H77-M Pro motherboard for our test drive. Purists might balk at the idea of an overclockable 77W TDP processor being used in tests intended to analyze the HTPC capabilities. However, the Core i7-3770K comes with Intel HD Graphics 4000, the highest end GPU in the Ivy Bridge lineup. Using this as the review platform gives readers an understanding of the maximum HTPC capabilities of the Ivy Bridge lineup.

The table below presents the hardware components of our Ivy Bridge HTPC testbed:

Ivy Bridge HTPC Testbed Setup
Processor Intel Core i7-3770K - 3.50 GHz (Turbo to 3.9 GHz)
Intel HD Graphics 4000 - 650 MHz (Max. Dynamic Frequency of 1150 MHz)
Motherboard Asus P8H77-M Pro uATX
OS Drive Seagate Barracuda XT 2 TB
Memory G.SKILL ECO Series 4GB (2 x 2GB) SDRAM DDR3 1333 (PC3 10666) F3-10666CL7D-4GBECO CAS 9-9-9-24
G.SKILL Ripjaws Z Series 16GB (2 x 8GB) SDRAM DDR3 1600 (PC3 12800) F3-12800CL10Q2-64GBZL CAS 10-10-10-30
Optical Drives ASUS 8X Blu-ray Drive Model BC-08B1ST
Case Antec VERIS Fusion Remote Max
Power Supply Antec TruePower New TP-550 550W
Operating System Windows 7 Ultimate x64 SP1
Display / AVR Acer H243H / Pioneer Elite VSX-32 + Sony Bravia KDL46EX720
.

The Asus P8H77-M PRO makes for a nice HTPC / general purpose board for consumers not interested in overclocking their CPU. It also has two PCI-E x16 slots (one operating in x16 with PCI-E 3.0, and the other in x4 with PCI-E 2.0) and two PCI-E x1 slots for those interested in adding gaming cards or TV tuners / video capture cards.

Readers might wonder about the two different flavours of DRAM being used in the testbed. It must be noted that at any given point of time, only one of the flavours was being used.

As readers will see in a later section, it is possible that the memory bandwidth and latency can play a very important role in the video post processing performance. Towards this, we actually ran our decode / post processing tests with three distinct configurations. The ECO modules were run at DDR3 1333 (9-9-9-24) and also at DDR3 1600 (9-9-9-24). The Ripjaws Z modules were overclocked to DDR3 1800 (12-12-12-32). The ability to overclock the G.Skill DRAM modules was quite useful in trying to find some insights into the effect of memory bandwidth and latency on video post processing using the integrated GPU.

The software setup for the Ivy Bridge HTPC testbed involved the following:

Ivy Bridge HTPC Testbed Software Setup
Blu-ray Playback Software CyberLink PowerDVD 12
Media Player MPC-HC v1.6.1.4235
Splitter / Decoder LAV Filters 0.50.1
Renderers EVR-CP (integrated in MPC-HC v1.6.1.4235)
madVR v0.82.5

The madVR renderer settings were fixed as below for testing purposes:

  1. Decoding features disabled
  2. Deinterlacing set to:
    • automatically activated when needed (activate when in doubt)
    • automatic source type detection (i.e, disable automatic source type detection is left unchecked)
    • only look at pixels in the frame center
    • be performed in a separate thread
  3. Scaling algorithms were set as below:
    • Chroma upscaling set to default (SoftCubic with softness of 100)
    • Luma upscaling set to default (Lanczos with 4 taps)
    • Luma downscaling set to default (Lanczos with 4 taps)
  4. Rendering parameters were set as below:
    • Start of playback was delayed till the render queue filled up
    • A separate device was used presentation, and D3D11 was used
    • CPU and GPU queue sizes were set to 32 and 24 respectively
    • Under windowed mode, the number of backbuffers was set to 8, and the GPU was set to be flushed after intermediate render steps as well as the last render step. In addition, the GPU was set to wait (sleep) after the last render step.

Exclusive mode settings were not applicable to our testbed, because we found the full screen exclusive mode to be generally bad in performance compared to the full screen windowed mode. Also, none of the options to trade quality for performance were checked.

Introduction HQV 2.0 Benchmarking
Comments Locked

70 Comments

View All Comments

  • Exodite - Tuesday, April 24, 2012 - link

    Anyone that says they can tell any difference between a 65% and 95% color gamut is whiny bitch.

    See, I can play that game too!

    Even if I were to buy your "factual" argument, and I don't, I've clearly stated that I care nothing about the things you consider advantages.

    I sit facing the center of my display, brightness and gamma is turned down to minimum levels and saturation is low. Measured power draw at the socket is 9W.

    It's a 2MS TN panel, obviously.

    All I want is more vertical space at a reasonable price, though a 120Hz display would be nice as well.

    My friend is running a 5ms 1080p eIPS display and between that and what I have I'd still pick my current display.

    End of the day it's personal preference, which I made abundantly clear in my first post.

    Though it seems displays, and IPS panels in general, is starting to attract the same amount of douchiness as the audiophile community.
  • Old_Fogie_Late_Bloomer - Tuesday, April 24, 2012 - link

    Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.

    I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.
  • Exodite - Tuesday, April 24, 2012 - link

    Yeah, you shouldn't have gotten into this.

    Point being that whatever the difference is I bet you the same can be said about latency.

    Besides, as I've said from the start it's about the things that you personally appreciate.

    My preferred settings absolutely destroy any kind of color fidelity anyway, and that doesn't even slightly matter as I don't work with professional imagery.

    But I can most definitely appreciate the difference between TN and even eIPS when it comes to gaming. And I consider the former superior.

    I don't /mind/ higher color fidelity or better viewing angles, I'm just sure as hell not going to pay any extra for it.
  • Old_Fogie_Late_Bloomer - Wednesday, April 25, 2012 - link

    I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.

    But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.

    For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.
  • Exodite - Wednesday, April 25, 2012 - link

    To be fair the counterexample wasn't about being correct, because the poster I replied to weren't, but rather about showing what an asshat argument he was making.

    That said it's about the frame of reference.

    Would you be able to tell the difference working with RAW images pulled from your DSLR or other high-quality imagery?

    Sure, side-by-side I have no doubt you would.

    Would you be able to tell the difference when viewing the desktop, a simple web form or an editor where the only color are black, white, two shades of blue and grey?

    Especially once both displays are calibrated to the point I'm comfortable with them. (Cold hue, 0% brightness, low saturation, negative gamma, high contrast.)

    I dare say not.
  • DarkUltra - Monday, April 30, 2012 - link

    I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.

    Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.
  • Origin32 - Saturday, April 28, 2012 - link

    I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it.
    This all goes double for 4K video.

    That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for.
    But not for film, not for me.
  • Old_Fogie_Late_Bloomer - Monday, April 23, 2012 - link

    Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).

    In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.
  • peterfares - Thursday, September 27, 2012 - link

    a 2560x1600 monitor (available for years) has 1.975 times the amount of pixels as a 1920x1080 screen.

    4K would be even better, though!
  • nathanddrews - Monday, April 23, 2012 - link

    4K is a very big deal for a couple reasons: pixel density and film transparency.

    From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.

    Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).

    Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.

    You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.

Log in

Don't have an account? Sign up now