Acer XB280HK: sRGB Calibration and Bench Tests

Pre-calibration the Acer has a blue tint to the grayscale and a very strange bump past 95%. This kind of bump typically means that the contrast is set too high, causing the panel to run out of a particular color before others. In this case it seems to be running out of red and green, causing the blue levels to spike. The gamma keeps rising as well, causing the dE2000 values for the grayscale to reach 3.0 at points.

Colors are fairly well behaved, with the dE2000 values for the color checker staying below 3.0 for most of the range. They are very close to 3.0, so on static images you can tell the difference from accurate colors, but for non-professionals the display performs reasonably well.

For calibration, we use SpectraCal CalMAN 5.3.5 with our own custom workflow. We target 200 cd/m2 of light output with a gamma of 2.2 and the sRGB color gamut, which corresponds to a general real-world use case. The meters used are an i1Pro2 provided by X-Rite and a SpectraCal C6. All measurements use APL 50% patterns except for uniformity testing, which uses full field.

  Pre-Calibration Post-Calibration,
200 cd/m2
Post-Calibration,
80 cd/m2
White Level ( cd/m2) 200.0 200.4 78.8
Black Level ( cd/m2) 0.2602 0.2723 0.1157
Contrast Ratio 769:1 736:1 681:1
Gamma (Average) 2.31 2.18 2.60
Color Temperature 7065K 6629K 6493K
Grayscale dE2000 2.18 0.44 0.59
Color Checker dE2000 2.42 1.60 1.55
Saturations dE2000 2.35 1.36 1.48
 

Post-calibration the RGB Balance and Gamma is almost perfect. The contrast ratio is only 736:1 but that isn’t much of a drop from the pre-calibration level of 769:1. Color errors are reduced, but as I’ll show here, that is only because the luminance levels are fixed. Unless a monitor has a 3D LUT, you cannot correct for over-saturation or tint errors in a display. Using an ICC profile and an ICC aware application you can fix some of those, but most applications are not ICC aware. Below you’ll see the color checker charts broken out into three different errors: Luminance, Color, and Hue. Color are Hue are what we cannot fix, while Luminance we can.

As we can see the DeltaL values are almost perfect now, but the DeltaC and DeltaH values are basically identical to before. Unless you have either ICC aware applications, or a monitor with a 3D LUT, this is all you’ll ever be able to do to correct a display. Grayscale and gamma improve, but a display needs to have accurate colors to be correct.

Targeting 80 cd/m2 now and the sRGB gamma curve we see similar results. The contrast ratio drops even more but that almost always happens. Colors have the same issues we’ve seen the whole time, with the DeltaL improving but not the Hue or Saturation.

Color accuracy on the Acer is okay but not fantastic. Since the pre-calibration numbers for colors are almost all below dE2000 levels of 3.0 most people will be fine with it. Many 4K displays to this point have had a focus on designers and photo editors, but the Acer is very much a gaming display, and in practice few gamers will really notice anything with the colors unless a display is really off, and that’s certainly not the case here.

Acer XB280HK: Brightness and Contrast Acer XB280HK: Display Uniformity
Comments Locked

69 Comments

View All Comments

  • MrSpadge - Thursday, January 29, 2015 - link

    Jarred, please test his claims and modded drivers! He surely comes accross as dubious, but if he's correct that's a real bomb waiting to explode.
  • SkyBill40 - Thursday, January 29, 2015 - link

    There's a huge thing that he's doing that makes his claim patently false: he's running that game WINDOWED and G-Sync only works full screen. Period. So, in essence, while he does make an interesting point... he's full of shit.
  • JarredWalton - Thursday, January 29, 2015 - link

    The current (updated) video is running the pendulum fullscreen, but again... his claims are dubious at best. "Look, it has an Altera FPGA. The only thing that's good for is security!" Ummm... does he even know what FPGA means? Field Programmable Gate Array, as in, you can program it to do pretty much anything you want within the confines of the number of gates available. Also, the suggestion that G-SYNC (which was released before AMD ever even talked about FreeSync) is the same as FreeSync is ludicrous.

    FWIW, I've seen laptop displays that can run at 50Hz before, so with this demo running at a static 50 FPS it seems, that's not really that crazy to have "modded drivers" work. Sure, the drivers allow you to apparently turn G-SYNC on or off, but he could mod the drivers to actually turn triple buffering on/off and I doubt most of us could tell the difference via an Internet video.

    He needs to show it running a game with a variable FPS (with a FRAPS counter), and he needs to zoom out enough that we can see the full laptop and not just a portion of the screen. Take a high speed video of that -- with the camera mounted on a tripod and not in his hands -- and someone could actually try stepping through the frames to see how long each frame is on screen. It would be a pain in the butt for certain, but it would at least make his claims plausible.

    My take is that if G-SYNC is basically hacked, it would have come to light a long time ago. Oh, wait -- the random guy on the Internet with his modded drivers (anyone have time to do a "diff" and see what has changed?) is smarter than all of the engineers at AMD, the display companies, etc.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I agree with you and appreciate your more in depth commentary on it. I still, like you, find his claim(s) to be quite dubious and likely to be pure crap.
  • JarredWalton - Saturday, January 31, 2015 - link

    Turns out this is NOT someone doing a hack to enable G-SYNC; it's an alpha leak of NVIDIA's drivers where they're trying to make G-SYNC work with laptops. PCPer did a more in-depth look at the drivers here:
    http://www.pcper.com/reviews/Graphics-Cards/Mobile...

    So, not too surprisingly, it might be possible to get most of the G-SYNC functionality with drivers alone, but it still requires more work. It also requires no Optimus (for now), and you need a better than average display to drive.
  • Will Robinson - Thursday, January 29, 2015 - link

    Thanx for reposting that link Pork.I posted it yesterday but it seems some people want to accuse him of being a conspiracy theorist or somehow not of sound mind rather than evaluate his conclusions with an open mind.
    I wondered if we would get an official response from AT.
  • nos024 - Thursday, January 29, 2015 - link

    Technically, if AMD is the only one supporting "FreeSync" you'll still be so-called "vendor-locked"? No?

    As a PC gamer you only have two choices for high performance gaming video cards. So I don't understand this so-called vendor-lock debate thing with G-sync and Freesync. Just because G-sync comes in the form of a chip and Freesync with me with the new version of display port, it's the same deal.
  • SkyBill40 - Thursday, January 29, 2015 - link

    No, it's not the same thing. G-Sync is wholly proprietary and the effects of it will *not* work without a G-sync capable video card; on the contrary, Free-sync is just that: free to whatever card you have no matter the vendor. It's open source and thereby there's no proprietary chips in the design. It just works. Period.
  • nos024 - Thursday, January 29, 2015 - link

    What do you mean it just works? If Nvidia decides not to support it, AMD becomes the only one to support it, which means vendor-lock anyways.

    So you are saying if I decide to use Intel's IGP (given it comes with the correct Display port version), I need no additional support from Intel (driver) and Freesync will just work? I don't think it's THAT easy. Bottom line is, you will be locked to AMD graphics card IF AMD is the only one supporting it. It doesn't matter how it is implemented into the hardware - it's all about support.

    The only thing it has going for it is that there's no royalty paid to AMD to adopt the technology from a monitor manufacturing point of view.
  • Black Obsidian - Thursday, January 29, 2015 - link

    And no additional (monitor) hardware required.
    And it's part of the DisplayPort spec.
    And any GPU manufacturer that wants to support it is free to do so.

    The only thing that Freesync has going *against* it is that nVidia might decide to be a bag of dicks and refuse to support an open standard in favor of their added-cost (read: added-profit) proprietary solution.

Log in

Don't have an account? Sign up now