Acer XB280HK: sRGB Calibration and Bench Tests

Pre-calibration the Acer has a blue tint to the grayscale and a very strange bump past 95%. This kind of bump typically means that the contrast is set too high, causing the panel to run out of a particular color before others. In this case it seems to be running out of red and green, causing the blue levels to spike. The gamma keeps rising as well, causing the dE2000 values for the grayscale to reach 3.0 at points.

Colors are fairly well behaved, with the dE2000 values for the color checker staying below 3.0 for most of the range. They are very close to 3.0, so on static images you can tell the difference from accurate colors, but for non-professionals the display performs reasonably well.

For calibration, we use SpectraCal CalMAN 5.3.5 with our own custom workflow. We target 200 cd/m2 of light output with a gamma of 2.2 and the sRGB color gamut, which corresponds to a general real-world use case. The meters used are an i1Pro2 provided by X-Rite and a SpectraCal C6. All measurements use APL 50% patterns except for uniformity testing, which uses full field.

  Pre-Calibration Post-Calibration,
200 cd/m2
Post-Calibration,
80 cd/m2
White Level ( cd/m2) 200.0 200.4 78.8
Black Level ( cd/m2) 0.2602 0.2723 0.1157
Contrast Ratio 769:1 736:1 681:1
Gamma (Average) 2.31 2.18 2.60
Color Temperature 7065K 6629K 6493K
Grayscale dE2000 2.18 0.44 0.59
Color Checker dE2000 2.42 1.60 1.55
Saturations dE2000 2.35 1.36 1.48
 

Post-calibration the RGB Balance and Gamma is almost perfect. The contrast ratio is only 736:1 but that isn’t much of a drop from the pre-calibration level of 769:1. Color errors are reduced, but as I’ll show here, that is only because the luminance levels are fixed. Unless a monitor has a 3D LUT, you cannot correct for over-saturation or tint errors in a display. Using an ICC profile and an ICC aware application you can fix some of those, but most applications are not ICC aware. Below you’ll see the color checker charts broken out into three different errors: Luminance, Color, and Hue. Color are Hue are what we cannot fix, while Luminance we can.

As we can see the DeltaL values are almost perfect now, but the DeltaC and DeltaH values are basically identical to before. Unless you have either ICC aware applications, or a monitor with a 3D LUT, this is all you’ll ever be able to do to correct a display. Grayscale and gamma improve, but a display needs to have accurate colors to be correct.

Targeting 80 cd/m2 now and the sRGB gamma curve we see similar results. The contrast ratio drops even more but that almost always happens. Colors have the same issues we’ve seen the whole time, with the DeltaL improving but not the Hue or Saturation.

Color accuracy on the Acer is okay but not fantastic. Since the pre-calibration numbers for colors are almost all below dE2000 levels of 3.0 most people will be fine with it. Many 4K displays to this point have had a focus on designers and photo editors, but the Acer is very much a gaming display, and in practice few gamers will really notice anything with the colors unless a display is really off, and that’s certainly not the case here.

Acer XB280HK: Brightness and Contrast Acer XB280HK: Display Uniformity
Comments Locked

69 Comments

View All Comments

  • JarredWalton - Thursday, January 29, 2015 - link

    This remains to be seen. Adaptive VSYNC is part of DisplayPort now, but I don't believe it's required -- it's optional. Which means that it almost certainly requires something in addition to just supporting DisplayPort. What FreeSync has going against it is that it is basically a copy of something NVIDIA created, released as an open standard, but the only graphics company currently interested in supporting it is AMD. If NVIDIA hadn't created G-SYNC, would we even have something coming in March called FreeSync?

    My bet is FreeSync ends up requiring:
    1) Appropriate driver level support.
    2) Some minimum level of hardware support on the GPU (i.e. I bet it won't work on anything prior to GCN cards)
    3) Most likely a more complex scaler in the display to make adaptive VSYNC work.
    4) A better panel to handle the needs of adaptive VSYNC.

    We'll see what happens when FreeSync actually ships. If Intel supports it, that's a huge win. That's also very much an "IF" not "WHEN". Remember how long it took Intel to get the 23.97Hz video stuff working?
  • Black Obsidian - Thursday, January 29, 2015 - link

    I agree with you on 1 and 2, and *possibly* 3, but I wouldn't bet on that last one myself, nor on #4. The monitor reviewed here--with a 60Hz maximum and pixel decay under 40Hz--would seem to suggest that a better panel isn't at all necessary.

    I also completely agree that, absent G-SYNC, Freesync very likely wouldn't exist. But that's often the way of things: someone comes up with a novel feature, the market sees value in it, and a variant of it becomes standardized.

    G-SYNC is brilliant, but G-SYNC is also clumsy, because of the compromises that are necessary when you can't exert control over all of the systems you depend on. Now that a proper standard exists, those compromises are no longer necessary, and the appropriate thing to do is to stop making them and transfer those resources elsewhere. This, of course, assumes that Freesync doesn't come with greater compromises of its own, but there's presently no reason to expect that it does.

    As for Intel, the 23.97Hz issue persisted as long as it did because you could round down the number of people who really cared to "nobody." It's possible that the number of people who care about Freesync in an IGP rounds similarly too, of course.
  • andrewaggb - Thursday, January 29, 2015 - link

    Freesync in an IGP for laptops and tablets would be a big deal I think.
  • nos024 - Friday, January 30, 2015 - link

    That's exactly what I am saying. Basically, we have only two GPU choices for PC gaming, nVidia or AMD. I'd understand the vendor-lock argument if there was a third and fourth player, but if nVidia doesn't support Free-Sync, you are basically locked into AMD GPUs for Freesync gaming.

    I'm sure nVidia can reduce the royalty fee or eliminate it completely, but you know what? There's nothing competing against it right now.

    nVidia seems to get away with lots of things, e.g. for a MB to implement SLI, it needs to license it and only come in enthusiast chipsets (Z77/Z87/Z97). Xfire comes free with all Intel chipsets - yet SLI is pretty popular still...just saying.
  • anubis44 - Tuesday, February 3, 2015 - link

    I say let nVidia be a bag of dicks and refuse to support the open standard. Then we'll see their true colours and know to boycott them, the greedy bastards.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I think BO and Jarred have it pretty much covered.
  • anubis44 - Tuesday, February 3, 2015 - link

    Somebody'll hack the nVidia drivers to make nVida cards work with Freesync, kind of like the customized Omega drivers for ATI/AMD graphics cards a few years ago. You can count on it. nVidia wants to charge you for something that can easily be done without paying them any licensing fees. I think we should just say no to that.
  • MrSpadge - Thursday, January 29, 2015 - link

    If Intel were smart they'd simply add Free Sync support to their driver. nVidia gamers could use Virtu to let the Intel + Freesync output the signal from their cards. Non gamers would finally get stutter-free video and GSync would be dead.

    No matter if Intel ever takes this route, they could do so and hence free Sync is not "vendor-locked".
  • phoenix_rizzen - Thursday, January 29, 2015 - link

    "The high resolution also means working in normal applications at 100% scaling can be a bit of an eyestrain (please, no comments from the young bucks with eagle eyes; it’s a real concern and I speak from personal experience), and running at 125% or 150% scaling doesn’t always work properly. Before anyone starts to talk about how DPI scaling has improved, let me quickly point out that during the holiday season, at least three major games I know of shipped in a state where they would break if your Windows DPI was set to something other than 100%. Oops. I keep hoping things will improve, but the software support for HiDPI still lags behind where it ought to be."

    This is something I just don't understand. How can it be so hard?

    An inch is an inch is an inch, it never changes. The number of pixels per inch does change, though, as the resolution changes. Why is it so hard for the graphics driver to adapt?

    A 12pt character should be the exact same size on every monitor, regardless of the DPI, regardless of the resolution, regardless of the screen size.

    We've perfected this in the print world. Why hasn't it carried over to the video world? Why isn't this built into every OS by default?

    Just seems bizarre that we can print at 150x150, 300x300, 600x600, 1200x1200, and various other resolutions in between without the characters changing size (12pt is 12pt at every resolution) and yet this doesn't work on computer screens.
  • DanNeely - Thursday, January 29, 2015 - link

    It's not the drivers; it's the applications. The basic win32 APIs (like all mainstream UI APIs from the era) are raster based and use pixels as the standard item size and spacing unit. This was done because on the slower hardware of the era the overhead from trying to do everything in inches or cm was an unacceptable performance hit when the range of DPIs that they needed to work on wasn't wide enough for it to be a problem.

    You can make applications built on them work with DPI scaling; but it would be a lot of work. At a minimum, everywhere you're doing layout/size calculations you'd need to multiply the numbers you're computing for sizes and positions by the scaling factor. I suspect if you wanted to avoid bits of occasional low level jerkyness when resizing you'd probably need to add a bunch of twiddles to manage the remainders you get when scaling doesn't give integral sizes (ex 13 *1.25 = 16.25). If you have any custom controls that you're drawing yourself you'd need to redo the paint methods of them as well. It didn't help that prior to Windows 8, you had to log out and back in to change the DPI scaling level; which would make debugging it very painful for anyone who tried to make it work.

    Newer interface libraries are pixel independent and do all the messy work for you but changing one out is a major rewrite. For Windows, the first one from MS was Windows Presentation Foundation (WPF); which launched in 2006 and was .net only. You can mix C/C++ and .net in a single application; but it's going to be messy and annoying to do at best. Windows 8 was the first version to offer a decedent of WPF to c++ applications directly; but between lack of compatibility with win7 systems meaning the need to maintain two different UIs and the general dislike of the non-windowed nature of Metro applications it hasn't gained much traction in the market.

    Disclosure: I'm a software developer whose duties include maintaining several internal or single customer line of business applications written in .net using the non-dpi aware Windows Forms UI library. Barring internal systems being upgraded to Win 8 or higher (presumably Win10) and high DPI displays or a request from one of our customers to make it happen (along with enough money to pay for it); I don't see any of what I maintain getting the level rewrite needed to retrofit DPI awareness.

Log in

Don't have an account? Sign up now