Acer XB280HK: Brightness and Contrast

Using a full white screen and setting the backlight to maximum, the Acer manages to produce 292 cd/m2 of brightness. That is pretty bright, though not nearly as bright as some displays can get. I don’t think anyone will really have an issue with this level unless there is direct sun on the display (in which case: close your blinds or move the display). The matte finish should cut down on reflections as well. Setting the backlight to minimum drops this to 34 cd/m2 – dark enough for any real world use, but not so dark that we’re losing flexibility in the settings.

White Level –  i1Pro and C6

At maximum backlight, the black level is 0.3787 cd/m2, which is relatively bright considering the white level. At minimum this falls down to 0.0453 cd/m2, but again without the white level this value is almost meaningless. We’ll see how it really is when we get to the contrast ratios.

Black Level – 1iPro and C6

The contrast ratios for the Acer are very average. At around 770:1 they are on the lower side even for a TN display. With movies, blacks will not be great and the overall image won’t pop as much as a good IPS or VA display can. For gaming it should be fine, as it will actually make it a bit easier to see shadows since they aren’t as dark, but for watching movies it won’t look as good.

Contrast Ratio –  i1Pro and C6

The Acer has just acceptable numbers across the board here. It’s bright, but not bright enough to overcome everything. The contrast ratios are only okay for a TN panel at this point, though I wonder how much 4K plays a role in this. Smaller pixels are probably harder to have fully opened and closed due to size, so contrast ratios are likely to suffer a bit. In time this will likely improve, but the Acer is only mediocre to fair at the moment.

G-SYNC Gaming Experience at 4Kp60 Acer XB280HK: sRGB Calibration and Bench Tests
Comments Locked

69 Comments

View All Comments

  • JarredWalton - Thursday, January 29, 2015 - link

    This remains to be seen. Adaptive VSYNC is part of DisplayPort now, but I don't believe it's required -- it's optional. Which means that it almost certainly requires something in addition to just supporting DisplayPort. What FreeSync has going against it is that it is basically a copy of something NVIDIA created, released as an open standard, but the only graphics company currently interested in supporting it is AMD. If NVIDIA hadn't created G-SYNC, would we even have something coming in March called FreeSync?

    My bet is FreeSync ends up requiring:
    1) Appropriate driver level support.
    2) Some minimum level of hardware support on the GPU (i.e. I bet it won't work on anything prior to GCN cards)
    3) Most likely a more complex scaler in the display to make adaptive VSYNC work.
    4) A better panel to handle the needs of adaptive VSYNC.

    We'll see what happens when FreeSync actually ships. If Intel supports it, that's a huge win. That's also very much an "IF" not "WHEN". Remember how long it took Intel to get the 23.97Hz video stuff working?
  • Black Obsidian - Thursday, January 29, 2015 - link

    I agree with you on 1 and 2, and *possibly* 3, but I wouldn't bet on that last one myself, nor on #4. The monitor reviewed here--with a 60Hz maximum and pixel decay under 40Hz--would seem to suggest that a better panel isn't at all necessary.

    I also completely agree that, absent G-SYNC, Freesync very likely wouldn't exist. But that's often the way of things: someone comes up with a novel feature, the market sees value in it, and a variant of it becomes standardized.

    G-SYNC is brilliant, but G-SYNC is also clumsy, because of the compromises that are necessary when you can't exert control over all of the systems you depend on. Now that a proper standard exists, those compromises are no longer necessary, and the appropriate thing to do is to stop making them and transfer those resources elsewhere. This, of course, assumes that Freesync doesn't come with greater compromises of its own, but there's presently no reason to expect that it does.

    As for Intel, the 23.97Hz issue persisted as long as it did because you could round down the number of people who really cared to "nobody." It's possible that the number of people who care about Freesync in an IGP rounds similarly too, of course.
  • andrewaggb - Thursday, January 29, 2015 - link

    Freesync in an IGP for laptops and tablets would be a big deal I think.
  • nos024 - Friday, January 30, 2015 - link

    That's exactly what I am saying. Basically, we have only two GPU choices for PC gaming, nVidia or AMD. I'd understand the vendor-lock argument if there was a third and fourth player, but if nVidia doesn't support Free-Sync, you are basically locked into AMD GPUs for Freesync gaming.

    I'm sure nVidia can reduce the royalty fee or eliminate it completely, but you know what? There's nothing competing against it right now.

    nVidia seems to get away with lots of things, e.g. for a MB to implement SLI, it needs to license it and only come in enthusiast chipsets (Z77/Z87/Z97). Xfire comes free with all Intel chipsets - yet SLI is pretty popular still...just saying.
  • anubis44 - Tuesday, February 3, 2015 - link

    I say let nVidia be a bag of dicks and refuse to support the open standard. Then we'll see their true colours and know to boycott them, the greedy bastards.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I think BO and Jarred have it pretty much covered.
  • anubis44 - Tuesday, February 3, 2015 - link

    Somebody'll hack the nVidia drivers to make nVida cards work with Freesync, kind of like the customized Omega drivers for ATI/AMD graphics cards a few years ago. You can count on it. nVidia wants to charge you for something that can easily be done without paying them any licensing fees. I think we should just say no to that.
  • MrSpadge - Thursday, January 29, 2015 - link

    If Intel were smart they'd simply add Free Sync support to their driver. nVidia gamers could use Virtu to let the Intel + Freesync output the signal from their cards. Non gamers would finally get stutter-free video and GSync would be dead.

    No matter if Intel ever takes this route, they could do so and hence free Sync is not "vendor-locked".
  • phoenix_rizzen - Thursday, January 29, 2015 - link

    "The high resolution also means working in normal applications at 100% scaling can be a bit of an eyestrain (please, no comments from the young bucks with eagle eyes; it’s a real concern and I speak from personal experience), and running at 125% or 150% scaling doesn’t always work properly. Before anyone starts to talk about how DPI scaling has improved, let me quickly point out that during the holiday season, at least three major games I know of shipped in a state where they would break if your Windows DPI was set to something other than 100%. Oops. I keep hoping things will improve, but the software support for HiDPI still lags behind where it ought to be."

    This is something I just don't understand. How can it be so hard?

    An inch is an inch is an inch, it never changes. The number of pixels per inch does change, though, as the resolution changes. Why is it so hard for the graphics driver to adapt?

    A 12pt character should be the exact same size on every monitor, regardless of the DPI, regardless of the resolution, regardless of the screen size.

    We've perfected this in the print world. Why hasn't it carried over to the video world? Why isn't this built into every OS by default?

    Just seems bizarre that we can print at 150x150, 300x300, 600x600, 1200x1200, and various other resolutions in between without the characters changing size (12pt is 12pt at every resolution) and yet this doesn't work on computer screens.
  • DanNeely - Thursday, January 29, 2015 - link

    It's not the drivers; it's the applications. The basic win32 APIs (like all mainstream UI APIs from the era) are raster based and use pixels as the standard item size and spacing unit. This was done because on the slower hardware of the era the overhead from trying to do everything in inches or cm was an unacceptable performance hit when the range of DPIs that they needed to work on wasn't wide enough for it to be a problem.

    You can make applications built on them work with DPI scaling; but it would be a lot of work. At a minimum, everywhere you're doing layout/size calculations you'd need to multiply the numbers you're computing for sizes and positions by the scaling factor. I suspect if you wanted to avoid bits of occasional low level jerkyness when resizing you'd probably need to add a bunch of twiddles to manage the remainders you get when scaling doesn't give integral sizes (ex 13 *1.25 = 16.25). If you have any custom controls that you're drawing yourself you'd need to redo the paint methods of them as well. It didn't help that prior to Windows 8, you had to log out and back in to change the DPI scaling level; which would make debugging it very painful for anyone who tried to make it work.

    Newer interface libraries are pixel independent and do all the messy work for you but changing one out is a major rewrite. For Windows, the first one from MS was Windows Presentation Foundation (WPF); which launched in 2006 and was .net only. You can mix C/C++ and .net in a single application; but it's going to be messy and annoying to do at best. Windows 8 was the first version to offer a decedent of WPF to c++ applications directly; but between lack of compatibility with win7 systems meaning the need to maintain two different UIs and the general dislike of the non-windowed nature of Metro applications it hasn't gained much traction in the market.

    Disclosure: I'm a software developer whose duties include maintaining several internal or single customer line of business applications written in .net using the non-dpi aware Windows Forms UI library. Barring internal systems being upgraded to Win 8 or higher (presumably Win10) and high DPI displays or a request from one of our customers to make it happen (along with enough money to pay for it); I don't see any of what I maintain getting the level rewrite needed to retrofit DPI awareness.

Log in

Don't have an account? Sign up now