Acer XB280HK: Brightness and Contrast

Using a full white screen and setting the backlight to maximum, the Acer manages to produce 292 cd/m2 of brightness. That is pretty bright, though not nearly as bright as some displays can get. I don’t think anyone will really have an issue with this level unless there is direct sun on the display (in which case: close your blinds or move the display). The matte finish should cut down on reflections as well. Setting the backlight to minimum drops this to 34 cd/m2 – dark enough for any real world use, but not so dark that we’re losing flexibility in the settings.

White Level –  i1Pro and C6

At maximum backlight, the black level is 0.3787 cd/m2, which is relatively bright considering the white level. At minimum this falls down to 0.0453 cd/m2, but again without the white level this value is almost meaningless. We’ll see how it really is when we get to the contrast ratios.

Black Level – 1iPro and C6

The contrast ratios for the Acer are very average. At around 770:1 they are on the lower side even for a TN display. With movies, blacks will not be great and the overall image won’t pop as much as a good IPS or VA display can. For gaming it should be fine, as it will actually make it a bit easier to see shadows since they aren’t as dark, but for watching movies it won’t look as good.

Contrast Ratio –  i1Pro and C6

The Acer has just acceptable numbers across the board here. It’s bright, but not bright enough to overcome everything. The contrast ratios are only okay for a TN panel at this point, though I wonder how much 4K plays a role in this. Smaller pixels are probably harder to have fully opened and closed due to size, so contrast ratios are likely to suffer a bit. In time this will likely improve, but the Acer is only mediocre to fair at the moment.

G-SYNC Gaming Experience at 4Kp60 Acer XB280HK: sRGB Calibration and Bench Tests
Comments Locked

69 Comments

View All Comments

  • MrSpadge - Thursday, January 29, 2015 - link

    Jarred, please test his claims and modded drivers! He surely comes accross as dubious, but if he's correct that's a real bomb waiting to explode.
  • SkyBill40 - Thursday, January 29, 2015 - link

    There's a huge thing that he's doing that makes his claim patently false: he's running that game WINDOWED and G-Sync only works full screen. Period. So, in essence, while he does make an interesting point... he's full of shit.
  • JarredWalton - Thursday, January 29, 2015 - link

    The current (updated) video is running the pendulum fullscreen, but again... his claims are dubious at best. "Look, it has an Altera FPGA. The only thing that's good for is security!" Ummm... does he even know what FPGA means? Field Programmable Gate Array, as in, you can program it to do pretty much anything you want within the confines of the number of gates available. Also, the suggestion that G-SYNC (which was released before AMD ever even talked about FreeSync) is the same as FreeSync is ludicrous.

    FWIW, I've seen laptop displays that can run at 50Hz before, so with this demo running at a static 50 FPS it seems, that's not really that crazy to have "modded drivers" work. Sure, the drivers allow you to apparently turn G-SYNC on or off, but he could mod the drivers to actually turn triple buffering on/off and I doubt most of us could tell the difference via an Internet video.

    He needs to show it running a game with a variable FPS (with a FRAPS counter), and he needs to zoom out enough that we can see the full laptop and not just a portion of the screen. Take a high speed video of that -- with the camera mounted on a tripod and not in his hands -- and someone could actually try stepping through the frames to see how long each frame is on screen. It would be a pain in the butt for certain, but it would at least make his claims plausible.

    My take is that if G-SYNC is basically hacked, it would have come to light a long time ago. Oh, wait -- the random guy on the Internet with his modded drivers (anyone have time to do a "diff" and see what has changed?) is smarter than all of the engineers at AMD, the display companies, etc.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I agree with you and appreciate your more in depth commentary on it. I still, like you, find his claim(s) to be quite dubious and likely to be pure crap.
  • JarredWalton - Saturday, January 31, 2015 - link

    Turns out this is NOT someone doing a hack to enable G-SYNC; it's an alpha leak of NVIDIA's drivers where they're trying to make G-SYNC work with laptops. PCPer did a more in-depth look at the drivers here:
    http://www.pcper.com/reviews/Graphics-Cards/Mobile...

    So, not too surprisingly, it might be possible to get most of the G-SYNC functionality with drivers alone, but it still requires more work. It also requires no Optimus (for now), and you need a better than average display to drive.
  • Will Robinson - Thursday, January 29, 2015 - link

    Thanx for reposting that link Pork.I posted it yesterday but it seems some people want to accuse him of being a conspiracy theorist or somehow not of sound mind rather than evaluate his conclusions with an open mind.
    I wondered if we would get an official response from AT.
  • nos024 - Thursday, January 29, 2015 - link

    Technically, if AMD is the only one supporting "FreeSync" you'll still be so-called "vendor-locked"? No?

    As a PC gamer you only have two choices for high performance gaming video cards. So I don't understand this so-called vendor-lock debate thing with G-sync and Freesync. Just because G-sync comes in the form of a chip and Freesync with me with the new version of display port, it's the same deal.
  • SkyBill40 - Thursday, January 29, 2015 - link

    No, it's not the same thing. G-Sync is wholly proprietary and the effects of it will *not* work without a G-sync capable video card; on the contrary, Free-sync is just that: free to whatever card you have no matter the vendor. It's open source and thereby there's no proprietary chips in the design. It just works. Period.
  • nos024 - Thursday, January 29, 2015 - link

    What do you mean it just works? If Nvidia decides not to support it, AMD becomes the only one to support it, which means vendor-lock anyways.

    So you are saying if I decide to use Intel's IGP (given it comes with the correct Display port version), I need no additional support from Intel (driver) and Freesync will just work? I don't think it's THAT easy. Bottom line is, you will be locked to AMD graphics card IF AMD is the only one supporting it. It doesn't matter how it is implemented into the hardware - it's all about support.

    The only thing it has going for it is that there's no royalty paid to AMD to adopt the technology from a monitor manufacturing point of view.
  • Black Obsidian - Thursday, January 29, 2015 - link

    And no additional (monitor) hardware required.
    And it's part of the DisplayPort spec.
    And any GPU manufacturer that wants to support it is free to do so.

    The only thing that Freesync has going *against* it is that nVidia might decide to be a bag of dicks and refuse to support an open standard in favor of their added-cost (read: added-profit) proprietary solution.

Log in

Don't have an account? Sign up now