Acer XB280HK Conclusion - Performance

Only looking at the objective numbers of the Acer XB280HK there are a few issues, the most notable being the so-so uniformity of the backlight. The over-saturation of blue is an issue, though blue is the color with the least visible errors to the eye. The worst main spec is that the contrast ratio doesn’t get above 780:1 which robs the display of much of the pop that other panels have today.

On a subjective ergonomic perspective, the Acer does many things well. The menu system is very good and easy to navigate. There are four USB 3.0 ports, with two on the side that provide quick access for flash drives and other peripherals. The stand offers a wide range of flexibility and makes it easy to align the display for your field of vision.

With 4K it is still a bit of a clunky solution sometimes. We are still using DisplayPort 1.2 which previously meant using MST for 60Hz refresh rates but new controllers have moved past this. DisplayPort 1.3 and HDMI 2.0 are both going to be available this year, completely moving past these issues, but that also means you’ll need a new GPU. For people looking to upgrade that will be fine, but if you just invested in enough power to support a 4K display, you might not want to upgrade just yet. We also are still waiting on all software to catch up with DPI scaling support as running at DPI scaling other than 100% still introduces issues.

What About G-SYNC?

The question most will have is whether or not G-SYNC is worth the price of entry. All things being equal, having G-SYNC available is definitely nice and it would be great to see all future displays incorporate such technology, but having G-SYNC inherently linked to NVIDIA GPUs makes that a less than perfect solution. AMD has their “me too” FreeSync technology coming out, and AMD has worked with the VESA standards group to make Adaptive Refresh a part of future DisplayPort protocols. They’ve used that to enable FreeSync...and then gave up all royalties and licensing for the technology. What that actually means for retail pricing however is still open for debate, though we should have the answer by March when the first FreeSync displays begin shipping.

Realistically, keeping the price as low as possible and using open standards is a good way to win long-term support for a technology, but G-SYNC was first and NVIDIA deserves plenty of credit. The technology is certainly a boon to gamers, especially when you're running GPUs that can't push 60+ FPS. Other display options with 144 Hz G-SYNC displays simply make the technology even more desirable, as we've been stuck at 60 Hz with LCDs for far too long (3D displays being a tangent of sorts). In my experience playing a large collection of games, never did I feel like G-SYNC resulted in an inferior experience compared to the alternatives, and with appropriate settings it is generally superior.

As far as the Acer 4K display goes, G-SYNC is also quite helpful as a lot of NVIDIA GPUs (even in SLI) struggle with running many games at 4K. Where G-SYNC doesn't add much benefit is for games where you're already pushing 60+ FPS, and there are certainly plenty of times where that's true on older titles. Ultimately, how much you want G-SYNC is going to depend on what sort of hardware you have and how much you're bothered by things like stutter, tearing, and lag.

If those are items you rarely think about, you can hold off and wait for the technology to continue to improve, at the same time waiting to see if a victor emerges in the G-SYNC vs. FreeSync "war". Ideally, we’d see the two competing solutions merge, as that would be a real victory for the consumers, but for the next few years we suspect NVIDIA will continue to support G-SYNC and the only company that supports FreeSync with their GPUs will be AMD. For those that are bothered by stutter, tearing, and lag, the recommendation is a bit easier: if you run an NVIDIA GPU, G-SYNC works and it's a real value add for your next display upgrade.

On a related subject, so far all of the G-SYNC displays have been on desktops, but it would really be nice to see laptops with internal G-SYNC (or FreeSync) panels. For one, laptops tend to have far more limited graphics hardware, so getting most games above 60 FPS on a mainstream laptop can be difficult if not impossible. There are again obstacles to doing this, for example switchable graphics, plus no one wants to add $100 or more to the cost of a laptop if they don’t view the added functionality as something highly marketable and in demand. Regardless of the technical hurdles, at some point we’d like to see adaptive refresh rates on more than just desktops; for now, G-SYNC remains a desktop display exclusive (though there are laptops with support for G-SYNC on external displays).

Final Thoughts

While as a display on its own the Acer XB280HK doesn't offer the best performance, it's still acceptable in all the important areas. As you can guess, however, the only real reason to buy this display is if you want G-SYNC, and more importantly you want it at 4K. This is the only current solution for that niche, and it's very much a niche market. Driving 4K gaming requires a lot of graphics hardware, so at the very least you should have a couple of GTX 970 cards to make good use of the display.

If you do have the hardware, the result is a great gaming experience for the most part, but you really have to be sold on 4K gaming. The XB280HK can also run with G-SYNC at lower resolutions, but you're still fundamentally limited to 60Hz and lower refresh rates. The main alternative right now is going to be the ASUS ROG Swift PG278Q with it's QHD 144Hz panel; Jarred prefers the higher refresh rate and more sensible native resolution, but there's plenty of personal preference at play. Acer also has their upcoming IPS Acer XB270HU, which is a QHD 144Hz G-SYNC display, but that won't start shipping for another month or two at least and pricing is not yet known.

While the above are certainly alternatives to keep an eye on, for 4K gaming on NVIDIA GPUs it looks like the XB280HK will remain the primary option. The price of nearly $800 is pretty steep, but then if you're seriously considering 4K gaming in the first place you should have around $800 (or more) just in graphics cards already, and a good display can easily last half a decade or more. Even if FreeSync ends up winning in the market, existing G-SYNC displays should continue working fine as the drivers and hardware shouldn't need any tweaks. Buying such a display today is certainly the bleeding edge, and we'll likely see better alternatives in the coming year (e.g. IPS panels, DisplayPort 1.3/HDMI 2.0, etc.), but this is currently the only game in town.

Acer XB280HK: Input Lag, Gamut, and Power Use
Comments Locked

69 Comments

View All Comments

  • MrSpadge - Thursday, January 29, 2015 - link

    Jarred, please test his claims and modded drivers! He surely comes accross as dubious, but if he's correct that's a real bomb waiting to explode.
  • SkyBill40 - Thursday, January 29, 2015 - link

    There's a huge thing that he's doing that makes his claim patently false: he's running that game WINDOWED and G-Sync only works full screen. Period. So, in essence, while he does make an interesting point... he's full of shit.
  • JarredWalton - Thursday, January 29, 2015 - link

    The current (updated) video is running the pendulum fullscreen, but again... his claims are dubious at best. "Look, it has an Altera FPGA. The only thing that's good for is security!" Ummm... does he even know what FPGA means? Field Programmable Gate Array, as in, you can program it to do pretty much anything you want within the confines of the number of gates available. Also, the suggestion that G-SYNC (which was released before AMD ever even talked about FreeSync) is the same as FreeSync is ludicrous.

    FWIW, I've seen laptop displays that can run at 50Hz before, so with this demo running at a static 50 FPS it seems, that's not really that crazy to have "modded drivers" work. Sure, the drivers allow you to apparently turn G-SYNC on or off, but he could mod the drivers to actually turn triple buffering on/off and I doubt most of us could tell the difference via an Internet video.

    He needs to show it running a game with a variable FPS (with a FRAPS counter), and he needs to zoom out enough that we can see the full laptop and not just a portion of the screen. Take a high speed video of that -- with the camera mounted on a tripod and not in his hands -- and someone could actually try stepping through the frames to see how long each frame is on screen. It would be a pain in the butt for certain, but it would at least make his claims plausible.

    My take is that if G-SYNC is basically hacked, it would have come to light a long time ago. Oh, wait -- the random guy on the Internet with his modded drivers (anyone have time to do a "diff" and see what has changed?) is smarter than all of the engineers at AMD, the display companies, etc.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I agree with you and appreciate your more in depth commentary on it. I still, like you, find his claim(s) to be quite dubious and likely to be pure crap.
  • JarredWalton - Saturday, January 31, 2015 - link

    Turns out this is NOT someone doing a hack to enable G-SYNC; it's an alpha leak of NVIDIA's drivers where they're trying to make G-SYNC work with laptops. PCPer did a more in-depth look at the drivers here:
    http://www.pcper.com/reviews/Graphics-Cards/Mobile...

    So, not too surprisingly, it might be possible to get most of the G-SYNC functionality with drivers alone, but it still requires more work. It also requires no Optimus (for now), and you need a better than average display to drive.
  • Will Robinson - Thursday, January 29, 2015 - link

    Thanx for reposting that link Pork.I posted it yesterday but it seems some people want to accuse him of being a conspiracy theorist or somehow not of sound mind rather than evaluate his conclusions with an open mind.
    I wondered if we would get an official response from AT.
  • nos024 - Thursday, January 29, 2015 - link

    Technically, if AMD is the only one supporting "FreeSync" you'll still be so-called "vendor-locked"? No?

    As a PC gamer you only have two choices for high performance gaming video cards. So I don't understand this so-called vendor-lock debate thing with G-sync and Freesync. Just because G-sync comes in the form of a chip and Freesync with me with the new version of display port, it's the same deal.
  • SkyBill40 - Thursday, January 29, 2015 - link

    No, it's not the same thing. G-Sync is wholly proprietary and the effects of it will *not* work without a G-sync capable video card; on the contrary, Free-sync is just that: free to whatever card you have no matter the vendor. It's open source and thereby there's no proprietary chips in the design. It just works. Period.
  • nos024 - Thursday, January 29, 2015 - link

    What do you mean it just works? If Nvidia decides not to support it, AMD becomes the only one to support it, which means vendor-lock anyways.

    So you are saying if I decide to use Intel's IGP (given it comes with the correct Display port version), I need no additional support from Intel (driver) and Freesync will just work? I don't think it's THAT easy. Bottom line is, you will be locked to AMD graphics card IF AMD is the only one supporting it. It doesn't matter how it is implemented into the hardware - it's all about support.

    The only thing it has going for it is that there's no royalty paid to AMD to adopt the technology from a monitor manufacturing point of view.
  • Black Obsidian - Thursday, January 29, 2015 - link

    And no additional (monitor) hardware required.
    And it's part of the DisplayPort spec.
    And any GPU manufacturer that wants to support it is free to do so.

    The only thing that Freesync has going *against* it is that nVidia might decide to be a bag of dicks and refuse to support an open standard in favor of their added-cost (read: added-profit) proprietary solution.

Log in

Don't have an account? Sign up now