Acer XB280HK Conclusion - Performance

Only looking at the objective numbers of the Acer XB280HK there are a few issues, the most notable being the so-so uniformity of the backlight. The over-saturation of blue is an issue, though blue is the color with the least visible errors to the eye. The worst main spec is that the contrast ratio doesn’t get above 780:1 which robs the display of much of the pop that other panels have today.

On a subjective ergonomic perspective, the Acer does many things well. The menu system is very good and easy to navigate. There are four USB 3.0 ports, with two on the side that provide quick access for flash drives and other peripherals. The stand offers a wide range of flexibility and makes it easy to align the display for your field of vision.

With 4K it is still a bit of a clunky solution sometimes. We are still using DisplayPort 1.2 which previously meant using MST for 60Hz refresh rates but new controllers have moved past this. DisplayPort 1.3 and HDMI 2.0 are both going to be available this year, completely moving past these issues, but that also means you’ll need a new GPU. For people looking to upgrade that will be fine, but if you just invested in enough power to support a 4K display, you might not want to upgrade just yet. We also are still waiting on all software to catch up with DPI scaling support as running at DPI scaling other than 100% still introduces issues.

What About G-SYNC?

The question most will have is whether or not G-SYNC is worth the price of entry. All things being equal, having G-SYNC available is definitely nice and it would be great to see all future displays incorporate such technology, but having G-SYNC inherently linked to NVIDIA GPUs makes that a less than perfect solution. AMD has their “me too” FreeSync technology coming out, and AMD has worked with the VESA standards group to make Adaptive Refresh a part of future DisplayPort protocols. They’ve used that to enable FreeSync...and then gave up all royalties and licensing for the technology. What that actually means for retail pricing however is still open for debate, though we should have the answer by March when the first FreeSync displays begin shipping.

Realistically, keeping the price as low as possible and using open standards is a good way to win long-term support for a technology, but G-SYNC was first and NVIDIA deserves plenty of credit. The technology is certainly a boon to gamers, especially when you're running GPUs that can't push 60+ FPS. Other display options with 144 Hz G-SYNC displays simply make the technology even more desirable, as we've been stuck at 60 Hz with LCDs for far too long (3D displays being a tangent of sorts). In my experience playing a large collection of games, never did I feel like G-SYNC resulted in an inferior experience compared to the alternatives, and with appropriate settings it is generally superior.

As far as the Acer 4K display goes, G-SYNC is also quite helpful as a lot of NVIDIA GPUs (even in SLI) struggle with running many games at 4K. Where G-SYNC doesn't add much benefit is for games where you're already pushing 60+ FPS, and there are certainly plenty of times where that's true on older titles. Ultimately, how much you want G-SYNC is going to depend on what sort of hardware you have and how much you're bothered by things like stutter, tearing, and lag.

If those are items you rarely think about, you can hold off and wait for the technology to continue to improve, at the same time waiting to see if a victor emerges in the G-SYNC vs. FreeSync "war". Ideally, we’d see the two competing solutions merge, as that would be a real victory for the consumers, but for the next few years we suspect NVIDIA will continue to support G-SYNC and the only company that supports FreeSync with their GPUs will be AMD. For those that are bothered by stutter, tearing, and lag, the recommendation is a bit easier: if you run an NVIDIA GPU, G-SYNC works and it's a real value add for your next display upgrade.

On a related subject, so far all of the G-SYNC displays have been on desktops, but it would really be nice to see laptops with internal G-SYNC (or FreeSync) panels. For one, laptops tend to have far more limited graphics hardware, so getting most games above 60 FPS on a mainstream laptop can be difficult if not impossible. There are again obstacles to doing this, for example switchable graphics, plus no one wants to add $100 or more to the cost of a laptop if they don’t view the added functionality as something highly marketable and in demand. Regardless of the technical hurdles, at some point we’d like to see adaptive refresh rates on more than just desktops; for now, G-SYNC remains a desktop display exclusive (though there are laptops with support for G-SYNC on external displays).

Final Thoughts

While as a display on its own the Acer XB280HK doesn't offer the best performance, it's still acceptable in all the important areas. As you can guess, however, the only real reason to buy this display is if you want G-SYNC, and more importantly you want it at 4K. This is the only current solution for that niche, and it's very much a niche market. Driving 4K gaming requires a lot of graphics hardware, so at the very least you should have a couple of GTX 970 cards to make good use of the display.

If you do have the hardware, the result is a great gaming experience for the most part, but you really have to be sold on 4K gaming. The XB280HK can also run with G-SYNC at lower resolutions, but you're still fundamentally limited to 60Hz and lower refresh rates. The main alternative right now is going to be the ASUS ROG Swift PG278Q with it's QHD 144Hz panel; Jarred prefers the higher refresh rate and more sensible native resolution, but there's plenty of personal preference at play. Acer also has their upcoming IPS Acer XB270HU, which is a QHD 144Hz G-SYNC display, but that won't start shipping for another month or two at least and pricing is not yet known.

While the above are certainly alternatives to keep an eye on, for 4K gaming on NVIDIA GPUs it looks like the XB280HK will remain the primary option. The price of nearly $800 is pretty steep, but then if you're seriously considering 4K gaming in the first place you should have around $800 (or more) just in graphics cards already, and a good display can easily last half a decade or more. Even if FreeSync ends up winning in the market, existing G-SYNC displays should continue working fine as the drivers and hardware shouldn't need any tweaks. Buying such a display today is certainly the bleeding edge, and we'll likely see better alternatives in the coming year (e.g. IPS panels, DisplayPort 1.3/HDMI 2.0, etc.), but this is currently the only game in town.

Acer XB280HK: Input Lag, Gamut, and Power Use
Comments Locked

69 Comments

View All Comments

  • JarredWalton - Wednesday, January 28, 2015 - link

    I suppose the question on 4K gaming is this: would you rather have 4K medium or QHD high settings (possibly even QHD ultra)? There are certainly games where 4K high or ultra is possible with a more moderate GPU, but most of the big holiday releases come close to using 3GB RAM for textures at ultra settings, and dropping to high in many cases still isn't enough. I think people really after 4K gaming in the first place will want to do it at high or ultra settings, rather than to juggle quality against resolution, but to each his own.
  • DigitalFreak - Wednesday, January 28, 2015 - link

    I had the Dell P2715Q for a bit and swapped it for the U3415W. I really didn't like the trade-offs you have to make with 4k (performance, etc.), and didn't really notice that much of a difference in graphics quality.
  • Mustalainen - Thursday, January 29, 2015 - link

    I also looked at that monitor(the U3415W). It is beautiful but it came down to the fact that it was priced at 990 euro. In hindsight I think I'm happier with 4k as text is so sharp. I also like having 2 or more monitors as I can run one application in full screen on one monitor while being able to see whats happening in the other applications on the other monitors. I don't know if I would want to put yet another monitor beside that 34", maybe works great, maybe not, I do not dare to comment on that. The important thing is that everyone has the hardware that fits them the best.

    I'm mostly happy that companies seems to be releasing a variety of monitors at reasonable price points. It felt like monitors 20"-22" were stuck at 1080p forever while mobile phone screens were improved every month. Lets hope that improvements will continue on both markets.
  • Mustalainen - Thursday, January 29, 2015 - link

    Oh, I cant edit, It was supposed to say 20"-27" were stuck...
  • Mustalainen - Thursday, January 29, 2015 - link

    Jarred, you are probably correct. I just wanted to give an alternative opinion to those who are looking at 4k and are leaning towards working(involving a lot of text) and being happy with not maxing out the graphics. I feel happy with 4k. It feels like "something new", have a lot of area to work with, scaling is almost a non-issue in win8.1 (with most of the applications I use).
  • Taristin - Wednesday, January 28, 2015 - link

    Acers always have that blue tint problem. I have 3 acer monitors on my desktop and each of them leans too far into the blue spectrum, even after playing with calibrations. Leads to some rapid eyestrain.
  • B3an - Thursday, January 29, 2015 - link

    TN panel? Nope.

    FreeSync or fuck off.
  • Pork@III - Thursday, January 29, 2015 - link

    Yes! Indeed!

    Write TN and we now know that the only use is to fence in my pigsty.
  • Pork@III - Thursday, January 29, 2015 - link

    WoW I read this article now: http://gamenab.net/2015/01/26/truth-about-the-g-sy...
    Cheers for those who paid lot of money for display with G-Synk module!
  • JarredWalton - Thursday, January 29, 2015 - link

    Interesting, though a bit too laced with conspiracy theory stuff to convince me he's not off his rocker. I'd like to see a game with clear videos of VSYNC Off, On, and G-SYNC modes on that laptop. Part of the issue of course is that a cell phone video of a display is going to be difficult to tell if the refresh rate is really 50Hz, 60Hz, or more importantly variable. The pendulum demo is a bit too staged for "proof".

Log in

Don't have an account? Sign up now