Acer XB280HK Conclusion - Performance

Only looking at the objective numbers of the Acer XB280HK there are a few issues, the most notable being the so-so uniformity of the backlight. The over-saturation of blue is an issue, though blue is the color with the least visible errors to the eye. The worst main spec is that the contrast ratio doesn’t get above 780:1 which robs the display of much of the pop that other panels have today.

On a subjective ergonomic perspective, the Acer does many things well. The menu system is very good and easy to navigate. There are four USB 3.0 ports, with two on the side that provide quick access for flash drives and other peripherals. The stand offers a wide range of flexibility and makes it easy to align the display for your field of vision.

With 4K it is still a bit of a clunky solution sometimes. We are still using DisplayPort 1.2 which previously meant using MST for 60Hz refresh rates but new controllers have moved past this. DisplayPort 1.3 and HDMI 2.0 are both going to be available this year, completely moving past these issues, but that also means you’ll need a new GPU. For people looking to upgrade that will be fine, but if you just invested in enough power to support a 4K display, you might not want to upgrade just yet. We also are still waiting on all software to catch up with DPI scaling support as running at DPI scaling other than 100% still introduces issues.

What About G-SYNC?

The question most will have is whether or not G-SYNC is worth the price of entry. All things being equal, having G-SYNC available is definitely nice and it would be great to see all future displays incorporate such technology, but having G-SYNC inherently linked to NVIDIA GPUs makes that a less than perfect solution. AMD has their “me too” FreeSync technology coming out, and AMD has worked with the VESA standards group to make Adaptive Refresh a part of future DisplayPort protocols. They’ve used that to enable FreeSync...and then gave up all royalties and licensing for the technology. What that actually means for retail pricing however is still open for debate, though we should have the answer by March when the first FreeSync displays begin shipping.

Realistically, keeping the price as low as possible and using open standards is a good way to win long-term support for a technology, but G-SYNC was first and NVIDIA deserves plenty of credit. The technology is certainly a boon to gamers, especially when you're running GPUs that can't push 60+ FPS. Other display options with 144 Hz G-SYNC displays simply make the technology even more desirable, as we've been stuck at 60 Hz with LCDs for far too long (3D displays being a tangent of sorts). In my experience playing a large collection of games, never did I feel like G-SYNC resulted in an inferior experience compared to the alternatives, and with appropriate settings it is generally superior.

As far as the Acer 4K display goes, G-SYNC is also quite helpful as a lot of NVIDIA GPUs (even in SLI) struggle with running many games at 4K. Where G-SYNC doesn't add much benefit is for games where you're already pushing 60+ FPS, and there are certainly plenty of times where that's true on older titles. Ultimately, how much you want G-SYNC is going to depend on what sort of hardware you have and how much you're bothered by things like stutter, tearing, and lag.

If those are items you rarely think about, you can hold off and wait for the technology to continue to improve, at the same time waiting to see if a victor emerges in the G-SYNC vs. FreeSync "war". Ideally, we’d see the two competing solutions merge, as that would be a real victory for the consumers, but for the next few years we suspect NVIDIA will continue to support G-SYNC and the only company that supports FreeSync with their GPUs will be AMD. For those that are bothered by stutter, tearing, and lag, the recommendation is a bit easier: if you run an NVIDIA GPU, G-SYNC works and it's a real value add for your next display upgrade.

On a related subject, so far all of the G-SYNC displays have been on desktops, but it would really be nice to see laptops with internal G-SYNC (or FreeSync) panels. For one, laptops tend to have far more limited graphics hardware, so getting most games above 60 FPS on a mainstream laptop can be difficult if not impossible. There are again obstacles to doing this, for example switchable graphics, plus no one wants to add $100 or more to the cost of a laptop if they don’t view the added functionality as something highly marketable and in demand. Regardless of the technical hurdles, at some point we’d like to see adaptive refresh rates on more than just desktops; for now, G-SYNC remains a desktop display exclusive (though there are laptops with support for G-SYNC on external displays).

Final Thoughts

While as a display on its own the Acer XB280HK doesn't offer the best performance, it's still acceptable in all the important areas. As you can guess, however, the only real reason to buy this display is if you want G-SYNC, and more importantly you want it at 4K. This is the only current solution for that niche, and it's very much a niche market. Driving 4K gaming requires a lot of graphics hardware, so at the very least you should have a couple of GTX 970 cards to make good use of the display.

If you do have the hardware, the result is a great gaming experience for the most part, but you really have to be sold on 4K gaming. The XB280HK can also run with G-SYNC at lower resolutions, but you're still fundamentally limited to 60Hz and lower refresh rates. The main alternative right now is going to be the ASUS ROG Swift PG278Q with it's QHD 144Hz panel; Jarred prefers the higher refresh rate and more sensible native resolution, but there's plenty of personal preference at play. Acer also has their upcoming IPS Acer XB270HU, which is a QHD 144Hz G-SYNC display, but that won't start shipping for another month or two at least and pricing is not yet known.

While the above are certainly alternatives to keep an eye on, for 4K gaming on NVIDIA GPUs it looks like the XB280HK will remain the primary option. The price of nearly $800 is pretty steep, but then if you're seriously considering 4K gaming in the first place you should have around $800 (or more) just in graphics cards already, and a good display can easily last half a decade or more. Even if FreeSync ends up winning in the market, existing G-SYNC displays should continue working fine as the drivers and hardware shouldn't need any tweaks. Buying such a display today is certainly the bleeding edge, and we'll likely see better alternatives in the coming year (e.g. IPS panels, DisplayPort 1.3/HDMI 2.0, etc.), but this is currently the only game in town.

Acer XB280HK: Input Lag, Gamut, and Power Use
Comments Locked

69 Comments

View All Comments

  • JarredWalton - Thursday, January 29, 2015 - link

    This remains to be seen. Adaptive VSYNC is part of DisplayPort now, but I don't believe it's required -- it's optional. Which means that it almost certainly requires something in addition to just supporting DisplayPort. What FreeSync has going against it is that it is basically a copy of something NVIDIA created, released as an open standard, but the only graphics company currently interested in supporting it is AMD. If NVIDIA hadn't created G-SYNC, would we even have something coming in March called FreeSync?

    My bet is FreeSync ends up requiring:
    1) Appropriate driver level support.
    2) Some minimum level of hardware support on the GPU (i.e. I bet it won't work on anything prior to GCN cards)
    3) Most likely a more complex scaler in the display to make adaptive VSYNC work.
    4) A better panel to handle the needs of adaptive VSYNC.

    We'll see what happens when FreeSync actually ships. If Intel supports it, that's a huge win. That's also very much an "IF" not "WHEN". Remember how long it took Intel to get the 23.97Hz video stuff working?
  • Black Obsidian - Thursday, January 29, 2015 - link

    I agree with you on 1 and 2, and *possibly* 3, but I wouldn't bet on that last one myself, nor on #4. The monitor reviewed here--with a 60Hz maximum and pixel decay under 40Hz--would seem to suggest that a better panel isn't at all necessary.

    I also completely agree that, absent G-SYNC, Freesync very likely wouldn't exist. But that's often the way of things: someone comes up with a novel feature, the market sees value in it, and a variant of it becomes standardized.

    G-SYNC is brilliant, but G-SYNC is also clumsy, because of the compromises that are necessary when you can't exert control over all of the systems you depend on. Now that a proper standard exists, those compromises are no longer necessary, and the appropriate thing to do is to stop making them and transfer those resources elsewhere. This, of course, assumes that Freesync doesn't come with greater compromises of its own, but there's presently no reason to expect that it does.

    As for Intel, the 23.97Hz issue persisted as long as it did because you could round down the number of people who really cared to "nobody." It's possible that the number of people who care about Freesync in an IGP rounds similarly too, of course.
  • andrewaggb - Thursday, January 29, 2015 - link

    Freesync in an IGP for laptops and tablets would be a big deal I think.
  • nos024 - Friday, January 30, 2015 - link

    That's exactly what I am saying. Basically, we have only two GPU choices for PC gaming, nVidia or AMD. I'd understand the vendor-lock argument if there was a third and fourth player, but if nVidia doesn't support Free-Sync, you are basically locked into AMD GPUs for Freesync gaming.

    I'm sure nVidia can reduce the royalty fee or eliminate it completely, but you know what? There's nothing competing against it right now.

    nVidia seems to get away with lots of things, e.g. for a MB to implement SLI, it needs to license it and only come in enthusiast chipsets (Z77/Z87/Z97). Xfire comes free with all Intel chipsets - yet SLI is pretty popular still...just saying.
  • anubis44 - Tuesday, February 3, 2015 - link

    I say let nVidia be a bag of dicks and refuse to support the open standard. Then we'll see their true colours and know to boycott them, the greedy bastards.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I think BO and Jarred have it pretty much covered.
  • anubis44 - Tuesday, February 3, 2015 - link

    Somebody'll hack the nVidia drivers to make nVida cards work with Freesync, kind of like the customized Omega drivers for ATI/AMD graphics cards a few years ago. You can count on it. nVidia wants to charge you for something that can easily be done without paying them any licensing fees. I think we should just say no to that.
  • MrSpadge - Thursday, January 29, 2015 - link

    If Intel were smart they'd simply add Free Sync support to their driver. nVidia gamers could use Virtu to let the Intel + Freesync output the signal from their cards. Non gamers would finally get stutter-free video and GSync would be dead.

    No matter if Intel ever takes this route, they could do so and hence free Sync is not "vendor-locked".
  • phoenix_rizzen - Thursday, January 29, 2015 - link

    "The high resolution also means working in normal applications at 100% scaling can be a bit of an eyestrain (please, no comments from the young bucks with eagle eyes; it’s a real concern and I speak from personal experience), and running at 125% or 150% scaling doesn’t always work properly. Before anyone starts to talk about how DPI scaling has improved, let me quickly point out that during the holiday season, at least three major games I know of shipped in a state where they would break if your Windows DPI was set to something other than 100%. Oops. I keep hoping things will improve, but the software support for HiDPI still lags behind where it ought to be."

    This is something I just don't understand. How can it be so hard?

    An inch is an inch is an inch, it never changes. The number of pixels per inch does change, though, as the resolution changes. Why is it so hard for the graphics driver to adapt?

    A 12pt character should be the exact same size on every monitor, regardless of the DPI, regardless of the resolution, regardless of the screen size.

    We've perfected this in the print world. Why hasn't it carried over to the video world? Why isn't this built into every OS by default?

    Just seems bizarre that we can print at 150x150, 300x300, 600x600, 1200x1200, and various other resolutions in between without the characters changing size (12pt is 12pt at every resolution) and yet this doesn't work on computer screens.
  • DanNeely - Thursday, January 29, 2015 - link

    It's not the drivers; it's the applications. The basic win32 APIs (like all mainstream UI APIs from the era) are raster based and use pixels as the standard item size and spacing unit. This was done because on the slower hardware of the era the overhead from trying to do everything in inches or cm was an unacceptable performance hit when the range of DPIs that they needed to work on wasn't wide enough for it to be a problem.

    You can make applications built on them work with DPI scaling; but it would be a lot of work. At a minimum, everywhere you're doing layout/size calculations you'd need to multiply the numbers you're computing for sizes and positions by the scaling factor. I suspect if you wanted to avoid bits of occasional low level jerkyness when resizing you'd probably need to add a bunch of twiddles to manage the remainders you get when scaling doesn't give integral sizes (ex 13 *1.25 = 16.25). If you have any custom controls that you're drawing yourself you'd need to redo the paint methods of them as well. It didn't help that prior to Windows 8, you had to log out and back in to change the DPI scaling level; which would make debugging it very painful for anyone who tried to make it work.

    Newer interface libraries are pixel independent and do all the messy work for you but changing one out is a major rewrite. For Windows, the first one from MS was Windows Presentation Foundation (WPF); which launched in 2006 and was .net only. You can mix C/C++ and .net in a single application; but it's going to be messy and annoying to do at best. Windows 8 was the first version to offer a decedent of WPF to c++ applications directly; but between lack of compatibility with win7 systems meaning the need to maintain two different UIs and the general dislike of the non-windowed nature of Metro applications it hasn't gained much traction in the market.

    Disclosure: I'm a software developer whose duties include maintaining several internal or single customer line of business applications written in .net using the non-dpi aware Windows Forms UI library. Barring internal systems being upgraded to Win 8 or higher (presumably Win10) and high DPI displays or a request from one of our customers to make it happen (along with enough money to pay for it); I don't see any of what I maintain getting the level rewrite needed to retrofit DPI awareness.

Log in

Don't have an account? Sign up now