Acer XB280HK Conclusion - Performance

Only looking at the objective numbers of the Acer XB280HK there are a few issues, the most notable being the so-so uniformity of the backlight. The over-saturation of blue is an issue, though blue is the color with the least visible errors to the eye. The worst main spec is that the contrast ratio doesn’t get above 780:1 which robs the display of much of the pop that other panels have today.

On a subjective ergonomic perspective, the Acer does many things well. The menu system is very good and easy to navigate. There are four USB 3.0 ports, with two on the side that provide quick access for flash drives and other peripherals. The stand offers a wide range of flexibility and makes it easy to align the display for your field of vision.

With 4K it is still a bit of a clunky solution sometimes. We are still using DisplayPort 1.2 which previously meant using MST for 60Hz refresh rates but new controllers have moved past this. DisplayPort 1.3 and HDMI 2.0 are both going to be available this year, completely moving past these issues, but that also means you’ll need a new GPU. For people looking to upgrade that will be fine, but if you just invested in enough power to support a 4K display, you might not want to upgrade just yet. We also are still waiting on all software to catch up with DPI scaling support as running at DPI scaling other than 100% still introduces issues.

What About G-SYNC?

The question most will have is whether or not G-SYNC is worth the price of entry. All things being equal, having G-SYNC available is definitely nice and it would be great to see all future displays incorporate such technology, but having G-SYNC inherently linked to NVIDIA GPUs makes that a less than perfect solution. AMD has their “me too” FreeSync technology coming out, and AMD has worked with the VESA standards group to make Adaptive Refresh a part of future DisplayPort protocols. They’ve used that to enable FreeSync...and then gave up all royalties and licensing for the technology. What that actually means for retail pricing however is still open for debate, though we should have the answer by March when the first FreeSync displays begin shipping.

Realistically, keeping the price as low as possible and using open standards is a good way to win long-term support for a technology, but G-SYNC was first and NVIDIA deserves plenty of credit. The technology is certainly a boon to gamers, especially when you're running GPUs that can't push 60+ FPS. Other display options with 144 Hz G-SYNC displays simply make the technology even more desirable, as we've been stuck at 60 Hz with LCDs for far too long (3D displays being a tangent of sorts). In my experience playing a large collection of games, never did I feel like G-SYNC resulted in an inferior experience compared to the alternatives, and with appropriate settings it is generally superior.

As far as the Acer 4K display goes, G-SYNC is also quite helpful as a lot of NVIDIA GPUs (even in SLI) struggle with running many games at 4K. Where G-SYNC doesn't add much benefit is for games where you're already pushing 60+ FPS, and there are certainly plenty of times where that's true on older titles. Ultimately, how much you want G-SYNC is going to depend on what sort of hardware you have and how much you're bothered by things like stutter, tearing, and lag.

If those are items you rarely think about, you can hold off and wait for the technology to continue to improve, at the same time waiting to see if a victor emerges in the G-SYNC vs. FreeSync "war". Ideally, we’d see the two competing solutions merge, as that would be a real victory for the consumers, but for the next few years we suspect NVIDIA will continue to support G-SYNC and the only company that supports FreeSync with their GPUs will be AMD. For those that are bothered by stutter, tearing, and lag, the recommendation is a bit easier: if you run an NVIDIA GPU, G-SYNC works and it's a real value add for your next display upgrade.

On a related subject, so far all of the G-SYNC displays have been on desktops, but it would really be nice to see laptops with internal G-SYNC (or FreeSync) panels. For one, laptops tend to have far more limited graphics hardware, so getting most games above 60 FPS on a mainstream laptop can be difficult if not impossible. There are again obstacles to doing this, for example switchable graphics, plus no one wants to add $100 or more to the cost of a laptop if they don’t view the added functionality as something highly marketable and in demand. Regardless of the technical hurdles, at some point we’d like to see adaptive refresh rates on more than just desktops; for now, G-SYNC remains a desktop display exclusive (though there are laptops with support for G-SYNC on external displays).

Final Thoughts

While as a display on its own the Acer XB280HK doesn't offer the best performance, it's still acceptable in all the important areas. As you can guess, however, the only real reason to buy this display is if you want G-SYNC, and more importantly you want it at 4K. This is the only current solution for that niche, and it's very much a niche market. Driving 4K gaming requires a lot of graphics hardware, so at the very least you should have a couple of GTX 970 cards to make good use of the display.

If you do have the hardware, the result is a great gaming experience for the most part, but you really have to be sold on 4K gaming. The XB280HK can also run with G-SYNC at lower resolutions, but you're still fundamentally limited to 60Hz and lower refresh rates. The main alternative right now is going to be the ASUS ROG Swift PG278Q with it's QHD 144Hz panel; Jarred prefers the higher refresh rate and more sensible native resolution, but there's plenty of personal preference at play. Acer also has their upcoming IPS Acer XB270HU, which is a QHD 144Hz G-SYNC display, but that won't start shipping for another month or two at least and pricing is not yet known.

While the above are certainly alternatives to keep an eye on, for 4K gaming on NVIDIA GPUs it looks like the XB280HK will remain the primary option. The price of nearly $800 is pretty steep, but then if you're seriously considering 4K gaming in the first place you should have around $800 (or more) just in graphics cards already, and a good display can easily last half a decade or more. Even if FreeSync ends up winning in the market, existing G-SYNC displays should continue working fine as the drivers and hardware shouldn't need any tweaks. Buying such a display today is certainly the bleeding edge, and we'll likely see better alternatives in the coming year (e.g. IPS panels, DisplayPort 1.3/HDMI 2.0, etc.), but this is currently the only game in town.

Acer XB280HK: Input Lag, Gamut, and Power Use
Comments Locked

69 Comments

View All Comments

  • inighthawki - Friday, January 30, 2015 - link

    In what way is it incorrect?
  • perpetualdark - Wednesday, February 4, 2015 - link

    Hertz refers to cycles per second, and with G-Sync the display matches the number of cycles per second to the framer per second the graphics card is able to send to the display, so in actuality, Hertz is indeed the correct term and it is being used correctly. At 45fps, the monitor is also at 45hz refresh rate.
  • edzieba - Wednesday, January 28, 2015 - link

    "We are still using DisplayPort 1.2 which means utilizing MST for 60Hz refresh rates." Huh-what? DP1.2 has the bandwidth to carry 4k60 with a single stream. Previous display controllers could not do so unless paired, but that was a problem at the sink end. There are several 4k60 SST monitors available now (e.g. P2415Q)..
  • TallestJon96 - Wednesday, January 28, 2015 - link

    Sync is a great way to make 4k more stable and usable. However, this is proprietary, costs more, and 4k scaling is just ok. Any one interested in this is better off waiting for a better, cheaper solution that isn't stuck with NVIDIA.
    As mentioned before, the SWIFT is simply a better option, better performance at 1440p, better UI scaling, higher maximum FPS. Only downside is lower Res, but 1440p certainly isn't bad.
    A very niche product with a premium, but all that being said I bet Crisis at 4k with G-Sync is amazing.
  • Tunnah - Wednesday, January 28, 2015 - link

    "Other 4K 28” IPS displays cost at least as much and lack G-SYNC, making them a much worse choice for gaming than the Acer. "

    But you leave out the fact that 4K 28" TN panels are a helluva lot cheaper. Gamers typically look for TN panels anyway because of refresh issues, so the comparison should be to other TN panels, not to IPS, and that comparison is G-SYNC is extremely expensive. It's a neat feature and all, but I would argue it's much better to spend the extra on competent graphics cards that could sustain 60fps rather than a monitor that handles the framerate drop better.
  • Tunnah - Wednesday, January 28, 2015 - link

    Response time issues even
  • Midwayman - Wednesday, January 28, 2015 - link

    If it ran 1080 @ 144hz as well as 4k@ 60hz this would be a winning combo. Getting stuck with 60hz really sucks for FPS games. I wouldn't mind playing my RPGs at 40-60fps with gsync though.
  • DanNeely - Wednesday, January 28, 2015 - link

    "Like most G-SYNC displays, the Acer has but a single DisplayPort input. G-SYNC only works with DisplayPort, and if you didn’t care about G-SYNC you would have bought a different monitor."

    Running a second or third cable and hitting the switch input button on your monitor if you occasionally need to put a real screen on a second box is a lot easier than swapping the cable behind the monitor and a lot cheaper than a non-VGA KVM (and the only 4k capable options on the market are crazy expensive).

    The real reason is probably that nVidia was trying to limit the price premium from getting any higher than it already is, and avoiding a second input helped simplify the chip design. (In addition to the time element for a bigger design, big FPGAs aren't cheap.)
  • JarredWalton - Wednesday, January 28, 2015 - link

    Well, you're not going to do 60Hz at 4K with dual-link DVI, and HDMI 2.0 wasn't available when this was being developed. A second input might have been nice, but that's just an added expense and not likely to be used a lot IMO. You're right on keeping the cost down, though -- $800 is already a lot to ask, and if you had to charge $900 to get additional inputs I don't think most people would bite.
  • Mustalainen - Wednesday, January 28, 2015 - link

    I was waiting for the DELL P2715Q but decided to get this monitor instead(about 2 weeks ago). Before I got this I borrowed a ASUS ROG SWIFT PG278Q that I used for a couple of weeks. The SWIFT was probably the best monitor that I had used until that point in time. But to be completely honest, I like the XB280HK better. The colors, viewing angles (and so on) are pretty much the same(in my opinion) as I did my "noob" comparison. My monitor has some minor blb in the bottom, barely notable while the SWIFT seems "flawless". The SWIFT felt as is was built better and has better materials. Still, the 4k was a deal breaker for me. The picture just looks so much better compared to 1440p. The difference between 1440p and 4k? Well after using the XB280HK I started to think that my old 24" 1200p was broken. It just looked as it had these huge pixels. This never happened with the SWIFT. And the hertz? Well I'm not a gamer. I play some RPGs now and then but most of the time my screen is filled with text and code. The 60hz seems to be sufficient in these cases. I got the XB280HK for 599 euro and compared to other monitors in that price range it felt as a good option. I'm very happy with it and dare to recommend this to anyone thinking about getting a 4k monitor. If IPS is your thing, wait for the DELL. This is probably the only regret I have(not having patience to wait for the DELL).

    I would also like to point out that the hype of running a 4k monitor seems to be exaggerated. I manage to run my games at medium settings with a single 660 gtx. Considering I run 3 monitors with different resolutions and still have playable fps just shows that you don't need a 980 or 295 to power one of these things(maybe if the settings are maxed out and you want max fps).

Log in

Don't have an account? Sign up now