A Brief Overview of G-SYNC

While the other performance characteristics of a monitor are usually going to be the primary concern, with a G-SYNC display the raison d'etre is gaming. As such, we’re going to start by going over the gaming aspects of the XB270HK. If you don’t care about gaming, there’s really not much point in paying the price premium for a G-SYNC enabled display; you could get very much the same experience and quality at a lower price. Along with the G-SYNC qualification, it should go without saying but let’s make this clear: you’ll want an NVIDIA GTX level graphics card to take advantage of G-SYNC. With that out of the way, let’s talk briefly about what G-SYNC does and why it’s useful for gaming. We’ve covered a lot of this before, but for those less familiar with the reason we can benefit from technologies like G-SYNC we’ll cover the highlights.

While games render their content internally, we view the final output on a display. The problem with this approach is that the display normally updates the content being shown at fixed intervals, and in the era of LCDs that usually means a refresh rate of 60Hz (though there are LCDs that can be run at higher rates). If a game renders faster than 60 FPS (Frames Per Second), there’s extra work being done that doesn’t provide much benefit, whereas rendering slower than 60 FPS creates problems. As the display content is updated 60 times per second, there are several options available.

  1. Show the same content on the screen for two consecutive updates (VSYNC On).
  2. Show the latest frame as soon as possible, even if the change occurs during a screen update (VSYNC Off).
  3. Create additional rendering buffers so that internal rendering rate isn’t limited by the refresh rate (Triple Buffering).

As you might have guessed, there are problems with each of these approaches. In the case of enabling VSYNC, this potentially limits your frame rate for each frame to half (or one-third, one-fourth, etc.) your refresh rate. In the worst case, if you have a PC capable of rendering a game at 59 FPS it will always take just a bit too long to finish the frame and the display will wait for updates, effectively giving you 30FPS. Perhaps even worse, if a game runs between 55-65 FPS, you’ll get a bunch of frames at 60 FPS and a bunch more at 30 FPS, which can result in the game stuttering or feeling choppy.

Turning VSYNC off only partially addresses the problem, as now the display will get parts of two or more frames each update, leading to image tearing. Triple buffering tries to get around the issue by using two off-screen buffers in addition to the on-screen buffer, allowing the game to always keep rendering as fast as possible (one of the back buffers always hold a complete screen update), but it can result in multiple frames being drawn but never displayed, it requires even more VRAM (which can be particularly problematic with 4K content), and it can potentially introduce an extra frame of lag between input sent to the PC and when that input shows up on the screen.

G-SYNC is thus a solution to the above problems, allowing for adaptive refresh rates. Your PC can render frames as fast as it’s able, and the display will swap to the latest frame as soon as it’s available. The result can be a rather dramatic difference in the smoothness of a game, particularly when you’re only able to hit 40-50 FPS, and this is all done with no image tearing. It’s a win-win scenario...with a few drawbacks.

First as noted above is that G-SYNC is for NVIDIA GPUs only. (AMD’s FreeSync alternative should start showing up in displays later this month, as working products were demoed at CES). The second is that the cost of licensing G-SYNC technology from NVIDIA along with some additional hardware requirements means that G-SYNC displays carry a price premium compared to otherwise identical but non-G-SYNC hardware.

4K G-SYNC in Practice

We’ve had G-SYNC displays for most of the past year, though the earliest option was a DIY kit where you had to basically mod your monitor, but the Acer XB280HK is something new: a 4Kp60 G-SYNC display. That’s potentially important because where high-end GPUs might easily run most games at frame rates above 60 FPS at 1920x1080 and even 2560x1440, even a couple of GTX 980 GPUs will struggle to break 60 FPS at 4K with a lot of recent releases. My personal feeling is that G-SYNC with 60Hz displays makes the most sense when you can reach 40-55 FPS; if you’re running slower than that, you need to lower the quality settings or resolution, while if you’re running faster than that it’s close enough to 60 FPS that a few minor tweaks to settings (or a slight overclock of the GPU) can make up the difference.

To get straight to the point, G-SYNC on the XB280HK works just as you would expect. For those times when frame rates are hovering in the “optimal” 40+ FPS range, it’s great to get the improved smoothness and lack of tearing. In fact, in many cases even when you’re able to average more than 60 FPS, G-SYNC is beneficial as it keeps the minimum frame rates still feeling as smooth as possible – so if you’re getting occasional dips to 50 FPS but mostly staying at or above 60, you won’t notice the stutter much if at all. Without G-SYNC (and with VSYNC enabled), those dips end up dropping to 30 FPS, and that’s a big enough difference that you can see and feel it.

There are problems however, and the biggest is that the native resolution of 3840x2160 still isn’t really ready for prime time (i.e. mainstream users). If you’re running a single GPU, you’re definitely going to fall short of 40 FPS in plenty of games, so you’ll need to further reduce the image quality or lower the resolution – and in fact, there are plenty of times where I’ve run the XB280HK at QHD or even 1080p to improve frame rates (depending on the game and GPU I was using). But why buy a 4K screen to run it at QHD or 1080p? Along with this, while G-SYNC can refresh the panel at rates as low as 30Hz, I find that anything below 40Hz will start to see the pixels on the screen decay, resulting in a slight flicker; hence, the desire to stay above 40 FPS.

The high resolution also means working in normal applications at 100% scaling can be a bit of an eyestrain (please, no comments from the young bucks with eagle eyes; it’s a real concern and I speak from personal experience), and running at 125% or 150% scaling doesn’t always work properly. Before anyone starts to talk about how DPI scaling has improved, let me quickly point out that during the holiday season, at least three major games I know of shipped in a state where they would break if your Windows DPI was set to something other than 100%. Oops. I keep hoping things will improve, but the software support for HiDPI still lags behind where it ought to be.

The other problem with 4Kp60 is that… well, 60Hz just isn’t the greatest experience in the world. I have an older 1080p 3D Vision display that would run the Windows desktop at 120Hz, and while it’s not the sort of night and day difference of some technologies, I definitely think 75-85 Hz would be a much better “default” than 60Hz. There’s also something to be said for tearing being less noticeable at higher refresh rates. And there’s an alternative to 4Kp60 G-SYNC: 1440p144 (QHD with 144Hz) G-SYNC also exists.

Without getting too far off the subject, we have a review of the ASUS ROG Swift PG278Q in the works. Personally, I find the experience of QHD with G-SYNC and refresh rates of 30-144 Hz to be superior for the vast majority of use cases to 4K 30-60Hz G-SYNC. Others will likely disagree and that’s fine, but on a 27” or 28” panel I just feel QHD is a better option – not to mention gaming at acceptable frame rates at QHD is much easier to achieve than 4K gaming.

There are some other aspects of using this display that I noticed, and while they're probably not a huge issue as most people will be using the XB280HK with NVIDIA GPUs, it’s worth noting that the behavior of my XB280HK with AMD GPUs has at times been quirky. For example, I purchased a longer DisplayPort cable because the included 2m cable can be a bit of a tight reach for my work area. The 3m cable I bought worked fine on all the NVIDIA GPUs I tested, but when I switched to an AMD GPU… no such luck. I had to drop to 4K @ 24Hz to get a stable image, so I ended up moving my PC around and going back to the original cable.

I’ve also noticed that a very large number of games with AMD GPUs don’t properly scale the resolution to fill the whole screen, so QHD with AMD GPUs often results in a lot of black borders. Perhaps even worse, every game with Mantle support that I’ve tried fails to scale the resolution to the full screen when using Mantle. So Dragon Age Inquisition at 1080p with Mantle fills the middle fourth of the display area and everything else is black. The problem would seem to lie with the drivers and/or Mantle, but it’s odd nonetheless – odd and undesirable; let's hope this gets fixed.

Acer XB280HK: Introduction, Design and Specs Acer XB280HK: Brightness and Contrast
Comments Locked

69 Comments

View All Comments

  • JarredWalton - Thursday, January 29, 2015 - link

    This remains to be seen. Adaptive VSYNC is part of DisplayPort now, but I don't believe it's required -- it's optional. Which means that it almost certainly requires something in addition to just supporting DisplayPort. What FreeSync has going against it is that it is basically a copy of something NVIDIA created, released as an open standard, but the only graphics company currently interested in supporting it is AMD. If NVIDIA hadn't created G-SYNC, would we even have something coming in March called FreeSync?

    My bet is FreeSync ends up requiring:
    1) Appropriate driver level support.
    2) Some minimum level of hardware support on the GPU (i.e. I bet it won't work on anything prior to GCN cards)
    3) Most likely a more complex scaler in the display to make adaptive VSYNC work.
    4) A better panel to handle the needs of adaptive VSYNC.

    We'll see what happens when FreeSync actually ships. If Intel supports it, that's a huge win. That's also very much an "IF" not "WHEN". Remember how long it took Intel to get the 23.97Hz video stuff working?
  • Black Obsidian - Thursday, January 29, 2015 - link

    I agree with you on 1 and 2, and *possibly* 3, but I wouldn't bet on that last one myself, nor on #4. The monitor reviewed here--with a 60Hz maximum and pixel decay under 40Hz--would seem to suggest that a better panel isn't at all necessary.

    I also completely agree that, absent G-SYNC, Freesync very likely wouldn't exist. But that's often the way of things: someone comes up with a novel feature, the market sees value in it, and a variant of it becomes standardized.

    G-SYNC is brilliant, but G-SYNC is also clumsy, because of the compromises that are necessary when you can't exert control over all of the systems you depend on. Now that a proper standard exists, those compromises are no longer necessary, and the appropriate thing to do is to stop making them and transfer those resources elsewhere. This, of course, assumes that Freesync doesn't come with greater compromises of its own, but there's presently no reason to expect that it does.

    As for Intel, the 23.97Hz issue persisted as long as it did because you could round down the number of people who really cared to "nobody." It's possible that the number of people who care about Freesync in an IGP rounds similarly too, of course.
  • andrewaggb - Thursday, January 29, 2015 - link

    Freesync in an IGP for laptops and tablets would be a big deal I think.
  • nos024 - Friday, January 30, 2015 - link

    That's exactly what I am saying. Basically, we have only two GPU choices for PC gaming, nVidia or AMD. I'd understand the vendor-lock argument if there was a third and fourth player, but if nVidia doesn't support Free-Sync, you are basically locked into AMD GPUs for Freesync gaming.

    I'm sure nVidia can reduce the royalty fee or eliminate it completely, but you know what? There's nothing competing against it right now.

    nVidia seems to get away with lots of things, e.g. for a MB to implement SLI, it needs to license it and only come in enthusiast chipsets (Z77/Z87/Z97). Xfire comes free with all Intel chipsets - yet SLI is pretty popular still...just saying.
  • anubis44 - Tuesday, February 3, 2015 - link

    I say let nVidia be a bag of dicks and refuse to support the open standard. Then we'll see their true colours and know to boycott them, the greedy bastards.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I think BO and Jarred have it pretty much covered.
  • anubis44 - Tuesday, February 3, 2015 - link

    Somebody'll hack the nVidia drivers to make nVida cards work with Freesync, kind of like the customized Omega drivers for ATI/AMD graphics cards a few years ago. You can count on it. nVidia wants to charge you for something that can easily be done without paying them any licensing fees. I think we should just say no to that.
  • MrSpadge - Thursday, January 29, 2015 - link

    If Intel were smart they'd simply add Free Sync support to their driver. nVidia gamers could use Virtu to let the Intel + Freesync output the signal from their cards. Non gamers would finally get stutter-free video and GSync would be dead.

    No matter if Intel ever takes this route, they could do so and hence free Sync is not "vendor-locked".
  • phoenix_rizzen - Thursday, January 29, 2015 - link

    "The high resolution also means working in normal applications at 100% scaling can be a bit of an eyestrain (please, no comments from the young bucks with eagle eyes; it’s a real concern and I speak from personal experience), and running at 125% or 150% scaling doesn’t always work properly. Before anyone starts to talk about how DPI scaling has improved, let me quickly point out that during the holiday season, at least three major games I know of shipped in a state where they would break if your Windows DPI was set to something other than 100%. Oops. I keep hoping things will improve, but the software support for HiDPI still lags behind where it ought to be."

    This is something I just don't understand. How can it be so hard?

    An inch is an inch is an inch, it never changes. The number of pixels per inch does change, though, as the resolution changes. Why is it so hard for the graphics driver to adapt?

    A 12pt character should be the exact same size on every monitor, regardless of the DPI, regardless of the resolution, regardless of the screen size.

    We've perfected this in the print world. Why hasn't it carried over to the video world? Why isn't this built into every OS by default?

    Just seems bizarre that we can print at 150x150, 300x300, 600x600, 1200x1200, and various other resolutions in between without the characters changing size (12pt is 12pt at every resolution) and yet this doesn't work on computer screens.
  • DanNeely - Thursday, January 29, 2015 - link

    It's not the drivers; it's the applications. The basic win32 APIs (like all mainstream UI APIs from the era) are raster based and use pixels as the standard item size and spacing unit. This was done because on the slower hardware of the era the overhead from trying to do everything in inches or cm was an unacceptable performance hit when the range of DPIs that they needed to work on wasn't wide enough for it to be a problem.

    You can make applications built on them work with DPI scaling; but it would be a lot of work. At a minimum, everywhere you're doing layout/size calculations you'd need to multiply the numbers you're computing for sizes and positions by the scaling factor. I suspect if you wanted to avoid bits of occasional low level jerkyness when resizing you'd probably need to add a bunch of twiddles to manage the remainders you get when scaling doesn't give integral sizes (ex 13 *1.25 = 16.25). If you have any custom controls that you're drawing yourself you'd need to redo the paint methods of them as well. It didn't help that prior to Windows 8, you had to log out and back in to change the DPI scaling level; which would make debugging it very painful for anyone who tried to make it work.

    Newer interface libraries are pixel independent and do all the messy work for you but changing one out is a major rewrite. For Windows, the first one from MS was Windows Presentation Foundation (WPF); which launched in 2006 and was .net only. You can mix C/C++ and .net in a single application; but it's going to be messy and annoying to do at best. Windows 8 was the first version to offer a decedent of WPF to c++ applications directly; but between lack of compatibility with win7 systems meaning the need to maintain two different UIs and the general dislike of the non-windowed nature of Metro applications it hasn't gained much traction in the market.

    Disclosure: I'm a software developer whose duties include maintaining several internal or single customer line of business applications written in .net using the non-dpi aware Windows Forms UI library. Barring internal systems being upgraded to Win 8 or higher (presumably Win10) and high DPI displays or a request from one of our customers to make it happen (along with enough money to pay for it); I don't see any of what I maintain getting the level rewrite needed to retrofit DPI awareness.

Log in

Don't have an account? Sign up now