Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.

NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.

Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.

I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.

With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.

Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.

How it Plays
Comments Locked

193 Comments

View All Comments

  • extide - Thursday, December 12, 2013 - link

    I totally agree. This could seemingly be solved with a much simpler solution that included a new display protocol/standard. Hopefully this is just the tip of the iceberg, and a more sensible solution will come in the future.
  • Pastuch - Thursday, December 12, 2013 - link

    Solid points... If someone game me a LB monitor I wouldn't use it though because I love 1440P. I was close to buying a 27" Lightboost monitor but when I saw my friends Qnix I changed my mind instantly. The tradeoffs to get LB are too drastic. G-sync looks to make lower fps feel better. I'd rather have higher FPS.
  • fade2blac - Thursday, December 12, 2013 - link

    Shouldn't Adaptive V-Sync be thrown into the mix as well? I thought this was also supposed to be a way to improve the user experience in situations where framerates drop below your display's refresh rate. G-Sync seems to be a better and more direct solution to the problem, but it requires one to buy new specialized (a.k.a. more expensive) hardware and also (currently) limits connectivity options.

    Ideally, I would much rather that this pushes development of an open standard that leverages DVI/HDMI/DP, which will likely require "smarter" displays, but doesn't discriminate on the GPU and connectivity side. Further fragmenting the market by implementing yet another proprietary solution to an otherwise universal problem will severely limit the adoption. I assume there are patents, etc. that likely prevent anyone from implementing similar solutions without having to license this "novel" idea from nVidia.
  • eanazag - Thursday, December 12, 2013 - link

    I think this may make more sense on gaming laptops. I say this because high res panels on laptops could be better supported with less than a top of the line GPU that costs $300-400.
  • ant6n - Thursday, December 12, 2013 - link

    On the first page the second diagram illustrates "V-Sync off causes 'Tearing'". But in the image, why does the GPU wait to start drawing the new frame until the monitor hits refresh? I thought the point would be that the gpu waits to display the new frame until the monitor refreshes; there's no reason wait for the refresh to start drawing. And for the given example there would be no lag, because the three frames could be drawn in time if the gpu wouldn't wait.

    Another question: If your card can't push 60Hz, why not just run the game at 40Hz, and v-sync to that? If the gpu can push at least 40hz, there will be no stuttering.
  • Traciatim - Thursday, December 12, 2013 - link

    Because generally monitors are fixed refresh rates and you can't easily just sync to 40hz. 40hz on a 60hz panel would mean judder since some frames would be displayed for different lengths of time which you then pick up as unnatural movement even if your frame rate is pretty high. A better option would be to since to evenly divisible numbers, so if you have a 144hz panel you only draw every 1 refresh for 144fps, 2 for 72 fps, 3 for 48fps, etc.

    That or you could just use G-Sync and tell the monitor when the frame is ready so you don't ever have to wait around for the monitor.
  • ant6n - Thursday, December 12, 2013 - link

    Are you saying that panels have a fixed internal refresh rate? I know they have a fixed native resolution, but the refresh rate should be whatever comes in from the pc.
  • SlyNine - Friday, December 13, 2013 - link

    Umm. that's kinda the whole point of Gsync is that Monitors run at a fixed hz. in fact that's the whole point of Vsync.
  • ant6n - Friday, December 13, 2013 - link

    Of course they run at a fixed Hz at any given time But Traciatim seems to claim an lcd panel will always display at 60hz, even if the gpu drives it with 40hz, resulting in judder.
  • Floflo81 - Monday, December 16, 2013 - link

    It highly depends on the monitor, and is rarely mentioned in the technical specs.
    My ASUS PA238Q accepts any refresh rate between 40 and 70Hz, but anything other than 60 Hz indeed results in judder (skipped or missing frames...) because the panel always runs at 60 Hz.
    But some other screens behave differently.

Log in

Don't have an account? Sign up now