Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.

NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.

Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.

I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.

With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.

Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.

How it Plays
Comments Locked

193 Comments

View All Comments

  • Ryan Smith - Sunday, December 15, 2013 - link

    That's Volta (2016?), not Maxwell.

    http://images.anandtech.com/doci/6846/GPURoadmap.j...
  • Totally - Friday, December 13, 2013 - link

    I just don't see why can't they integrate this into the video card completely, and work out a method with display manufacturers to bypass the scalar. Closet guess I have is Nvidia doesn't want to add that $120 if that to their cards.
  • MrSpadge - Saturday, December 14, 2013 - link

    Thanks for that very intersting review, Anand! Glad you're not only doing iWhatever by now ;)

    Some people have brought up the point that one could simply get more raw GPU horsepower and push for high frame rates with VSync on. I think GSync is superior, in fact I'd formulate it the other way around: it could let you get away with a smaller GPU, since 30 - 60 fps is fine with GSync on. Apart from buying the GPUs this also saves on power consumption, cooling requirements, noise etc.

    And this could go really well with mechanisms built into the game engines to ensure a certain minimum frame rate by dynamically skipping or reducing the complexity of less important stuff.

    And decoupling of the AI and interface from the display refresh, of course.
  • godihatework - Saturday, December 14, 2013 - link

    my question is will this ever make it to laptops? I think the potential benefit is much higher in a mid range laptop scenario rather than in a high end gaming desktop.
  • mutantmagnet - Saturday, December 14, 2013 - link

    http://www.youtube.com/watch?v=KhLYYYvFp9A&t=4...

    When will you guys talk about Gsync's alternative modes. I was curious about what improvements they made to Lightboost with their Low Persistence Mode. I was a little shocked to see you say gsync won't have much benefit for those who can push out a lot of FPS which is correct but LPM is supposed to address those type of people but it wasn't mentioned.
  • nand - Saturday, December 14, 2013 - link

    when do diy modding kits coming out?
  • coolhund - Saturday, December 14, 2013 - link

    Stuttering? Stuttering? What?
    I have never experienced stuttering with Vsync on, except when I turn the pre-rendered frames too high. They should always be on 0 or 1, never more, or you GET stuttering and input lag. Also LCDs have far too much motion blur to notice slight stuttering.
  • Murloc - Sunday, December 15, 2013 - link

    I've never experienced tearing nor stuttering, vsync enabled or not.
    Weird.
  • frag85 - Sunday, December 15, 2013 - link

    Unfortunately, with this being a proprietary hardware/software 'gimmick' I don't see it taking off. A standard needs to be created for this to really take off and be viable for everyone. If it can be adopted by everyone (ATI, Intel, Matrox, VIA ect..) it will no longer be a gimmick.
  • lilkwarrior - Tuesday, December 17, 2013 - link

    This is hardly a gimmick. This is absolutely a game changer that makes Nvidia have all the cards as far as capturing high-end gamers.

    Of course it makes sense for ALL high-end gamers (people willing to buy Nvidia and ATI's flagship GPU cards around this time of the year annually) for it to be licensed out eventually, but it makes sense for them to not do that until a year or so.

    It's a no-brainer the benefits of the problem solved that's not necessarily critical to have unless you're already the niche part of the audience that values high-quality levels of entertainment.

    It's not too different than retina displays by Apple early on if you ignore the closed access to this technology: You have to SEE it to believe it, but the problems it solves is a no-brainer to capture if you're already willing to pay that much for a laptop/desktop.

    If you're deciding between the best tier of Nvidia card or ATI card right now for gaming, this technology--along with PhyX--makes Nvidia the rationale choice right now to side with.

    They undisputedly have the best cards this year, the better proprietary technology that enhances games in a progressively-enhancing way (if you don't have it, you don't notice it; if you do have it, it's very satisfying) Nvidia better provides, and their SLI is more consistent in what it delivers compared to Crossfire that has had issues ATI is solving to bring parity back ot that discussion.

    G-Sync being exclusive to Nvidia graphics cards for a year isn't too much of a big deal.

    I wouldn't be surprised nonetheless if they license it after a year or so to offset any attempt for someone to research their own answer to it.

    My only concern is the component that replaces the monitor scaler: Would it pose a problem with non-nvidia cards in the future?

Log in

Don't have an account? Sign up now