Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.

NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.

Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.

I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.

With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.

Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.

How it Plays
Comments Locked

193 Comments

View All Comments

  • just4U - Friday, December 13, 2013 - link

    The only stumbling block I really have is being tied to one video chip maker because of the Monitor I buy. That's a problem for me...Stuff like this has to be a group effort.
  • butdoesitwork - Friday, December 13, 2013 - link

    "The interval remains today in LCD flat panels, although it’s technically unnecessary."

    Technically unnecessary from whose perspective? Anand, I'm sure you meant the monitor's perspective, but this otherwise benign comment on VBLANK is misleading at best and dangerous at worst. The last thing we need is folks going around stirring the pot saying things aren't needed. Some bean counter might actually try to muck things up.

    VBLANK most certainly IS "technically" needed on the other end ---- every device from your Atari VCS to your GDDR5 graphics card!

    VBLANK is the only precious time you have to do anything behind the display's back.
    On the Atari VCS, that was the only CPU time you had to run the game program.
    On most consoles (NES on up), that was the only time you had to copy finished settings for the next frame to the GPU. (And you use HBLANK for per-line effects, too. Amiga or SNES anyone?)

    On most MODERN consoles (GameCube through XBox One), you need to copy the rendered frame from internal memory to some external memory for display. And while you can have as many such external buffers as you need (meaning the copy could happen any time), you can bet some enterprising programmers use only one (to save RAM footprint). In that case VBLANK is the ONLY time you have to perform the copy without tearing.

    On any modern graphics card, VBLANK is the precious time you have to hide nondeterministic duration mode changes which might cause display corruption otherwise. Notably GDDR5 retraining operations. Or getting out of any crazy power saving modes. Of course it's true all GPU display controllers have FIFOs and special priority access to avoid display corruption due to memory bandwidth starvation, but some things you just cannot predict well enough, and proper scheduling is a must.
  • DesktopMan - Friday, December 13, 2013 - link

    G-Sync isn't V-blank, so, yeah, if you have G-Sync you don't need V-Blank. You can take your time rendering, not worried about what the monitor is doing, and push your updated frame once the frame is ready. This moves the timing responsibility from monitor to the GPU, which obviously is a lot more flexible.

    If you need time to do GPU configuration or other low level stuff as you mention, then just do them and push the next frame when it's done. None of it will result in display corruption, because you are not writing to the display. You really can rethink the whole setup from bottom up with this. Comparing to systems that are not this is kinda meaningless.
  • mdrejhon - Friday, December 13, 2013 - link

    Although true -- especially for DisplayPort packet data and LCD panels -- this is not the only way to interpret GSYNC.

    Scientifically, GSYNC can be interpreted as a variable-length VBLANK.

    Remember the old analog TV's -- the rolling picture when VHOLD was bad -- that black bar is VBLANK (also called VSYNC). With variable refresh rates, that black bar now becomes variable-height, padding time between refreshes. This is one way to conceptually understand GSYNC, if you're an old-timer television engineer. You can theoretically do GSYNC over an analog cable this way, via the variable-length blanking interval technique.
  • mdrejhon - Friday, December 13, 2013 - link

    Yeah, put this into perspective:
    "Refresh rates" is an artificial invention
    "Frame rate" is an artifical invention

    We had to invent them about century ago, when the first movies came out (19th century), and then the first televisions (Early 20th century). There was no other way to display recorded motion to framerateless human eyes, so we had to come up with the invention of a discrete series of images, which necessitates an interval between them. Continuous, real-life motion has no "interval", no "refresh rate", no "frame rate".
  • darkfalz - Friday, December 13, 2013 - link

    Will this polling performance hit be resolvable by future driver versions, or only by hardware changes?
  • Strulf - Friday, December 13, 2013 - link

    While you can still see the effects in 120 Hz, tearing or lag is nearly not visible at all anymore at 144 Hz. At this point, I can easily do without G-Sync.
    G-Sync is certainly a nice technology if you use a 60 Hz monitor and a nVidia card. But nearly the same effects can be achieved with lots of Hz. A 27" 1440p@144Hz monitor might be quite expensive though. ;)
  • emn13 - Friday, December 13, 2013 - link

    The article states that at any framerate below 30fps G-Sync doesn't work since a panel refresh is caused on at least a 30Hz signal. That conclusion doesn't make sense; unlike a "true" 30Hz refresh rate, after every forced refresh, G-Sync can allow a quicker refresh again.

    Since the refresh latency is 1s/144 ~ 7ms aka on this panel, and a 30Hz refresh is ~ 33ms, that means that when the frame rendering takes longer than 33ms - but shorter than 40ms, it'll finish during the refresh, and will need to wait for the refresh. Translated, that means that only if the instantaneous frame is between 25 and 30 fps will you get stuttering. In practice, I'd expect frame rates to rarely be that consistent; you'll get some >30fps and some <25fps moments, so even in a bad case I'd expect the stuttering to be reduced somewhat; at least, were it not for the additional overhead due to polling.
  • mdrejhon - Friday, December 13, 2013 - link

    And since the frame scan-out time is 1/144sec, that's one very tiny stutter (6.9ms) in a forest of high-latency frames (>33ms+)
  • 1Angelreloaded - Friday, December 13, 2013 - link

    Just a thought but the one thing you really didn't take into account was NVidia's next GPU series, Maxwell, which supposedly will have the VRAM right next to the DIE and will share stored files with direct access from the CPU, if you take that into account, along with G-Sync you can see what will be happening to the framerates if they can pull off a die shrink as well.
    At this point I think Monitor technologies are so far behind, and profit milked to the point of stifling the industry so bad, 1080p has been far longer cycle that we could expect partially due to mainstream HDTVs. We should have had the next jump in resolution as a standard over a year ago in the 200$-300$ price range and standard IPS technologies with lower MS in that range as well or have been introduced to other display types. LED backlighting carried a heavy price premium when in reality they are ridiculously cheaper to produce because of the hazardous waste CFL bulbs being taken out of the equation, which cost more and have certain fees that need to be paid.

Log in

Don't have an account? Sign up now