How it Plays

The requirements for G-Sync are straightforward. You need a G-Sync enabled display (in this case the modified ASUS VG248QE is the only one “available”, more on this later). You need a GeForce GTX 650 Ti Boost or better with a DisplayPort connector. You need a DP 1.2 cable, a game capable of running in full screen mode (G-Sync reverts to V-Sync if you run in a window) and you need Windows 7 or 8.1.

G-Sync enabled drivers are already available at GeForce.com (R331.93). Once you’ve met all of the requirements you’ll see the appropriate G-Sync toggles in NVIDIA’s control panel. Even with G-Sync on you can still control the display’s refresh rate. To maximize the impact of G-Sync NVIDIA’s reviewer’s guide recommends testing v-sync on/off at 60Hz but G-Sync at 144Hz. For the sake of not being silly I ran all of my comparisons at 60Hz or 144Hz, and never mixed the two, in order to isolate the impact of G-Sync alone.

NVIDIA sampled the same pendulum demo it used in Montreal a couple of months ago to demonstrate G-Sync, but I spent the vast majority of my time with the G-Sync display playing actual games.

I’ve been using Falcon NW’s Tiki system for any experiential testing ever since it showed up with NVIDIA’s Titan earlier this year. Naturally that’s where I started with the G-Sync display. Unfortunately the combination didn’t fare all that well, with the system exhibiting hard locks and very low in-game frame rates with the G-Sync display attached. I didn’t have enough time to further debug the setup and plan on shipping NVIDIA the system as soon as possible to see if they can find the root cause of the problem. Switching to a Z87 testbed with an EVGA GeForce GTX 760 proved to be totally problem-free with the G-Sync display thankfully enough.

At a high level the sweet spot for G-Sync is going to be a situation where you have a frame rate that regularly varies between 30 and 60 fps. Game/hardware/settings combinations that result in frame rates below 30 fps will exhibit stuttering since the G-Sync display will be forced to repeat frames, and similarly if your frame rate is equal to your refresh rate (60, 120 or 144 fps in this case) then you won’t really see any advantages over plain old v-sync.

I've put together a quick 4K video showing v-sync off, v-sync on and G-Sync on, all at 60Hz, while running Bioshock Infinite on my GTX 760 testbed. I captured each video at 720p60 and put them all side by side (thus making up the 3840 pixel width of the video). I slowed the video down by 50% in order to better demonstrate the impact of each setting. The biggest differences tend to be at the very beginning of the video. You'll see tons of tearing with v-sync off, some stutter with v-sync on, and a much smoother overall experience with G-Sync on.

While the comparison above does a great job showing off the three different modes we tested at 60Hz, I also put together a 2x1 comparison of v-sync and G-Sync to make things even more clear. Here you're just looking for the stuttering on the v-sync setup, particularly at the very beginning of the video:

Assassin’s Creed IV

I started out playing Assassin’s Creed IV multiplayer with v-sync off. I used GeForce Experience to predetermine the game quality settings, which ended up being maxed out even on my GeForce GTX 760 test hardware. With v-sync off and the display set to 60Hz, there was just tons of tearing everywhere. In AC4 the tearing was arguably even worse as it seemed to take place in the upper 40% of the display, dangerously close to where my eyes were focused most of the time. Playing with v-sync off was clearly not an option for me.

Next was to enable v-sync with the refresh rate left at 60Hz. Lots of AC4 renders at 60 fps, although in some scenes both outdoors and indoors I saw frame rates drop down into the 40 - 51 fps range. Here with v-sync enabled I started noticing stuttering, especially as I moved the camera around and the difficulty of what was being rendered varied. In some scenes the stuttering was pretty noticeable. I played through a bunch of rounds with v-sync enabled before enabling G-Sync.

I enabled G-Sync, once again leaving the refresh rate at 60Hz and dove back into the game. I was shocked; virtually all stuttering vanished. I had to keep FRAPS running to remind me of areas where I should be seeing stuttering. The combination of fast enough hardware to keep the frame rate in the G-Sync sweet spot of 40 - 60 fps and the G-Sync display itself produced a level of smoothness that I hadn’t seen before. I actually realized that I was playing Assassin’s Creed IV with an Xbox 360 controller literally two feet away from my PS4 and having a substantially better experience. 

Batman: Arkham Origins

Next up on my list was Batman: Arkham Origins. I hadn’t played the past couple of Batman games but they always seemed interesting to me so I was glad to spend some time with this one. Having skipped the previous ones, I obviously didn’t have the repetitive/unoriginal criticisms of the game that some other seemed to have had. Instead I enjoyed its pace and thought it was a decent way to kill some time (or in this case, test a G-Sync display).

Once again I started off with v-sync off with the display set to 60Hz. For a while I didn’t see any tearing, that was until I ended up inside a tower during the second mission of the game. I was panning across a small room and immediately encountered a ridiculous amount of tearing. This was even worse than Assassin’s Creed. What’s interesting about the tearing in Batman was that it really felt more limited in frequency than in AC4’s multiplayer, but when it happened it was substantially worse.

Next up was v-sync on, once again at 60Hz. Here I noticed sharp variations in frame rate resulting in tons of stutter. The stutter was pretty consistent both outdoors (panning across the city) and indoors (while fighting large groups of enemies). I remember seeing the stutter and noting that it was just something I’m used to expecting. Traditionally I’d fight this on a 60Hz panel by lowering quality settings to at least drive for more time at 60 fps. With G-Sync enabled, it turns out I wouldn’t have to.

The improvement to Batman was insane. I kept expecting it to somehow not work, but G-Sync really did smooth out the vast majority of stuttering I encountered in the game - all without touching a single quality setting. You can still see some hiccups, but they are the result of other things (CPU limitations, streaming textures, etc…). That brings up another point about G-Sync: once you remove GPU/display synchronization as a source of stutter, all other visual artifacts become even more obvious. Things like aliasing and texture crawl/shimmer become even more distracting. The good news is you can address those things, often with a faster GPU, which all of the sudden makes the G-Sync play an even smarter one on NVIDIA’s part. Playing with G-Sync enabled raises my expectations for literally all other parts of the visual experience.

Sleeping Dogs

I’ve been wanting to play Sleeping Dogs ever since it came out, and the G-Sync review gave me the opportunity to do just that. I like the premise and the change of scenery compared to the sandbox games I’m used to (read: GTA), and at least thus far I can put up with the not-quite-perfect camera and fairly uninspired driving feel. The bigger story here is that running Sleeping Dogs at max quality settings gave my GTX 760 enough of a workout to really showcase the limits of G-Sync.

With v-sync (60Hz) on I typically saw frame rates around 30 - 45 fps, but there were many situations where the frame rate would drop down to 28 fps. I was really curious to see what the impact of G-Sync was here since below 30 fps G-Sync would repeat frames to maintain a 30Hz refresh on the display itself.

The first thing I noticed after enabling G-Sync is my instantaneous frame rate (according to FRAPS) dropped from 27-28 fps down to 25-26 fps. This is that G-Sync polling overhead I mentioned earlier. Now not only did the frame rate drop, but the display had to start repeating frames, which resulted in a substantially worse experience. The only solution here was to decrease quality settings to get frame rates back up again. I was glad I ran into this situation as it shows that while G-Sync may be a great solution to improve playability, you still need a fast enough GPU to drive the whole thing.

Dota 2 & Starcraft II

The impact of G-Sync can also be reduced at the other end of the spectrum. I tried both Dota 2 and Starcraft II with my GTX 760/G-Sync test system and in both cases I didn’t have a substantially better experience than with v-sync alone. Both games ran well enough on my 1080p testbed to almost always be at 60 fps, which made v-sync and G-Sync interchangeable in terms of experience.

Bioshock Infinite @ 144Hz

Up to this point all of my testing kept the refresh rate stuck at 60Hz. I was curious to see what the impact would be of running everything at 144Hz, so I did just that. This time I turned to Bioshock Infinite, whose integrated benchmark mode is a great test as there’s tons of visible tearing or stuttering depending on whether or not you have v-sync enabled.

Increasing the refresh rate to 144Hz definitely reduced the amount of tearing visible with v-sync disabled. I’d call it a substantial improvement, although not quite perfect. Enabling v-sync at 144Hz got rid of the tearing but still kept a substantial amount of stuttering, particularly at the very beginning of the benchmark loop. Finally, enabling G-Sync fixed almost everything. The G-Sync on scenario was just super smooth with only a few hiccups.

What’s interesting to me about this last situation is if 120/144Hz reduces tearing enough to the point where you’re ok with it, G-Sync may be a solution to a problem you no longer care about. If you’re hyper sensitive to tearing however, there’s still value in G-Sync even at these high refresh rates.

 

Introduction & How it Works Final Words
Comments Locked

193 Comments

View All Comments

  • just4U - Friday, December 13, 2013 - link

    The only stumbling block I really have is being tied to one video chip maker because of the Monitor I buy. That's a problem for me...Stuff like this has to be a group effort.
  • butdoesitwork - Friday, December 13, 2013 - link

    "The interval remains today in LCD flat panels, although it’s technically unnecessary."

    Technically unnecessary from whose perspective? Anand, I'm sure you meant the monitor's perspective, but this otherwise benign comment on VBLANK is misleading at best and dangerous at worst. The last thing we need is folks going around stirring the pot saying things aren't needed. Some bean counter might actually try to muck things up.

    VBLANK most certainly IS "technically" needed on the other end ---- every device from your Atari VCS to your GDDR5 graphics card!

    VBLANK is the only precious time you have to do anything behind the display's back.
    On the Atari VCS, that was the only CPU time you had to run the game program.
    On most consoles (NES on up), that was the only time you had to copy finished settings for the next frame to the GPU. (And you use HBLANK for per-line effects, too. Amiga or SNES anyone?)

    On most MODERN consoles (GameCube through XBox One), you need to copy the rendered frame from internal memory to some external memory for display. And while you can have as many such external buffers as you need (meaning the copy could happen any time), you can bet some enterprising programmers use only one (to save RAM footprint). In that case VBLANK is the ONLY time you have to perform the copy without tearing.

    On any modern graphics card, VBLANK is the precious time you have to hide nondeterministic duration mode changes which might cause display corruption otherwise. Notably GDDR5 retraining operations. Or getting out of any crazy power saving modes. Of course it's true all GPU display controllers have FIFOs and special priority access to avoid display corruption due to memory bandwidth starvation, but some things you just cannot predict well enough, and proper scheduling is a must.
  • DesktopMan - Friday, December 13, 2013 - link

    G-Sync isn't V-blank, so, yeah, if you have G-Sync you don't need V-Blank. You can take your time rendering, not worried about what the monitor is doing, and push your updated frame once the frame is ready. This moves the timing responsibility from monitor to the GPU, which obviously is a lot more flexible.

    If you need time to do GPU configuration or other low level stuff as you mention, then just do them and push the next frame when it's done. None of it will result in display corruption, because you are not writing to the display. You really can rethink the whole setup from bottom up with this. Comparing to systems that are not this is kinda meaningless.
  • mdrejhon - Friday, December 13, 2013 - link

    Although true -- especially for DisplayPort packet data and LCD panels -- this is not the only way to interpret GSYNC.

    Scientifically, GSYNC can be interpreted as a variable-length VBLANK.

    Remember the old analog TV's -- the rolling picture when VHOLD was bad -- that black bar is VBLANK (also called VSYNC). With variable refresh rates, that black bar now becomes variable-height, padding time between refreshes. This is one way to conceptually understand GSYNC, if you're an old-timer television engineer. You can theoretically do GSYNC over an analog cable this way, via the variable-length blanking interval technique.
  • mdrejhon - Friday, December 13, 2013 - link

    Yeah, put this into perspective:
    "Refresh rates" is an artificial invention
    "Frame rate" is an artifical invention

    We had to invent them about century ago, when the first movies came out (19th century), and then the first televisions (Early 20th century). There was no other way to display recorded motion to framerateless human eyes, so we had to come up with the invention of a discrete series of images, which necessitates an interval between them. Continuous, real-life motion has no "interval", no "refresh rate", no "frame rate".
  • darkfalz - Friday, December 13, 2013 - link

    Will this polling performance hit be resolvable by future driver versions, or only by hardware changes?
  • Strulf - Friday, December 13, 2013 - link

    While you can still see the effects in 120 Hz, tearing or lag is nearly not visible at all anymore at 144 Hz. At this point, I can easily do without G-Sync.
    G-Sync is certainly a nice technology if you use a 60 Hz monitor and a nVidia card. But nearly the same effects can be achieved with lots of Hz. A 27" 1440p@144Hz monitor might be quite expensive though. ;)
  • emn13 - Friday, December 13, 2013 - link

    The article states that at any framerate below 30fps G-Sync doesn't work since a panel refresh is caused on at least a 30Hz signal. That conclusion doesn't make sense; unlike a "true" 30Hz refresh rate, after every forced refresh, G-Sync can allow a quicker refresh again.

    Since the refresh latency is 1s/144 ~ 7ms aka on this panel, and a 30Hz refresh is ~ 33ms, that means that when the frame rendering takes longer than 33ms - but shorter than 40ms, it'll finish during the refresh, and will need to wait for the refresh. Translated, that means that only if the instantaneous frame is between 25 and 30 fps will you get stuttering. In practice, I'd expect frame rates to rarely be that consistent; you'll get some >30fps and some <25fps moments, so even in a bad case I'd expect the stuttering to be reduced somewhat; at least, were it not for the additional overhead due to polling.
  • mdrejhon - Friday, December 13, 2013 - link

    And since the frame scan-out time is 1/144sec, that's one very tiny stutter (6.9ms) in a forest of high-latency frames (>33ms+)
  • 1Angelreloaded - Friday, December 13, 2013 - link

    Just a thought but the one thing you really didn't take into account was NVidia's next GPU series, Maxwell, which supposedly will have the VRAM right next to the DIE and will share stored files with direct access from the CPU, if you take that into account, along with G-Sync you can see what will be happening to the framerates if they can pull off a die shrink as well.
    At this point I think Monitor technologies are so far behind, and profit milked to the point of stifling the industry so bad, 1080p has been far longer cycle that we could expect partially due to mainstream HDTVs. We should have had the next jump in resolution as a standard over a year ago in the 200$-300$ price range and standard IPS technologies with lower MS in that range as well or have been introduced to other display types. LED backlighting carried a heavy price premium when in reality they are ridiculously cheaper to produce because of the hazardous waste CFL bulbs being taken out of the equation, which cost more and have certain fees that need to be paid.

Log in

Don't have an account? Sign up now