Earlier today NVIDIA announced G-Sync, its variable refresh rate technology for displays. The basic premise is simple. Displays refresh themselves at a fixed interval, but GPUs render frames at a completely independent frame rate. The disconnect between the two is one source of stuttering. You can disable v-sync to try and work around it but the end result is at best tearing, but at worst stuttering and tearing.

NVIDIA's G-Sync is a combination of software and hardware technologies that allows a modern GeForce GPU to control a variable display refresh rate on a monitor equipped with a G-Sync module. In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU.

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

There's a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn't go into great detail other than to say that there are considerations that need to be taken into account.

The first native G-Sync enabled monitors won't show up until Q1 next year, however NVIDIA will be releasing the G-Sync board for modding before the end of this year. Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors. Otherwise in Q1 of next year ASUS will be selling the VG248QE with the G-Sync board built in for $399, while BenQ, Philips, and ViewSonic are also committing to rolling out their own G-Sync equipped monitors next year too. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually. The G-Sync module itself looks like this:

There's a controller and at least 3 x 256MB memory devices on the board, although I'm guessing there's more on the back of the board. NVIDIA isn't giving us a lot of detail here so we'll have to deal with just a shot of the board for now.

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Although we only have limited information on the technology at this time, the good news is we got a bunch of cool demos of G-Sync at the event today. I'm going to have to describe most of what I saw since it's difficult to present this otherwise. NVIDIA had two identical systems configured with GeForce GTX 760s, both featured the same ASUS 144Hz displays but only one of them had NVIDIA's G-Sync module installed. NVIDIA ran through a couple of demos to show the benefits of G-Sync, and they were awesome.

The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.

Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

Update: NVIDIA has posted a bit more information about G-Sync, including the specs of the modified Asus VG248QE monitor, and the system requirements.

NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1

Comments Locked

217 Comments

View All Comments

  • ninjaquick - Thursday, October 24, 2013 - link

    G-Sync will most likely spawn a VESA standard counterpart, which will be a driver update away for AMD cards. Hell, even TrueAudio and Mantle are something Nvidia could get into, if they wanted. Leave it to nvidia to create a closed-source solution where their competitor has been doing open-source dev-and consumer- friendly work all along.
  • LedHed - Monday, October 28, 2013 - link

    "where their competitor has been doing open-source dev-and consumer- friendly work all along."

    I hope you are joking... Mantle could have changed the PC community as a whole if AMD had simply opted to license the API out and NVIDIA could port it through CUDA easily. Instead AMD is holding on to it like a new born baby and will essentially squander the opportunity to create an almost seamless transition from console to PC. Mantle is extremely proprietary (the opposite of open source..) and TrueAudio is also proprietary, so how you came to the conclusion anything they are doing is open source is beyond me. In my opinion I believe Mantle will be implemented into games (the few AMD can afford, look at their current stock trends) in a similar fashion to how we are seeing PhysX implemented into games (Batman Origins). No one is going to want to recode their entire game around a new API that only a tiny percentage of gamers can use (compare Mantle compatible GPUs to all the rest currently used according to the latest Steam survey), so I believe they will code specific elements of a game, just like you see with PhysX effects. I honestly doubt we are going to see anything ground breaking with BF4 and Mantle considering BF4's engine was already well established before AMD even approached DICE with Mantle. Obviously this is all my opinion, but unlike most posters in here, I am an author for a well known GPU Site.
  • Exodite - Friday, October 18, 2013 - link

    Did you get to see how this impacts mouse response?

    The reason I personally skip on V-Sync when gaming is that I consider even minimal mouse lag far, far worse than any amount of tearing and the like. Since the G-Sync has to be messing with what comes from the computer I'm needless to say concerned that it'll introduce gameplay issues (with the mouse controls) while providing better image quality.
  • Maxwell_88 - Friday, October 18, 2013 - link

    I doubt it will impact mouse response in any way. As far as I understand it is that when ever the GPU has a frame ready only then will the monitor scan out from the framebuffer.
  • Exodite - Friday, October 18, 2013 - link

    That sounds encouraging, I hope you're correct.
  • nathanddrews - Friday, October 18, 2013 - link

    Sounds about right.
  • inighthawki - Friday, October 18, 2013 - link

    It will still have the same impact if the game renders faster than the display can refresh. The actual reason behind the cause of input latency when vsync enabled is caused by the game rendering more quickly than 60fps. In order to do so and not artificially impact performance and tolerance, gpu rendering of frames is queued asynchronously. Games do not start rendering the frames at vsync, they start as soon as the previous frame is done. This means if you have a 16ms frame (60Hz) and you finish rendering in 1ms, it will proceed immediately to render frame 2, then frame 3, and so on (up until a point where the OS blocks you from continuing to prevent too much work). What you end up with is you've rendered 3 frames in 3ms, but now it's queued up and you have to wait 50ms to see the results (3 frames later). Variable framerate doesn't change this, it simply makes things smoother. If you have G-Sync on a 60Hz display and you continue to render at 200Hz, you will see the same thing happen. G-Sync is only beneficial for the flip side - the rendering cannot keep up with the display rate. It solves the common issue with framerate halving with vsync enabled and dropping below the vsync period per frame, but it won't solve the issues for mouse latency.
  • nafhan - Friday, October 18, 2013 - link

    It sounds like this should and will be coupled with high refresh rate monitors... which pretty much negates everything negative you said.
  • inighthawki - Friday, October 18, 2013 - link

    While that would be ideal, I see no reason why this technology would be limited to high refresh rate monitors. 60Hz monitors can certainly make very good use of this technology.

    Higher refresh rates will certainly help, but does not necessarily "negate" the issue. There is and still will be three frames of latency, they will just be smaller frames.
  • datex - Friday, October 18, 2013 - link

    As an indie game dev, what can I do to take advantage of G-Sync? Is there a developer FAQ or anything?

    Now looking into ways to not buffer 3 frames in advance...

Log in

Don't have an account? Sign up now