Earlier today NVIDIA announced G-Sync, its variable refresh rate technology for displays. The basic premise is simple. Displays refresh themselves at a fixed interval, but GPUs render frames at a completely independent frame rate. The disconnect between the two is one source of stuttering. You can disable v-sync to try and work around it but the end result is at best tearing, but at worst stuttering and tearing.

NVIDIA's G-Sync is a combination of software and hardware technologies that allows a modern GeForce GPU to control a variable display refresh rate on a monitor equipped with a G-Sync module. In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU.

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

There's a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn't go into great detail other than to say that there are considerations that need to be taken into account.

The first native G-Sync enabled monitors won't show up until Q1 next year, however NVIDIA will be releasing the G-Sync board for modding before the end of this year. Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors. Otherwise in Q1 of next year ASUS will be selling the VG248QE with the G-Sync board built in for $399, while BenQ, Philips, and ViewSonic are also committing to rolling out their own G-Sync equipped monitors next year too. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually. The G-Sync module itself looks like this:

There's a controller and at least 3 x 256MB memory devices on the board, although I'm guessing there's more on the back of the board. NVIDIA isn't giving us a lot of detail here so we'll have to deal with just a shot of the board for now.

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Although we only have limited information on the technology at this time, the good news is we got a bunch of cool demos of G-Sync at the event today. I'm going to have to describe most of what I saw since it's difficult to present this otherwise. NVIDIA had two identical systems configured with GeForce GTX 760s, both featured the same ASUS 144Hz displays but only one of them had NVIDIA's G-Sync module installed. NVIDIA ran through a couple of demos to show the benefits of G-Sync, and they were awesome.

The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.

Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

Update: NVIDIA has posted a bit more information about G-Sync, including the specs of the modified Asus VG248QE monitor, and the system requirements.

NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1

Comments Locked

217 Comments

View All Comments

  • Braincruser - Friday, October 18, 2013 - link

    Also people with cheap graphics will be limited from the GTX and above club, nvidia set.
  • RoninX - Friday, October 18, 2013 - link

    The point is that you only need to buy the monitor once.

    So, for example, I have a a moderately high-end (but not crazy high-end) GTX680. When it came out, it could pretty much hit 60 Hz with all of the games turned up to max settings. Now, there are a few where it might dip into the upper 50s. The problem is that if you have V-Sync on, 59 Hz is actually just 30 Hz. And if V-Sync is off, you have tearing.

    G-SYNC solves this problem, and you only have to pay the price once, instead of increasing your GPU update schedule from, say once every two years (my schedule) to every year.

    Of course, I wouldn't settle for a cheap TN panel. I'm currently using a Dell Ultrasharp U2410 IPS display, and I would only buy a G-SYNC monitor that could match or better this picture quality.
  • RoninX - Friday, October 18, 2013 - link

    I should clarify what I mean. 59Hz is obviously better on average than 30Hz, even with V-Sync, but you will getting stuttering that drops the update rate to 30 Hz between some frames.
  • medi02 - Sunday, October 20, 2013 - link

    @The problem is that if you have V-Sync on, 59 Hz is actually just 30 Hz.@

    Oh dear...
  • nathanddrews - Friday, October 18, 2013 - link

    I believe you're looking at this the wrong way. The problem that I see with your reasoning is the 60Hz threshold, which many gamers with expensive systems consider the bottom of the barrel. I could go on regarding lightboost and Catleap and custom overdrive boards, but this solves all that.

    Vsync on causes lag. Vsync off causes tearing. This will let the GPU take full advantage of frequencies all the way up to 144Hz with no lag or tearing. It's the best of all worlds, IMO.

    Now we just need 120+Hz 4K+ monitors...
  • GiantPandaMan - Friday, October 18, 2013 - link

    I tend to agree with PPalmgren. The people who would most benefit from this would be better off with buying a more expensive video card.

    There is a different class of user, which you mention, that overdrives monitors and goes 120+Hz. Sure, this will benefit them, but the size of that group is vanishingly small. First, they have to have money. Second, they have to prioritize framerate (TN monitors) over display quality. (Good IPS monitors don't usually support framerates higher than 60 Hz. I hesitate to put overclocked Catleaps into the high quality IPS display category. Not that they're bad mind you.) Third, to drive such massive displays at high framerates it's almost required to go multi-GPU. Multi-GPU configurations have their own problems with keeping smooth framerates.

    Lastly: this has to be an open standard. If Intel or AMD GPU's can't run it, then it'll be stuck to a very small niche market. Personally I'd hope nVidia goes the open route and turns it into a cheap and almost standard feature on monitors. I'm not keeping my fingers crossed, though.
  • nathanddrews - Friday, October 18, 2013 - link

    To be fair, there are some very good TN panels out there, but it's a null issue to me. Frame rate is king for me. I guess we'll just have to wait and see what happens in the coming year(s) with GSync. I'm hopeful that this will push some new approaches to TN panels as well as IPS. OLED? <RandyMarsh>"Nyomygot"</RandyMarsh>

    I just got done reading the QA live blog and it sounds like NVIDIA has left this open to licensing this tech beyond their own hardware, which is great news! Of course, the cost will have to come down a lot for the hardware and licensing, whatever those costs are.

    "Carmack: is G-Sync hardware licensable?
    NV: we haven't precluded that"

    Also:
    "Carmack: I've got a lightboost monitor on my desktop, it'll probably move to G-Sync"

    If it's good enough for Carmack, it's good enough for me. ;-)
  • Yojimbo - Saturday, October 19, 2013 - link

    A device doesn't have to just integrate into the current market, it can alter the market. Developers develop games with the situation at hand to produce the best experience they can. If the main reason for not rendering less than 60fps is that it causes visual artifacts due to asynchronicity between the monitor and the video card (I don't know if this is the case), then removing that issue will allow them to create games where they allow frame rates to drop below 60fps for WHATEVER hardware profile they might be targeting, even the high end profiles. This means that they can include more features, because it effectively shifts the target frames rate downward, and so equivalently makes everyone who has a compliant device have a virtual upward shift in ability. Of course there would need to be widespread adoption in the marketplace for such a shift to be meaningful to the developers at the high end.
  • n0b0dykn0ws - Friday, October 18, 2013 - link

    But will they get 23.976 right?
  • Ryan Smith - Friday, October 18, 2013 - link

    It can't drop below 30Hz, so you'd technically have to run at double that.

Log in

Don't have an account? Sign up now