Earlier today NVIDIA announced G-Sync, its variable refresh rate technology for displays. The basic premise is simple. Displays refresh themselves at a fixed interval, but GPUs render frames at a completely independent frame rate. The disconnect between the two is one source of stuttering. You can disable v-sync to try and work around it but the end result is at best tearing, but at worst stuttering and tearing.

NVIDIA's G-Sync is a combination of software and hardware technologies that allows a modern GeForce GPU to control a variable display refresh rate on a monitor equipped with a G-Sync module. In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU.

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

There's a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn't go into great detail other than to say that there are considerations that need to be taken into account.

The first native G-Sync enabled monitors won't show up until Q1 next year, however NVIDIA will be releasing the G-Sync board for modding before the end of this year. Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors. Otherwise in Q1 of next year ASUS will be selling the VG248QE with the G-Sync board built in for $399, while BenQ, Philips, and ViewSonic are also committing to rolling out their own G-Sync equipped monitors next year too. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually. The G-Sync module itself looks like this:

There's a controller and at least 3 x 256MB memory devices on the board, although I'm guessing there's more on the back of the board. NVIDIA isn't giving us a lot of detail here so we'll have to deal with just a shot of the board for now.

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Although we only have limited information on the technology at this time, the good news is we got a bunch of cool demos of G-Sync at the event today. I'm going to have to describe most of what I saw since it's difficult to present this otherwise. NVIDIA had two identical systems configured with GeForce GTX 760s, both featured the same ASUS 144Hz displays but only one of them had NVIDIA's G-Sync module installed. NVIDIA ran through a couple of demos to show the benefits of G-Sync, and they were awesome.

The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.

Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

Update: NVIDIA has posted a bit more information about G-Sync, including the specs of the modified Asus VG248QE monitor, and the system requirements.

NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1

Comments Locked

217 Comments

View All Comments

  • inighthawki - Saturday, October 19, 2013 - link

    That's impossible, there is ALWAYS buffering if you are rendering faster than the display's refresh rate. If the display maxes out at 144hz, then you cannot render faster than 144hz without queuing frames ahead. The only thing that reduces input lag here is that frames are smaller at 144hz, so there is less time to drain a queue of frames, leading to a better worst case.
  • Kevin G - Friday, October 18, 2013 - link

    I'm wondering how G-Sync timing is sent to the display. I presume it is over DisplayPort as it has a dedicated AUX channel that could be used for this.

    The other thing that springs to mind is the idea that this is a variant of panel self refresh (PSR). This would allow the panels to refresh on their own independent of the host GPU's timing. PSR was designed as a power saving technique as it would allow the GPU to power down in mobile devices. My guess is that nVidia thought of a nice high performance usage for it by coordinating the refresh rate. This would explain the presence of the 1.5 GB buffer on the G-Sync card (though I'm thinking that the actual amount is 192 MB by way of six 256 Mbit chips).

    I would also fathom that this technology can be used to shave off a few hundred microseconds in the process by preemptively starting the display refresh before the new frame buffer is finished. This is what causes tearing with V-Sync disabled but here with a software driven timing algorithm, the swap to a new frame buffer could happen when it is >90% complete.

    I'm also enthusiastic about G-Sync becoming part of Occulus Rift. This seems like a solution to what Carmack and others have been searching for in VR. In fact, I'm kinda surprised that Jen-Hsun didn't reference Carmack's new position at Occulus when talking about G-Sync. Sure, he's more famous as a programmer but the Occulus side of his work is where G-Sync will have the most impact.
  • errorr - Friday, October 18, 2013 - link

    This was exactly what I thought when I saw this.
  • repoman27 - Friday, October 18, 2013 - link

    I highly doubt the AUX channel will have anything to do with timing. DP is packet based, and as Anand pointed out NVIDIA is just manipulating the timing of the v-blank signal within that packet stream to indicate a refresh.

    Also, there is clearly 756 MB of DDR3L SDRAM on that module if you look at that picture closely. H5TC2G63FFR = http://www.skhynix.com/products/consumer/view.jsp?...
  • Kevin G - Sunday, October 20, 2013 - link

    Good catch on the RAM parts. Earlier shots of that board weren't high res enough to make out the part numbers. It could be either 768 MB or 1.5 GB if the back side is populated too (which I haven't seen a picture of).

    The Saturday update to this article does point toward this being a twist of PSR as the modification goes pure DisplayPort. I wouldn't have expected that the PSR buffers would need to be so large. Even at cinematic 4096 x 2304 resolution that's 21 times the amount of necessary for a 32 bit frame buffer. 768 MB has enough space for a couple 8K buffers at 64 bit color depth.

    My guess is that that this could be the first phase of the technology's introduction. Phase 2 would likely have the monitor accept multiple DisplayPort inputs from different nVidia cards in SLI. The monitor would buffer the frames from either video card and pick what is best based upon its own time algorithm. This would essentially allow the video cards to operate asynchronously with respect to the monitors refresh and even each other.

    Other possibilities would include multi-monitor setups to have one GPU dedicated to purely rendering what is one screen. Rendering in that fashion is notorious to load balance as what appears on one screen may be easier to render than the others. This produces an asynchronous result on a per monitor rendering basis but if the monitors themselves are handling the refresh, then this problem may become a non-issue. The other downside for per GPU per monitor rendering is that it'll likely be buggy under current API's due to how things are rendered as a single logical surface (ie the frame buffer covers all three displays).
  • repoman27 - Tuesday, October 22, 2013 - link

    So clearly when I said 756 MB, I really meant 768 MB (d'oh!). But anyways, I strongly suspect that only one side is populated and that's all she wrote. Either way, it's quite a bit of RAM.

    However, if this works as I surmise and just negotiates the fastest possible DisplayPort main link rate and then uses bit stuffing between frames, this could be a way of buffering that entire firehose. DP 1.2 can shift 2 GB/s, which would make 768 MB a 373 ms buffer if it was able to be used in its entirety. Or to put it another way, at 30 fps the G-Sync chip may need to ingest 68.7 MB of DP data to get one displayable frame.
  • Kevin G - Wednesday, October 23, 2013 - link

    I was thinking that the tech could scale to support 120 or 144 Hz refresh rates at 4k resolution. While DP 1.2 is fast, it isn't fast enough to handle that resolution and refresh rate simultaneously. To side step the issue, the monitor uses two DP links. Each frame from each link gets buffered but the display itself picks what would be the most appropriate based upon when they arrive. The only means of getting such high frame rates would be to use SLI. This allows each video card to feed the host display. If the display itself is picking what frames are displayed, then the synchronization between each video card can be removed on the host PC.

    Though by my calculations, a 3840 x 2160 frame would consume 33.2 MByte of data. At 60 Hz, the 768 MB of memory would roughly a 385 ms buffer
  • PPalmgren - Friday, October 18, 2013 - link

    Excellent idea, but its going to be really really really hard to sell to the people who need it.

    G-sync targets systems that regularly dip below the 60hz threshold, and costs over $100 at the moment. When you're buying/building a system, do you spend that extra money on a better card so you never have the problems it corrects, or do you buy g-sync?

    Maybe it will get life from the big spenders in the high refresh rate monitors, but its gonna take a lot of production to push that price down. The people who need this the most are the ones that are least likely to afford it.
  • kyuu - Friday, October 18, 2013 - link

    Yeah that's my thought. People with cheap graphics cards aren't likely going to be looking to spend up a couple hundred dollars on a monitor. People with more expensive graphics cards who might have the money likely aren't having issues with staying above at or above 60 FPS.

    For me, spending a couple hundred dollars on a new monitor that's going to have a mediocre TN panel and getting locked into one vendor for graphics cards = no go.
  • kyuu - Friday, October 18, 2013 - link

    I should say a couple hundred dollars over and above the normal price of the monitor.

Log in

Don't have an account? Sign up now