Earlier today NVIDIA announced G-Sync, its variable refresh rate technology for displays. The basic premise is simple. Displays refresh themselves at a fixed interval, but GPUs render frames at a completely independent frame rate. The disconnect between the two is one source of stuttering. You can disable v-sync to try and work around it but the end result is at best tearing, but at worst stuttering and tearing.

NVIDIA's G-Sync is a combination of software and hardware technologies that allows a modern GeForce GPU to control a variable display refresh rate on a monitor equipped with a G-Sync module. In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU.

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

There's a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn't go into great detail other than to say that there are considerations that need to be taken into account.

The first native G-Sync enabled monitors won't show up until Q1 next year, however NVIDIA will be releasing the G-Sync board for modding before the end of this year. Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors. Otherwise in Q1 of next year ASUS will be selling the VG248QE with the G-Sync board built in for $399, while BenQ, Philips, and ViewSonic are also committing to rolling out their own G-Sync equipped monitors next year too. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually. The G-Sync module itself looks like this:

There's a controller and at least 3 x 256MB memory devices on the board, although I'm guessing there's more on the back of the board. NVIDIA isn't giving us a lot of detail here so we'll have to deal with just a shot of the board for now.

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Although we only have limited information on the technology at this time, the good news is we got a bunch of cool demos of G-Sync at the event today. I'm going to have to describe most of what I saw since it's difficult to present this otherwise. NVIDIA had two identical systems configured with GeForce GTX 760s, both featured the same ASUS 144Hz displays but only one of them had NVIDIA's G-Sync module installed. NVIDIA ran through a couple of demos to show the benefits of G-Sync, and they were awesome.

The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.

Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

Update: NVIDIA has posted a bit more information about G-Sync, including the specs of the modified Asus VG248QE monitor, and the system requirements.

NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1

Comments Locked

217 Comments

View All Comments

  • Soulwager - Sunday, October 20, 2013 - link

    Say you have a 50fps constant framerate and a 144hz monitor. With V-sync on, the game engine thinks each frame is on screen for exactly 20ms, but the actual time each frame is on screen is either 21ms or 14ms, Most of the frames will be 21 ms, but whenever the frame rate catches up to the scan line you'll get 7ms of judder. This means smooth motion from the game engine's perspective will appear to hiccup forward on the monitor. With 45fps g-sync, the game engine thinks each frame takes 22ms, and that's what it looks like on the monitor.
  • SlyNine - Sunday, October 20, 2013 - link

    I thought that was the point of the V-Sync dropping to a multiple of the refresh rate. So the frames could be timed correctly? So you're saying that it continues to render at 50fps but showing at 48fps until a frame is dropped to catch up?
  • Soulwager - Sunday, October 20, 2013 - link

    All v-sync does is limit when buffer swaps can take place to avoid tearing, but if you're using double buffering with v-sync that can cause your framerate to drop to 1/2 or 1/3 your refresh rate. In which case you'd have 48 fps instead of 50fps. Triple buffering means you can keep the GPU busy when there's a completed frame waiting for the next display refresh, which is generally good, because if your framerate is 50fps on a 144hz monitor, you're quite close to the 1/3 threshold and would likely be dipping down into the 1/4 refresh rate territory occasionally, and that's worse than occasionally bumping up into 1/2 refresh rate territory occasionally. There may be additional frame metering going on, but that's more complicated.
  • wojtek - Sunday, October 20, 2013 - link

    It is neglible and unnoticeable to human eye. Do the math: 60 Hz is 16,(6) ms and it is considered to be smooth, 7 ms is not even a half of that value.
  • Soulwager - Monday, October 21, 2013 - link

    60hz is "smooth" if it's consistently 60fps, but you can still tell the difference between 60hz and 120hz. 45fps on a 60hz display doesn't look as good as a rock solid 30fps on a 60hz display. This is because your eye, brain really, is far better at detecting inconsistent movement than it is at noticing missing information.
  • wojtek - Monday, October 21, 2013 - link

    Lets put it that way to be more clear: if you have stable 60 Hz, your brain does not notice anything that is added or not in between two following 16 ms frames. Actually old 50/60 Hz TV/CRT has just black screen there.
    Now, this 7 ms drift means that in most cases you will have one additional frame between those 60 Hz frames and in rare cases, when you missed the v-sync, this additional frame will be skipped and drawn with 60 Hz regime.
    There is no possibility you see any shuttering as there was no possibility you see black luminophore when the cathode ray was shutting somewhere else on CRT screen.
  • Meaker10 - Monday, October 21, 2013 - link

    CRTs were never black between the frames, the sulphur on the surface of the display will glow for a short period of time and become dimmer over time until the beam hit it again.
  • wojtek - Monday, October 21, 2013 - link

    And even more. The 120 Hz was added to LCDs to do 3D with suttering glasses, not to make the move more smooth. Back in the CRT domination days, the refresh rates bigger that 60 Hz were done only because it was less tiring for eyes to watch, not because it was more smooth. An even then the dominant comfortable refresh rate was about 85 Hz. There were rare 100 Hz CRTs and nothing like 120 Hz.
    Remember, on CRT, in reality, you have always black screen with tiny lighting point that scans enterie area and this scan fades on your retina so slow that you have impression of stable picture.

    Nowadays we have LCDs with liquid crystals that are much less responsive than luminophore. The best have full transition black-to-black in about 12 ms and gray-to-gray 4 ms with overdrive. That means that the picture you are looking at stays much longer on screen and on your retina, than those single lighting point in CRT screen.
    This simply leads to the situation when noticing any change in such sub-120 Hz situation is even more impossible than on regular CRT.

    So if you have 120/144 Hz LCD any shuttering you see is caused by software hitch not the v-sync desychronisation.
  • Soulwager - Monday, October 21, 2013 - link

    I'm not talking about shuttering, I'm talking about judder, which is when an object that should be moving at a constant rate appears to change speed due to asynchronous clocks. What's important isn't the presence of an additional frame, it's that the additional frame isn't where you expect it to be. Here's an illustration: http://i.imgur.com/i690R1p.png
  • wojtek - Monday, October 21, 2013 - link

    This chart is just plain wrong. First it should be discrete not continous. Second what "position" means: pixels, inches, meters? Third it shows the drift that I was talking about and showing you that you retina is not able to adapt fast enough to notice this position discontinuity. And last but not least tripple buffering adds significant latency that you want to avoid as a gamer.

Log in

Don't have an account? Sign up now