Earlier today NVIDIA announced G-Sync, its variable refresh rate technology for displays. The basic premise is simple. Displays refresh themselves at a fixed interval, but GPUs render frames at a completely independent frame rate. The disconnect between the two is one source of stuttering. You can disable v-sync to try and work around it but the end result is at best tearing, but at worst stuttering and tearing.

NVIDIA's G-Sync is a combination of software and hardware technologies that allows a modern GeForce GPU to control a variable display refresh rate on a monitor equipped with a G-Sync module. In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU.

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

There's a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn't go into great detail other than to say that there are considerations that need to be taken into account.

The first native G-Sync enabled monitors won't show up until Q1 next year, however NVIDIA will be releasing the G-Sync board for modding before the end of this year. Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors. Otherwise in Q1 of next year ASUS will be selling the VG248QE with the G-Sync board built in for $399, while BenQ, Philips, and ViewSonic are also committing to rolling out their own G-Sync equipped monitors next year too. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually. The G-Sync module itself looks like this:

There's a controller and at least 3 x 256MB memory devices on the board, although I'm guessing there's more on the back of the board. NVIDIA isn't giving us a lot of detail here so we'll have to deal with just a shot of the board for now.

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Although we only have limited information on the technology at this time, the good news is we got a bunch of cool demos of G-Sync at the event today. I'm going to have to describe most of what I saw since it's difficult to present this otherwise. NVIDIA had two identical systems configured with GeForce GTX 760s, both featured the same ASUS 144Hz displays but only one of them had NVIDIA's G-Sync module installed. NVIDIA ran through a couple of demos to show the benefits of G-Sync, and they were awesome.

The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.

Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

Update: NVIDIA has posted a bit more information about G-Sync, including the specs of the modified Asus VG248QE monitor, and the system requirements.

NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1

Comments Locked

217 Comments

View All Comments

  • Soulwager - Wednesday, October 23, 2013 - link

    10 ms of what? latency? strobe interval? frametime variance? Just because you can't see a light flickering above X hz when it's stationary doesn't mean you can't see it flickering when it's moving. If you don't believe me, try it. build a timer circuit that flashes a led 100 times a second, or however fast it takes to give the illusion of a solid light, then wave it around in a dark room. you WILL see it flashing until you get up to frequencies that would sound absurd for a computer monitor.
  • wojtek - Wednesday, October 23, 2013 - link

    This is what i'm talking about strobbing, but you are watching a stable monitor, or constant place in a motion, where awareness is focused in one place! Take this flashing circuit, focus on it and do some motion. You will see no flashing at all.
  • Soulwager - Wednesday, October 23, 2013 - link

    There is a LOT more to it than just getting rid of flicker. It's like comparing pure audio reaction time(~150ms) to synchronizing to a beat(<20ms).

    When I'm moving mouse around in an game, it's moving around at something like 5k pixels per second. This means ± 8ms variance on frame time corresponds to a mouse inaccuracy of something like 80 pixels. To get around this you need to have a ton of practice with the mouse so you know how far the cursor is going to move without seeing it.

    The equivalent of latency would be the LED appearing to trail it's actual position as you move your arm around. Variance would be it's apparent position relative to it's actual position changing as it's moved at a constant rate. Even if you can't see it watching non-interactive content, you can certainly feel it when looking around in a FPS or moving a mouse anywhere.
  • wojtek - Wednesday, October 23, 2013 - link

    When you moving mouse cursor/screen with framerate below 60 Hz you see much more latency (usualy more than 20-30 ms) that is _clearly visible_ and additional +/- 8ms of latency is almost neglible component to it taking into account the additional motion blur that it produces. This is what I'm trying to say, but you maybe you must try it for yourself.
    See how more blurred _and_ shifted back is the 30 FPS animation compared to 60 FPS one.
    http://www.testufo.com/#test=framerates
    Now we are talking about animation below 60 FPS on the 120 Hz screen that causes visual blur higher than 60 FPS, and the timing inaccuracy you are so afraid of is not bigger than a half of 60 FPS blur on 120 FPS move! This sub 60 FPS animation, and the object you track, is already shifted back and blured more than the 60 FPS one no mather how accurate you sync it with the buffer!
  • Soulwager - Wednesday, October 23, 2013 - link

    Absolute latency has zero impact on how smooth the animation looks, it has nothing to do with strobing, and it has nothing to do with stutter. It's just a delay. You can't see it unless the system is reacting to your input, which is what happens in a game. If latency is consistent, you can adapt to it, by leading your target for example. If its inconsistent, you can't predict where something is going to be nearly as accurately, because you don't know how far the animation is lagging behind the game engine.

    http://www.youtube.com/watch?feature=player_embedd...

    Here, you can SEE THE DIFFERENCE between 10ms latency and 1ms latency. Now imagine that instead of smoothly following on the 10ms clip, that the white blob is some random point between where it is and where it would be on a 20ms latency test, and that that white blob is what you have to use to figure out where the finger is now. That is why I have more of a problem with variance in frame time than in absolute latency(which is also bad). I'm not just theorycrafting here, I've actually experienced what this problem feels like, and the only solution I've found so far is to play on low enough settings that you NEVER drop below your v-sync framerate threshold. If G-sync fixes this, which it sounds like it will, I'll buy it in an instant.
  • wojtek - Thursday, October 24, 2013 - link

    What you see on this video is the _input_ latency, the refresh rate of screen in still 60 Hz! You see your finger _and_ the painted square at the same time compared simultaneously. Even if your see it 10 seconds later you will see the same shift betwen your finger and the sqare, and that is independent to your reaction, that will be performed. Thats why modern gaming mouses has >1000 Hz sampling. Contrary to what you said the real problem is _when_ you see it, and in practice how fast you can react. In fact every latency bigger than 20 ms has noticable lag between your move and the observed action on the screen, and is very uncomfortable especially for hardcore gamers, where such latency is unacceptable. Sub 60 Hz experience should be avoided in hard core gaming at all cost even if it means temporal disabling of some visual improvements! This technique is already used in consoles (i.e. temporal upscaling of lower resolution rendering).
  • Soulwager - Thursday, October 24, 2013 - link

    The "screen" in the video I posted is a research projector, it sure as hell isn't running at 60hz. Gaming mice have high sample rates because gamers click their target before they see their cursor in the right place on screen. If the mouse interface was interrupt based, it wouldn't need a high poll rate. With G-Sync, you're putting the monitor's interface on an interrupt instead of cranking up it's poll rate.

    Also, going from a normal 125hz USB mouse polling rate to 1khz? 7ms max benefit in input latency, average of half that. The difference is people had PS/2 to compare it to, which is interrupt based.

    I think you misunderstood what exactly it was about human vision that was being measured in that research paper you posted.
  • wojtek - Thursday, October 24, 2013 - link

    Then ask yourself why you have no such effect on your screen. I have 60 Hz display, normal mouse and gaming mouse. No visible lag on each device. I can even put my mouse on screen and compare it directly with cursor. Nothing! Magic?

    You clearly not understand that process.

    The problem with touch displays is the the sample rate of touch input nothing more. The touch samples are taken in order, then then the draw code is executed for each sample and the sceen is drawn on the closest refresh. All processing from touch to display usually took about ~40 ms with current touch panels. That is the shift you are observing.
    If the input took 1 ms, and the code would be executed below 16 ms then you will see the square as fast as the closest frame refresh on 60 Hz display, which is the case i.e. on my monitor with my hardware, and I believe most current PC users.
    So no. It's not magic.
  • Soulwager - Thursday, October 24, 2013 - link

    I DO have that effect on my screen, so do you, you're just so familiar with it that you don't notice. If you actually dropped your mouse sensitivity so it was 1:1 to the pixel density of your and moved the mouse around on your monitor AT THE SPEEDS USED IN A COMPETITIVE GAME, you would realize that what you said is bullshit.

    Even now I don't know what point you're trying to make. There are obvious tradeoffs with V-sync and latency that you simply don't have to make with G-sync. You said yourself no competitive gamer would tolerate a framerate below 60fps, Take a moment to think about WHY that's true, and then re-read everything I've said.
  • wojtek - Thursday, October 24, 2013 - link

    "I DO have that effect on my screen, so do you"

    No I don't. You have some hardware or software issue then. It's clearly noticable when you do v-sync below 60 Hz, or most tripple buffering options in games, but it's not when it is perfect >=60 Hz synchronized. Believe it or not, I have explained it to you. I can't help you more.

Log in

Don't have an account? Sign up now