Earlier today NVIDIA announced G-Sync, its variable refresh rate technology for displays. The basic premise is simple. Displays refresh themselves at a fixed interval, but GPUs render frames at a completely independent frame rate. The disconnect between the two is one source of stuttering. You can disable v-sync to try and work around it but the end result is at best tearing, but at worst stuttering and tearing.

NVIDIA's G-Sync is a combination of software and hardware technologies that allows a modern GeForce GPU to control a variable display refresh rate on a monitor equipped with a G-Sync module. In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU.

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

There's a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn't go into great detail other than to say that there are considerations that need to be taken into account.

The first native G-Sync enabled monitors won't show up until Q1 next year, however NVIDIA will be releasing the G-Sync board for modding before the end of this year. Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors. Otherwise in Q1 of next year ASUS will be selling the VG248QE with the G-Sync board built in for $399, while BenQ, Philips, and ViewSonic are also committing to rolling out their own G-Sync equipped monitors next year too. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually. The G-Sync module itself looks like this:

There's a controller and at least 3 x 256MB memory devices on the board, although I'm guessing there's more on the back of the board. NVIDIA isn't giving us a lot of detail here so we'll have to deal with just a shot of the board for now.

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Although we only have limited information on the technology at this time, the good news is we got a bunch of cool demos of G-Sync at the event today. I'm going to have to describe most of what I saw since it's difficult to present this otherwise. NVIDIA had two identical systems configured with GeForce GTX 760s, both featured the same ASUS 144Hz displays but only one of them had NVIDIA's G-Sync module installed. NVIDIA ran through a couple of demos to show the benefits of G-Sync, and they were awesome.

The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.

Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

Update: NVIDIA has posted a bit more information about G-Sync, including the specs of the modified Asus VG248QE monitor, and the system requirements.

NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1

Comments Locked

217 Comments

View All Comments

  • mdrejhon - Monday, October 21, 2013 - link

    Ooops, the link to TestUFO Panning Map Test is wrong. The correct link is:
    http://www.testufo.com/photo#photo=toronto-map.png...
    It displays a map that pans sideways at high speeds, at a framerate matching refesh rate.
  • wojtek - Monday, October 21, 2013 - link

    Do no not forget that the low frequency strobe light it is very tiring and unhealthy for human eye. That was the case why CRTs, which are natural strobes always, were performing higher refresh frame rates than 60 Hz, ie 75 or 85.
    Now you are talking that we actually do the opposite and this is the progress? How ridiculous is that!
  • mdrejhon - Tuesday, October 22, 2013 - link

    Two things:

    - There's no way to have low persistence without strobing, unless we have 1000fps@1000Hz to gain the motion clarity of CRT without flicker. We're trading off flicker for motion blur.

    - The feature can be turned ON/OFF. CRT couldn't allow you to turn of its flicker. LightBoost and GSYNC allows you to do so.

    - Some of us get motion blur headaches. Eizo FDF2405W uses a strobe mode in their 240Hz monitor to reduce motion blur eyestrain. LightBoost gives me less eyestrain (it emulates a 120Hz CRT) because my eyes are no longer tired by motion blur.
  • wojtek - Tuesday, October 22, 2013 - link

    In fact it is the opposite! The motion blur _is natural_ for human eyes, if not, you should have motion blur headaches in the cinema as well, which I believe is not the case. The technology you mention has nothing to do with _proper motion blur_ it just tries to reduce the LCD unresponsiveness and in fact it is old technology that many people just dissmised, look for example at BenQ FP241WZ http://www.tftcentral.co.uk/articles/content/benq_... with Black Frame Inserting. It was present in LCDs 7 years ago.

    Search google for people opinions it was barely accepted. It may be accepted for higher framerates like in CRT, but for 24-60 FPS it is just eyes nigthtmare. We already did that lesson. No reason to repeat it.
  • mdrejhon - Tuesday, October 22, 2013 - link

    Human motion blur should be 100% natural. Displays should not force extra motion blur upon eyes above-and-beyond human limitations. Sometimes we want the display to do it for artistic purposes. Otherwise, Holodeck displays will never arrive.

    Multiple things:
    -- Flicker DOES increase eyestrain.
    -- But trying to use focussing muscles to focus on motion blurry images (e.g. www.testufo.com/photo) for long periods DOES increase eyestrain too.

    Some people are less bothered by one than the other.
    For example, see the testimonials of people who love it, and get less strain:
    http://www.blurbusters.com/lightboost/testimonials...
    Some people stopped FPS gaming when switching from CRT to LCD because of the motion blur problem, and have only returned to FPS gaming because of strobing.

    Remember, everybody's vision is different.
    Some people are color blind. Some people are not.
    Some people hear better. Some people don't hear as well.
    Some people notice certain image artifacts. Others don't.
    Some people see flicker better. Others don't.
    etc.
  • mdrejhon - Tuesday, October 22, 2013 - link

    Also see: http://www.eizo.com/global/products/duravision/fdf...

    "Blur Reduction with 240 Hz Refresh Rate
    The monitor converts 120 Hz input signals to 240 Hz to reduce ghosting and blurring that is caused during frame changes. This greatly improves sharpness and visibility and reduces eye fatigue that occurs when viewing scrolling or moving images."

    And it uses strobing, according to page 15 of its manual:
    http://www.blurbusters.com/eizo-240hz-va-monitor-u...

    Fortunately, the strobing is optional, but vision research has shown that strobing DOES alleviate motion blur strain FOR SOME PEOPLE (MAYBE NOT YOU) because eye-focussing muscles don't have to struggle as much because you can't undo unnatural display-forced-upon-you motion blur with your human eyes' focussing muscles trying to hunt back-and-forth through the motion blurry images.
  • wojtek - Tuesday, October 22, 2013 - link

    I completly agree on reduction of motion blur imposed by LCD, in fact it is not natural motion blur, it is just blending blur similar to some old game techniques that simulate motion blur by blending queue of frames. This is bad idea.

    Everything I am talking about is that we should improve high frametate displays to eliminate ghosting and have good, practical and unnoticable margins in stable v-sync, and stable picture reproduction, rather than doing the oposite: trying to achieve variable low framerate that is produced by slow GPU, and has no sense because of flickering and a lot of additional efforts to make it sane.
    You simply try to solve problem that does not exist in high frametare world, which can reproduce stable low ramerate conditions to human eyes!

    The propoer motion blur, with correct optical parameters, should be calculated on the fly by game engine and presented on display that has no additional flaws. This is the way to go. And it is orthogonal to the refresh rate itself, which is just a parameter in such calculations. If you have enough frame slots (120/144 FPS) you don't need to redesign anything, just make sure it is presented with accuracy higher than 60 Hz, which gives you natural impression of smoothnes.

    Now, I understand NVIDIA and its "inventions" because it is business, and they care about moneny. They need to convince people to buy their hardware, and the more unique features they provide the more advantage they may take. The only problem is merit of their inventions and in the g-sync case I see no merit at all, as I see no merit with buying expensive HDMI cables that does not improve anything or listening to 24/192 music that actually produces additional distorsion in most amplituners because of their nonlinearity.

    That is all my point. Nothing more.
  • mdrejhon - Tuesday, October 22, 2013 - link

    > I completly agree on reduction of motion blur imposed by LCD, in fact it is not natural motion
    > blur, it is just blending blur similar to some old game techniques that simulate motion blur by
    > blending queue of frames. This is bad idea.

    That's not the cause of LCD motion blur on modern LCD's.
    See http://www.testufo.com/eyetracking for what really causes motion blur on modern LCD's.

    GtG is only a tiny fraction of a refresh, most motion blur is now caused by persistence. LCD pixel transitions more resemble a square wave today now: http://www.testufo.com/mprt ... (otherwise the checkerboard pattern illusion is impossible) .... GtG is tiny fraction of refresh; persistence/stableness is most of a refresh.

    This motion blur unfortunately also happens on OLED's too, so they have to strobe OLED's:
    http://www.blurbusters.com/faq/oled-motion-blur/

    Motion blur occurs on all flickerfree displays, even if they have instant 0ms GtG pixel transitions. As you track moving objects on a flickerfree display, your eyes are blurring each continuously-displayed refresh. Your eyes are in a different position at the beginning of a refresh than at the end of a refresh.

    To have 1ms of persistence without using flicker, we need 1000fps@1000Hz to solve strobing AND motion blur. You have to fill all the black gaps with 1ms frames, and all frames have to be unique. 1ms of persistence translates to 1 pixelwidth of tracking motion blur (forced upon you) for every 1000 pixels/second.

    Same motion clarity of a 2ms strobe backlight = you need 500fps@500Hz nonstrobed
    Same motion clarity of a 1ms strobe backlight = you need 1000fps@1000Hz nonstrobed
    Same motion clarity of a 0.5ms strobe backlight = you need 2000fps@2000Hz nonstrobed
    etc.

    Since silly framerates are impossible, we are stuck with strobing for now if we want to eliminate motion blur (regardless of display technology, not just LCD)
  • wojtek - Tuesday, October 22, 2013 - link

    This is valid reasoning only to some extent, because even when you tracking "moving pixels" this tracking is not more accurate/sensitive than your retina responsiveness which is far lower than 1000Hz and in fact is more content dependent than the real timing http://neuroscience.uth.tmc.edu/s2/chapter15.html

    But strobbing over 100Hz is OK since it is not tiring for eyes. Strobing in 24-60 Hz is tiring and unhealthy for longer exposition. That is medical fact.
    All in all, as I said, we should focus on higer FPS, rather than accurate low FPS because the letter has no benefits.
  • wojtek - Tuesday, October 22, 2013 - link

    Ah, and keep in mind, that the motion blur you are talking about is also resolution dependent. The higher PPI the less blur because there is less gap between transitions, and that in turn leads again to FPS which uses more pixels to draw movement if it is higher.

Log in

Don't have an account? Sign up now