Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.

NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.

Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.

I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.

With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.

Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.

How it Plays
Comments Locked

193 Comments

View All Comments

  • Jelic - Thursday, December 12, 2013 - link

    Hmm, interesting article. I actually currently have the ASUS VG248QE. While gsync sounds intriguing, what I find even more promising is the use of lightboost to give CRT like quality to the panel. WIth my current setup I have a GTX 680 with the max framerate limited via EVGA's OC tool to 120fps. On a 1080p screen, with 120hz refresh rate, and 2d lightboost enabled you get absolutely no motion blur, very little tearing, and overall an amazing gaming experience. Since you have the hardware already, I'd be interested in hearing your opinion on 2d lightboost + gsync (at 120hz), and if that makes any difference. Also I'd love it if Anandtech did an article on lightboosted monitors as well! My ideal monitor would be something like a 27in 2560x1600 IPS panel with 120hz lightboost supported... of course I'd need something like dual 780s to get the most out of it, but it'd be well worth it to me heh.
  • DesktopMan - Friday, December 13, 2013 - link

    Lightboost doesn't work well on low framerates since you'd see the backlight flicker. If you flicker it more than once per frame you introduce retina blur again. It works best at high, stable framerates. G-Sync would still be useful with lightboost if your framerate hovers between 60 and 120 though.
  • mdrejhon - Friday, December 13, 2013 - link

    Just so all readers know, the great news is there are several different strobe backlights now:

    - LightBoost
    - Official sequel to LightBoost (coming with G-SYNC monitors), mentioned by John Carmack
    - EIZO's FG2421
    - BENQ Blur Reduction (behaves like LightBoost, but via monitor menus)
    - Sony's Motionflow "Impulse" (GAME MODE strobe backlight, low lag, no interpolation)

    Some of them darken a lot, and others darken less. Some have better contrast ratios, and much better colors. Some of them (BENQ Z-series) can strobe at 75Hz and 85Hz, if you want zero motion blur with a bit less GPU horsepower. Some of them are zero-ghost (no double-image effect). But you can't "get it all" simultaneously.

    From my experience playing on the EIZO FG2421 (warmed up after 30 mins to reduce VA ghosting on cold panels), it's lovely to have a bright and colorful picture, something that LightBoost has difficulty with. The VA panels ghosts a bit more (until I warm up), but when I sustain 120fps@120Hz (Bioshock Infinite, VSYNC ON on a GeForce Titan), it produces spectacular motion quality, the most CRT quality I have ever seen.

    Now, if I fall below 100fps a lot, like Battlefield 4, I prefer G-SYNC because it does an amazing job of eliminating stutters during fluctuating framerates.
  • blackoctagon - Sunday, December 15, 2013 - link

    And does G-Sync offer any benefit if you're ALREADY at 120fps@120Hz? Because, if so, surely someone needs to review the VG248QE with both G-Sync and LightBoost enabled at the same time :)
  • web-cyborg - Thursday, December 12, 2013 - link

    All of those articles focus on variable hz function of g-sync and not the supposed "superior to lightboost" backlight strobing option. The articles say "30 to 40 fps is 'fine'", with 40 being the sweet spot. I would disagree. These same people complain about marginal input lag milliseconds, yet accept long "freeze-frame" milliseconds with open arms in order to get more eye candy. I think people will be cranking up their graphics settings and getting 30 - 40fps. At 30fps you are frozen on the same frame of world action for 33.2ms while the 120hz+120fps user sees 4 game world action update "slices". At 40fps you are seeing the same frozen slice of game world action for 25ms, while the 120hz+120fps user see 3 action slice updates. This makes you see new action later and gives you less opportunities to initiate action, (less "dots per dotted line length") then you add input lag to your already out of date game world state you are acting on. Additionally, the higher hz+higher frame rates provide an aesthetically smoother control, aesthetically smoother higher motion definition and animation definition. Of course 120hz also cuts the continual FoV movement blur of the entire viewport by 50% (vs 60hz baseline full smearing "outside of the lines" blur) as well, and backlight strobing at high hz eliminates FoV blur essentially (eizo FG2421 now, "superior to lightboost" backlight strobing mode of g-sync monitors in the future supposedly).
  • web-cyborg - Thursday, December 12, 2013 - link

    60hz vz 120hz vs backlight strobing. Note that newer monitors like the eizo FG2421 and future "superior to lightboost" backlight functionality of g-sync strobe mode (unfortunately mutually exclusive from the variable hz mode) do not/will not suffer the lowered brightness and muted colors of the lightboost "hack" shown in these examples. However they will eliminate the blur which is shown in these examples.
    http://www.blurbusters.com/faq/60vs120vslb/
    Now remember that in reality it's not just a single simple cell shaded cartoon object moving across your screen, rather your entire 1st/3rd person viewport of high detail textures, depth via bump mapping, "geography"/terrain, architectures and creatures are all smeared "outside of the lines" or "shadow masks" of everything on screen every time you move your FoV at 60hz, more within the "shadow masks" of onscreen objects at 120hz but still losing all detail, textures and bump mapping, and essentially zero blur when using backlight strobing over 100hz.
  • web-cyborg - Thursday, December 12, 2013 - link

    I'm more interested in high fps and zero blur obviously, even if I have to turn down the ever higher *arbitrarily set by devs* graphics cieling "carrot" that people keep chasing (that ceiling could be magnitudes higher if they wanted).
    I still play some "dated" games too.. fps is high.

    You are seeing multiple frames skipped and behind a 120hz+120fps user, watching "freeze-frames" for 25ms to 33.2 ms at 30fps and 40fps, and every time you move your FoV you are smearing the entire viewport into what can't even be defined as a solid grid resolution to your eyes/brain. So much for high rez.
    I think people are sacrificing a lot motion, animation, and control wise aesthetically as well as sacrificing seeing action sooner and being given more and sooner opportunities to initiate actions - to reach for higher still-detail eye candy aesthetically.
    You don't play a screen shot :b
  • mdrejhon - Thursday, December 12, 2013 - link

    Hello fellow guys at AnandTech -- I've created a new TestUFO animation (via software interpolation) that simulates the smooth framerate ramping that G-SYNC can do:

    http://www.testufo.com/stutter#demo=gsync

    It shows off stutter-free frame rate variances as well. I created this unique animation (I think, the only one of its kind on the Internet), for the Blur Busters preview of G-SYNC.
  • HisDivineOrder - Thursday, December 12, 2013 - link

    The "biggest issue with what we have here today" is not that it's nVidia only. That's a big issue, to be sure.

    The biggest issue is that there are a LOT of us who have fantastic displays that we paid high dollar for and will not go down to 16:9 or TN panels. Hell, a lot of us won't even go and spend the same money we just spent on our incredibly expensive and incredibly hard to resell monitors to get this technology that should have 1) been included from the start in the LCD spec and 2) should have a way of being implemented that involves something other than tossing our old monitor in the bin.

    They need to make an adapter box for monitors without built-in scalers that translates what they're doing to DVI-D. Else, there's a LOT of people who won't be seeing this technology have any use until they get around to making 4K monitors that include it with IPS and at an even semi-reasonable price.

    Really, the biggest problem is they didn't find a way to adapt it for all monitors.
  • web-cyborg - Friday, December 13, 2013 - link

    in regard to the backlight strobing functionality, the eizo FG2421 is a high hz VA panel whose backlight strobing "zero blur" capability is independent of gpu camps.

    We are talking about gaming usage. Practically all 1st/3rd person games use HOR+ / virtual cinematography which means you see more of a scene in 16:9 mode, even if you have to run 16:9 mode on a 16:10 monitor. 16:10 mode cuts the sides off basically.
    http://www.web-cyb.org/images/lcds/HOR-plus_scenes...

    Gpu upgrades can run $500 - $1000 now too for high end, and somewhere in between or double for dual gpus. 16:10 / 16:9 is really a bigger deal at 1080 vs 1200 even for desktop use. 16:10 30" is not as much real-estate difference as the size suggest between 2560x 27", the 30" pixels are a lot larger. Here is a graphic I made to show three common resolutions compared at the same ppi or equivalent percieved ppi at viewing distances.
    http://www.web-cyb.org/images/lcds/4k_vs_27in_vs_3...

    Imo for the time being you are better off using two different monitors, one for gaming and one for desktop/apps instead of trying to get both in one monitor and getting worse performance/greater trade-offs combined in one (i.e 60hz vs 120hz, lack of backlight strobing or gsync, resolutions too high to maintain high fps at high+/ultra gfx settings relative to your gpu budget, resolutions too low for quality desktop/app usage,lots of tradeoff, etc).

    Upgrades to display and gpu technology are the nature of the beast really. Up until now you would be better off getting a korean knock off 2560x1440 ips or the american mfg versions for $400 or less, and put a good 120hz gaming monitor next to it imo Eizo FG2421 24" VA backlight strobing model is around $500, so for $900+ (and a good gpu of course) you could have better of both worlds pretty much for the time being. Going forward we know g-sync will have backlight strobing functionality but we don't know if any of the higher resolution monitors due to come out with g-sync will have 100hz+ required to support strobing adequately. If they don't, again we are back to major tradeoffs gaming vs desktop use again (low hz -> low motion+animation definition/much less game action updates shown per second/lower control definition, full 60hz baseline smear bluring out all detail and textures during continual FoV movement/motion flow).

Log in

Don't have an account? Sign up now