Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.

NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.

Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.

I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.

With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.

Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.

How it Plays
POST A COMMENT

193 Comments

View All Comments

  • tipoo - Thursday, December 12, 2013 - link

    Good to hear it mostly works well, if you can keep the framerate high enough. This with a high end computer and Occulus Rift would be an amazing combination, I hope both take off. Reply
  • smunter6 - Thursday, December 12, 2013 - link

    Want to make any wild guesses as to what John Carmack's working on over in Oculus Rift's secret labs? Reply
  • GiantPandaMan - Thursday, December 12, 2013 - link

    Except the Oculus Rift probably won't have it. They love non-proprietary stuff and G-Sync lands firmly in the proprietary category.

    Make it a standard, make it cost about $10 more to implement rather than $120 and this will take off. I don't see this happening, though. NVidia just doesn't operate in that matter, unfortunately. It would make gaming so much better for the people who really need it--those with sub-par video cards.

    No display maker is going to make a key component (the scalar) beholden only to a single manufacturer (nVidia). The technology needs to be licensed so it becomes an industry standard so that manufacturers can put it into their displays without having to rely on a single OEM.
    Reply
  • psuedonymous - Thursday, December 12, 2013 - link

    Carmack himself mentioned at the panel after the G-sync reveal that the first consumer release of the Oculus would NOT contain G-sync, but that is definitely something they want to incorporate.
    My guess is the reason being the use of LVDS as the sole panel interface. There simply AREN'T any decent 5.6"-6" panels using LVDS. Nobody makes them. The relatively bad (6-bit FRC, crummy colours compared to modern panels, low fill-factor, low resolution, too big to be used efficiently) panel was a compromise in that it was the only one readily available in volume and compatible with the existing LVDS board. Phone/tablet panels in the correct size, resolution and quality range are all MIPI DSI, with the exception of the Retina iPad Mini, which uses an eDP panel like the iPad 3 onwards. Except that panel is still too large, and will be unavailable in volume until Apple decide to reduce their orders in 6 months or so. The current 1080p prototype uses one of the early DVI->MIPI chips (probably on an in-house board) because it's the only way to actually drive the panels available.
    Reply
  • GiantPandaMan - Thursday, December 12, 2013 - link

    Interesting information. Thanks for posting it.

    As useful as G-Sync would be for something like Oculus (especially for reducing motion sickness) it's still far too expensive to implement. Oculus, itself, wants to hold the line at $300. There's simply no way for $120 to be cut down into that price.

    Then there's the fact that Oculus would benefit far more from 120hz panels than it would be from gsync. Honestly, I can't imagine Carmack or Oculus ever bad mouthing a new technology that it could benefit from in the future, but the fact remains there are so many other things that would be more cost effective for Oculus to do first. Higher resolution, 1920x2160 say; higher refresh rates, 120hz. Personally I hope they think about using some of the projector panels. Their smaller, lighter, and already have both the color depth and refresh rates. The only problem, of course, is they're probably too small and may be too expensive.
    Reply
  • psuedonymous - Friday, December 13, 2013 - link

    They specifically avoid making a microdisplay-based HMD, because of the tradeoffs that every previous microdisplay HMD has had to make. Because the displays are small, you need some hefty optics to view the image, and these must be complicated in order to correct for distortion (as unlike the large-panel software-corrected approach the Rift uses, distortion with a much smaller display would be so great it could not be effectively corrected). This means the optics are bulky, heavy and expensive. And that goes doubly so if you want a large field-of-view (compare the Oculus 90° horizontal FoV to the HMD-1/2/3's 45° hFoV, and the HMD series were praised for their unusually large FoV compared to competing models). In fact, the only large FoV HMD I know of using microdisplays is the Sensics Pisight (http://sensics.com/head-mounted-displays/technolog... a huge 24 display monster that costs well in excess of $20,000.

    And anything other than a tristimulus subpixel microdisplay (a tiny transmissive LCD) will have chromatic fringing when you look around due to sequential colour (http://blogs.valvesoftware.com/abrash/why-virtual-...
    Reply
  • GiantPandaMan - Friday, December 13, 2013 - link

    Ahh, so I guess my fears on using projector panels are true. Damn. I guess we're going to be stuck with 60 hz on the Oculus for awhile. I just don't see phone displays moving up in refresh rates anytime soon.

    I really want the Pisight now, but, unfortunately, I need to do things like eat and have shelter. :P
    Reply
  • JoannWDean - Saturday, December 14, 2013 - link

    my buddy's aunt earned 14958 dollar past week. she been working on the laptop and got a 510900 dollar home. All she did was get blessed and put into action the information leaked on this site... http://cpl.pw/OKeIJo Reply
  • Black Obsidian - Thursday, December 12, 2013 - link

    G-Sync seems to live in a very small niche. How many people both:
    A) Need better performance
    *and*
    B) Need a new monitor as well
    ?

    Absent those two conditions, aren't people simply better off investing the ~$400 a G-Sync monitor would cost in, you know, a better video card instead? I experience neither tearing nor stuttering, because my absurd triple-slot, factory-overclocked R7970 has no problem pushing any game I play well beyond 60FPS. A special monitor would cost 80% what that card did at launch, so G-Sync seems like a bit of a non-starter to me, unless there's something I'm missing here.
    Reply
  • IanCutress - Thursday, December 12, 2013 - link

    For the gamer that has it all?

    I'm interested in G-Sync at 4K. If the need for AA is reduced, and you're battling against 30-60 FPS numbers. But for those users who are in the mid range GPU market, having a good monitor that will last 5-10 years might be cheaper than a large GPU or system upgrade.

    It's just another piece in the puzzle towards which will hopefully become standard. Think about it - in an ideal world, shouldn't this have been implemented from the start?
    Reply

Log in

Don't have an account? Sign up now