Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

There's also what happens if game developers can assume the world is running on displays with variable refresh rates. All of the sudden targeting frame rates between 30 and 60 fps becomes far less of a tradeoff.

NVIDIA hasn’t disclosed much about G-Sync pricing, although ASUS has already given us a little guidance. The VG248QE currently sells for $280 on Newegg, while the upcoming G-Sync enabled flavor will apparently be sold for $400. The $120 premium can be a tough pill to swallow. A 40% increase in display cost is steep, which is another reason why I feel like NVIDIA might have a little more success pushing G-Sync as a part of a higher end display. On the flip side NVIDIA could easily get those costs down by migrating from an FPGA to an ASIC, although to justify that move we’d have to see pretty broad adoption of G-Sync.

Some system integrators will be selling the aftermarket upgraded VG248QE between now and CES, but you can expect other displays to be announced over the coming months. NVIDIA still hasn't figured out if/how it wants to handle end user upgrades for those who already own VG248QE displays.

I feel like NVIDIA is slowly but surely assembling a bunch of components of a truly next-generation gaming experience. With all of the new consoles launched, the bar is set for the next several years. PCs already exceed what consoles are capable of in terms of performance, but the focus going forward really needs to be on improving ease of use as well as the rest of the experience. Things like GeForce Experience are a step in the right direction, but they need far more polish and honestly, integration into something like Steam. G-Sync just adds to the list. For PC gaming to continue to thrive, it needs to evolve into something even more polished than it is today. It’s not enough to just offer higher resolution and a better looking image than what you can get on other platforms, it’s very important to provide a smoother and more consistent experience as well. G-Sync attempts to and succeeds at doing just that.

With G-Sync enabled, I began to expect/demand more visually from my games. Aliasing and other rendering imperfections were far more pronounced now that a big portion of stuttering was removed. G-Sync isn't the final solution, but rather the first on a long list of things that need improving. There are other use cases for G-Sync outside of gaming as well. Streaming video where bandwidth constraints force a variable frame rate is another one I’ve heard passed around.

Although G-Sync is limited to NVIDIA hardware (GeForce GTX 650 Ti Boost or greater), the implementation seems simple enough that other manufacturers should be able to do something similar. That’s obviously the biggest issue with what we have here today - it only works with NVIDIA hardware. For die hard NVIDIA fans, I can absolutely see a G-Sync monitor as being a worthy investment. You might just want to wait for some more displays to hit the market first.

How it Plays
Comments Locked

193 Comments

View All Comments

  • nathanddrews - Thursday, December 12, 2013 - link

    "Can't keep a constant XXfps at XXXXp because our GPUs are too slow? Here, buy this thing that makes your display go slower!"

    I'm a bit torn on G-Sync. On the one hand, it removes some glaring issues that have plagued gamers for years. On the other, it's basically a beard. 15 years ago, you could play a game at insane FPS and refresh rates on CRT. Games were simple with small textures and almost no particle effects. 10 years ago, LCDs became affordable and suddenly everyone was capped at 60Hz and consoles were locked at 30fps or 60fps. Games were more complex, requiring faster hardware, but the slow LCDs made it less noticeable. Now were moving on to LCDs that operate at 144Hz and 4K displays capped at 60Hz. G-Sync is a band-aid. The REAL problem is that GPU makers (NVIDIA/AMD) have not kept up with the pace of resolution requirements and game complexity.

    Like most reviews point out, it all comes down to what you're used to. I'm still using a CRT, 1920x1200@96Hz (sometimes lower, sometimes higher). I have all my games set up to maximize FPS for the target resolution and usually don't use vsync. Screen tearing is not as noticeable due to the high frame rate, instant response time, and the nonexistent lag that comes from CRT tech. G-Sync appeals to me because it would allow me to avoid the most glaring pitfalls of LCD tech and my inability to turn up eye candy to the max without buying all the highest-end hardware. But like I said, this is really just a band-aid and I'm not sure I want to reward this laziness.

    G-Sync hasn't earned my dollar yet. I know my next display purchase will be 4K, but I'm not content with 60Hz LCD. DP 1.3 is on the way, bringing with it 4K and 8K support at significantly higher refresh rates along with 3-D and all that jazz. Will AMD have a response to G-Sync or will they be able to license it for Hawaii 2.0? Will someone develop and open spec that requires minimal hardware to implement for broader adoption? Will GPU makers significantly push performance to make G-Sync obsolete? My CRT hopefully has a couple years left in her, so I hope I can weather the oncoming storm (not a DW reference).
  • Yojimbo - Friday, December 13, 2013 - link

    Obviously there's a limit to how good of a video card you can get. This pushes the upper bound on the experience offered by allowing frame rates down to 35fps to be acceptable instead of down to 60fps. As far as cost analysis of buying a faster card for those not in the market for the top tier cards, one must remember that most users will upgrade video cards far more often than monitors. For the life of a monitor, one must continue to purchase more expensive video cards each time one upgrades video cards in order to equal the same experience of a g-sync-enabled monitor with less expensive video cards.
  • hoboville - Thursday, December 12, 2013 - link

    It's kind of a stopgap device for those who don't want to shell out the extra cash for a better/second GPU. ..But even then, the cost of getting a new monitor would seem to offset the cost of a better/second GPU.

    Anand hit the nail on the head when he pointed out that if you are getting a minimum FPS of 60, then vsync should be fine for you. At 1440p+ resolution, even dual GPU will start to encounter slow downs, so it makes sense to invest in Gsync, because minimum frames will be lower. Also, as your hardware ages in relation to the games you play, having Gsync will be good because you'll get a smooth experience without having to buy a new GPU / CPU. Old hardware will retain its relevance longer.
  • Mr Perfect - Thursday, December 12, 2013 - link

    Don't forget C!

    C) Have, or are willing to go buy, an nvidia GPU to use with the screen.

    It's always a little disappointing when a manufacturers spends a lot of time and money making some cool new feature, only to have it die because it's proprietary. If this was a part of DirectX or some other industry standard, maybe it would take off.
  • kwrzesien - Thursday, December 12, 2013 - link

    They might as well make an entirely new connector and cable.
  • Dribble - Friday, December 13, 2013 - link

    Got to lol there - DirectX is proprietary. From the manufacturers point of view (which is in this case nvidia) if they have spent all that time and money how do they get it back if they just give the tech away for free.
  • Sadrak85 - Thursday, December 12, 2013 - link

    As a person who outgrew 1080p a while back and now has a 3x1 setup, I'm certainly hungry for more features in my monitors, if they're going to continue to cost the same (and they appear to have no intention of having a race to the bottom).
  • Yojimbo - Friday, December 13, 2013 - link

    It may be niche for the next couple years, but it seems like a technology which is destined to eventually become ubiquitous. It's common sense and has real world results. As the industry matures, the holes in the experience will be filled in.
  • Samus - Saturday, December 14, 2013 - link

    For what probably costs $20 in hardware, they can charge a $50-$100 price premium on a high end display for G-Sync. It's definitely niche, but so are video cards costing over $200 and they sell quite well.
  • ArmedandDangerous - Sunday, December 15, 2013 - link

    Well, an FPGA isn't cheap, RAM on it isn't cheap, and it definitely doesn't cost $20 in materials.

Log in

Don't have an account? Sign up now