We often neglect to get too involved in the discussion of what options people should always enable when they play games. Rather, we tend to focus on what we test with. Honestly, our recommended settings for playing the games we test would be very similar to the settings we use to benchmark with one very important exception: we would enable triple buffering (which implies vsync) whenever possible. While it's not an available option in all games, it really needs to be, and we are here to make the case for why gamers should use triple buffering and why developers need to support it.

Most often gamers, when it comes to anything regarding vsync, swear by forcing vsync off in the driver or disabling it in the game. In fact, this is what we do when benchmarking because it allows us to see more clearly what is going on under the hood. Those who do enable vsync typically do so to avoid the visual "tearing" that can occur in some cases despite the negative side effects.

We would like to try something a little different with this article. We'll include two polls, one here and one at the end of the article. This first poll is designed to report what our readers already do with respect to vsync and double versus triple buffering.

{poll 134:300}

After reading the rest of this article, our readers are invited to answer a related poll which is designed to determine if arming gamers with the information this article provides will have any impact on what settings are used from here on out.

First up will be a conceptual review of what double buffering and vsync are, then we'll talk about what triple buffering brings to the table. For those who really want the nitty gritty (or who need more convincing) we will provide follow that up with a deeper dive into each approach complete with some nifty diagrams.

What are Double Buffering, vsync and Triple Buffering?
Comments Locked

184 Comments

View All Comments

  • greylica - Friday, June 26, 2009 - link

    I always use triple buffering in OpenGL apps, and the performance is superb, until Vista/7 cames and crippled my hardware with Vsync enabled by default. This sh*t of hell Microsoft invention crippled my flawless GTX 285 to a mere 1/3 of the performance in OpenGL in the two betas I have tested.

    Thanks to GNU/Linux I have at least one chance to be free of the issue and use my 3D apps with full speed.
  • The0ne - Friday, June 26, 2009 - link

    Love your comment lol
  • JonP382 - Friday, June 26, 2009 - link

    I always avoided triple buffering because it introduced input lag for me. I guess the implementation that ATI and Nvidia have for OpenGL is not the same as this one. Too bad. :(

    I'm going to try triple buffering in L4D and TF2 later today, but I'm just curious if their implementation is the same as the one promoted in this article?
  • DerekWilson - Friday, June 26, 2009 - link

    I haven't spoke with valve, but I suspect their implementation is good and should perform as expected.
  • JonP382 - Friday, June 26, 2009 - link

    Same old story - I get even more input lag on triple buffering than on double buffering. :(
  • JonP382 - Friday, June 26, 2009 - link

    I should say that triple buffering introduced additional lag. Vsync itself introduces an enormous amount of input lag and drives me insane. But I do hate tearing...
  • prophet001 - Friday, June 26, 2009 - link

    one of the best articles i've read on here in a long time. i knew what vsync did as far as degrading performance (only in that it waited for the frame to be complete before displaying) but i never knew how double and triple buffering actually worked. triple buff from here on out

    4.9 out of 5.0 :-D
    (but only b/c nobody gets a 5.0 lol)
    thank you

    Preston
  • danielk - Friday, June 26, 2009 - link

    This was an excellent article!

    While im a gamer, i dont know much about the settings i "should" be running for optimal FPS vs. quality. I've run with vsync on as thats been the only remedy ive found for tearing, but had it set to "always on" in the gfx driver, as i didnt know better.

    Naturally, triple buffering will be on from here on.

    I would love to see more info about the different settings(anti aliasing etc) and their impact on FPS and image quality in future articles.

    Actually, if anyone has a good guide to link, i would appreciate it!


    Regards,
    Daniel
  • DerekWilson - Friday, June 26, 2009 - link

    keep in mind that you can't force triple buffering on in DirectX games from the control panel (yet - hopefully). It works for OpenGL though.

    For DX games, there are utilities out there that can force the option on for most games, but I haven't done indepth testing with these utilities, so I'm not sure on the specifics of how they work/what they do and if it is a good implementation.

    The very best option (as with all other situtions) is to find an in-game setting for triple buffering. Which many developers do not include (but hopefully that trend is changing).
  • psychobriggsy - Friday, June 26, 2009 - link

    I can see the arguments for triple buffering when the rendered frame rate is above the display frame rate. Of course a lot of work is wasted with this method, especially with your 300fps example.

    However I've been drawing out sub-display-rate examples on paper here to match your examples, and it's really not better than VsyncDB apart from the odd frame here and there.

    What appears to be the best solution is for a game to time each frame's rendering (on an ongoing basis) and adjust when it starts rendering the frame so that it finishes rendering just before the Vsync. I will call this "Adaptive Vsync Double Buffering", which uses the previous frame rendering time to work out when to render the next frame so that what is displayed is up to date, but work is reduced.

    In the meantime, lets work on getting 120fps monitors, in terms of the input signal. That would be the best way to reduce input lag in my opinion.

Log in

Don't have an account? Sign up now