We often neglect to get too involved in the discussion of what options people should always enable when they play games. Rather, we tend to focus on what we test with. Honestly, our recommended settings for playing the games we test would be very similar to the settings we use to benchmark with one very important exception: we would enable triple buffering (which implies vsync) whenever possible. While it's not an available option in all games, it really needs to be, and we are here to make the case for why gamers should use triple buffering and why developers need to support it.

Most often gamers, when it comes to anything regarding vsync, swear by forcing vsync off in the driver or disabling it in the game. In fact, this is what we do when benchmarking because it allows us to see more clearly what is going on under the hood. Those who do enable vsync typically do so to avoid the visual "tearing" that can occur in some cases despite the negative side effects.

We would like to try something a little different with this article. We'll include two polls, one here and one at the end of the article. This first poll is designed to report what our readers already do with respect to vsync and double versus triple buffering.

{poll 134:300}

After reading the rest of this article, our readers are invited to answer a related poll which is designed to determine if arming gamers with the information this article provides will have any impact on what settings are used from here on out.

First up will be a conceptual review of what double buffering and vsync are, then we'll talk about what triple buffering brings to the table. For those who really want the nitty gritty (or who need more convincing) we will provide follow that up with a deeper dive into each approach complete with some nifty diagrams.

What are Double Buffering, vsync and Triple Buffering?
Comments Locked

184 Comments

View All Comments

  • velanapontinha - Friday, June 26, 2009 - link

    The eyes and brain that watch a game or a movie are the same. If there was a "Pepsi challenge"-like contest between 30fps, 60fps and 200fps, the error rate would be astronomical, and a lot of overspending gamers would feel bad about spending so much money on hardware that is able to create frames they never see - nor even miss.
  • JS - Friday, June 26, 2009 - link

    The difference is not in the frame rate, but the fact that a film is not a sequence of perfectly sharp static images (like games normally are). Motion blur is automatically introduced by the shutter time on the film camera. That is why 24 fps works for film but not so well for games.

    Most people would definitely see the difference.
  • james jwb - Friday, June 26, 2009 - link

    films also do not require the viewer to make decisions based on what they see. For a movie, fast paced movements in a war scene doesn't require the viewer to see every detail in perfect accuracy and definition, what happens next isn't your choice, you are just watching. In a game, what you see decides what you'll do, and a motion blurred to death fast movement will never suffice in some game genres. You need a compromise somewhere and with games, a higher than film frame rate will significantly help overcome this.
  • BJ Eagle - Saturday, June 27, 2009 - link

    Ahh - good point about the bluring in films...

    But heres another one then:
    In nVidia control panel (vista x64 driver ver 186.18) it clearly states under triple buffering (though only OpenGL is affected as discussed) "Turning on this setting improves performance when Vertical sync is also turned on"...
    This is not quite the impression I got from reading this article. Clearly there is still some confusion of when to enable what settings and having an article like this contradicting nVidias recommendation doesn't really help.. me at least :)
  • profoundWHALE - Monday, January 19, 2015 - link

    I'm just going to leave this here:

    http://www.testufo.com/
  • james jwb - Friday, June 26, 2009 - link

    I can see myself using triple buffering in most situations, but games like CS:S, i don't think it would be wise. For a game like this consistently high frame and refresh rates would be the preferred option. Actually that would be the preferred option for all games, but in order to do this you'd have to delay playing new, graphic intensive games for two years to allow the hardware to catch up.
  • DerekWilson - Friday, June 26, 2009 - link

    i'd still want triple buffering for CS:S ...

    for me, tearing is distracting and i use the top of my display more than the bottom (even if new data were drawn lower on the screen it wouldn't be beneficial to me).
  • james jwb - Friday, June 26, 2009 - link

    ah, see here's a point to consider as to why i said what i said. I use a CRT at 100hz, so the tearing issue becomes almost insignificant. Sure, if I was on an LCD I would agree with youm tearing in CS:S is a disaster in that scenario.
  • JarredWalton - Friday, June 26, 2009 - link

    What I really want is LCDs with a native 120Hz refresh rate and data rate. That last part is key; I want 1920x1200 at 120Hz, not 1920x1200 with 60 images and some funky software interpolating to 120Hz. It would require DisplayPort, dual-link DVI, or HDMI 1.4 (I think?), but with triple buffering that would be the best of all worlds.
  • james jwb - Friday, June 26, 2009 - link

    @ Jarred, i couldn't agree more, but you know that already :)

    If someone like HP can bring a 24" IPS 120hz to market with similar performance to their current model, I'd be in tech-drool heaven. Under this scenario, I'd play CS:S with double buffering, no v-sync, but games that were graphic intensive and could not sustain high frame rates, I'd definitely love the option of triple buffering.

Log in

Don't have an account? Sign up now