We often neglect to get too involved in the discussion of what options people should always enable when they play games. Rather, we tend to focus on what we test with. Honestly, our recommended settings for playing the games we test would be very similar to the settings we use to benchmark with one very important exception: we would enable triple buffering (which implies vsync) whenever possible. While it's not an available option in all games, it really needs to be, and we are here to make the case for why gamers should use triple buffering and why developers need to support it.

Most often gamers, when it comes to anything regarding vsync, swear by forcing vsync off in the driver or disabling it in the game. In fact, this is what we do when benchmarking because it allows us to see more clearly what is going on under the hood. Those who do enable vsync typically do so to avoid the visual "tearing" that can occur in some cases despite the negative side effects.

We would like to try something a little different with this article. We'll include two polls, one here and one at the end of the article. This first poll is designed to report what our readers already do with respect to vsync and double versus triple buffering.

{poll 134:300}

After reading the rest of this article, our readers are invited to answer a related poll which is designed to determine if arming gamers with the information this article provides will have any impact on what settings are used from here on out.

First up will be a conceptual review of what double buffering and vsync are, then we'll talk about what triple buffering brings to the table. For those who really want the nitty gritty (or who need more convincing) we will provide follow that up with a deeper dive into each approach complete with some nifty diagrams.

What are Double Buffering, vsync and Triple Buffering?
Comments Locked

184 Comments

View All Comments

  • DerekWilson - Friday, June 26, 2009 - link

    unfortunately, you really can't build a practical implementation that starts rendering a frame at the point where it will finish just before the next viable refresh. typically, with anything changing at all on screen, you aren't going to have previous frames be good predictors down to the accuracy level you would need.

    I didn't include sub 60 fps or sub 30 fps examples to keep it simple ... but in each case, the frame that starts being drawn at each refresh is equivalent between double buffering with no vsync and triple buffering.

    the "odd frame" here or there really add up when you look at an entire second by the way.
  • velanapontinha - Friday, June 26, 2009 - link

    I always try to play with double buffering + V-Sync. I've known about Tripple Buffering for quite some time, but I still prefer DB+Vsync. It's just that I never felt the theoretical input lag, while I can feel the benefits of having my CPU and GPU rest, instead of beeing always striving to get those useless 100fps.
    60fps (heck, even 30fps), if constant, provide a flawless gaming experience, and if you can have a wonderful gaming experience without your hardware being pointlessly pushed to its limits, why make it render frames you will never miss?
    Less workload, less heat, less noise, less energy, and still an impecable gaming experience.
  • DerekWilson - Friday, June 26, 2009 - link

    there is still benefit at 30 FPS as well and not only when the framerate skyrockets.

    as frametime gets longer, input lag starts to become more and more of an issue. minimizing additional lag (as triple buffering can do) can help more at lower framerates when compared to double buffering and vsync.
  • KikassAssassin - Friday, June 26, 2009 - link

    I just ran a test in WoW (I picked it since it has a Triple Buffering option built-in), where I ran down a path and back again, running the same path three times, once with double buffering and vsync disabled, one with double buffering and vsync enabled, and one with triple buffering. I had RivaTuner open in the background monitoring my CPU and GPU usage.

    In all three tests, the CPU and GPU usage graphs look exactly the same. There's almost no difference between them whatsoever.
  • velanapontinha - Friday, June 26, 2009 - link

    Well, if you can't see any difference, I guess (i'm just guessing) that you're running WoW close to your setup limits, then.

    I'm a beta tester for a software company, and I can assure you that vsync can and will keep your CPU and GPU usage much lower.

    Try running a 3D software that lets your hardware at ease (and thus runs and over 100fps, double buffer without v-sync).
    Then run the same software with v-sync enabled, and you'll see that your hardware has a lot less to struggle for.

    Try this one:
    http://www.theprodukkt.com/downloads/fr-041_debris...">http://www.theprodukkt.com/downloads/fr-041_debris...

    A very small app (177kb) that looks impressive. Run it at a low resolution (say 1024x768, for example), and then check it out. You have v-syn option in the app.
  • velanapontinha - Friday, June 26, 2009 - link

    At least I'm sure you'll notice that CPU usage will be lower. As to GPU, it depends, as GPU load indicators usually are not reliable (always varying at either 0% or 99%)
  • randomname - Friday, June 26, 2009 - link

    I usually start by switching most of the options on in games. After I realize it isn't running fast enough, I start switching some of those options off. Therefore even triple buffering is a "nice to have" property, that I would select (or not) based on an experiment. Unfortunately, that little tryout probably isn't representative of the rest of the game. So often just when it gets really interesting (a lot of stuff and cool effects start happening), the performance plummets. Then you switch off everything that doesn't have an immediate visual impact (maybe triple buffering as well) and try again.

    Absolutely the best part of console gaming is that someone else has made the (artistic) choice of enabling something, and they are in effect saying that your experience is best with these options. The game has been reviewed with those options and the same hardware, and if it sucks, it's the developers fault. The argument doesn't go towards "you really need a fast machine to appreciate the graphics", which leaves questions about how fast is fast enough (to play through the heaviest scenes) and is there any sense in making a several hundred dollar investment to play a fifty buck game, and exactly what options and hardware did the reviewer use? All that tends to take a lot away from the enjoyment and immersion.

    One example is the motion blur in Crysis. It looks really nice and smooths out that FPS-style jerkiness of being able to move your head (optical axis) so fast. But it was also quite a heavy option, and although I really, really didn't want to switch it off, I had to.
  • SleepyGreg - Friday, June 26, 2009 - link

    Having a poll of which buffering method you use under the heading "Triple buffering: Why we love it" is rather flawed. People often answer what they think is the right answer, not what they actually do.
  • DerekWilson - Friday, June 26, 2009 - link

    You know, I agree with you ... I apologize for poisoning the sample. I don't think I'm that great at article titles anyway, but the poll was just something I thought would be a cool idea. I didn't think about how they would impact each other.

    I'll try to be more careful with stuff like this if I do it in the future.
  • Mills - Friday, June 26, 2009 - link

    Seems like nobody here really agrees when it is better.

    Some people say it's better only when your FPS is greater than refresh, some say it's better only when FPS is less than refresh.

    Article seems to make the claim it's always better.

    I remain confused.

Log in

Don't have an account? Sign up now