Crysis, DX10 and Forcing VSYNC Off in the Driver

Why do we keep on coming back to Crysis as a key focal point for our reviews? Honestly because it’s the only thing out there that requires the ultra high end hardware enabled by recently released hardware.

That and we’ve discovered something very interesting this time around.

We noted that some of our earlier DX10 performance numbers on Skulltrail looked better than anything we could get more recently. In general, the higher number is usually more likely to be "right", and it has been a frustrating journey trying to hunt down the issues that lead to our current situation.

Many reinstalls and configuration tweaks later and we’ve got an answer.

Every time I set up a system, because I want to ensure maximum performance, the first thing I do is force VSYNC off in the driver. I also generally run without having the graphics card scale output for my panel; centered timings allow me to see what resolution is currently running without having to check. But I was in a hurry on Sunday and I must have forgotten to check the driver after I set up an 8800 Ultra SLI for testing Crysis.

Low and behold, when I looked at the numbers, I saw a huge performance increase. No, it couldn’t be that VSYNC was simply not forced off in the driver could it? After all, Crysis has a setting for VSYNC and it was explicitly disabled; the driver setting shouldn’t matter.

But it does.

Forcing VSYNC off in the driver can decrease performance by 25% under the DX10 applications we tested. We see a heavier impact in CPU limited situations. Interestingly enough, as we discussed last week, with our high end hardware, Crysis and World in Conflict were heavily CPU and system limited. Take a look for yourself at the type of performance gains we saw from disabling VSYNC. These tests were run on Crysis using the GeForce 9800 GX2 in Quad SLI.


We would have tried overclocking the 790i system as well if we could have done so and maintained stability.

In looking at these numbers, we can see some of the major issues we had between NVIDIA platforms and Skulltrail diminish. There is still a difference, but 790i does have PCIe 2.0 bandwidth between its cards and it uses DDR3 rather than FB-DIMMS. We won’t be able to change those things, but right now my option is to run half the slots with 800 MHz FB-DIMMS or all four slots with 667 MHz. We should be getting a handful of higher speed lower latency FB-DIMMS in for testing soon which will we believe will help. Now that we’ve gotten a better feel for the system, we also plan on trying some bus overclocking to help alleviate the PCIe 2.0 bandwidth advantage 790i has. It also seems possible to push our CPUs up over 4GHz with air cooling, but we really need a larger PSU to keep the system stable (without any graphics going, a full CPU load can pull about 700W at the wall when running 4GHz at 1.5v) especially when you start running a GPU on top of that.

Slower GPUs will benefit less from not forcing VSYNC off in the driver, but even if the framerate is near a CPU limit (say, within 20%) performance will improve. NVIDIA seems more impacted by this than AMD, but we aren’t sure at this point whether that is because NVIDIA’s cards expose more of a CPU limit due to their higher performance.

NVIDIA is aware of the VSYNC problem, as they were able to confirm our findings yesterday.

Being that we review hardware, it is conceivable that this issue might not affect most gamers. Many people like VSYNC on, and since most games allow for the option it isn’t usually necessary or beneficial to force VSYNC off in the driver. So we decided to ask in our video forum just how many people force VSYNC off in the driver, and whether they do so always or just some of the time.

More than half our 89 respondents (at the time of this writing) never force VSYNC off, but 40% of the remaining respondents admitted to forcing VSYNC off at some point, half of these always forcing VSYNC off (just as we do in our testing).

This is a big deal, especially for members who want to play Crysis and have lower end CPUs. We didn’t have time to rerun all of our numbers without VSYNC forced off in the driver, so keep in mind that these numbers could benefit a lot by doing so.


The Test Scaling and Performance with 3-way SLI
Comments Locked

49 Comments

View All Comments

  • 7Enigma - Tuesday, April 1, 2008 - link

    NM, the images now show up that include the 8800GT. Thanks! So it seems the 9800GTX in most situations is <20% faster than the 8800GT at 1280X1024 correct? Since I game on a 19" LCD I might be better off with an 8800GT for a year or so and then upgrading to the next round of cards.....decisions....decisions....

    For anyone that cares here's a direct comparison using the numbers from the table:

    9800GTX compared to 8800GT at 1280X1024 resolution

    Crysis.....19.5% faster

    CoD4.......17.5-18.5% faster (depending on no/4X AA)

    Oblivion...17.5-27% faster (depending on no/4X AA)

    QuakeW.....10.5% faster

    Stalker....13% faster





  • just4U - Tuesday, April 1, 2008 - link

    So it's roughly 2% faster then the GTS/512? :(
  • 7Enigma - Tuesday, April 1, 2008 - link

    Again I'm disappointed that this review completely fails to include the 8800GTS. I asked in the previous 9800GX2 review as did several others and there was no response to the questions. It definitely appears that they are purposely failing to include the most obvious competitor to the 9800GTX (and any future lower-end cards, GTS, GT, etc.).

    Looks like I'll be going to another site for a better comparison.
  • 7Enigma - Tuesday, April 1, 2008 - link

    Hocp has a good comparison review (albeit with their odd way of benchmarking) of the 8800GTX/S against the 9800GTX. Pretty much shows what we thought, some slight improvements, but nothing to write home about. This quote from the conclusion sums up the release of the 9800GTX:

    "If you are a gamer and were hoping to upgrade, today is not the day if you already own pretty much any 8800 series card. Here’s hoping real next-gen technology will be seen in a “9900” series soon."

    This pretty much solidifies my purchase of an 8800GT. I just can't see the advantage of shelling out closer to $300 for a slightly better card than a $200 8800GT, with the hopes that within a year SOMEONE comes to the rescue of actually releasing a next gen card that is better than the current/previous generation.
  • AggressorPrime - Tuesday, April 1, 2008 - link

    I'm pretty sure dual, tri, and quad Crossfire is not supposed to give the exact same results in Crysis. There must be something wrong with the chart.
  • AggressorPrime - Tuesday, April 1, 2008 - link

    It looks like these tests are done with the 790i, yet there is no info on what RAM is used or motherboard for that matter in the chart.

    It is interesting that a 790i setup would beat Skulltrail in Crysis, but I guess fast RAM is more important.
  • Noya - Tuesday, April 1, 2008 - link

    ...the best bang for the buck is a pair of 8800gt in SLI @ about $350.
  • KingViper - Tuesday, April 1, 2008 - link

    Can we get a spell check in the house?
  • jtleon - Tuesday, April 1, 2008 - link

    DittoDittoDittoIjusthateitwhenwebcontentisnotedited!!!!

    Regardsjtleon
  • JarredWalton - Tuesday, April 1, 2008 - link

    Fixed.

Log in

Don't have an account? Sign up now