Splinter Cell: Chaos Theory Performance

We make use of the Lighthouse demo for Splinter Cell: Chaos Theory. We have been using this benchmark for quite some time and facilitate automation with the scripts published at Beyond 3D. This benchmark is fairly close to in game performance for our system, but midrange users may see a little lower real world performance when tested with a lower speed processor.

Our settings all used the highest quality level possible including the extra SM3.0 features. As the advanced shaders and antialiasing are mutually exclusive under SC:CT, we left AA disabled and focused on the former. We set anisotropic filtering to 8x for all cards.

For this 3rd person stealth game, ultra high frame rates are not necessary. We have a good playing experience at 25 fps or higher. There may be the framerate junkie out there who likes it a little higher, but our recommendation is based on consistency of experience and ability to play the game without a degraded experience.

Splinter Cell: Chaos Theory

NVIDIA's 7900 GTX SLI does almost as well as X1900 CrossFire, but the 14% advantage X1950 CF has over X1900 CF puts it way out in front. The 7950 GX2 once again splits the difference between the X1950 XTX and the 7900 GTX SLI.

While X1950 XTX leads all the single-GPU single-card solutions, there really isn't that much difference between the playability of the X1900 XTX, 7900 GTX, and X1900 XT. The extra 256MB of RAM the original X1900 XT has does give it a 7.5% advantage over it's baby brother at this resolution.

ATI leads again in Splinter Cell: Chaos Theory, both in dual-GPU and single-GPU configurations. Here the GX2 occupies a nice middle ground, and all of the tested cards manage to remain playable up through 2048x1536. Using the "Chuck Patch" it is also possible to enable AA+HDR on ATI hardware, though time constraints and the fact that there is no NVIDIA equivalent caused us to skip this test for now.

Quake 4 Performance Power to the People
Comments Locked

74 Comments

View All Comments

  • DerekWilson - Saturday, August 26, 2006 - link

    yeah ... i didn't test power with crossfire -- which is a whole lot higher. also, i have a minimal set of componets to make it work -- one hdd, one cdrom drive, and no addin cards other than graphics.

    we'll do multi-gpu power when we look at quadsli
  • ElFenix - Thursday, August 24, 2006 - link

    the review states that power consumption was measured at the wall wtih a kill-a-watt, during a 3Dmark run.

    in addition to the water cooling, it could be he's running a more efficient PSU. in a powerful system drawing 220 watts from the power supply would draw 277 watts from the wall with an 80% efficient PSU (like a good seasonic) and draw 314 watts with a 70% efficient PSU. that's a pretty decent difference right there.

    ... still waiting for nvidia's HQ driver run...
  • poohbear - Thursday, August 24, 2006 - link

    thanks
  • Rock Hydra - Wednesday, August 23, 2006 - link

    With those competitively price parts, hopefully nVIDIA will respond with lower prices.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    I'm not familiar with 1920x1440, did you mean 1920x1200? What resolution were these tests performed? Thank you!

  • JarredWalton - Wednesday, August 23, 2006 - link

    1920x1440 is a standard 4:3 aspect ratio used on many CRTs. It is often included as performance is somewhat close to 1920x1200 performance.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    Thanks, I've been using my LCD for so long I forgot about the vintage CRT res's out there ;) Plus I never ran that particular res on my CRT when I had one, so I just wasn't familiar.
  • cgaspar - Wednesday, August 23, 2006 - link

    While average frame rates are interesting, I _really_ care about minimum frame rates - 300fps average is useless if at a critical moment in a twitch game the frame rate drops to 10fps for 3 seconds - this is especially true in Oblivion. Of course it's possible that the minimums would be the same for all cards (if the game is CPU bound in some portion), but they might not be.
  • JarredWalton - Wednesday, August 23, 2006 - link

    A lot of games have instantaneous minimums that are very low due to HDD accesses and such. Oblivion is a good example. Benchmarking also emphasizes minimum frame rates, as in regular play they occur less frequently. Basically, you run around an area for a longer period of time in actual gaming, as opposed to a 30-90 second benchmark. If there's a couple seconds at the start of the level where frame rates are low due to the engine caching textures, that doesn't mean as much as continuos low frame rates.

    More information is useful, of course, but it's important to keep things in perspective. :)
  • kmmatney - Wednesday, August 23, 2006 - link

    The charts show tht the 7900GT gets a huge boost from being factory overclocked. It would be nice to see if the X1900XT 256 MB can also be overclocked at all, or if there is any headroom.

Log in

Don't have an account? Sign up now