Splinter Cell: Chaos Theory Performance

We make use of the Lighthouse demo for Splinter Cell: Chaos Theory. We have been using this benchmark for quite some time and facilitate automation with the scripts published at Beyond 3D. This benchmark is fairly close to in game performance for our system, but midrange users may see a little lower real world performance when tested with a lower speed processor.

Our settings all used the highest quality level possible including the extra SM3.0 features. As the advanced shaders and antialiasing are mutually exclusive under SC:CT, we left AA disabled and focused on the former. We set anisotropic filtering to 8x for all cards.

For this 3rd person stealth game, ultra high frame rates are not necessary. We have a good playing experience at 25 fps or higher. There may be the framerate junkie out there who likes it a little higher, but our recommendation is based on consistency of experience and ability to play the game without a degraded experience.

Splinter Cell: Chaos Theory

NVIDIA's 7900 GTX SLI does almost as well as X1900 CrossFire, but the 14% advantage X1950 CF has over X1900 CF puts it way out in front. The 7950 GX2 once again splits the difference between the X1950 XTX and the 7900 GTX SLI.

While X1950 XTX leads all the single-GPU single-card solutions, there really isn't that much difference between the playability of the X1900 XTX, 7900 GTX, and X1900 XT. The extra 256MB of RAM the original X1900 XT has does give it a 7.5% advantage over it's baby brother at this resolution.

ATI leads again in Splinter Cell: Chaos Theory, both in dual-GPU and single-GPU configurations. Here the GX2 occupies a nice middle ground, and all of the tested cards manage to remain playable up through 2048x1536. Using the "Chuck Patch" it is also possible to enable AA+HDR on ATI hardware, though time constraints and the fact that there is no NVIDIA equivalent caused us to skip this test for now.

Quake 4 Performance Power to the People
Comments Locked

74 Comments

View All Comments

  • SixtyFo - Friday, September 15, 2006 - link

    So do they still use a dongle between the cards? If you had 2 xfire cards then it won't be connecting to a dvi port. Is there an adaptor? I guess what I'm asking is are you REALLY sure I can run 2 crossfire ed. x1950s together? I'm about to drop a grand on video cards so that piece of info may come in handy.
  • unclebud - Friday, September 1, 2006 - link

    "And 10Mhz beyond the X1600 XT is barely enough to warrant a different pair of letters following the model number, let alone a whole new series starting with the X1650 Pro."

    nvidia has been doing it for years with the 4mx/5200/6200/7300/whatever and nobody here said boo!
    hm.
  • SonicIce - Thursday, August 24, 2006 - link

    How can a whole X1900XTX system use only 267 watts? So a 300w power supply could handle the system?
  • DerekWilson - Saturday, August 26, 2006 - link

    generally you need something bigger than a 300w psu, because the main problem is current supply on both 12v rails must be fairly high.
  • Trisped - Thursday, August 24, 2006 - link

    The crossfire card is not the same as the normal one. The normal card also has the extra video out options. So there is a reason to buy the one to team up with the other, but only if you need to output to a composite, s-video, or component.
  • JarredWalton - Thursday, August 24, 2006 - link

    See discussion above under the topic "well..."
  • bob4432 - Thursday, August 24, 2006 - link

    why is the x1800xt left out of just about every comparison i have read? for the price you really can't beat it....
  • araczynski - Thursday, August 24, 2006 - link

    ...I haven't read the article, but i did want to just make a comment...

    having just scored a brand new 7900gtx for $330 shipped, it feels good to be able to see the headlines for articles like this, ignore them, and think "...whew, i won't have to read anymore of these until the second generation of DX10's comes out..."

    I'm guessing nvidia will be skipping the 8000's, and 9000's, and go straight for the 10,000's, to signal the DX10 and 'uber' (in hype) improvements.

    either way, its nice to get out of the rat race for a few years.
  • MrJim - Thursday, August 24, 2006 - link

    Why no Anisotropic filtering tests? Or am i blind?
  • DerekWilson - Saturday, August 26, 2006 - link

    yes, all tests are performed with at least 8xAF. Under games that don't allow selection of a specific degree of AF, we choose the highest quality texture filtering option (as in BF2 for instance).

    AF comes at fairly little cost these days, and it just doesn't make sense not to turn on at least 8x. I wouldn't personally want to go any higher without angle independant AF (like the high quality af offered on ATI x1k cards).

Log in

Don't have an account? Sign up now