Splinter Cell: Chaos Theory Performance

Splinter Cell: Chaos Theory has the capacity to bring almost any card to its knees when you enable the HDR rendering modes. SM2.0 support has been added via the latest patch, allowing ATI cards to also support the HDR rendering modes. We aren't looking at HDR rendering here, as it's still not an apples-to-apples comparison, but ATI owners can at least get improved graphics now.

Splinter Cell: Chaos Theory


Splinter Cell: Chaos Theory


Splinter Cell: Chaos Theory


In our G70 review, we saw that the 7800 GTX was able to outperform the 6800 Ultra in SLI mode in all tested resolutions. Granted, it was only by a small margin in some cases, but it's impressive nonetheless. Even more surprising is that at present, a single 7800 GTX outperforms even the 7800 GT SLI configuration in the three tested resolutions. Enabling HDR rendering would of course change the results quite a bit.

As the numbers show, the 7800 GT is no match for the 6800U SLI in this game. At 1600x1200, the 7800 GT gets 51.7 fps, as opposed to the 6800Us 40.2 (a 28.6% increase). The SLI setup gains another 24 frames over the GT however, for an 89% increase. At 2048x1536, the 6800U gets 22.2 fps, while the 7800 GT gets 36.2 (a 63% increase), and the 6800U SLI gets 41 fps (an 85% increase).

Taking into consideration the much higher gains in performance with the 6800 Ultra in SLI mode, it looks to be a promising alternative to the single 7800 GT if you have the means. While you may need to upgrade your power supply and possibly your motherboard to fit two 6800 Ultras, this may not cost as much as you'd expect, especially considering that prices for these things will likely be falling soon. Depending on how much it costs for a particular setup, the 7800 GTX may be a better option.

Half-Life 2 Performance Star Wars: Knights of the Old Republic 2 Performance
Comments Locked

77 Comments

View All Comments

  • Quiksel - Thursday, August 11, 2005 - link

    Like I mentioned in one of the other articles:

    "(1) I understand that taking new tech and reviewing it on launch day, etc., is important. (2) Then comes the mass production of the tech by different manufacturers, so there's a need for the readers to be informed on the differences between the different products. (3) Then there's the difference between the interim releases after the initial launch of the new tech that also need reviewing and explanation. From those three different times of a piece of new tech, I would typically expect 3 articles or so for each piece of said new tech. From my initial post, I have just been surprised that what seems to be happening are lots of reviews centered around the second phase of your review cycle, and so that's why I was asking whether this is really what readers want to see on AT all the time (i.e., $500 graphic cards to oggle and wish a relative would die so that we could afford it)."

    "Can't tell you how weird I felt last night to read the new article about the $3000 desk. I guess it helps to have some off-the-wall review about such a nice piece of desk. But is that really what the readers want to see? More hardware that they can't afford? One poster above me here mentioned that you've lost touch with your readers, and sometimes, I wonder whether you're really just trying to fill a niche that no one else is really pursuing in an effort to either drive the industry in that direction or just cater to a crowd that may or may not even visit here. Who knows. I sure got confused with such an article. These 7800GTX articles have done the same for me."

    "I don't know what to tell ya to do, because I'm not in your position. But I certainly don't feel as at home on this site as I used to. Am I getting too old to appreciate all this nice shiny new expensive hardware?? :)"

    4 out of the last 5 articles on AT are all this high-end tech! Where's the sweet spot? The budget? ANYTHING ELSE BUT THE HIGH-END??

    flame away, thanks :)
  • coldpower27 - Thursday, August 11, 2005 - link

    What else is there to review? I mean it's not like Nvidia has relased the 7600 Series yet??? Neither is RV530 anywhere to be found. And typically a high end piece of hardware is new, and you remember Anandtech did review the Athlon 64 X2 3800+. Though I would like to see a reivew of the recently announced Sempron 3400+. I would also like to see how the new Celeron D 351 stacks up as well.

    I am not sure it's all that interesting to review the same video card over and over again like reference 6600 GT vs a new one with a new more advanced heatsink, then a new one with a better bundle of software etc...

  • JarredWalton - Friday, August 12, 2005 - link

    I have my doubts as to whether a 7600 type card will even *BE* launched in the next six months. Think about it: why piss off all the owners of 6800GT cards by releasing a new card that isn't SLI compatible? From the customer support standpoint, it's better to keep the older SLI-capable cards in production and simply move them to the mid-range and value segments. Which is exactly what NVIDIA did with 6800 and 6800GT with this launch. Now if the 6800U would just drop to $350, everything would be about right.
  • jkostans - Thursday, August 11, 2005 - link

    The 7800GT is slightly slower than a 6800 Ultra SLI setup and the GTX is on par or faster. The GT AND GTX cost less than the additional 6800 ultra upgrade to SLI, so SLI is rather useless. Why opt for an extra power hungry 6800 ultra when you can just swap for a lower power 7800 GT or better performing and lower power GTX for less money? This will happen with the 7800 GTX SLI setup too. SLI should only be a considerationas an initial buy (for rich gamers who want the absolute best), not as an upgrade path for later. Gotta love nVIDIA "rendering" their own technology useless lol!.
  • JNo - Thursday, August 11, 2005 - link

    Hear Hear! Good point, well made and I think intelligent people realised this from the off. Let me think - 2x 6800U dustbusters causing a racket or 1 new 7800GT(X)...
  • Anemone - Thursday, August 11, 2005 - link

    Hi there

    I'd like to suggest maybe using 1920x1200 for high res tests. The popularity of widescreen gaming (where possible) is growing, and this provides a more commonly used "extreme resolution" than the 2048x1536, thus, imo a bit more relevant.

    Just my $.02

    Thanks
  • JNo - Thursday, August 11, 2005 - link

    I second this motion for 1920x1200!! Why test at 2048x1536 when most people who could afford these monitors (albeit CRTs) would likely go for widescreen instead? Slightly less pixels but better visual impact... (nb love watching other CS players not spotting an enemy on the peripheral of my screen presumably cos their monitors are not widescreen!)
  • adonn78 - Thursday, August 11, 2005 - link

    First off, no gamer plays videogames at resolutions above 1600x1200! Most of us stick to 1024x768 so that we can get high framerates and enable all the features and play the game on the highest settings. In addition you did not show how the GT and GTX stacked up against the previous generation suchs as the 6800 ultra, GT and the 5950 ultra. And Where is the AGP version? My computer is 2 years old and I am upgrading my graphics card soon. I guess I'll wait to see if ATI makes AGP cards for their next generation. And where the heck is the R520? ATI is really lagging this time around. Hopefully we will get some AGP love. AGP still got a good 2 years of life left in it.
  • DerekWilson - Thursday, August 11, 2005 - link

    I play games at 1920x1080 and 1920x1200 depending on what room I'm in ... and I like to have at least 8xAF on and 4xAA if I can.

    When I'm not playing at those resolutions, I'm playing at 1600x1200 with 4xAA 8xAF period. Any lower than that and I feel like I'm back in 1996.

    But that may just be me :-)

    If I ran benchmarks at 1024x768, no matter the settings, all these cards would give me the same number (barring everquest 2 on extreme quality which would probably still be slow).

    I also play with vsync on so I don't get tearing ... but we test with it off so we can remove the limits and see the cards potential.
  • neogodless - Thursday, August 11, 2005 - link

    Hey, that's good to know about the vsync... back when I played Doom III, I noticed some of that, but didn't know much about it. I just felt "robbed" because my Geforce 6800GT was giving me tearing... thought maybe it couldn't keep up with the game. But everywhere I went I saw people saying "Vsync off! Two legs good!"

Log in

Don't have an account? Sign up now