Splinter Cell: Chaos Theory Performance

Splinter Cell: Chaos Theory has the capacity to bring almost any card to its knees when you enable the HDR rendering modes. SM2.0 support has been added via the latest patch, allowing ATI cards to also support the HDR rendering modes. We aren't looking at HDR rendering here, as it's still not an apples-to-apples comparison, but ATI owners can at least get improved graphics now.

Splinter Cell: Chaos Theory


Splinter Cell: Chaos Theory


Splinter Cell: Chaos Theory


In our G70 review, we saw that the 7800 GTX was able to outperform the 6800 Ultra in SLI mode in all tested resolutions. Granted, it was only by a small margin in some cases, but it's impressive nonetheless. Even more surprising is that at present, a single 7800 GTX outperforms even the 7800 GT SLI configuration in the three tested resolutions. Enabling HDR rendering would of course change the results quite a bit.

As the numbers show, the 7800 GT is no match for the 6800U SLI in this game. At 1600x1200, the 7800 GT gets 51.7 fps, as opposed to the 6800Us 40.2 (a 28.6% increase). The SLI setup gains another 24 frames over the GT however, for an 89% increase. At 2048x1536, the 6800U gets 22.2 fps, while the 7800 GT gets 36.2 (a 63% increase), and the 6800U SLI gets 41 fps (an 85% increase).

Taking into consideration the much higher gains in performance with the 6800 Ultra in SLI mode, it looks to be a promising alternative to the single 7800 GT if you have the means. While you may need to upgrade your power supply and possibly your motherboard to fit two 6800 Ultras, this may not cost as much as you'd expect, especially considering that prices for these things will likely be falling soon. Depending on how much it costs for a particular setup, the 7800 GTX may be a better option.

Half-Life 2 Performance Star Wars: Knights of the Old Republic 2 Performance
Comments Locked

77 Comments

View All Comments

  • dwalton - Friday, August 12, 2005 - link

    "I would like G70 technology on 90nm ASAP, I have a feeling Nvidia didn't do a shift to 90nm for NV40 for a reason, as that core is still based on AGP technology, and Nvidia currently doesn't have a native PCI-E part for 6800 Line, they are all using HSI on the GPU substrate from the NV45 design."

    I believe Nvidia didn't want another 5800 fiasco. They probably determine a long time ago that 110 nm was a safer bet and used the 6600 as a guinea pig. Having a sucessful launch of the 6600 gave them confidence that manufacturing a 110nm g70 would be painless process.

    Futhermore, the 7600 will be a midrange card and will target a market segment that is more than likely dominated by AGP boards. So NV40 based 7600 would make perfect sense since the majority of the 7600 sold wouldn't require a HSI chip.

    "Let's faice it for the time being, were not going to be getting fully fucntional high end cores at the 199US price point with 256Bit Memory Interface, so far we have gotten things like Radeon X800, Geforce 6800, 6800 LE, X800 SE, X800 GT. Etc etc. It just doesn't seem profitable to do so."

    The X800 GT is a 256Bit Memory Interface Card and targets the 6600 GT segment.
  • coldpower27 - Friday, August 12, 2005 - link

    "The X800 GT is a 256Bit Memory Interface Card and targets the 6600 GT segment."

    I guess you missed reading the fully functional part, as the X800 GT does not comply with this statement.

    I guess I didn't get my meaning right, when I said G70 technology, I was talking about the mainstream cards going to 90nm not the 7800 GTX/GT.

    For a mid range part the risk would be reduced for going to 90nm as the core is not quite as complex, Nvidia did make safe bet to go to 110nm for their high end cards, I am asking for a G7x technology based performace (199US) card on 90nm technology. Not on the high end.

    Targeting PCI-E now would be a good idea as there are now boards on both sides that have PCI-E support for a decent amount of time, and it's the more forward thinking marchitecture, not to mention the possibility of power conusmption reduced enough on the 7600 GT to put it solely on the PCI-E bus if the Bridge Chip didn't exist. There isn't much point in designing a native AGP chip now, unless your talking about the value segment where margins are extremely thin per card.

    For the AGP users, I believe they can continue to use 110nm NV48, but I would like for PCI-E users to benefit from a 7600 GT 90nm PCI-E native card, with possible bridging to AGP if demand calls for it. There isn't much point of calling the mianstream card a 7600 GT if it's not based on G7x technology. We don't want Nvidia to follow ATI's lead on that kinda front. :)
  • neogodless - Thursday, August 11, 2005 - link

    I mainly agree with you, and who knows, such things could be in the work. But "simple process shrink"? I get the feeling that's a contradiction!

    Let us not forget the mistakes of the past... like the FX5800 and its "simple process shrink".
  • dwalton - Thursday, August 11, 2005 - link

    The FX5800 was Nvidia attempt to introduce a new high end architecture on a new process (130nm) it had never used before. Just like ATI is doing now. The 6xxx lines (130nm) is not new tech so producing a mature NV40 architecture on 90nm or 110nm should go alot smoother. Even at 130nm the NV40 is smaller than the G70 at 110nm (287mm² vs. 334mm²). Moving the NV40 to 90nm would reduce die size to ~200mm². Look at the 6600Gt: 110nm, 150mm², 8 pipes(?) vs. a 90nm NV40 ~200mm², 16 pipes.

  • JarredWalton - Friday, August 12, 2005 - link

    Unless they can make a 90nm "7600GT" part backwards compatible (via SLI) with the 6800GT, NVIDIA is in a position of "damned if you do, damned if you don't." As a 6800GT owner, I'd be rather sad to suddenly have the promise of upgrading to SLI yanked away.
  • dwalton - Thursday, August 11, 2005 - link

    Also, mature driver support at introduction.
  • Sunbird - Thursday, August 11, 2005 - link

    I don't know. At least they arent doing those ugly case reviews anymore, but they sure are still making me feel alienated.

    Thats first page smacks of elitism. Why can't we average people with a 5900XT (or even 5200) upgrade to say a 7600 that uses less power and thus is less noisy and easier to cool to than a 6600 or 6800?

    I wonder which of the 2 authors wrote that paragraph?

    I guess this could be a symptom of Anand and his Apple usage, cause Apple people are often very elitist. Or it could be that they want to be the upmarket tech website for people with lots of money and think Toms Hardware is better suited to us unwashed (FX 5200 weilding) masses.

    Actually, this whole new colourscheme smacks of cold sauve elitism! Not the warm yellowish homey feel of old....

    ;(
  • Shinei - Thursday, August 11, 2005 - link

    As has been pointed out in this very comments section, a 7600 release would be redundant because there already is a 16-pipe, 350MHz part with 6 vertex pipelines: The 6800GT. There is no elitism, it's the raw fact that a 7600GT would be identical to a 6800GT in specifications and (most likely) performance, rendering it pointless to spend time fabricating when the 6800GT serves just as well.

    As for the article, I noticed that the 7800GT was outperformed by the 6800U in some SLI applications (like UT2004). Is that related to memory bandwidth, or is that a driver issue because of the 77.77 beta drivers you tested with?
  • Sunbird - Friday, August 12, 2005 - link

    As has been pointed out in my very own comment, "upgrade to say a 7600 that uses less power and thus is less noisy and easier to cool to than a 6600 or 6800?"

    And anyway its about price point, will the 6800GT cost as much as the 6600GT then?

    I want 6800GT (aka 7600) performance at the $200 to $250 pricepoint and I want it now!

    I'd settle for a 7200 that performs like a 6600 too.
  • DerekWilson - Friday, August 12, 2005 - link

    Yes, the 6800 GT will come down to $250 and likely even more over the next few months. You can already buy a 6800 GT for $270 (from our realtime price engine).

    The 6800 GT is not a noisy part. The HSF solution for the 7800 GT is strikingly similar. A lower performance G70 part may run cooler and draw less power, but, again, the 6800 GT is not a power hog.

    There really is not a reason for us to want a lower performing G70 part -- prices on 6 Series cards are falling and this is all we need. Even if NVIDIA came out with something like a "7200 that performs like a 6600", the 6600 would probably be cheaper because people would the 7 means more performance -- meaning the 6600 would be a better buy.

Log in

Don't have an account? Sign up now