Splinter Cell

We used Splinter Cell in the past briefly, but AA and AF aren’t recommended by the creator because of the way it was designed. Even so, the game is still very detailed and takes its toll on even the newest of hardware.



We ran the cards through all benchmark scenarios, but to avoid repetition, we will only include benchmark scores from two of the scenarios. As we push up the resolutions, the Mobility Radeon 9600 increases its 10% lead at 1024x768 by another 6%. At 1600x1200, this scenario hits sub 20fps for both graphics processor, and game play is, for the most part, unreasonable.

The largest difference between the two cards ended up being 28% at 1600x1200, but neither card ended up with sub 10fps scores in any of the scenarios.



We should note that the GeForce FX Go5650 was able to decrease the differential between the Mobility Radeon 9600 in one of the benchmark scenarios. However, the difference between the two only started at 6% at 1024x768, and ended up dropping to 3% at 1600x1200.

One glitch that we stumbled upon in this title was the interference of SpeedStep with the internal benchmark counter. With SpeedStep enabled, the game derives the wrong processor frequency, and therefore, reports the fps wrong. The developer has since been informed of this.

Tomb Raider: Angel of Darkness Half-Life 2
Comments Locked

47 Comments

View All Comments

  • Anonymous User - Monday, September 15, 2003 - link

    #25 those benchmarks were performed at 1280x1024 with 4x AA and 8x AF. They provide a better theoretical test than lower settings because the test will be more GPU-limited than CPU-limited: if you wanted to play the game at good frame rate you would not be using such high settings.

    However I agree we need to see real-world numbers too: what settings are necessary to see reasonable frame rates out of this (say 40-50 FPS)?
  • Anonymous User - Monday, September 15, 2003 - link

    #18 what are you talking about? Isn't that the opposite of what I (#17) said? Or did a post get deleted to cause numbers to get out of sync?

    #19 Andrew so you admit these numbers don't actually tell you how the game will perform using the "appropriate" code path for the 56x0? Even though that reduces image quality so shouldn't really be compared directly to the Radeon, it still would be nice to see real-world numbers for the sake of comparison.
  • Anonymous User - Monday, September 15, 2003 - link

    "The scores that we achieved in AquaMark 3 are similarly reminiscent of our scores in Half-Life 2 but without such large margins. In AquaMark 3, the GeForce FX Go5650 achieves sub 10 fps scores in all but one of the scenarios. Meanwhile, the Mobility Radeon 9600 on the average is situated in the mid teens. Minimally, though, the Mobility Radeon 9600 shows its clear lead over the GeForce FX Go5650 with a 58% lead. At its best, the Mobility Radeon 9600 doubles the margin between its counterpart, and this just reinforces the GeForce FX Go5650’s trouble in true DX9 benchmarks."
    I really think there is a misinterpretation of the AquaMark 3 numbers. What is the point of being able to one gpu outperforms the other in up to 58% if none of them can push numbers above 24 fps? The reviewer should have noted that none of theses solutions will do when it comes to all DX9 games even in low quality setups.
    The honest recomendation would better be: wait for the next gen DX9 mobile chips because there is not such thing as true DX9 mobile solution neither from Nvidia nor from ATI.
  • dvinnen - Monday, September 15, 2003 - link

    <<<We are currently revising our graphics benchmark suite in the anticipation of future DX9 stuff. These two GPUs are full DX9 parts, and we are benchmarking them accordingly. UT2003 and our current line of benchmarking titles are DX8, and therefore aren't specifically appropriate for this context. Why are our choices of benchmark titles odd? The Mobility and Go mobile graphics parts are no more than mobile version of desktop processors (clocked down, better power management features and in the M10 case integrated memory package).>>>

    I understand that. But the whole suite dosen't have to be dx9 to get an idea how it will play. I agreee with HL2, Warcraft3, and Splinter Cell, because lots of people play them. (Or in the case of HL2, will play.) AquaMark3 is also a good choice. But not all games are going to be Dx9. OpenGL is still a viable choice. Doom3 is going to use it and many games will be using the engin in the comeing years. I also brought up UT03 because lots of people play it. Quake3 is rather usless now, I agree. It became outdated long ago and now people are pushing 500 fps on it. But a OpenGL benchmark (like RtCW:ET, and yes I know it's still based on Quake3 engin, but you don't get the insane FPS) would be apprciated.
  • Andrew Ku - Monday, September 15, 2003 - link

    #20: Please look at our test configuration page. The divers we used are the newest available drivers for testing, at per the time of the head to head. Remember, mobile drivers need to be vailidated by the mobile system vendor, not the graphics part vendor.
  • Anonymous User - Monday, September 15, 2003 - link

    No drivers are going to make up a 400% differential (!). Nvidia had better get their act together for NV4x.
  • Anonymous User - Monday, September 15, 2003 - link

    Ugghh...
    I just purchased a Dell Inspiron 8600 with the 128mb Geforce Go 5650. Was looking forward to having a mobile platform to play some of the upcoming games. How disappointing to realize that the 5650 just won't be up to snuff. Nvidia should be ashamed of themselves...

    I would have liked to have waited until a ATI radeon 9600 came out for a dell system, but I got a good deal on the laptop and the 9600 card just doesn't seem to have wide distribution yet except in some very expensive custom laptops. Maybe I'll be able to switch out my Nvidia card for an ATI card when it becomes available from Dell?
  • Anonymous User - Monday, September 15, 2003 - link

    I would like to know if this was done using the Rel 50 drivers that aren't "publically released". I remeber hearing comment that these drivers were made for DX9 Games (I think) and that NVIDIA stopped research on the current drivers in DX9 months ago. I really think it would skew the results and the benchmarks do about as much good as the ones posted on HL2 benchmarks a couple days ago. (Sept 12th I think)

    Also, that comment on openGL makes me think.

    Don't get me wrong NVIDIA isn't looking to good,
    But if the drivers aren't the upmost recent on there card I'd like to know how it's a head to head test?
  • Andrew Ku - Monday, September 15, 2003 - link

    #17: The article was subtitled "Taking on DX9." Therefore we benchmarked in DX9 as we stated on the Half-Life 2 page.
  • Anonymous User - Monday, September 15, 2003 - link

    #17: Yes you are correct. We should all run our monitors at 30Hz too, any more is a waste.

Log in

Don't have an account? Sign up now