Tomb Raider: Angel of Darkness

Tomb Raider: Angel of Darkness is one of the first playable titles to use DX9’s pixel shader 2.0. The title has a built-in benchmark, but it auto detects hardware settings and selects the optimal quality settings for the best game play. In this case, the GeForce FX Go5650 was auto-detected, and the game selected lower quality settings than it did on the Mobility Radeon 9600. We ran the benchmark in four different settings to give an idea of the different code paths, and the respective ability of each graphics processors to run through each scenario. As the character [Lara Croft] ran through the pipes and waded through water, the image quality of each scenario reflected the settings we set.



As we bump up to higher and higher code paths, we see the differential between the two mobile graphic processors increase, as well. The Mobility Radeon 9600 takes a 51%, 88%, 96%, and a 181% lead, respectfully. When we hit the DX9 code path, the scores on both ends get to be extremely low and actual game play becomes unreasonable. While Tomb Raider: Angel of Darkness is DX9, it is still nothing like the use of DX9 in Half-Life 2. Read on to see those benchmarks.

Microsoft FlightSim 2004 Splinter Cell
Comments Locked

47 Comments

View All Comments

  • Anonymous User - Monday, September 15, 2003 - link

    #25 those benchmarks were performed at 1280x1024 with 4x AA and 8x AF. They provide a better theoretical test than lower settings because the test will be more GPU-limited than CPU-limited: if you wanted to play the game at good frame rate you would not be using such high settings.

    However I agree we need to see real-world numbers too: what settings are necessary to see reasonable frame rates out of this (say 40-50 FPS)?
  • Anonymous User - Monday, September 15, 2003 - link

    #18 what are you talking about? Isn't that the opposite of what I (#17) said? Or did a post get deleted to cause numbers to get out of sync?

    #19 Andrew so you admit these numbers don't actually tell you how the game will perform using the "appropriate" code path for the 56x0? Even though that reduces image quality so shouldn't really be compared directly to the Radeon, it still would be nice to see real-world numbers for the sake of comparison.
  • Anonymous User - Monday, September 15, 2003 - link

    "The scores that we achieved in AquaMark 3 are similarly reminiscent of our scores in Half-Life 2 but without such large margins. In AquaMark 3, the GeForce FX Go5650 achieves sub 10 fps scores in all but one of the scenarios. Meanwhile, the Mobility Radeon 9600 on the average is situated in the mid teens. Minimally, though, the Mobility Radeon 9600 shows its clear lead over the GeForce FX Go5650 with a 58% lead. At its best, the Mobility Radeon 9600 doubles the margin between its counterpart, and this just reinforces the GeForce FX Go5650’s trouble in true DX9 benchmarks."
    I really think there is a misinterpretation of the AquaMark 3 numbers. What is the point of being able to one gpu outperforms the other in up to 58% if none of them can push numbers above 24 fps? The reviewer should have noted that none of theses solutions will do when it comes to all DX9 games even in low quality setups.
    The honest recomendation would better be: wait for the next gen DX9 mobile chips because there is not such thing as true DX9 mobile solution neither from Nvidia nor from ATI.
  • dvinnen - Monday, September 15, 2003 - link

    <<<We are currently revising our graphics benchmark suite in the anticipation of future DX9 stuff. These two GPUs are full DX9 parts, and we are benchmarking them accordingly. UT2003 and our current line of benchmarking titles are DX8, and therefore aren't specifically appropriate for this context. Why are our choices of benchmark titles odd? The Mobility and Go mobile graphics parts are no more than mobile version of desktop processors (clocked down, better power management features and in the M10 case integrated memory package).>>>

    I understand that. But the whole suite dosen't have to be dx9 to get an idea how it will play. I agreee with HL2, Warcraft3, and Splinter Cell, because lots of people play them. (Or in the case of HL2, will play.) AquaMark3 is also a good choice. But not all games are going to be Dx9. OpenGL is still a viable choice. Doom3 is going to use it and many games will be using the engin in the comeing years. I also brought up UT03 because lots of people play it. Quake3 is rather usless now, I agree. It became outdated long ago and now people are pushing 500 fps on it. But a OpenGL benchmark (like RtCW:ET, and yes I know it's still based on Quake3 engin, but you don't get the insane FPS) would be apprciated.
  • Andrew Ku - Monday, September 15, 2003 - link

    #20: Please look at our test configuration page. The divers we used are the newest available drivers for testing, at per the time of the head to head. Remember, mobile drivers need to be vailidated by the mobile system vendor, not the graphics part vendor.
  • Anonymous User - Monday, September 15, 2003 - link

    No drivers are going to make up a 400% differential (!). Nvidia had better get their act together for NV4x.
  • Anonymous User - Monday, September 15, 2003 - link

    Ugghh...
    I just purchased a Dell Inspiron 8600 with the 128mb Geforce Go 5650. Was looking forward to having a mobile platform to play some of the upcoming games. How disappointing to realize that the 5650 just won't be up to snuff. Nvidia should be ashamed of themselves...

    I would have liked to have waited until a ATI radeon 9600 came out for a dell system, but I got a good deal on the laptop and the 9600 card just doesn't seem to have wide distribution yet except in some very expensive custom laptops. Maybe I'll be able to switch out my Nvidia card for an ATI card when it becomes available from Dell?
  • Anonymous User - Monday, September 15, 2003 - link

    I would like to know if this was done using the Rel 50 drivers that aren't "publically released". I remeber hearing comment that these drivers were made for DX9 Games (I think) and that NVIDIA stopped research on the current drivers in DX9 months ago. I really think it would skew the results and the benchmarks do about as much good as the ones posted on HL2 benchmarks a couple days ago. (Sept 12th I think)

    Also, that comment on openGL makes me think.

    Don't get me wrong NVIDIA isn't looking to good,
    But if the drivers aren't the upmost recent on there card I'd like to know how it's a head to head test?
  • Andrew Ku - Monday, September 15, 2003 - link

    #17: The article was subtitled "Taking on DX9." Therefore we benchmarked in DX9 as we stated on the Half-Life 2 page.
  • Anonymous User - Monday, September 15, 2003 - link

    #17: Yes you are correct. We should all run our monitors at 30Hz too, any more is a waste.

Log in

Don't have an account? Sign up now