Test Settings

We tested Double Agent on a variety of cards from both ATI and NVIDIA, all of which support SM3.0. The ATI cards we tested were the X1300 XT, X1650 Pro, X1650 XT, X1950 Pro, X1900 XT 256MB, and finally the X1950 XTX. From NVIDIA, we tested the 7300 GS, 7300 GT, 7600 GS, 7600 GT, 7950 GT and 7900 GTX. Without SLI support the 7950 GX2 was unable to perform as it should given it is effectively single-card SLI; the GX2 saw frame rates lower than our single 7950 GT, thus it was excluded from our tests. Also, NVIDIA's 8800 cards were excluded from tests due to rendering errors during gameplay. While the game did run on our 8800, the Sam Fisher model was warped and twisted into a very strange shape making gameplay pretty much impossible, as the image shows below. Hopefully we will see a patch for the game that will address this, along with the other SLI/CrossFire issues and various bugs soon.



Since Double Agent is still mainly a game of stealth, a very high frame rate is not necessary for a decent gameplay experience. We found that an average frame rate of no less than 15fps was smooth enough to provide good gameplay, but any less and the game starts to get choppy. Even though 15fps was still playable, though, 25fps and higher is definitely preferred. Here is the test system we used for these benchmarks.

Sytem Test Configuration
CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: EVGA nForce 680i SLI
Intel BadAxe
Chipset: NVIDIA nForce 680i SLI
Intel 975X
Chipset Drivers: Intel 7.2.2.1007 (Intel)
NVIDIA nForce 9.35
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.10
NVIDIA ForceWare 96.97
NVIDIA ForceWare 91.47 (G70 SLI)
Desktop Resolution: 2560 x 1600 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

Setting Expectations High End Cards
Comments Locked

28 Comments

View All Comments

  • frostyrox - Wednesday, December 13, 2006 - link

    The PC gaming scene is slowly becoming a joke, and this is coming from a avid pc gamer. Nvidia and ATi release 10 different tiers of cards completely ripping off all of us because only 2 out of the 10 cards can actually play games well and last at least a year before they force another upgrade down our throats. I'm not buying it anymore. And Ubisoft releasing games that don't have any support for Shader 2.0 cards (Rainbow Six Vegas and Double Agent) when many many people are still using these cards because they're really not that old or slow. And THEN the games come out buggy as hell because they were designed for consoles and weren't properly optimized for PCs. Anyone else notice Rainbow Six Vegas PC has a PATCH out before the gamespot.com review is even up for the game? Hahaha. PC gaming scene is a joke, and the jokes on all of us. The question is whether us gamers are gonna take it anymore. I'm not.
  • frostyrox - Wednesday, December 13, 2006 - link

    I'd also like to point out websites like Tomshardware and Anandtech fully know that the only reason Oblivion runs like a total turd on every videocard configuration available is because it was poorly ported over to PC. It has literally NOTHING to do with the game being "a true test for videocards" or "amazingly NASA advanced graphics LOL". But instead of being real about the whole thing, toms and anand try their hardest to not upset the bigwigs and bring attention to this fact. I suppose so they can keep getting their free test hardware and other support for their site. It's all good. Any monkey can clearly look at the game and see the truth. Microsoft doesn't care about gamers. About the only thing they do care about is "beating sony and nintendo" (which they wont, and will never ever do). This is exactly why Oblivion was an extremely rushed title full of bugs, glitches and overall turd performance. I'm finished ranting. Have a Nice Day.
  • lemonadesoda - Sunday, December 10, 2006 - link

    What on earth is the reviewer doing by testing different cards BUT ON a very very high end CPU? I really cannot imagine ANYONE with such a CPU using a low end card.

    The tests are not helpful for the typical user. It would have been much better to do the tests with a typical cpu (e.g. P4 or D at 3.0Ghz) with all these cards. That way the typical user gets an idea how the gamne will perform on their EXISTING system or with a GPU upgrade.

    Alternatively, take a typical GPU, say X800 or X1650 or X1950 and test with different CPUs, e.g. P4 3.0 and CD 2.0, and C2D 3.0 to get an idea how the game will perform on a typical PC or with a CPU upgrade.
  • Josh Venning - Sunday, December 10, 2006 - link

    Thanks for the comment. For this review, our focus was on how Double Agent performs across different graphics cards. A faster CPU gives us more flexibility when testing, because we wouldn't be able to see the real difference in how high end graphics cards can handle the game. For lower end CPUs, a slower CPU won't have as much of an impact because the game will already be GPU limited rather than CPU limited. We may see slightly lower results, but really the only thing a slower CPU would do is obscure the difference between graphics cards. This is how we have approached all of our graphics hardware reviews over the past few years, and how we will continue to test graphics cards in the future. The idea is to eliminate as many other bottlenecks as possible so we can look at the capabilities of the hardware we are trying to study.

    Double Agent CPU performance is definitely something we could look at in a future article, but we will be waiting for Ubisoft to fix some of the problems that make this game difficult to test.

    Obviously, when making a buying descision, all aspects of a system must be taken into account. We can't review every possible system (the combiniations are way too numerous), but we can review a huge number of individual components and know where the bottleneck would be before we build a system.
  • Xcom1Cheetah - Saturday, December 9, 2006 - link

    can the power requirement of the GPU cards be checked along the tests. Just wanted to know how much difference is between 7900GS and X1950 wrt power requirement...

    Btw very well covered article...
  • Rand - Friday, December 8, 2006 - link

    It would have been nice to see some GeForce6 series graphics cards tested, their still in a considerable number of systems and are SM 3.0 capable.

    I'm also rather disappointed only one processor was tested, I think it would be worthwhile to get a gauge of CPU dependency in the game especially as related to the individual graphics cards.
  • JarredWalton - Friday, December 8, 2006 - link

    Typically we either do a look at GPU performance with one CPU, or a look at CPU performance with one GPU (usually after determining the best GPU for a game). Benching a selection of GPUs and CPUs all at the same time is simply impractical. Running four resolutions, two levels, and two/three detail settings with 10 GPUs already means doing about 200 test configurations (give or take). Now if you wanted to test those with 5 CPUs....

    Anyway, maybe Josh can look at a separate CPU scaling article in the near future if there's enough interest in that. If SCDA becomes part of our standard benchmark suite, it will also be covered with CPU launches in the future. More likely is that we will use R6 Las Vegas instead (if we add something new from the Clancy game world).
  • poohbear - Friday, December 8, 2006 - link

    why did anandtech choose this game to benchmark? It doesnt exactly stand out as a graphicly intensive game, especially since the first unreal engine 3 game is coming out in a few days (rainbow six: las vegas. i know roboblitz is the first game, but its hardly demonstrates what UE3 is capable of). I'd much rather see benchies for Rainbow six: las vegas, which will show us firsthand what kind of hardware is needed for the next year. just my 2 cents.
  • Josh Venning - Friday, December 8, 2006 - link

    Actually, we are planning to review Rainbow Six Las Vegas when we can get a hold of it, so good suggestion. :-) Double Agent may not be the most graphically intensive game ever released, but it's still a fairly high-profile release and we wanted to keep our readers informed about its performance.
  • imaheadcase - Friday, December 8, 2006 - link

    Clearly not from the screenshots, graphics don't look like anything.

Log in

Don't have an account? Sign up now