Test Setup

As with our last game article, we grouped the performance tests into three different categories: low-end, mainstream/midrange, and high-end graphics performance. However, we only have one benchmark for Rainbow Six: Vegas, which will mean fewer numbers. In two of the three sections, the low-end and mainstream/midrange, we have performance results for the game at both highest and lowest quality settings. We chose the same types of cards from ATI and NVIDIA for testing this game as with Double Agent, and we used these solutions because they cover a broad spectrum of current-generation cards at different performance levels.

The NVIDIA cards we tested Vegas with are the 7300 GT, 7600 GS, 7600 GT, 7900 GS, 7950 GT, 7900 GTX, and the 8800 GTS and GTX. From ATI, we have the X1300 XT, X1650 Pro, X1650 XT, X1900 XT 256, and X1950 XTX. We are happy to report that unlike with Double Agent, Rainbow Six: Vegas runs on the 8800 without any strange graphical artifacts at all. Also, because the game doesn't yet officially support SLI, the 7950 GX2 doesn't see the type of performance in this game that it should, and so it was omitted from our tests. We would have very much liked to have seen how quad SLI handled the game, but unfortunately we will have to wait and hope a patch or driver update will allow this. The 7300 GS performed so poorly with this game that it wasn't included in our tests, and needless to say we don't recommend trying to play Rainbow Six: Vegas on this card. In fact, any current card that costs under $125 is going to have difficulties unless you run at lowest quality settings and a low resolution.

Here is the system we used for our performance tests.

System Test Configuration
CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: EVGA nForce 680i SLI
Intel BadAxe
Chipset: NVIDIA nForce 680i SLI
Intel 975X
Chipset Drivers: Intel 7.2.2.1007 (Intel)
NVIDIA nForce 9.35
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.10
NVIDIA ForceWare 96.97
NVIDIA ForceWare 91.47 (G70 SLI)
Desktop Resolution: 2560 x 1600 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

High-End Performance

As we saw with other games like Oblivion and Double Agent, Rainbow Six: Vegas performs better on ATI cards than NVIDIA ones at the same price point. This is something that can possibly change as drivers are updated and game patches are released in the future. Something to keep in mind is in our benchmark for Vegas, average FPS of a little over 20 will mean relatively smooth frame rates throughout most of the game. There will however be times when action gets heated and you will experience choppiness on cards that score less than ~30 FPS in our particular benchmark scenario.

High End


We can easily see from these results how much performance NVIDIA's 8800 series of cards are capable of over the rest of the competition. ATI doesn't actually have any direct competition to the 8800 right now, so those wanting the fastest performance in this or basically any other game will have to go with NVIDIA. (Don't plan on running the beta/release candidate of Vista on 8800 cards right now, however, as the Vista drivers are still not finalized.) This shows how, as we said earlier, the 8800 GTS and GTX are the only two cards that can really run the game smoothly at the highest resolution with the highest quality settings enabled.

The X1950 XTX almost runs the game smoothly at the highest settings, and with some overclocking, Vegas has a good chance of running perfectly fine at maximum details and 1600x1200 with this card. Similarly, the 7900 GTX, as powerful as it is, just can't manage acceptable performance in the game at 1600x1200 at reference speeds, but at one resolution down it looks and plays fine.

Setting Expectations Mainstream/Midrange Performance
Comments Locked

32 Comments

View All Comments

  • ariafrost - Monday, December 25, 2006 - link

    Well forget about running it on my X850XT, apparently RSV *requires* a Pixel Shader 3.0 video card. If anyone could confirm/deny that information it'd be great, but for now it looks like a lot of ill-informed customers may end up buying a game their "128MB/256MB" video cards can't support.
  • justly - Monday, December 25, 2006 - link

    quote:

    It's very evident looking at all of these tests how Rainbow Six: Vegas tends to favor ATI hardware, but again, keep in mind that because of patches and updates this may not (and hopefully won't) be the case for long.


    Anandtech always seems to have a problem when ever it can't recomend NVIDIA as the best solution in every senerio. What is so wrong with the idea that ATI hardware performs better than NVIDIA hardware of the same generation? Maybe I'm mistaken, but I thought even Anandtech expected ATI might do better in newer games.
    Personally I'm not much of a gamer so it really doesn't matter to me, but fot the sake of the people using your articles to choose hardware why give them expectations that might not materialize?

    Maybe because I am not engrosed in the gamming experiance I have a different perspective, but considering a lot of games are ported over from consoles (or at least designed with consoles in mind) wouldn't it be reasonable to expect any game designed around a console using ATI graphics to favor ATI graphics on the PC? It wouldn't surprize me in the least to see games favoring (or at least more competitive) on hardware built around ATI for the next year or two.
  • Jodiuh - Monday, December 25, 2006 - link

    Because it's happened before. Remember Oblivion?
  • munky - Monday, December 25, 2006 - link

    Nothing happened. The 7-series still has much worse performance in Oblivion in outdoor scenes with foliage than equivalent Ati cards.
    http://www.firingsquad.com/hardware/nvidia_geforce...">http://www.firingsquad.com/hardware/nvidia_geforce...
  • Frumious1 - Monday, December 25, 2006 - link

    Try not to be so easily offended, Justly. I think the point Anandtech was trying to make is that they hope the performance gap can be reduced somewhat with driver/game updates. There are other games where NVIDIA outperforms ATI, but overall the 7900 GTX offers similar performance to the X1900 XT and not too much worse than the X1950 XT/XTX cards (I think). Another way of looking at this is that perhaps they just hope SM3 support doesn't turn into a GeForce FX fiasco again.

    So far, looks to me like ATI has better shader hardware. Ever read any of the stuff on the folding at home forums by their programmers? Basically, they have stated that G70 really has poor performance on their SM3 code even with optimizations... and it doesn't even look like G80 will be all that great. All that said, I still don't like ATI's drivers. CCC(P) is so sluggish it's pathetic, and that's after performance improvements since it first cam out.
  • jediknight - Monday, December 25, 2006 - link

    I was hoping to see some of the last gen cards (err.. now with the 8800, I guess two gens old..) - as that's what I'm running with (with no hope of upgrading - as I'm with AGP right now.. )

    Specifically, if future reviews would consider the performance of the X800XL running at 1280x1024, I'll be happy :->
  • Spoelie - Monday, December 25, 2006 - link

    you need to have a SM3 card to play this game, as such, it won't even start on your card.

    not that I agree with that policy, they should have provided a SM2 path, not everybody has a ~1/1.5 years old card.
  • jkostans - Monday, December 25, 2006 - link

    I think its pretty clear you'll be needing to run at 800x600 with med graphics, or 1024x768 with low graphics settings in order to get around 20 fps.
  • Tanclearas - Monday, December 25, 2006 - link

    quote:

    The X1950 XTX almost runs the game smoothly at the highest settings, and with some overclocking, Vegas has a good chance of running perfectly fine at maximum details and 1600x1200 with this card. The 7900 GTX, as powerful as it is, just can't manage acceptable performance in the game at 1600x1200, but at one resolution down it looks and plays fine.


    At 1600 x 1200, the 7900GTX runs at 19.8 and the X1950XTX runs at 20.4 FPS. Given those numbers, the above quote doesn't really make much sense. Did I miss something?

    And just so people don't think I'm whining, or a fanboy, or whatever, I have an X1900XT (512MB). I am just honestly confused by the conclusion that the X1950XTX could handle 1600 x 1200 and the 7900GTX could not.
  • Josh Venning - Monday, December 25, 2006 - link

    Thanks for the comment. The paragraph has been tweaked a little so that it's a little more clear. The fact is that both the X1950 XTX and 7900 GTX at reference speeds experience a little choppiness in the game at the highest resolution and quality settings. With some overclocking, either of these cards could run the game at these settings smoothly. Sorry for the confusion.

Log in

Don't have an account? Sign up now