Setting Expectations

As we mentioned, Rainbow Six: Vegas is one of the first games out that incorporates the Unreal Engine 3, which makes it somewhat special. Another game out which uses the Unreal Engine 3 is Gears of War for the Xbox 360, which we hope to see released for the PC sometime soon. We can hope that Rainbow Six: Vegas performance will reflect future games based on the Unreal Engine 3, because compared to games like The Elder Scrolls IV: Oblivion when it was first released, the biggest and best GPU available isn't really required to enjoy this game right now. To be fair, it's worth nothing that the performance requirements are about as strenuous as Oblivion, if not slightly higher; the only difference is that the new GeForce 8800 series is now available.


The Unreal Engine 2 made use of DX8, but DX9 elements (things like HDR and shadow effects) were added in more recent games like Splinter Cell: Double Agent. The Unreal Engine 3 is more based on the programmable graphics aspect of DX9, and future titles using UE3 should also support DX10/WGF2. This basically means developers have more flexibility when making a game for this engine. A lot of game developers licensed the UE2, and we are expecting the same with the UE3. At any rate, the next big game expected using the Unreal Engine 3 is in fact Unreal Tournament 2007, which will hopefully be available soon.


We mentioned in the introduction that the game had a few issues we wish would have been addressed before Rainbow Six: Vegas was released. One of them is that as of right now, there doesn't seem to be any support for SLI or Crossfire setups. This is bad news because Vegas is one game that could use the type of flexibility a multi-GPU system could provide. Another issue that could be seen as a drawback is the lack of antialiasing in the game. There is no option for turning on AA and we aren't extremely surprised about this given that the game uses the very new Unreal Engine 3. Hopefully we will see support for this in the future though, as the fastest GPUs should be able to handle HDR as well as FSAA.

Settings and Benchmark Information Test Setup and High-End Performance
Comments Locked

32 Comments

View All Comments

  • ariafrost - Monday, December 25, 2006 - link

    Well forget about running it on my X850XT, apparently RSV *requires* a Pixel Shader 3.0 video card. If anyone could confirm/deny that information it'd be great, but for now it looks like a lot of ill-informed customers may end up buying a game their "128MB/256MB" video cards can't support.
  • justly - Monday, December 25, 2006 - link

    quote:

    It's very evident looking at all of these tests how Rainbow Six: Vegas tends to favor ATI hardware, but again, keep in mind that because of patches and updates this may not (and hopefully won't) be the case for long.


    Anandtech always seems to have a problem when ever it can't recomend NVIDIA as the best solution in every senerio. What is so wrong with the idea that ATI hardware performs better than NVIDIA hardware of the same generation? Maybe I'm mistaken, but I thought even Anandtech expected ATI might do better in newer games.
    Personally I'm not much of a gamer so it really doesn't matter to me, but fot the sake of the people using your articles to choose hardware why give them expectations that might not materialize?

    Maybe because I am not engrosed in the gamming experiance I have a different perspective, but considering a lot of games are ported over from consoles (or at least designed with consoles in mind) wouldn't it be reasonable to expect any game designed around a console using ATI graphics to favor ATI graphics on the PC? It wouldn't surprize me in the least to see games favoring (or at least more competitive) on hardware built around ATI for the next year or two.
  • Jodiuh - Monday, December 25, 2006 - link

    Because it's happened before. Remember Oblivion?
  • munky - Monday, December 25, 2006 - link

    Nothing happened. The 7-series still has much worse performance in Oblivion in outdoor scenes with foliage than equivalent Ati cards.
    http://www.firingsquad.com/hardware/nvidia_geforce...">http://www.firingsquad.com/hardware/nvidia_geforce...
  • Frumious1 - Monday, December 25, 2006 - link

    Try not to be so easily offended, Justly. I think the point Anandtech was trying to make is that they hope the performance gap can be reduced somewhat with driver/game updates. There are other games where NVIDIA outperforms ATI, but overall the 7900 GTX offers similar performance to the X1900 XT and not too much worse than the X1950 XT/XTX cards (I think). Another way of looking at this is that perhaps they just hope SM3 support doesn't turn into a GeForce FX fiasco again.

    So far, looks to me like ATI has better shader hardware. Ever read any of the stuff on the folding at home forums by their programmers? Basically, they have stated that G70 really has poor performance on their SM3 code even with optimizations... and it doesn't even look like G80 will be all that great. All that said, I still don't like ATI's drivers. CCC(P) is so sluggish it's pathetic, and that's after performance improvements since it first cam out.
  • jediknight - Monday, December 25, 2006 - link

    I was hoping to see some of the last gen cards (err.. now with the 8800, I guess two gens old..) - as that's what I'm running with (with no hope of upgrading - as I'm with AGP right now.. )

    Specifically, if future reviews would consider the performance of the X800XL running at 1280x1024, I'll be happy :->
  • Spoelie - Monday, December 25, 2006 - link

    you need to have a SM3 card to play this game, as such, it won't even start on your card.

    not that I agree with that policy, they should have provided a SM2 path, not everybody has a ~1/1.5 years old card.
  • jkostans - Monday, December 25, 2006 - link

    I think its pretty clear you'll be needing to run at 800x600 with med graphics, or 1024x768 with low graphics settings in order to get around 20 fps.
  • Tanclearas - Monday, December 25, 2006 - link

    quote:

    The X1950 XTX almost runs the game smoothly at the highest settings, and with some overclocking, Vegas has a good chance of running perfectly fine at maximum details and 1600x1200 with this card. The 7900 GTX, as powerful as it is, just can't manage acceptable performance in the game at 1600x1200, but at one resolution down it looks and plays fine.


    At 1600 x 1200, the 7900GTX runs at 19.8 and the X1950XTX runs at 20.4 FPS. Given those numbers, the above quote doesn't really make much sense. Did I miss something?

    And just so people don't think I'm whining, or a fanboy, or whatever, I have an X1900XT (512MB). I am just honestly confused by the conclusion that the X1950XTX could handle 1600 x 1200 and the 7900GTX could not.
  • Josh Venning - Monday, December 25, 2006 - link

    Thanks for the comment. The paragraph has been tweaked a little so that it's a little more clear. The fact is that both the X1950 XTX and 7900 GTX at reference speeds experience a little choppiness in the game at the highest resolution and quality settings. With some overclocking, either of these cards could run the game at these settings smoothly. Sorry for the confusion.

Log in

Don't have an account? Sign up now