Benchmarking Software: an Analysis of Far Cry 2 Settings under AMD and NVIDIA

Before we get started, let's take a look at our test setup:

Platform: ASUS Rampage II Extreme, Core i7-965, 6GB DDR3-1333, Intel SSD
AMD Driver: Final 8.10 hotfix
NVIDIA Driver: 180.44

Our first goal in getting our testing rolling was to find out what to test and to understand the settings in the game better. We spent time playing the game at different quality levels with different hardware like we generally do. But because we wanted to take advantage of the benchmark tool, we decided to collect a bit of data on different settings with one card from AMD and one card from NVIDIA. We look at three different quality levels under two different DX APIs with two different AA settings across five different resolutions. For those keeping count, that's 60 tests per card or 120 tests total for this section.

The result isn't as much like our usual hardware focused tests, as this provides us with more of an analysis of the game itself. We get a better perspective on how the game responds in different situations with different hardware on different platforms without the need to test every piece of hardware out there. Our hope was that this page could help people who are running a particular setup see generally how performance might change if they tweaked one of the variables. Of course, you can't predict specific performance with this, as there isn't enough data for interpolation purposes, but knowing the general trend and what sort of changes make the largest differences can still be useful.

This test is run with our custom timedemo rather than any of the built in benchmarks.

The cards we chose are the highest end NVIDIA and AMD single GPU solutions (the GeForce GTX 280 and the Radeon HD 4870 1GB). While not everyone will have these cards, we were able to test the broadest range of playable data with them. We'll start our analysis with the NVIDIA hardware in DX9 and DX10.

Now take a deep breath because these graphs can be a little tricky. Each graph is only 6 resolution scaling lines, but you'll want to approach them by looking at two groups of three: blue diamonds, red squares, green triangles are no antialiasing, while purple X, blue *, orange circles are 4xAA.

Under DX9 and NVIDIA hardware, High quality performs significantly higher than Very High quality both with and without AA. Moving from Very High quality to High quality gives at best a 47% increase in performance while the worst case is 27% with 4xAA and 37% without. Performance increases in this case generally trend downward as resolution increases. We also see that High quality 4xAA outperforms Very High quality with no AA. While there is a crossover point, Very High quality with 4xAA also performs very similarly to Ultra High quality with no AA.

Moving to DX10 under NVIDIA hardware, High quality performance takes a dive while the rest of the numbers stay relatively stable. This basic indication here is that DX9 won't gain you much performance (and will sometimes drop your performance a bit) unless you are looking at High quality mode at which case it could be very worth it to run DX9. As a further consequence, the performance benefit of dropping down to High quality in DX10 mode makes it essentailly useless. High quality with 4xAA looses the advantage over Very High quality with no AA. Very High quality or better is the way to go under DX10, and DX9 should only be paired with High quality mode or lower.

The analysis of the AMD data is very similar to what we see with NVIDIA. We see the same big performance advantage of High quality DX9 with DX10 actually increasing performance at the higher quality levels (the exception is at 2560x1600 where performance drops off more sharply than the GTX 280). The major difference here is in the fact that moving from Ultra High quality to Very High quality gives you a much larger performance increase under AMD than NVIDIA. This means that Very High 4xAA has a larger advantage over Ultra High with no AA (except at 2560x1600), and that it is more worth it to drop back to a lower quality setting to gain performance on AMD hardware. We still recommend Ultra High quality though, unless 4xAA is something you just can't live with out (in that case, Very High quality plus 4xAA is probably the way to go).

The comparison we haven't made yet is NVIDIA versus AMD. These tests show that under DX10 AMD Radeon HD 4870 1GB is either higher performing than or performing on par with the NVIDIA GeForce GTX 280 (except at ultra high resolutions with 4xAA). This is very impressive due to the $100 price advantage (the GeForce GTX 280 comes in at 33% more expensive than the Radeon HD 4870 1GB at $400). If you've got a 2560x1600 monitor and want to run Ultra High quality with 4xAA, that's the only case where the GeForce GTX 280 is worth it, though you'll be pushing the playability limit there and SLI with two cheaper cards might be a better way to go.

Going forward, we will be looking at DX10 with Ultra High quality settings and will generally favor testing without AA as we feel that Ultra High quality is a better use of resources than 4xAA. For multi-GPU and high end testing, we will still try to include 4xAA numbers though. This custom timedemo will also be the test we stick with rather than the built in RanchSmall demo.

Index Testing with RanchSmall
Comments Locked

78 Comments

View All Comments

  • DerekWilson - Monday, November 24, 2008 - link

    the issue is overlapping development cycles for drivers. once features for a WHQL driver have been frozen, non-critical changes can't be made. This means that once one month's catalyst ships, it is not likely that any bugs that were found at the time of the release of the driver will make it in to the next months driver. This means it'll be at least two months before a fix is seen.

    i don't see how this disagrees with either what i said or what you said.
  • MichaelD - Sunday, November 23, 2008 - link

    I'm curious about two things. Why wasn't the 4870X2 included in the test? With GTX280 SLI tested, an X2 would've been a good inclusion due to both price and performance. Also, why is it stated that "Crossfire doesn't work" when it works just fine on my X2.

    Before AMD released it's latest "FarCry2 Patch" (newer driver) there were XFire issues with the X2 and some games like FC2 and Stalker CS, but those have been fixed.
  • smokenjoe - Sunday, November 23, 2008 - link

    I never had any problems with my x2 unless it was in windowed mode or it had wait for V sink on. Then there were massive slowdowns. In regular play my card had very consistant frame rates with around 2 hrs play. I did not use the benchmark. FC2 is not the only game with this problem. I know a lot of reviewers like to have the game in window mode to multi task wile benchmarking and Vsink on for pics but it doesnt reflect real world game play with any kind of crosfire set up.

    Unfortunatly I dont know the easy way of getting out of windowd mode I did not see it in the options so I had to edit the config file. To make more annoying the game reset to windowd mode after driver updates.

    The game looks good to me maxed out but I havent had time to play it with other games first on the list.

  • rocky1234 - Sunday, November 23, 2008 - link

    No had to comment on that as well...No taking longer for driver releases is not the answer because when I had a Nvidia card if there was a bug or something did not work you were pretty much sol until nvidia released their 8 month old driver to you & you had to hope that it would fix your problem or their bug if it didn't you were hooped until the next release or had to rely on leaked beta drivers. I was glad to get rid of that headache & have not looked back since I got my AMD/ATI card yes there have been issues but most if not all have been worked out & it has been done so in 3 months not 8 months. This is a new game & yes there will be problems with it & I personally have not had any issues with my 4870x2 2GB so far with this game it runs fast & looks good & yes there are pauses for maybe half a second once & a while but it only happens when the HDD reads & while driving so to those that complain about this get a faster drive or please stop whining...enough said. No need to comment on this to me as I don't care & have better things to do like go & play farcry 2 or Grid.
  • rocky1234 - Sunday, November 23, 2008 - link

    Well personally I have had no issues with this game & it is maxed out & no stuttering to be found there may be a very brief pause once & a while but it happens whenever the HDD is reading & I am driving something so this clearly is not the fault of the Graphics card.

    I have found no problems with this game & the Radeon drivers do far even before & switched to the hotfix drivers. I did find that the game does run a lot better from Windows Vista than it did with my Windows XP install & DX10 of coarse. I am glad that AMD took the time to make hot fixes for this game it shows that they are trying to make their cards run properly with this game. With Nvidia if it is not time for a driver release you unlucky souls would have to wait 6 to 8 months for a driver fix or have to depend on a leaked beta driver to fix the problem so to Anandtech take it easy on AMD at least they did something about it & released a hot fix.

    I run most every game at 1080p on my HDTV & I found that crysis is a far more unstable game & just poorly optimized for any platform no matter how much power you give it. This is not AMD's or nvidia's fault this is the fault of the dev. Yes crysis does look a little better here & there but farcry 2 is very close & it just runs fast & smooth.
  • PrinceGaz - Saturday, November 22, 2008 - link

    It's funny how soon older cards are forgotten here now. I've got one of those ancient relics, a G80 based 640MB 8800GTS, but like all cards of its generation, it was ommitted from the review. I suppose I can guesstimate that it will be around or just under the performance level of the 9600GT as they have a very similar architecture, but GeForce 6/7 users are totally left out by this review.

    In the current economic climate, it is unlikely everyone will be replacing their graphics card every year (or even every two years), so testing with some older generation cards, at lower detail settings of course, would be a good idea.
  • strikeback03 - Monday, November 24, 2008 - link

    This review already contains data for 16 configurations, and they probably tested other configurations as well due to the Crossfire issues. If you are going to start throwing in every other reasonable configuration from the past few years and both companies, you would easily top 25-30 configurations, and the time required would be insane.
  • daniyarm - Monday, November 24, 2008 - link

    I completely agree with you. I bought 8800gt when it came out and bought another one 3 months ago for SLI. 8800GT SLI is by no means a low end graphics solution, it's on par with current single card GPUs. This is the problem with most review sites, they show benchies for new hardware to get you to upgrade because it pleases sponsors. They forget that unlike them, we pay for our hardware and can't afford to buy a new high end GPU every 6 months.
  • Hawkmoon - Saturday, November 22, 2008 - link

    Can anyone tell me what CPU was used with these videocards for these tests?

    Thanks
  • Hawkmoon - Saturday, November 22, 2008 - link

    Hmmm, maybe I missed it... but can anyone tell me also what drivers they used for the Nvidia videocards?

    Thanks

Log in

Don't have an account? Sign up now