Benchmarking Software: an Analysis of Far Cry 2 Settings under AMD and NVIDIA

Before we get started, let's take a look at our test setup:

Platform: ASUS Rampage II Extreme, Core i7-965, 6GB DDR3-1333, Intel SSD
AMD Driver: Final 8.10 hotfix
NVIDIA Driver: 180.44

Our first goal in getting our testing rolling was to find out what to test and to understand the settings in the game better. We spent time playing the game at different quality levels with different hardware like we generally do. But because we wanted to take advantage of the benchmark tool, we decided to collect a bit of data on different settings with one card from AMD and one card from NVIDIA. We look at three different quality levels under two different DX APIs with two different AA settings across five different resolutions. For those keeping count, that's 60 tests per card or 120 tests total for this section.

The result isn't as much like our usual hardware focused tests, as this provides us with more of an analysis of the game itself. We get a better perspective on how the game responds in different situations with different hardware on different platforms without the need to test every piece of hardware out there. Our hope was that this page could help people who are running a particular setup see generally how performance might change if they tweaked one of the variables. Of course, you can't predict specific performance with this, as there isn't enough data for interpolation purposes, but knowing the general trend and what sort of changes make the largest differences can still be useful.

This test is run with our custom timedemo rather than any of the built in benchmarks.

The cards we chose are the highest end NVIDIA and AMD single GPU solutions (the GeForce GTX 280 and the Radeon HD 4870 1GB). While not everyone will have these cards, we were able to test the broadest range of playable data with them. We'll start our analysis with the NVIDIA hardware in DX9 and DX10.

Now take a deep breath because these graphs can be a little tricky. Each graph is only 6 resolution scaling lines, but you'll want to approach them by looking at two groups of three: blue diamonds, red squares, green triangles are no antialiasing, while purple X, blue *, orange circles are 4xAA.

Under DX9 and NVIDIA hardware, High quality performs significantly higher than Very High quality both with and without AA. Moving from Very High quality to High quality gives at best a 47% increase in performance while the worst case is 27% with 4xAA and 37% without. Performance increases in this case generally trend downward as resolution increases. We also see that High quality 4xAA outperforms Very High quality with no AA. While there is a crossover point, Very High quality with 4xAA also performs very similarly to Ultra High quality with no AA.

Moving to DX10 under NVIDIA hardware, High quality performance takes a dive while the rest of the numbers stay relatively stable. This basic indication here is that DX9 won't gain you much performance (and will sometimes drop your performance a bit) unless you are looking at High quality mode at which case it could be very worth it to run DX9. As a further consequence, the performance benefit of dropping down to High quality in DX10 mode makes it essentailly useless. High quality with 4xAA looses the advantage over Very High quality with no AA. Very High quality or better is the way to go under DX10, and DX9 should only be paired with High quality mode or lower.

The analysis of the AMD data is very similar to what we see with NVIDIA. We see the same big performance advantage of High quality DX9 with DX10 actually increasing performance at the higher quality levels (the exception is at 2560x1600 where performance drops off more sharply than the GTX 280). The major difference here is in the fact that moving from Ultra High quality to Very High quality gives you a much larger performance increase under AMD than NVIDIA. This means that Very High 4xAA has a larger advantage over Ultra High with no AA (except at 2560x1600), and that it is more worth it to drop back to a lower quality setting to gain performance on AMD hardware. We still recommend Ultra High quality though, unless 4xAA is something you just can't live with out (in that case, Very High quality plus 4xAA is probably the way to go).

The comparison we haven't made yet is NVIDIA versus AMD. These tests show that under DX10 AMD Radeon HD 4870 1GB is either higher performing than or performing on par with the NVIDIA GeForce GTX 280 (except at ultra high resolutions with 4xAA). This is very impressive due to the $100 price advantage (the GeForce GTX 280 comes in at 33% more expensive than the Radeon HD 4870 1GB at $400). If you've got a 2560x1600 monitor and want to run Ultra High quality with 4xAA, that's the only case where the GeForce GTX 280 is worth it, though you'll be pushing the playability limit there and SLI with two cheaper cards might be a better way to go.

Going forward, we will be looking at DX10 with Ultra High quality settings and will generally favor testing without AA as we feel that Ultra High quality is a better use of resources than 4xAA. For multi-GPU and high end testing, we will still try to include 4xAA numbers though. This custom timedemo will also be the test we stick with rather than the built in RanchSmall demo.

Index Testing with RanchSmall
Comments Locked

78 Comments

View All Comments

  • toyota - Friday, November 21, 2008 - link

    I have a GTX260 with 180.48 drivers and it stutters in the benchmark and in the game. theres a little hitch even while walking around like in STALKER but not as severe. my 4670 stuttered much less in the benchmark and basically zero in the game so this is NOT an ATI only issue.
  • Goty - Friday, November 21, 2008 - link

    You really can't blame AMD for having issues with the 8.10 drivers, they probably weren't given access to the game until very shortly before it was released (if at all) as a result of it being a part of the TWIMTBP program. If you consider the fact that work on the 8.11s probably began sometime a month or so before, too, there's even reason for issues there. Watch the 8.12s come out and AMD jump ahead significantly in performance (not that anyone will care by then, though).
  • Genx87 - Monday, November 24, 2008 - link

    The beta testers for the game manufacturer have access to the cards and drivers. ATI knew about this well before the release of the game.
  • ashegam - Friday, November 21, 2008 - link

    why is there so little difference between the 192 260 and the 216 260?
    I swear I've seen reviews that put that card a good 10-20% above it's older counterpart.
  • PrinceGaz - Saturday, November 22, 2008 - link

    A good 10-20%? I very much doubt that, given that stock original GTX260's and the Core 216 later versions differ only in having 9 instead of 8 shader banks, and the equivalent increase in texture units.

    Under ideal conditions, that would result in a 12.5% performance increase, but in practice is likely to be little more than 5% or so as many other factors affect performance. Anything above 12.5% improvement with a Core 216 would only be possible with a driver tweak which favoured it, or unless the Core 216 was overclocked. An improvement of 5% or so over the original GTX260 is what you should expect.
  • CEO Ballmer - Friday, November 21, 2008 - link

    It does not work on Macs!


    http://fakesteveballmer.blogspot.com">http://fakesteveballmer.blogspot.com
  • CrystalBay - Sunday, November 23, 2008 - link

    Yeah, I Hate It, Ubisoft should be banned to making chess games for Macs.

    Anyhow Firing Squad backs up Dereks benches , pretty much ...
  • chizow - Friday, November 21, 2008 - link

    Seems to be missing, platform used, drivers used etc. I'm guessing the 180.48s weren't used, as those results seem to be off for NV parts. If they weren't, that distinction should probably be made.
  • phatmhatg - Friday, November 21, 2008 - link

    nice article. very well supported.

    im still going with the 260 192 though.

    its just about as good as the 4870 1gb. what, im losing fewer than 10fps at 19x12?

    its about 60-75 cheaper. i got my 260 for 214 after rebate. free shipping.

    and heres the funny part - it came with far cry 2. so i save about 50 going with the 260 over the 4870 1gb AND i save another 50 by getting the game with it. thats 100 in savings. again - for about max 10fps less?

    lastly - driver issues. i dont JUST play farcry2. i play other games. just seems - and maybe im wrong and maybe things will change - that nvidia either avoids problems with games and/or fixes them better/more quickly than amd does. i dont want to have to wait or mess with things to get my game working. i want it working when i install it.

    so there are 4 good reasons to go with the 260 - cheaper, get game with card, not much slower at all, and better drivers in other games.
  • kr7400 - Tuesday, December 2, 2008 - link



    Can you please fucking die? Preferably by getting crushed to death in a garbage compactor, by getting your face cut to ribbons with a pocketknife, your head cracked open with a baseball bat, your stomach sliced open and your entrails spilled out, and your eyeballs ripped out of their sockets. *beep* bitch



    Shut the *beep* up f aggot, before you get your face bashed in and cut to ribbons, and your throat slit.

Log in

Don't have an account? Sign up now