Battlefield 2 Performance

The most requested game that we didn't include in our initial coverage is Battlefield 2. This highly popular game is quite important in comparing performance, as it does an excellent job of setting the standard for first-person shooter quality. The numbers that we attained came from running our custom BF2 demo on the highest quality settings. This means that anisotropic filtering was enabled both with and without AA (as Doom 3's high quality mode also enables AF).

Our "no AA" performance numbers show the X1800 XT performing on par with the 7800 GTX until we move beyond 1600x1200. The 7800 GT has an advantage over the X1800 XL as well. The most important thing to note is that this is the only test that we have run to show the X1600 XT performing on the level of the GeForce 6800 GT. While it is good to see the new mid-range part performing in its price class, one title is not enough to make it worth the $250. The "budget" X1300 doesn't quite perform as well as the 6600 GT, which looks to sell at about the same price.

After enabling 4xAA on Battlefield 2, the X1800 XT really stretches its legs. Likewise, the X1800 XL jumps ahead of the 7800 GT. When we move to the X1600 XT, the numbers show it falling further behind the 6800 GT.

The X1800 XT bearly breaks a sweat when AA is enabled dropping at most 18.3 percent. In fact, at every resolution, the X1800 XT drops about half the percent decrease in performance as seen on the 7800 GTX explaining the change in leadership between our two tests. Dropping more than the 6800 GT and less than the 6600 GT (percentage-wise), the X1600 XT shows different characteristics than its heavier hitting siblings.

Next up is Day of Defeat: Source. We already had a peak at this game's performance earlier this week. Now, let's see if our extended data supports what we saw then.

Index Day of Defeat: Source Performance


View All Comments

  • DonPMitchell - Tuesday, October 11, 2005 - link

    I wish they had benchmarked Half Life2, which is sort of a standard thing to look at. I don't disagree with their final conclusions, but I can't help but wonder if HL2 spanked nVidia, and that made them chose another game. It would be more unbiased for them to stick to a more or less standard set of tests, and it would let use compare to prior tests as well. Reply
  • Larso - Monday, October 10, 2005 - link

    First I would like to thank the AT crew for another excellent article! The graphs are very intuitive and easy to read.

    I think that one aspect is missing in about every of the X1000 articles I have found on the net: how does this new family of cards compare with the previous generation(s)? - For example, what middlerange card should I pickup as an upgrade for 9600XT, and how much improvement will I get??

    Anybody stumbled over a review that compares these new cards with the previous generations??


    Another question, I selected the 9600XT back then because it can easily be passively cooled, now I wonder if any of the new cards can be silenced without turning the case into a bake owen?
  • MrJim - Sunday, October 09, 2005 - link

    I dont understand why anandtech keeps on not testing games other than shooters with these high-end cards, like demanding flightsims(lock-on is very graphic intensive) and/or Pacific Fighters(not as GPU dependant as Lock-On). Also why no racing sims? We who play these do use filtering and FSAA alot or at least the 350 ones i know. Cheers! Reply
  • dimitrio - Saturday, October 08, 2005 - link

    Some one said they like the way the graphs were done. I must say that I found it a little difficult to do comparisions, since you have to look back and forth to see what card each symbol represents. After you give up figuring that out, you try to look at the data below, but again, since it's not ordered "better to worse", it takes some time to figure out that data. Things got even more complicated with the "Lower is Better" graph.

    I aknowledge that it makes everything much cleaner, and with the number of benchmarks on this article, you would end up with dozens of graphs on several pages, and you can clearly sence the writers desire to improve the presentation with this new format, but sometimes things are better kept simple, and I still would like to see the many bar graphs, as they are much more intuitive and informative, to me at least.
  • photoguy99 - Saturday, October 08, 2005 - link

    This article is a prime reason - editors listen, participate, improve.

    Tom's doesn't even link their articles directly to discussions! Why? Can't handle the feedback?

    Glad to be here.
  • Spoelie - Saturday, October 08, 2005 - link

    Isn't anyone else confuzzled about the x1600xt lackluster showing? I was really hoping to make it my next upgrade but it's current performance is only value-card worth. Just looking at the specs (12 SM3.0 pixel pipes @ 600mhz) would have it creaming the 6600GT (8 SM3.0 pixel pipes @ 500mhz), but it's barely even competing with it. This without considering other architectural advances and faster memory! My guess is that the ultimate fillrate is determined by the TMU's, and only having 4 makes this card a worthy 9600xt followup -which had 4 TMU's @ 500mhz- but nowhere near the mainstream card of this day and age. Extremely bad decision on ATI's part if this is it. I can't think of any other reason for this card to perform so pathetic. It would be nice to have it clarified if I'm completely missing the issue tho. Reply
  • taylore2003 - Saturday, October 08, 2005 - link

    what Anandtech really need to to is benchmark the x1300pro on a non fx-55 system, ppl who buy that gpu will not have a top of the line PC, do it on a 3200+ amd not a fx-55, i mean come on, then ppl like me (damn all 16 y/o's) can see what kinda framefates we would be getting, the x1300pro should go vs the 6600 non gt! b/c we can see te x1800 is a great top of the line but ati's mid to low range gpus are not so hot, so lets see, a x1300pro vs a 6600 non gt! w/ a reasonable test setup. Reply
  • coldpower27 - Saturday, October 08, 2005 - link

    From what I can see, it seems the 4 TMU's is a very crippling feature when compared to the 6600 GT, as that has 8 TMU's but 4 ROP's. It beats the 6600 GT yes, but not as much as we would be expecting.

    Compared to the 6600 GT

    Pure Pixel Fillrate: X1600 XT 7.08 GP vs 6600 GT 4.0GP 77% More (ATI)
    Output Pixel Fillrate: X1600 XT (4 TMU's)2.36 GP vs (4 ROP's) 2.0 GP 18% More (ATI)
    Vertex Shader Fillrate: X1600 XT 737.5 MT's vs 6600 GT 375 MT's 96.6% More (ATI)
    Memory Bandwidth: X1600 XT 1380MHZ 22.08 GB/s vs 6600 GT 1000MHZ 16 GB/s 38% More (ATI)

    Add to that the 256MB vs 128MB comparison, and "more efficient Shader Model 3.0 implementation".

    Battlefield with AA.
    X1600 XT is ~ 50-60% Faster

    Day of Defeat Source with AA
    X1600 XT is ~ 33% Faster (Starting 12x9)

    Doom 3 wtih AA(OpenGL)
    X1600 XT is ~ 6600 GT

    FarCry with AA
    X1600 XT is ~ 37%-38% Faster (10x7 to 12x9)

    Chronicles of Riddick No AA (OpenGL)
    X1600 XT is ~ 6600 GT

    Splinter Cell Chaos Theory with AA&AF
    X1600 XT is ~ 12-18% Faster.

    Everquest No AA
    X1600 XT is 12-18% Faster.

    In some cases it's just faster up to the difference in their Output Fillrates, Battlefield 2, FarCry & DOD:Source being it's biggest Wins, & the two OpenGL games as it's poorest showing.

    Not much of this is a surprise however.
  • AtaStrumf - Saturday, October 08, 2005 - link

    I was wondering something along those lines too, especialy why is X1800 XT so much faster than X1800 XL, and why is X1300 Pro not that much slower than X1600 XT?

    I don't get this new ring bus memory controler. Maybe it has something to do with that as well as TMUs and co. In the past we had 256-bit on the high end, 128-bit in the middle and 64-bit on the low end, now it seems as though all have _the same_ memory controler, which seem a bit odd to me, and what is also peculiar is the fact that all but the highest end X1800 XT have 256 MB of memory, while X1800 XT has 512 MB. Does more memory now somehow equal more "bits" -- bandwith?
  • haelduksf - Friday, October 07, 2005 - link

    I'm guessing you're actually having a "peek" at DOD:S performance ;) Reply

Log in

Don't have an account? Sign up now