Introduction

With our recent architecture and feature set coverage over the ATI Radeon X1000 series launch, we were a little bit light on the performance numbers. Rather than fill out some of the tests and update the article, we decided to take a little more time and do it up right. We have heard the call for more game tests, and today, we bring them on in spades. Our expanded tests include the games mentioned in our earlier coverage, the much requested Battlefield 2, and we illustrate the stellar way in which the new X1000 series handles enabling 4xAA on the games that we tested.

While scaling-with-aa on the new X1000 series is very good, will it be good enough to make up for the price difference with competitive NVIDIA hardware? Certainly, the feature set is of value with ATI offering the added benefit of MSAA on MRT and floating point surfaces, high quality AF, SM3.0, and Avivo. But performance is absolutely critical on current and near term games. Currently, many HDR methods avoid floating point output and MRTs in order to maintain compatibility with AA on current hardware. Until game developers shift to full floating point framebuffers or make heavy use of multiple render targets, ATI's added AA support won't make much difference to gamers. High quality anisotropic filtering is definitely something that we have begged of NVIDIA and ATI for a long time and we are glad to see it, but the benefits just aren't that visible in first-person shooters and the like. Shader Model 3.0 and Avivo are good things to have around as well; better API support, image quality, and video output are things that everyone wants.

However, the bottom line is that performance sells video cards. The first thing that people question when they are looking for a new card is just how well it runs in their favorite game. Hopefully, we will be able to shed some light on the issue here.

We will look at resolutions from 800x600 up to 2048x1536 and antialiasing tests will be included where possible. In games where we tested both with and without antialiasing, we will also show a graph of how performance drops due to AA scales with resolution. This data will be a lower-is-better graph (less drop in frame rate is a good thing) and will be shown scaled over resolution (as a performance drop will increase in percentage with resolution). The test system that we employed is the one used for our initial tests of the hardware.

Battlefield 2 Performance
POST A COMMENT

93 Comments

View All Comments

  • bob661 - Friday, October 07, 2005 - link

    Are they fully DX10 or partially? If partially, will that be enough to be Vista compliant? Reply
  • Clauzii - Friday, October 07, 2005 - link

    I´m pretty amased that ATI despite the higher clockrate can acomplish almost the same as a 7800GTX although with 2/3 the Pipelinecapacity.

    I´ll look even more forward to R580.
    Reply
  • MemberSince97 - Friday, October 07, 2005 - link

    Ahh Thank You Derek, this is much more AT style. Reply
  • Madellga - Friday, October 07, 2005 - link

    Derek, nice update. Thanks for including 1920x1200 in the benchmarks, it is a good move and I hope that other sites follow AT on that.

    It is interesting to see how the performance of some higher level cards fall after 1600x1200. Anyone buying WS monitors should pay attention to this.

    I was not conviced that the X1800XT was better performer than the 7800GTX, but looking at the WS high resolutions and AA+AF that pretty much settles the discussion.

    Don't let the critics bug you. Use it as feedback and source of ideas for future reviews.

    On the next article, please do not forget to check the famous "shimmering" effect.
    Does the R520 family handles this issue better than the G70?

    Take care
    Reply
  • JNo - Monday, October 10, 2005 - link

    Well put! This is extremely helpful for 1920x1200 LCD owners Reply
  • erinlegault - Friday, October 07, 2005 - link

    I think important point that is missing from all reviews is the importance of a Vista compatible graphics crad. The x1xxx's are the first graphics cards compatible with the new spec.

    So the price premium may be worth while if you are interested in upgrading the Vista, when ever it is finally released.
    Reply
  • bob661 - Friday, October 07, 2005 - link

    All you need is DX9 to be Vista compatible. Reply
  • bob661 - Friday, October 07, 2005 - link

    Oops, DX8. Reply
  • tfranzese - Friday, October 07, 2005 - link

    From the article:
    quote:

    But performance is absolutely critical on current and near term games.


    Yet you guys tested none. I think benchmarking available versions of FEAR, Call of Duty 2, Serious Sam 2, Black and White 2, etc would be much more enteresting than some of the choices made here. All the cards tested handle todays games well, but I would expect most who buy these cards are buying these for games that are soon-to-be released or coming in the next one or two quarters.
    Reply
  • karlreading - Friday, October 07, 2005 - link

    i must admit it seems to me everyones just giving anadtech a hard time. the review seemed prtty reasonable, they responded to the massive backlash they got from there first review, and i think thats where the deserve the credit. sheesh guys! givem a break!
    karlos
    Reply

Log in

Don't have an account? Sign up now