Test Setup and Power Performance

Our testing methodology was to try and cover a lot of ground with top to bottom hardware. Including the X1300 through the X1800 line required quite a few different cards and tests to be run. In order to make it easier to look at the data, rather than put everything for each game in one place as we normally do, we have broken up our data into three separate groups: Budget, Midrange, and High End.

We used the latest drivers we had available which are both beta drivers. From NVIDIA, the 81.82 drivers were tested rather than the current release as we expect the rel 80 drivers to be in the end users hands before the X1000 series is easy to purchase.

All of our tests were done on this system:

ATI Radeon Express 200 based system
AMD Athlon 64 FX-55
1GB DDR400 2:2:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStreams PSU

The resolutions we tested range from 800x600 on the low end to 2048x1536 on the high end. The games we tested include:
  • Day of Defeat: Source
  • Doom 3
  • EverQuest 2
  • Far Cry
  • Splinter Cell: Chaos Theory
  • The Chronicles of Riddick
We were interested in testing the FEAR demo, but after learning that the final version would change the performance characteristics of the game significantly, we decided it would be best to wait until we had a shipping game. From what we hear, the FEAR demo favors ATI's X1000 series considerably over NVIDIA hardware, but performance will be much closer in the final version.

Before we take a look at the performance numbers, here's a look at the power draw of various hardware.

Load Power


As we can see, this generation draws about as much power as previous generatation products under load at the high end and midrange. The X1300 Pro seems to draw a little more power than we would like to see in a budget part. The card also sports a fan that is just as loud as the X1600 XT. Considering that some of the cards we tested against the X1300 Pro were passively cooled, this is something to note.

Adaptive AA Budget Performance
Comments Locked

103 Comments

View All Comments

  • mlittl3 - Wednesday, October 5, 2005 - link

    I'll tell you how it is a win. Take a 8 less pipeline architecture, put it onto a brand new 0.90nm die shrink, clock the hell out of the thing, consume just a little more power and add all the new features like sm3.0 and you equal the competition's fastest card. This is a win. So when ATI releases 1,2,3 etc. more quad pipes, they will be even faster.

    I don't see anything bob. Anandtech's review was a very bad one. ALL the other sites said this was is good architecture and is on par with and a little faster than nvidia. None of those conclusions can be drawn from the confusing graphs here.

    Read the comments here and you will see others agree. Good job, ATI and Nvidia for bringing us competition and equal performing cards. Now bob, go to some other sites, get a good feel for which card suits your needs, and then go buy one. :)
  • bob661 - Wednesday, October 5, 2005 - link

    I read the other sites as well as AT. Quite frankly, I trust AT before any of the other sites because their methodology and consistancy is top notch. HardOCP didn't even test a X1800XT and if I was an avid reader of their site I'd be wondering where that review was. I guess I don't see it your way because I only look for bang for the buck not which could be better if it had this or had that. BTW, I just got some free money (no, I didn't steal it!) today so I'm going to pick up a 7800GT. :)
  • Houdani - Wednesday, October 5, 2005 - link

    One of the reasons for the card selections is due to the price of the cards -- and was stated as such. Just because ATI is calling the card "low-end" doesn't mean it should be compared with other low-end cards. If ATI prices their "low-end" card in the same range as a mid-range card, then it should rightfully be compared to those other cards which are at/near the price.

    But your point is well taken. I'd like to see a few more cards tossed in there.
  • Madellga - Wednesday, October 5, 2005 - link

    Derek, I don't know if you have the time for this, but a review at other website showed a huge difference in performance at the Fear Demo. Ati was in the lead with substantial advantage for the maximum framerates, but near at minimum.

    http://techreport.com/reviews/2005q4/radeon-x1000/...">http://techreport.com/reviews/2005q4/radeon-x1000/...

    As Fear points towards the new generation of engines, it might be worth running some numbers on it.

    Also useful would be to report minimum framerates at the higher resolutions, as this relates to good gameplay experience if all goodies are cranked up.
  • Houdani - Wednesday, October 5, 2005 - link

    Well, the review does state that the FEAR Demo greatly favors ATI, but that the actual shipping game is expected to not show such bias. Derek purposefully omitted the FEAR Demo in order to use the shipping game instead.
  • allnighter - Wednesday, October 5, 2005 - link

    Is it safe to assume that you guys might not have had enough time with these cards to do your usuall in-depth review? I'm sure you'll update for us to be able to get the full picture. I also must say that I'm missing the oc part of the review. I wanted to see how true it is taht these chips can go sky hig.> Given the fact that they had 3 re-spins it may as well be true.
  • TinyTeeth - Wednesday, October 5, 2005 - link

    ...an Anandtech review.

    But it's a bit thin, I must say. I'm still missing overclocking results and Half-Life 2 and Battlefield 2 results. How come no hardware site has tested the cards in Battlefield 2 yet?

    From my point of view, Doom III, Splinter Cell, Everquest II and Far Cry are the least interesting games out there.

    Overall it's a good review as you can expect from the absolutely best hardware site there is, but I hope and expect there will be another, much larger review.
  • Houdani - Wednesday, October 5, 2005 - link

    The best reason to continue benchmarking games which have been out for a while is because those are the games which the older GPUs were previously benched. When review sites stop using the old benchmarks, they effectively lose the history for all of the older GPU's, and therefore we lose those GPUs in the comparison.

    Granted, the review is welcome to re-benchmark the old GPUs using the new games ... but that would be a significant undertaking and frankly I don't see many (if any) review sites doing that.

    But I will throw you this bone: While I think it's quite appropriate to use benchmarks for two years (maybe even three years), it would also be a good thing to very slowly introduce new games at a pace of one per year, and likewise drop one game per year.
  • mongoosesRawesome - Wednesday, October 5, 2005 - link

    they have to retest whenever they use a different driver/CPU/motherboard, which is quite often. I bet they have to retest every other article or so. Its a pain in the butt, but thats why we visit and don't do the tests ourselves.
  • Madellga - Wednesday, October 5, 2005 - link

    Techreport has Battlefield 2 benchmarks, as Fear, Guild Wars and others. I liked the article, recommend that you read also.

Log in

Don't have an account? Sign up now