Introduction

With our recent architecture and feature set coverage over the ATI Radeon X1000 series launch, we were a little bit light on the performance numbers. Rather than fill out some of the tests and update the article, we decided to take a little more time and do it up right. We have heard the call for more game tests, and today, we bring them on in spades. Our expanded tests include the games mentioned in our earlier coverage, the much requested Battlefield 2, and we illustrate the stellar way in which the new X1000 series handles enabling 4xAA on the games that we tested.

While scaling-with-aa on the new X1000 series is very good, will it be good enough to make up for the price difference with competitive NVIDIA hardware? Certainly, the feature set is of value with ATI offering the added benefit of MSAA on MRT and floating point surfaces, high quality AF, SM3.0, and Avivo. But performance is absolutely critical on current and near term games. Currently, many HDR methods avoid floating point output and MRTs in order to maintain compatibility with AA on current hardware. Until game developers shift to full floating point framebuffers or make heavy use of multiple render targets, ATI's added AA support won't make much difference to gamers. High quality anisotropic filtering is definitely something that we have begged of NVIDIA and ATI for a long time and we are glad to see it, but the benefits just aren't that visible in first-person shooters and the like. Shader Model 3.0 and Avivo are good things to have around as well; better API support, image quality, and video output are things that everyone wants.

However, the bottom line is that performance sells video cards. The first thing that people question when they are looking for a new card is just how well it runs in their favorite game. Hopefully, we will be able to shed some light on the issue here.

We will look at resolutions from 800x600 up to 2048x1536 and antialiasing tests will be included where possible. In games where we tested both with and without antialiasing, we will also show a graph of how performance drops due to AA scales with resolution. This data will be a lower-is-better graph (less drop in frame rate is a good thing) and will be shown scaled over resolution (as a performance drop will increase in percentage with resolution). The test system that we employed is the one used for our initial tests of the hardware.

Battlefield 2 Performance
Comments Locked

93 Comments

View All Comments

  • flexy - Saturday, October 8, 2005 - link

    there is an interesting article (in german, sorry) where they compare the old cards' (X850) performance with the new adaptive antialiasing turned on.

    You can see that some games do pretty well with minor performance loss - eg. but FarCry gets a HUGE hit by enabling adaptive antialiasing. I also did some tests on my own (X850XT) and the hit is as big as 50% in FarCry benchmark.

    My question would be how the new cards handle this and how big the performance hit would be eg. with a 1800XL/XT in certain engines.

    Also, i think the 6xAntiAliasing modi are a bit under-represented - i for my part am used to play HL2 1280x1024 with 6xAA and 16xAF....and i am not that interested in 4xAA 8xAF since i ASSUME that a high-end card like the 1800XT should be pre-destined to run the higher AA/AF modi PLUS adaptive antialiasing. Maybe also please note that a big number (?) of people might not even be able to run monster resolutions like 2048x but MIGHT certainly be interested in resolutions upto 1600x but with max AA/AF/adaptive modi on.

  • flexy - Saturday, October 8, 2005 - link

    here the link, sorry forgot above:

    http://www.3dcenter.org/artikel/2005/10-04_b.php">http://www.3dcenter.org/artikel/2005/10-04_b.php


  • cryptonomicon - Friday, October 7, 2005 - link

    I want to see ATI release a product that takes back the performance crown.. only then they can sit on the high price premiums for their cards again because they own the highest performance. Until then they can get busy slashing prices...
  • ElFenix - Friday, October 7, 2005 - link

    you guys fail to realize that, at retail prices for nvidia cards, the ati cards slot quite nicely. best buy and compusa still sell 6600GTs for nearly $200, and 6800GTs for nearly $300. so, comparing those prices to the ATi prices reveals that ATi is quite price competitive. of course, no one who reads this site buys at retail (unless it's a hot deal), but there isn't any reason to think that ATi cards can't come down in price as quickly as the nvidia cards.
  • bob661 - Saturday, October 8, 2005 - link

    Ummm yeah. We, the geeks, don't shop at CompUSA or Best Buy. Therefore, ATI's new hardware is NOT price competitive. Also, if Nvidia 6600GT's are $200 and 6800GT's are $300 at said stores, how would the ATI cards magically not get a price gouging too?
  • ElFenix - Tuesday, October 11, 2005 - link

    did you even bother to read my post before shooting off your idiotic post? i said that no one here shops at best buy. and you don't know if ati's hardware is price competitive or not because, at the moment, you can't buy it. once it gets out into the channel maybe newegg and zzf and monarch will stock them at competitive prices as the nvidia parts. maybe not. but you don't know that yet, so making blanket statements like 'ati is not price competitive' is stupid!

    i'm not really sure what this 'price gouging' is you're referring to, but because you've already demonstrated your inability to comprehend the english language i'm going to assume its because you think best buy and compusa are selling for more than msrp. they're not. they are selling at msrp. and at best buy and compusa ati cards will sell at msrp. and ati cards at msrp are quite price competitive with nvidia cards at msrp.
  • shabby - Friday, October 7, 2005 - link

    Lets see some hdr+aa benchmarks.
  • DerekWilson - Saturday, October 8, 2005 - link

    There are no games where we can test this feature yet
  • TinyTeeth - Friday, October 7, 2005 - link

    You make up for the flaws of the last review and show that you still are the best hardware site out there. Keep it up!
  • jojo4u - Friday, October 7, 2005 - link

    The graphs give a nice overview, good work.

    Please consider to include the information what AF level was used into the graphs. This is something all recent reviews here have have been lacking.

    About the image quality: The shimmering was greatly reduced with the fixed driver (78.03). So it's down to NV40 level now. But 3dCenter.de[1] and Computerbase.de conclude that only enabling "high quality" in the Forceware brings comparable image quality to "A.I. low". Perhaps you find the time to explore this issue in the image quality tests.

    [1] http://www.3dcenter.de/artikel/g70_flimmern/index_...">http://www.3dcenter.de/artikel/g70_flimmern/index_...
    This article is about the unfixed quality. But to judge the G70 today, have a look at the 6800U videos.
    http://www.hexus.net/content/item.php?item=1549&am...">http://www.hexus.net/content/item.php?item=1549&am...
    This article shows the performance hit of enabling "high quality"

Log in

Don't have an account? Sign up now