Hardware Features and Test Setup

We're talking about features and tests today because we are going to be trying something a bit different this time around. In addition to our standard noAA/4xAA tests (both of which always have 8xAF enabled), we are including a performance test at maximal image quality on each architecture. This won't give us directly comparable numbers in terms of performance, but it will give us an idea of playability at maximum quality.

These days, we are running out of ways to push our performance tests. Plenty of games out there are CPU limited, and for what purpose is a card as powerful as an X1900XTX or 7800 GTX 512 purchased except to be pushed to its limit and beyond? Certainly, a very interesting route to go would be for us to purchase a few apple cinema displays and possibly an old IBM T221 and go insane with resolution. And maybe we will at some point. But for now, most people don't have 30" displays (though the increasing power of today's graphics cards is certainly a compelling argument for such an investment). For now, people can push their high end cards by enabling insane features and getting the absolute maximum eye candy possible out of all their games. Flight and space sim nuts now have angle independent anisotropic filtering on ATI hardware, adaptive antialiasing for textured surfaces helps in games with lots of fences and wires and tiny detail work, and 6xAA combined with 16xAF means you'll almost never have to look at a blurry texture with jagged edges again. It all comes at a price, or course, but is it worth it?

In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA. Some games with in game AA settings didn't have an option for 8xAA and didn't play well when we forced it in the driver, so we opted to go with the highest in game AA setting most of the time (which is reflected by the highest MSAA level supported in hardware - again most of the time). We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.

Our standard tests should look pretty familiar, and here is all the test hardware we used. Multiple systems were required in order to test both CrossFire and SLI, but all single card tests were performed in the ATI reference RD480 board.

ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU

First up is our apples to apples testing with NVIDIA and ATI setup to produce comparable image quality with 8xAF and either no AA or 4xAA. The resolutions we will look at are 1280x960 (or 1024) through 2048x1536.

Not Quite Ready: The Ultimate Gamer Platform, RD580 The Performance Breakdown
Comments Locked

120 Comments

View All Comments

  • tuteja1986 - Tuesday, January 24, 2006 - link

    wait for firing squad review then :) if you want AAx8
  • beggerking - Tuesday, January 24, 2006 - link

    Did anyone notice it? the breakdown graphs doesn't quite reflect the actual data..

    the breakdown is showing 1900xtx being much faster than 7800 512, but in the actual performance graph 1900xtx is sometimes outpaced by 7800 512..
  • SpaceRanger - Tuesday, January 24, 2006 - link

    All the second to last section describes in the Image Quality. There was no explaination on power consumtion at all. Was this an accidental omit or something else??
  • Per Hansson - Tuesday, January 24, 2006 - link

    Yes, please show us the power consumption ;-)

    A few things I would like seen done; Put a low-end PCI GFX card in the comp, boot it and register power consumption, leave that card in and then do your normal tests with a single X1900 and then dual so we get a real point on how much power they consume...

    Also please clarify exactly what PSU was used and how the consumption was measured so we can figure out more accuratley how much power the card really draws (when counting in the (in)efficiency of the PSU that is...
  • peldor - Tuesday, January 24, 2006 - link

    That's a good idea on isolating the power of the video card.

    From the other reviews I've read, the X1900 cards are seriously power hungry. In the neighborhood of 40-50W more than the X1800XT cards. The GTX 512 (and GTX of course) are lower than the X1800XT, let alone the X1900 cards.
  • vaystrem - Tuesday, January 24, 2006 - link

    Anyone else find this interesting??

    Battlefield 2 @ 2048x1536 Max Detail
    7800GTX512 33FPS
    AIT 1900XTX 32.9FPS
    ATI 1900XTX Crossfire. 29FPS
    -------------------------------------
    Day of Defeat
    7800GTX512 18.93FPS
    AIT 1900XTX 35.5PS
    ATI 1900XTX Crossfire. 35FPS
    -------------------------------------
    Fear
    7800GTX512 20FPS
    AIT 1900XTX 36PS
    ATI 1900XTX Crossfire. 49FPS
    -------------------------------------
    Quake 4
    7800GTX512 43.3FPS
    AIT 1900XTX 42FPS
    ATI 1900XTX Crossfire. 73.3FPS


  • DerekWilson - Tuesday, January 24, 2006 - link

    Becareful here ... these max detail settings enabled superaa modes which really killed performance ... especially with all the options flipped on quality.

    we're working on getting some screens up to show the IQ difference. but suffice it to say that that the max detail settings are very apples to oranges.

    we would have seen performance improvements if we had simply kept using 6xAA ...
  • DerekWilson - Tuesday, January 24, 2006 - link

    to further clarify, fear didn't play well when we set AA outside the game, so it's max quality ended up using the in game 4xaa setting. thus we see a performance improvement.

    for day of defeat, forcing aa/af through the control panel works well so we were able to crank up the quality.

    I'll try to go back and clarify this in the article.
  • vaystrem - Wednesday, January 25, 2006 - link

    I'm not sure how that justifies what happens. Your argument is that it is the VERY highest settings so that its ok for the 'dual' 1900xtx to have lower performance than a single card alternative? That doesn't seem to make sense and speaks poorly for the ATI implementation.
  • Lonyo - Tuesday, January 24, 2006 - link

    The XTX especially in Crossfire does seem to give a fair boost in a number of tests over the XT and XT in Crossfire.

Log in

Don't have an account? Sign up now