Hardware Features and Test Setup

We're talking about features and tests today because we are going to be trying something a bit different this time around. In addition to our standard noAA/4xAA tests (both of which always have 8xAF enabled), we are including a performance test at maximal image quality on each architecture. This won't give us directly comparable numbers in terms of performance, but it will give us an idea of playability at maximum quality.

These days, we are running out of ways to push our performance tests. Plenty of games out there are CPU limited, and for what purpose is a card as powerful as an X1900XTX or 7800 GTX 512 purchased except to be pushed to its limit and beyond? Certainly, a very interesting route to go would be for us to purchase a few apple cinema displays and possibly an old IBM T221 and go insane with resolution. And maybe we will at some point. But for now, most people don't have 30" displays (though the increasing power of today's graphics cards is certainly a compelling argument for such an investment). For now, people can push their high end cards by enabling insane features and getting the absolute maximum eye candy possible out of all their games. Flight and space sim nuts now have angle independent anisotropic filtering on ATI hardware, adaptive antialiasing for textured surfaces helps in games with lots of fences and wires and tiny detail work, and 6xAA combined with 16xAF means you'll almost never have to look at a blurry texture with jagged edges again. It all comes at a price, or course, but is it worth it?

In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA. Some games with in game AA settings didn't have an option for 8xAA and didn't play well when we forced it in the driver, so we opted to go with the highest in game AA setting most of the time (which is reflected by the highest MSAA level supported in hardware - again most of the time). We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.

Our standard tests should look pretty familiar, and here is all the test hardware we used. Multiple systems were required in order to test both CrossFire and SLI, but all single card tests were performed in the ATI reference RD480 board.

ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU

First up is our apples to apples testing with NVIDIA and ATI setup to produce comparable image quality with 8xAF and either no AA or 4xAA. The resolutions we will look at are 1280x960 (or 1024) through 2048x1536.

Not Quite Ready: The Ultimate Gamer Platform, RD580 The Performance Breakdown
Comments Locked

120 Comments

View All Comments

  • poohbear - Tuesday, January 24, 2006 - link

    $500 too much? there are cars for $300, 000+, but u dont see the majority of ppl complaining because they're NOT aimed at u and me and ferrari & lamborghini could care less what we think cause we're not their target audience. get over yourself, there ARE cards for you in the $100+ $300, so what are u worried about?
  • timmiser - Tuesday, January 24, 2006 - link

    While I agree with what you are saying, we are already on our 3rd generation of $500 high end graphic cards. If memory serves, it was the Nvidia 6800 that broke the $500 barrier for a single card solution.

    I'm just happy it seems to have leveled off at $500.
  • Zebo - Tuesday, January 24, 2006 - link

    Actually GPU's in general scale very well with price/performance and this is no exception. Twice as fast as a 850 XT which you can get for $275 should cost twice as much or $550 which it does. If you want to complain about prices look at CPUs, high end memory and raptors/SCSI which higher line items offer small benefits for huge price premiums.
  • fishbits - Tuesday, January 24, 2006 - link

    Geez, talk about missing the point. News flash: Bleeding edge computer gear costs a lot. $500 is an excellent price for the best card out. Would I rather have it for $12? Yes. Can I afford/justify a $500 gfx card? No, but more power to those who can, and give revenue to ATI/Nvidia so that they can continue to make better cards that relatively quickly fall within my reach. I can't afford a $400 9800 pro either... whoops! They don't cost that much now, do they?

    quote:

    Even if you played a game thats needs it you should be pissed at the game company thats puts a blot mess thats needs a $500 card.

    Short-sighted again. Look at the launch of Unreal games for instance. Their code is always awesome on the performance side, but can take advantage of more power than most have available at release time. You can tell them their code is shoddy, good luck with that. In reality it's great code that works now, and your gaming enjoyment is extended as you upgrade over time and can access better graphics without having to buy a new game. Open up your mind, quit hating and realize that these companies are giving us value. You can't afford it now, neither can I, but quit your crying and applaud Nv/ATI for giving us constantly more powerful cards.
  • aschwabe - Tuesday, January 24, 2006 - link

    Agreed, I'm not sure how anyone constitutes $500 for ONE component a good price. I'll pay no more than 300-350 for a vid card.
  • bamacre - Tuesday, January 24, 2006 - link

    Hear, hear!! A voice of reason!
  • rqle - Tuesday, January 24, 2006 - link

    I like new line graph color and interface, but i like bar graph so much more. Never a big fan over SLI or Crossfire on the graph, makes its a distracting, especially it only represent a small group. Wonder if crossfire and sli can have their own graph by themselves or maybe their own color. =)
  • DerekWilson - Tuesday, January 24, 2006 - link

    it could be possible for us to look at multigpu solutions serpeately, but it is quite relevant to compare single card performance to multigpu performance -- especially when trying to analyze performance.
  • Live - Tuesday, January 24, 2006 - link

    Good reading! Good to see ATI getting back in the game. Now lets see some price competition for a change.

    I don’t understand what CrossFire XTX means. I thought there was no XTX crossfire card? Since the Crossfire and XT have the same clocks it shouldn’t matter if the other card is a XTX. By looking at the graphs it would seem I was wrong but how can this be? This would indicate that the XTX has more going for it then just the clocks but that is not so, right?

    Bha I'm confused :)
  • DigitalFreak - Tuesday, January 24, 2006 - link

    My understanding is that Crossfire is async, so both cards run at their maximum speed. The XTX card runs at 650/1.55, while the Crossfire Edition card runs at 625/1.45. You're right, there is no Crossfire Edition XTX card.

Log in

Don't have an account? Sign up now