Hardware Features and Test Setup

We're talking about features and tests today because we are going to be trying something a bit different this time around. In addition to our standard noAA/4xAA tests (both of which always have 8xAF enabled), we are including a performance test at maximal image quality on each architecture. This won't give us directly comparable numbers in terms of performance, but it will give us an idea of playability at maximum quality.

These days, we are running out of ways to push our performance tests. Plenty of games out there are CPU limited, and for what purpose is a card as powerful as an X1900XTX or 7800 GTX 512 purchased except to be pushed to its limit and beyond? Certainly, a very interesting route to go would be for us to purchase a few apple cinema displays and possibly an old IBM T221 and go insane with resolution. And maybe we will at some point. But for now, most people don't have 30" displays (though the increasing power of today's graphics cards is certainly a compelling argument for such an investment). For now, people can push their high end cards by enabling insane features and getting the absolute maximum eye candy possible out of all their games. Flight and space sim nuts now have angle independent anisotropic filtering on ATI hardware, adaptive antialiasing for textured surfaces helps in games with lots of fences and wires and tiny detail work, and 6xAA combined with 16xAF means you'll almost never have to look at a blurry texture with jagged edges again. It all comes at a price, or course, but is it worth it?

In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA. Some games with in game AA settings didn't have an option for 8xAA and didn't play well when we forced it in the driver, so we opted to go with the highest in game AA setting most of the time (which is reflected by the highest MSAA level supported in hardware - again most of the time). We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.

Our standard tests should look pretty familiar, and here is all the test hardware we used. Multiple systems were required in order to test both CrossFire and SLI, but all single card tests were performed in the ATI reference RD480 board.

ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU

First up is our apples to apples testing with NVIDIA and ATI setup to produce comparable image quality with 8xAF and either no AA or 4xAA. The resolutions we will look at are 1280x960 (or 1024) through 2048x1536.

Not Quite Ready: The Ultimate Gamer Platform, RD580 The Performance Breakdown
Comments Locked

120 Comments

View All Comments

  • photoguy99 - Tuesday, January 24, 2006 - link

    Why do the editors keep implying the power of cards is "getting ahead" of games when it's actually not even close?

    - 1600x1200 monitors are pretty affordable
    - 8xAA does look better than 4xAA
    - It's nice play games with a minimum frame rate of 50-60

    Yes these are high end desires, but the X1900XT can't even meet these needs despite it's great power.

    Let's face it - the power of cards could double tomorrow and still be put to good use.
  • mi1stormilst - Tuesday, January 24, 2006 - link

    Well said well said my friend...

    We need to stop being so impressed by so very little. When games look like REAL LIFE does with lots of colors, shading, no jagged edges (unless its from the knife I just plunged into your eye) lol you get the picture.
  • poohbear - Tuesday, January 24, 2006 - link

    technology moves forward at a slower pace then that mates. U expect every vid card to be a 9700pro?! right. there has to be a pace the developers can follow.
  • photoguy99 - Wednesday, January 25, 2006 - link

    I think we are agreeing with you -

    The article authors keep implying they have to struggle to push these cards to their limit because they are getting so powerful so fast.

    To your point, I do agree it's moving forward slow - relative to what people can make use of.

    For example 90% of Office users can not make use of a faster CPU.

    However 90% of gamers could make use of a faster GPU.

    So even though GPU performance is doubling faster than CPU performance they should keep it up because we can and will use every ounce of it.
  • Powermoloch - Tuesday, January 24, 2006 - link

    It is great to see that ATi is doing their part right ;)
  • photoguy99 - Tuesday, January 24, 2006 - link

    When DX10 is released with vist it seems like this card would be like having SM2.0 - you're behind the curve again.

    Yea, I know there is always something better around the corner - and I don't recommend waiting if you want a great card now.

    But I'm sure some people would like to know.
  • Spoelie - Thursday, January 26, 2006 - link

    Not at all, I do not see DX10 arriving before vista near the end of this year. If it does earlier it will not make any splash whatsoever on game development before that. Even so, you cannot be 'behind' if you're only competitor is still at SM3.0 as well. As far as I can tell, there will be no HARD architectural changes in G71/7900 - they might improve tidbits here and there, like support for AA while doing HDR rendering, but that will be about the full extent of changes.
  • DigitalFreak - Tuesday, January 24, 2006 - link

    True, but I'm betting it will be quite a while before we see any DX10 games. I would suspect that the R620/G80 will be DX10 parts.
  • timmiser - Tuesday, January 24, 2006 - link

    I expect that Microsoft's Flight Simulator X will be the first DX10 game.
  • hwhacker - Tuesday, January 24, 2006 - link

    Question to Derek (or whomever):

    Perhaps I interpreted something wrong, but is it correct that you're saying X1900 is more of a 12x4 technology (because of fetch4) than the 16x3 we always thought? If so, that would make it A LOT more like Xenos, and perhaps R600, which makes sense, if I recall their ALU setup correctly (Xenos is 16x4, one for stall, so effective 16x3). R520 was 16x1, so...I gotta ask...Does this mean a 16x4 is imminent, or am I just reading the information incorrectly?

    If that's true, ATi really did mess with the definition of a pipeline.

    I can hear the rumours now...R590 with 16 QUADS, 16 ROPs, 16 TMUs, and 64 pixel processors...Oh yeah, and GDDR4 (on a 80nm process.) You heard it here first. ;)

Log in

Don't have an account? Sign up now