Introduction

Take all the clichés used to describe a long overdue event or the unexpected fulfillment of a promise (hot places freezing, heavy animals soaring through the air, etc...) and you still couldn't say enough to fully proclaim the news that ATI has finally properly hard launched a product. That's right, looking around the internet this morning has provided us with the joyous realization that the Radeon X1900XT, XTX, and CrossFire parts are available for purchase. We've tried to keep an eye on the situation and it's been quite easy to see that ATI would be able to pull it off this time. Some sites started taking preorders earlier in the week saying their X1900 parts would ship in one to two days, putting the timeframe right on the mark. There were no missing dongles, no problems with customs, and ATI told us last week that thousands of parts had already been delivered to manufacturers.

And if that isn't enough to dance about, ATI has delivered a hugely powerful part with this launch. The Radeon X1900 series is no joke, and every card featuring the name is a behemoth. With triple the pixel shader units of the X1800 XT, and a general increase in supporting hardware throughout the pixel processing engine, ATI's hugely clocked 384 Million transistor GPU is capable of crunching enormous volumes of data very quickly. Fill rate isn't increased very much because the X1900 series still only allows 16 pixels to be drawn to the screen per clock cycle, but power is delivered where it is needed most. With longer and more complex shader programs, pixels need to stay in the shader engine longer which further shifts the performance burden from the theoretical maximum fill rate.

NVIDIA would like us to compare the X1900's increase in ALU (arithmetic logic unit) power to what they did with the FX 5900 after NV30 tanked. Certainly, increasing the math power (and increasing memory bandwidth) helped NVIDIA, but fortunately for ATI the X1900 is not derived from a fundamentally flawed GPU design. The X1800 series are certainly not bad parts, even if they are being completely replaced by the X1900 in ATI's lineup.



I'll spoil the results and make it clear that the X1900XT and XTX are hands down the best cards out there right now. But all positives aside, ATI needed this card to hard launch with good availability, perform better than anything else, and look good doing it. There have been too many speed bumps in ATI's way for there to be any room for a slip up on this launch, and it looks like they've pulled it off. The launch of the X1900 series not only puts ATI back on top, but (much more importantly) it puts them back in the game. Let's hope that both ATI and NVIDIA can keep up the good fight.

But let's not forget why we're here. The first thing we are going to do is talk about what makes the R580 GPU that powers the X1900 series so incredibly good at what it does.

R580 Architecture
Comments Locked

120 Comments

View All Comments

  • tuteja1986 - Tuesday, January 24, 2006 - link

    wait for firing squad review then :) if you want AAx8
  • beggerking - Tuesday, January 24, 2006 - link

    Did anyone notice it? the breakdown graphs doesn't quite reflect the actual data..

    the breakdown is showing 1900xtx being much faster than 7800 512, but in the actual performance graph 1900xtx is sometimes outpaced by 7800 512..
  • SpaceRanger - Tuesday, January 24, 2006 - link

    All the second to last section describes in the Image Quality. There was no explaination on power consumtion at all. Was this an accidental omit or something else??
  • Per Hansson - Tuesday, January 24, 2006 - link

    Yes, please show us the power consumption ;-)

    A few things I would like seen done; Put a low-end PCI GFX card in the comp, boot it and register power consumption, leave that card in and then do your normal tests with a single X1900 and then dual so we get a real point on how much power they consume...

    Also please clarify exactly what PSU was used and how the consumption was measured so we can figure out more accuratley how much power the card really draws (when counting in the (in)efficiency of the PSU that is...
  • peldor - Tuesday, January 24, 2006 - link

    That's a good idea on isolating the power of the video card.

    From the other reviews I've read, the X1900 cards are seriously power hungry. In the neighborhood of 40-50W more than the X1800XT cards. The GTX 512 (and GTX of course) are lower than the X1800XT, let alone the X1900 cards.
  • vaystrem - Tuesday, January 24, 2006 - link

    Anyone else find this interesting??

    Battlefield 2 @ 2048x1536 Max Detail
    7800GTX512 33FPS
    AIT 1900XTX 32.9FPS
    ATI 1900XTX Crossfire. 29FPS
    -------------------------------------
    Day of Defeat
    7800GTX512 18.93FPS
    AIT 1900XTX 35.5PS
    ATI 1900XTX Crossfire. 35FPS
    -------------------------------------
    Fear
    7800GTX512 20FPS
    AIT 1900XTX 36PS
    ATI 1900XTX Crossfire. 49FPS
    -------------------------------------
    Quake 4
    7800GTX512 43.3FPS
    AIT 1900XTX 42FPS
    ATI 1900XTX Crossfire. 73.3FPS


  • DerekWilson - Tuesday, January 24, 2006 - link

    Becareful here ... these max detail settings enabled superaa modes which really killed performance ... especially with all the options flipped on quality.

    we're working on getting some screens up to show the IQ difference. but suffice it to say that that the max detail settings are very apples to oranges.

    we would have seen performance improvements if we had simply kept using 6xAA ...
  • DerekWilson - Tuesday, January 24, 2006 - link

    to further clarify, fear didn't play well when we set AA outside the game, so it's max quality ended up using the in game 4xaa setting. thus we see a performance improvement.

    for day of defeat, forcing aa/af through the control panel works well so we were able to crank up the quality.

    I'll try to go back and clarify this in the article.
  • vaystrem - Wednesday, January 25, 2006 - link

    I'm not sure how that justifies what happens. Your argument is that it is the VERY highest settings so that its ok for the 'dual' 1900xtx to have lower performance than a single card alternative? That doesn't seem to make sense and speaks poorly for the ATI implementation.
  • Lonyo - Tuesday, January 24, 2006 - link

    The XTX especially in Crossfire does seem to give a fair boost in a number of tests over the XT and XT in Crossfire.

Log in

Don't have an account? Sign up now