Details of the Cards

There are actually 4 products being launched today, three of which we were able to get our hands on for this article. We have actually spotted all three of these cards we tested around the internet today, so availability is immediate, and we couldn't be happier. As for pricing, ATI's MSRPs are as follows:

Radeon X1900 XTX -- $650
Radeon X1900 CrossFire Edition -- $600
Radeon X1900 XT -- $550

The CrossFire Edition version of the X1900 is clocked the same as the X1900 XT except for its I/O connectors and compositing engine. The X1900 XT weighs in with some very high clock speeds, especially for the number of pixel pipelines it supports. If you are worried about the CrossFire card bringing down the XTX, don't be. The XTX only sees about a 4% increase in core clock speed and a 7% increase in memory clock speed over the stock X1900 XT.

ATI X1000 Series Features
Radeon X1900 XT(X)
Radeon X1600
Radeon X1800 XL
Radeon X1800 XT
Vertex Pipelines
8
5
8
8
Pixel Pipelines
48
12
16
16
Core Clock
625(650)
590
500
625
Memory Size
512MB
256MB
256MB
512MB
Memory Data Rate
1.45GHz (1.55GHz)
1.38GHz
1GHz
1.5GHz
Texture Units
16
4
16
16
Render Backends
16
4
16
16
Z Compare Units
16
8
16
16
Maximum Threads
512
128
512
512


So, while the price gap between the XTX, XT, and CrossFire versions of the card would seem to indicate sizeable performance differences, we can definitively say that this is not the general case. The XTX is only marginally faster even on paper, and, as we will see, in the real world, real performance is what matters. Our advice is to save your money and go with the cheaper XT. 18% more cost for at best 7% more performance is all that the XTX gives.

R580 Architecture One Last Thing, there’s an All-in-Wonder Version too
Comments Locked

120 Comments

View All Comments

  • blahoink01 - Wednesday, January 25, 2006 - link

    Considering the average framerate on a 6800 ultra at 1600x1200 is a little above 50 fps without AA, I'd say this is a perfectly relevant app to benchmark. I want to know what will run this game at 4 or 6 AA with 8 AF at 1600x1200 at 60+ fps. If you think WOW shouldn't be benchmarked, why use Far Cry, Quake 4 or Day of Defeat?

    At the very least WOW has a much wider impact as far as customers go. I doubt the total sales for all three games listed above can equal the current number of WOW subscribers.

    And your $3000 monitor comment is completely ridiculous. It isn't hard to get a 24 inch wide screen for 800 to 900 bucks. Also, finding a good CRT that can display greater than 1600x1200 isn't hard and that will run you $400 or so.
  • DerekWilson - Tuesday, January 24, 2006 - link

    we have looked at world of warcraft in the past, and it is possible we may explore it again in the future.
  • Phiro - Tuesday, January 24, 2006 - link

    "The launch of the X1900 series no only puts ATI back on top, "

    Should say:

    "The launch of the X1900 series not only puts ATI back on top, "
  • GTMan - Tuesday, January 24, 2006 - link

    That's how Scotty would say it. Beam me up...
  • DerekWilson - Tuesday, January 24, 2006 - link

    thanks, fixed
  • DrDisconnect - Tuesday, January 24, 2006 - link

    Its amusing how the years have changed everyone's perception as to how much is a reasonalble price for a component. Hardrives, memory, monitors and even CPUs have become so cheap many have lost the perspective of what being on the leading edge costs. I paid 750$ for a 100 MB drive for my Amiga, 500$ for a 4x CR-ROM and remember spending 500$ on a 720 X 400 Epson colour injet. (Yeah I'm in my 50's) As long as games continue to challenge the capabilities of video cards and the drive to increase performance continues the top end will be expensive. Unlike other hardware (printers, memory, hardrives) there are still perfomance improvements to be made that the user will perceive. If someday a card can render so fast that all games play like reality, then video cards will become like hardrives are now.
  • finbarqs - Tuesday, January 24, 2006 - link

    Everyone gets this wrong! It uses 16 PIXEL-PIPELINES with 48 PIXEL SHADER PROCESSORS in it! the pipelines are STILL THE SAME as the X1800XT! 16!!!!!!!!!! oh yeah, if you're wondering, in 3DMark 2005, it reached 11,100 on just a Single X1900XTX...
  • DerekWilson - Tuesday, January 24, 2006 - link

    semantics -- we are saying the same things with different words.

    fill rate as the main focus of graphics performance is long dead. doing as much as possible at a time to as many pixels as possible at a time is the most important thing moving forward. Sure, both the 1900xt and 1800xt will run glquake at the same speed, but the idea of the pixel (fragment) pipeline is tied more closely to lighting, texturing and coloring than to rasterization.

    actually this would all be less ambigous if opengl were more popular and we had always called pixel shaders fragment shaders ... but that's a whole other issue.

  • DragonReborn - Tuesday, January 24, 2006 - link

    I'd love to see how the noise output compares to the 7800 series...
  • slatr - Tuesday, January 24, 2006 - link

    How about some Lock On Modern Air Combat tests?

    I know not everyone plays it, but it would be nice to have you guys run your tests with it. Especially when we are shopping for $500 dollar plus video cards.

Log in

Don't have an account? Sign up now