Take all the clichés used to describe a long overdue event or the unexpected fulfillment of a promise (hot places freezing, heavy animals soaring through the air, etc...) and you still couldn't say enough to fully proclaim the news that ATI has finally properly hard launched a product. That's right, looking around the internet this morning has provided us with the joyous realization that the Radeon X1900XT, XTX, and CrossFire parts are available for purchase. We've tried to keep an eye on the situation and it's been quite easy to see that ATI would be able to pull it off this time. Some sites started taking preorders earlier in the week saying their X1900 parts would ship in one to two days, putting the timeframe right on the mark. There were no missing dongles, no problems with customs, and ATI told us last week that thousands of parts had already been delivered to manufacturers.

And if that isn't enough to dance about, ATI has delivered a hugely powerful part with this launch. The Radeon X1900 series is no joke, and every card featuring the name is a behemoth. With triple the pixel shader units of the X1800 XT, and a general increase in supporting hardware throughout the pixel processing engine, ATI's hugely clocked 384 Million transistor GPU is capable of crunching enormous volumes of data very quickly. Fill rate isn't increased very much because the X1900 series still only allows 16 pixels to be drawn to the screen per clock cycle, but power is delivered where it is needed most. With longer and more complex shader programs, pixels need to stay in the shader engine longer which further shifts the performance burden from the theoretical maximum fill rate.

NVIDIA would like us to compare the X1900's increase in ALU (arithmetic logic unit) power to what they did with the FX 5900 after NV30 tanked. Certainly, increasing the math power (and increasing memory bandwidth) helped NVIDIA, but fortunately for ATI the X1900 is not derived from a fundamentally flawed GPU design. The X1800 series are certainly not bad parts, even if they are being completely replaced by the X1900 in ATI's lineup.

I'll spoil the results and make it clear that the X1900XT and XTX are hands down the best cards out there right now. But all positives aside, ATI needed this card to hard launch with good availability, perform better than anything else, and look good doing it. There have been too many speed bumps in ATI's way for there to be any room for a slip up on this launch, and it looks like they've pulled it off. The launch of the X1900 series not only puts ATI back on top, but (much more importantly) it puts them back in the game. Let's hope that both ATI and NVIDIA can keep up the good fight.

But let's not forget why we're here. The first thing we are going to do is talk about what makes the R580 GPU that powers the X1900 series so incredibly good at what it does.

R580 Architecture


View All Comments

  • blahoink01 - Wednesday, January 25, 2006 - link

    Considering the average framerate on a 6800 ultra at 1600x1200 is a little above 50 fps without AA, I'd say this is a perfectly relevant app to benchmark. I want to know what will run this game at 4 or 6 AA with 8 AF at 1600x1200 at 60+ fps. If you think WOW shouldn't be benchmarked, why use Far Cry, Quake 4 or Day of Defeat?

    At the very least WOW has a much wider impact as far as customers go. I doubt the total sales for all three games listed above can equal the current number of WOW subscribers.

    And your $3000 monitor comment is completely ridiculous. It isn't hard to get a 24 inch wide screen for 800 to 900 bucks. Also, finding a good CRT that can display greater than 1600x1200 isn't hard and that will run you $400 or so.
  • DerekWilson - Tuesday, January 24, 2006 - link

    we have looked at world of warcraft in the past, and it is possible we may explore it again in the future. Reply
  • Phiro - Tuesday, January 24, 2006 - link

    "The launch of the X1900 series no only puts ATI back on top, "

    Should say:

    "The launch of the X1900 series not only puts ATI back on top, "
  • GTMan - Tuesday, January 24, 2006 - link

    That's how Scotty would say it. Beam me up... Reply
  • DerekWilson - Tuesday, January 24, 2006 - link

    thanks, fixed Reply
  • DrDisconnect - Tuesday, January 24, 2006 - link

    Its amusing how the years have changed everyone's perception as to how much is a reasonalble price for a component. Hardrives, memory, monitors and even CPUs have become so cheap many have lost the perspective of what being on the leading edge costs. I paid 750$ for a 100 MB drive for my Amiga, 500$ for a 4x CR-ROM and remember spending 500$ on a 720 X 400 Epson colour injet. (Yeah I'm in my 50's) As long as games continue to challenge the capabilities of video cards and the drive to increase performance continues the top end will be expensive. Unlike other hardware (printers, memory, hardrives) there are still perfomance improvements to be made that the user will perceive. If someday a card can render so fast that all games play like reality, then video cards will become like hardrives are now. Reply
  • finbarqs - Tuesday, January 24, 2006 - link

    Everyone gets this wrong! It uses 16 PIXEL-PIPELINES with 48 PIXEL SHADER PROCESSORS in it! the pipelines are STILL THE SAME as the X1800XT! 16!!!!!!!!!! oh yeah, if you're wondering, in 3DMark 2005, it reached 11,100 on just a Single X1900XTX... Reply
  • DerekWilson - Tuesday, January 24, 2006 - link

    semantics -- we are saying the same things with different words.

    fill rate as the main focus of graphics performance is long dead. doing as much as possible at a time to as many pixels as possible at a time is the most important thing moving forward. Sure, both the 1900xt and 1800xt will run glquake at the same speed, but the idea of the pixel (fragment) pipeline is tied more closely to lighting, texturing and coloring than to rasterization.

    actually this would all be less ambigous if opengl were more popular and we had always called pixel shaders fragment shaders ... but that's a whole other issue.

  • DragonReborn - Tuesday, January 24, 2006 - link

    I'd love to see how the noise output compares to the 7800 series... Reply
  • slatr - Tuesday, January 24, 2006 - link

    How about some Lock On Modern Air Combat tests?

    I know not everyone plays it, but it would be nice to have you guys run your tests with it. Especially when we are shopping for $500 dollar plus video cards.


Log in

Don't have an account? Sign up now