ATI secretly released its Silver Bullets material to AIBs this week and the picture of R580 is slowly coming together. R580, or Radeon X1900 as it is called internally, is expected to "launch" in January according to ATI documentation. Unfortunately, ATI's track record has not been spectacular with the last few product launches. ATI's Crossfire chipset "launched" twice, once in June and again in October.

ATI released their Silver Bullets for Radeon X1800 approximately two months before the launch of the card, but it took nearly six weeks after the launch for master card variants of the X1800 to show up on store shelves (and nearly as long for Crossfire motherboards as well).

The Silver Bullets presentation was a little light on details, but did confirm the R580 GPU has 48 pixel shader processors and higher clocks than R520 (a.k.a. Radeon X1800). Radeon X1900 uses a 90nm process also found on Radeon X1800. The internal briefing also confirmed that there will be two separate versions of the card, RX1900CF "Master Card" and RX1900 "Crossfire Ready" card. Like the Radeon X1800 series, you will need at least one master card to enable Crossfire support. Avivo and Shader Model 3.0 will also appear on Radeon X1900.



View All Comments

  • DigitalFreak - Friday, December 23, 2005 - link

    So you work for ATI and know this for a fact? Reply
  • shabby - Thursday, December 22, 2005 - link

    It can have a 100 pixel shaders, it still only has 16 rops. Reply
  • Clauzii - Thursday, December 22, 2005 - link

    Do You know that as a fact?

    And even if it only has 16 Output pipelines, if the 48 shaders can finish the rest I don´t think it´s gonna be a big problem, ´caus the shaders do the dirty work...

    But yet again - We could then hope for R600 to be 32 Pipelines with 64 shaders? :)
  • quasarsky - Thursday, December 22, 2005 - link

    i feel like a newb :-D lol.

    will ati ever move beyond 16 pixel pipelines?

  • xTYBALTx - Thursday, December 22, 2005 - link

    Your terminology is crossed. Reply
  • SLIguy - Thursday, December 22, 2005 - link

    SLI/CrossFire are more than PR gimmicks. They give one the ability to run more games at higher resolutions and details than any single card solution, at least 90% of the time.

    F.E.A.R. and CoD2 are great examples of games that when running at maximum settings even at just 1600x1200, just don't run smoothly, even with a single GTX 512, in my experience. I don't know about the X1800XT since I don't have one, but from the numbers I've seen, I doub't a single X1800XT can do any better.

    Now go SLI with the GTX 512's, and now we're talking. F.E.A.R. and CoD2 just run like butter @ 1600x1200 maximum detail.

    Cost effective? No. Fun. Hell yeah!

    So while the dual card thing is proably never going to be something you pick at Best Buy, it will be the choice of PC enuthsiets willing to spend the money.

    I too lament this master card setup. Plus, I want to see a motherboard with both SLI and CrossFire on it.

  • FordFreak - Friday, December 23, 2005 - link

    Not that many people run games at the resolution so it still is more of a PR gimmick than anything else. If you have money to buy a monitor that does 1600x1200, you have money to buy 2 over priced video cards. Reply
  • alienz - Monday, December 26, 2005 - link

    LOL, are you kidding? My 6 year old Dell 19" CRT does 1600x1200 resolution, and it does it quite well. Reply
  • bob661 - Monday, December 26, 2005 - link

    I was about to say my Samsung 19" CRT does 1600x1200 also but I run it at 1280x960. I also play my games at 1280x960. A single 7800GT does it well even on COD2 but SLI 7800GT's would be REALLY nice. Reply
  • Braznor - Thursday, December 22, 2005 - link

    LOL, this time the immovable object (ATI) will collide into the impossible force (Nvidia) Nvidia's upcoming G71 launch should keep the canadians nervous. Reply

Log in

Don't have an account? Sign up now