One Last Thing, there's an All-in-Wonder Version too

The All-In-Wonder version of this card isn't lagging too far behind this time around. Previous AIW launches have seen at least a little gap between the launch of the card it's based on and an announcement. This time ATI is being proactive and bringing out an AIW version of the X1900 immediately.

The card is a single slot solution, clocked a little lower, and aside from being the cheapest X1900 around, it also features all the bells and whistles AIW users have come to know and love. The price tag doesn't exactly scream bargain, but considering all the features smashed into this part it's obviously not going to be a slouch. While the specs for the part are a considerable cut from the faster cards in the series, the combination of all the positives on this card are incredible. Here's a breakdown:

All-In-Wonder X1900:
Core clock speed: 500 MHz
Memory clock speed: 960 MHz
Price (MSRP): $500

Though the All-In-Wonder series is always sold first in North America (all AIW parts bought here are built by ATI), we haven't seen much in the way of availability for this part today. The card is listed at ATI's own store as out of stock and will ship when available. While the focal point of the launch is on the three main products we tested today, we would have preferred that ATI hold off on the announcement of this part until volume was available. We are more inclined to believe ATI's promise that the AIW will be available in the next couple weeks now that we've seen them deliver so well on this hard launch, and we'll try to test one as soon as possible to see how the reduced clocks affect real world performance.

Details of the Cards Not Quite Ready: The Ultimate Gamer Platform, RD580
Comments Locked

120 Comments

View All Comments

  • bob4432 - Thursday, January 26, 2006 - link

    Good for ATI, after some issues in the not so distant past it looks like the pendulum has swung back in their direction.

    i really like this, it should drop the 7800GT prices down maybe to the ~$200-$220(hoping, as nvidia want to keep the market hold...) which would actually give me a reason to switch to some flavor of pci-e based m/b, but being display limited @ 1280x1024 with a lcd, my x800xtpe is still chugging along nicely :)
  • Spoelie - Thursday, January 26, 2006 - link

    it won't, they're in a different pricerange alltogether, prices on those cards will not drop before ati brings out a capable competitor to it.
  • neweggster - Thursday, January 26, 2006 - link

    How hard would it be for this new series of cards by ATI to be optimized for all benchmarking softwares? Well ask yourself that, I just got done talking to a buddy of mine whos working out at MSI. I swear I freaked out when he said that ATI is using an advantage they found by optimizing the new R580's to work better with the newest benchmarking programs like 3Dmark 06 and such. I argued with him thats impossible, or is it? Please let me know, did ATI possibly use optimizations built into the new R580 cards to gain this advantage?
  • Spoelie - Thursday, January 26, 2006 - link

    how would validating die-space on a gpu for cheats make any sense? If there is any cheat it's in the drivers. And no, the only thing is that 3dmark06 needs 24bit DST's for its shadowing and that wasn't supported in the x1800xt (uses some hack instead) and it is supported now. Is that cheating? The x1600 and x1300 have support for this as well btw, and they came out at the same time as the x1800.

    Architecturally optimizing for one kind of rendering being called a cheat would make nvidia a really bad company for what they did with the 6x00/Doom3 engine. But noone is complaining about higher framerates in those situations now are they?
  • Regs - Thursday, January 26, 2006 - link

    ....Where in this article do you see a 3D Mark score?
  • mi1stormilst - Thursday, January 26, 2006 - link

    It is not impossible, but unless your friend works in some high level capacity I would say his comments at best are questionable. I don't think working in shipping will qualify him as an expert on the subject?
  • coldpower27 - Wednesday, January 25, 2006 - link

    http://www.anandtech.com/video/showdoc.aspx?i=2679...">http://www.anandtech.com/video/showdoc.aspx?i=2679...

    "Notoriously demanding on GPUs, F.E.A.R. has the ability to put a very high strain on graphics hardware, and is therefore another great benchmark for these ultra high-end cards. The graphical quality of this game is high, and it's highly enjoyable to watch these cards tackle the F.E.A.R demo."


    Wasn't use of this considered a bad idea as Nvidia cards have a huge performance penalty when used in this and the final buuld was supposed to be much better???
  • photoguy99 - Wednesday, January 25, 2006 - link

    I noticed 1900x1440 is commonly benchmarked -

    Wouldn't the majority of people with displays in this range have 1920x1200 since that's what all the new LCDs are using? And it's the HD standard.

    Aren't LCDs getting to be pretty capable game displays? My 24" Acer has a 6 ms (claimed) gray to gray response time, and can at least hold it's own.

    Resolution for this monitor and almost all others this large: 1920x1200 - not 1920x1440.
  • Per Hansson - Wednesday, January 25, 2006 - link

    Doing the math:

    Crossfire = 459w - 1900XTX = 341w = 118w, efficiency of PSU used@400w=78% so 118x0.78=92,04w
  • Per Hansson - Friday, January 27, 2006 - link

    No replies huh? Cause I've read on other sites that the card draws upto 175w... Seems like quite a stretch so that was why I did the math to start with...

Log in

Don't have an account? Sign up now