The part that everyone wants to hear about is, of course, the Radeon X1800 based on ATI's long awaited R520 GPU. Due for introduction later this quarter, the 90nm R520 will be a 16-pipe, 16-shader processor design with a number of different SKUs based on the GPU. Internal ATI documentation specifically claims that the R520 series will ship at launch, just as NVIDIA's 7800GTX and 7800GT series shipped and launched on the same dates.

ATI R520 Roadmap and Pricing
Card Pipes Std Core Clock Std Memory MSRP
X1800 XT 16 600MHz 700MHz 512MB GDDR3 $599
X1800 XL 16 550MHz 625MHz 512MB GDDR3 $499
X1800 Pro 16 500MHz 500MHz 256MB GDDR3 $449
X1800 LE 12 450MHz 450MHZ 256MB GDDR3 $349

Common features to all R520 based boards include the new 90nm lead free manufacturing process, a Xilleon based TV encoder, SM3.0, H.264 decode acceleration and CrossFire support. Also expect to see HDTV options for all 90nm ATI cards in the near future, although they may be limited to the All In Wonder series for R520.

At the top end is the Radeon X1800 XT; this 16-pipe R520 will feature a 600MHz core clock, with a 256-bit memory bus connected to 512MB of GDDR3 memory clocked at 700MHz. The 600MHz core clock will give it a lower fill rate than the GeForce 7800 GTX (24-pipes at 430MHz), while the 700MHz memory clock will give it more memory bandwidth than the stock GTX (600MHz). Much like the GTX, the X1800 XT will be priced at $599. The X1800 XT will feature two DVI outputs with HDCP support. The lower fillrate seems alarming at first, but consider several factors. First of all, ATI's traditional core design can do "more" per clock cycle (at least on the R420 design) than NVIDIA. Secondly, R520 has a lot of little tweaks including hardware asissted H.264 decoding. Just last week, we also received details about ATI's revamped memory controller which operates on an internal 512-bit ring bus. There is a lot to speculate about performance, but even with similar fill rates as NVIDIA, there is a strong possibility that other workings in R520 will differentiate the card on a real world performance level.

Next up is the Radeon X1800 XL, which is positioned between the GeForce 7800 GTX and the 7800 GT. The XL drops the core clock down to 550MHz, and the memory clock down to 625MHz. Other than the lower clock speeds, the XL is identical to the XT, meaning it still has 512MB of GDDR3 memory connected to a 256-bit memory bus. The X1800 XL will be priced at $499. Both the X1800 XT and X1800 XL appear to be dual-slot designs from previous roadmaps and existing box art. The roadmap also details that there will be HDCP support for the X1800 XL and X1800 XT via Texas Instrument's TFP513PAP DVI transmitter.

Priced at $449, we have the X1800 Pro, once more a 16-pipe R520 design but this time the core runs at 500MHz. The Radeon X1800 Pro only has 256MB of memory, also running at 500MHz, but still retains the same 256-bit memory bus. What is interesting about the Radeon X1800 Pro is that its fill rate and memory bandwidth appear to be identical to that of NVIDIA's GeForce 7800GT; coincidentally, so does its price. The reference design for the X1800 Pro features a single VGA and a single DVI connector, with no HDCP support.

The last member of the R520 family is the Radeon X1800 LE, which disables four of the pipelines of the R520 taking it down to a 12-pipe design. The LE runs at 450MHz with 256MB of 450MHz GDDR3 memory. Once again we're dealing with a 256-bit memory bus, and this time a $349 price tag. The outputs are identical to the X1800 Pro. Both the Pro and LE cards are single slot cooling design, thanks to their lower running clock speeds.

According to our roadmaps, it looks like ATI will abandon the "vanilla" nomenclature for future products. For example, instead of a plain X1800, instead we will get an X1800 LE. Likewise, on our previous roadmaps components that were named with the non-XT non-LE non-Pro non-XL name will thus become "LE" parts. Certainly a good move on ATI's behalf, as "vanilla" X800 cards are hard enough to explain to readers.

The roadmap also refers to R580, and that the card is working in-house at the moment. R580 is essentailly a clock ramp and pipe ramp of R520, but both of those details have not been disclosed yet (even to AIBs). Unforunately, the R580 will not ship at the same time as R520.

RV530
POST A COMMENT

65 Comments

View All Comments

  • Griswold - Wednesday, September 14, 2005 - link

    quote:

    by looking at the speculation right now - the ati cards may only perform marginally better than the nvidia counterparts, not quite the revolutionary "kick ass" chip everyone's been expecting


    Yea but what if they got a 24 or 32 pipe version up their sleeve? I planned to buy a 7800GT for my new box this year, but after finding out how bad the visual quality (texture flickering) is compared to older GF and the current ATI card, I'm not so sure what I should do right now. I'm even tempted to get a very cheap card from either the current ATI line or the 6xxx series from NV and replace it later with a 7800GT (if NV can fix their visual problem with drivers) or the next ATI line - if it's superior.

    At any rate, it will be a step-up from my trusty 9700pro. :)
    Reply
  • patrick0 - Wednesday, September 14, 2005 - link

    Texture flickering has been fixed with the new driver release. Reply
  • Griswold - Thursday, September 15, 2005 - link

    quote:

    Texture flickering has been fixed with the new driver release.


    Really? Would be good news.
    Reply
  • nserra - Wednesday, September 14, 2005 - link

    What more nvidia optimizations? Never heard that before.... Reply
  • Griswold - Wednesday, September 14, 2005 - link

    Check out this article:

    http://tinyurl.com/9kwzn">http://tinyurl.com/9kwzn
    Reply
  • nserra - Wednesday, September 14, 2005 - link

    I dindt know that....

    But why toms, anand, xbit, ....
    doesnt say anything about it...

    But what is nvidia trying to achive? Sis Xabre image quality levels?

    I dont understand the AF hit is much lower then AA. Why remove quality from it if AF is much more important (with less performance it than AA) to achive higger image quality levels.

    Why did Nvidia disable the "old" AF of Geforce3/4 (and FX)? Is it impossible to suppord both at hardware and driver level?
    Reply
  • Griswold - Thursday, September 15, 2005 - link

    It seems that FPS is what sells hardware these days.. ATI is no exception, though their image quality was not as low as the nvidia counterparts. This episode taught me a lesson though. I value image quality very high, especially when I put down several hundred bucks for a single piece of hardware. I will check and doublecheck any vid card from either company before I even consider upgrading in the future.

    Somebody mentioned that the newest drivers fixed the texture flickering, gonna have to check that out somehow before I order the 7800GT I planned to buy.
    Reply
  • Slappi - Tuesday, September 13, 2005 - link

    So for $50 more you get 1250mhz memory vs. 1000mhz memory and a 550mhz core vs. a 500mhz core AND 256MB of more memory?!?

    I smell BS.
    Reply
  • KristopherKubicki - Tuesday, September 13, 2005 - link

    MSRPs have a difficult time translating into Retail.

    Kristopher
    Reply
  • knitecrow - Tuesday, September 13, 2005 - link

    When doesthe NDA expire? Reply

Log in

Don't have an account? Sign up now