The part that everyone wants to hear about is, of course, the Radeon X1800 based on ATI's long awaited R520 GPU. Due for introduction later this quarter, the 90nm R520 will be a 16-pipe, 16-shader processor design with a number of different SKUs based on the GPU. Internal ATI documentation specifically claims that the R520 series will ship at launch, just as NVIDIA's 7800GTX and 7800GT series shipped and launched on the same dates.

ATI R520 Roadmap and Pricing
Card Pipes Std Core Clock Std Memory MSRP
X1800 XT 16 600MHz 700MHz 512MB GDDR3 $599
X1800 XL 16 550MHz 625MHz 512MB GDDR3 $499
X1800 Pro 16 500MHz 500MHz 256MB GDDR3 $449
X1800 LE 12 450MHz 450MHZ 256MB GDDR3 $349

Common features to all R520 based boards include the new 90nm lead free manufacturing process, a Xilleon based TV encoder, SM3.0, H.264 decode acceleration and CrossFire support. Also expect to see HDTV options for all 90nm ATI cards in the near future, although they may be limited to the All In Wonder series for R520.

At the top end is the Radeon X1800 XT; this 16-pipe R520 will feature a 600MHz core clock, with a 256-bit memory bus connected to 512MB of GDDR3 memory clocked at 700MHz. The 600MHz core clock will give it a lower fill rate than the GeForce 7800 GTX (24-pipes at 430MHz), while the 700MHz memory clock will give it more memory bandwidth than the stock GTX (600MHz). Much like the GTX, the X1800 XT will be priced at $599. The X1800 XT will feature two DVI outputs with HDCP support. The lower fillrate seems alarming at first, but consider several factors. First of all, ATI's traditional core design can do "more" per clock cycle (at least on the R420 design) than NVIDIA. Secondly, R520 has a lot of little tweaks including hardware asissted H.264 decoding. Just last week, we also received details about ATI's revamped memory controller which operates on an internal 512-bit ring bus. There is a lot to speculate about performance, but even with similar fill rates as NVIDIA, there is a strong possibility that other workings in R520 will differentiate the card on a real world performance level.

Next up is the Radeon X1800 XL, which is positioned between the GeForce 7800 GTX and the 7800 GT. The XL drops the core clock down to 550MHz, and the memory clock down to 625MHz. Other than the lower clock speeds, the XL is identical to the XT, meaning it still has 512MB of GDDR3 memory connected to a 256-bit memory bus. The X1800 XL will be priced at $499. Both the X1800 XT and X1800 XL appear to be dual-slot designs from previous roadmaps and existing box art. The roadmap also details that there will be HDCP support for the X1800 XL and X1800 XT via Texas Instrument's TFP513PAP DVI transmitter.

Priced at $449, we have the X1800 Pro, once more a 16-pipe R520 design but this time the core runs at 500MHz. The Radeon X1800 Pro only has 256MB of memory, also running at 500MHz, but still retains the same 256-bit memory bus. What is interesting about the Radeon X1800 Pro is that its fill rate and memory bandwidth appear to be identical to that of NVIDIA's GeForce 7800GT; coincidentally, so does its price. The reference design for the X1800 Pro features a single VGA and a single DVI connector, with no HDCP support.

The last member of the R520 family is the Radeon X1800 LE, which disables four of the pipelines of the R520 taking it down to a 12-pipe design. The LE runs at 450MHz with 256MB of 450MHz GDDR3 memory. Once again we're dealing with a 256-bit memory bus, and this time a $349 price tag. The outputs are identical to the X1800 Pro. Both the Pro and LE cards are single slot cooling design, thanks to their lower running clock speeds.

According to our roadmaps, it looks like ATI will abandon the "vanilla" nomenclature for future products. For example, instead of a plain X1800, instead we will get an X1800 LE. Likewise, on our previous roadmaps components that were named with the non-XT non-LE non-Pro non-XL name will thus become "LE" parts. Certainly a good move on ATI's behalf, as "vanilla" X800 cards are hard enough to explain to readers.

The roadmap also refers to R580, and that the card is working in-house at the moment. R580 is essentailly a clock ramp and pipe ramp of R520, but both of those details have not been disclosed yet (even to AIBs). Unforunately, the R580 will not ship at the same time as R520.

RV530
Comments Locked

65 Comments

View All Comments

  • imaheadcase - Wednesday, September 14, 2005 - link

    But you don't need the fastest CPU out to get the latest and greatest, unlike graphics card.
  • IKeelU - Wednesday, September 14, 2005 - link

    Would you be more satisfied if they released $300 vid cards every 18 months? It would definately cost less to own the latest-and-greatest.

    I wouldn't, because having super-high end video cards is a good thing, no matter how much they cost. This is what makes PCs such an great platform: if you want to pay more to get better graphics, you have the ability to do so. Games devs will always target the most popular platform, so you will never *need* a top-of-the line card to get a good experience (HL2 kicked ass on my 3.5 year-old PC).

    You no longer need a high-end CPU to enjoy games because, generally, user demand for game improvements in CPU-intensive functions has gone down (only with the popularity of realistic physics has demand gone up, but with dedicated physics cards on the way, it will go down once again, just like when GPUs took over transform and lighting).
  • yacoub - Thursday, September 15, 2005 - link

    "Would you be more satisfied if they released $300 vid cards every 18 months? "

    As opposed to $600 cards every 6 months, thus costing twice the price and being outdated three times as quickly? Yes, I'd rather have a $300 purchase every 18 months, which is about as frequently as a person currently needs to upgrade a videocard anyway. The Radeon 9800 Pro 128mb card has been out around 24 months now, IIRC, and it is just now needing to be replaced by a 7800-series card to run very high resolutions smoothly in the latest games. So yeah, $300 every 18 months is about right.
  • xsilver - Wednesday, September 14, 2005 - link

    whats wrong is people that are WILLING to pay $500-600 for a video card

    its all about supply and demand

    also for as long as I can remember now high end pcs cost a little over 2k US, over time, many things have reduced in price, so expensive video cards just make it possible to keep the total system cost to around the same mark
  • tonyou - Tuesday, September 13, 2005 - link

    The price for the 512MB X1800 XT looks like a steal if it can debut at the same price NVIDIA had for the 256MB 7800GTX. Damn, and I just bought a 7800GTX, hopefully I won't regret it!
  • Cybercat - Tuesday, September 13, 2005 - link

    "First of all, ATI's traditional core design can do "more" per clock cycle (at least on the R420 design) than NVIDIA."

    refering to what? obviously not shader ops...
  • jonny13 - Tuesday, September 13, 2005 - link

    Being that the GTX has been out for so much longer and the prices have dropped since release, the GTX retails for about the same price as the X1800 PRO. That might be a tough sell for ATI unless these cards are monsters at all price points. Either way, it should be interesting to see how the new cards perform as they look competetive on paper.
  • Pete84 - Tuesday, September 13, 2005 - link

    Heck, ATi has a card for EVERY price point!!! When are the mid and low range GeForce 7's going to come out?
  • coldpower27 - Tuesday, September 13, 2005 - link

    When Nvidia is ready to debut products on their 90nm process. The 6800 GT, 6800 will do in the meantime. They are basically feature complete speedwise they maybe a problem.
  • xsilver - Wednesday, September 14, 2005 - link

    i dont see how the 6800gt can be a "problem" -- it kicks! the only reason why the x800xl is competitive right now is price, when these ati cards come out, nvidia will sure revise their whole pricing scheme

    by looking at the speculation right now - the ati cards may only perform marginally better than the nvidia counterparts, not quite the revolutionary "kick ass" chip everyone's been expecting

Log in

Don't have an account? Sign up now