While AMD will tell us that R600 is not late and hasn't been delayed, this is simply because they never actually set a public date from which to be delayed. We all know that AMD would rather have seen their hardware hit the streets at or around the time Vista launched, or better yet, alongside G80. But the fact is that AMD had quite a few problems in getting R600 out the door.

While we couldn't really get the whole story from anyone, we heard bits and pieces here and there during our three day briefing event in Tunis, Tunisia. These conversations were short and scattered and not the kind of thing that it's easy to get a straight answer about when asking direct questions. Keeping that in mind, we do have some information and speculation about a few of the road bumps AMD faced with R600.

Apparently, the first spin of R600 silicon could only communicate over the debugging interface. While the upside is that the chip wasn't totally dead, this is not a good problem to have. We also overheard that a later revision of the hardware suffered from fragments getting stuck in pixel shaders. We even overheard one conversation where someone jokingly remarked that AMD should design hardware but leave the execution to NVIDIA.

In a wild bout of pure speculation on our part, we would have to guess about one other problem that popped up during R600's creation. It seems to us that AMD was unable to get their MSAA hardware to work properly and was forced to use shader hardware to handle MSAA rather than go back for yet another silicon revision. Please know that this is not a confirmed fact, but just an educated guess.

In another unique move, there is no high end part in AMD's R600 lineup. The Radeon HD 2900 XT is the highest end graphics card in the lineup and it's priced at $399. While we appreciate AMD's intent to keep prices in check, the justification is what we have an issue with. According to AMD, it loses money on high end parts which is why we won't see anything more expensive than the 2900 XT this time around. The real story is that AMD would lose money on a high end part if it wasn't competitive, which is why we feel that there's nothing more expensive than the 2900 XT. It's not a huge deal because the number of people buying > $399 graphics cards is limited, but before we've started the review AMD is already giving up ground to NVIDIA, which isn't a good sign.

More than anything, we'd guess that the lack of a high end part has a lot to do with the delays and struggles AMD saw this time around in bringing R600 to market. We expect to see the return of a very high end part by the time R700 comes around, assuming that there aren't similarly debilitating delays.

The delays and lack of a high end would be beyond perfect if the Radeon HD 2900 XT could do to NVIDIA what the G80 launch did to ATI, unfortunately the picture just isn't that rosy. ATI's latest and greatest doesn't exactly deliver the best performance per watt, so while it doesn't compete performance-wise with the GeForce 8800 GTX it requires more power. An ultra high end power requirement in a sub-$400 graphics card isn't exactly ideal.

Despite all of this, there's a great deal of cool technology in the R600, and as ATI is now a part of a CPU company, we received more detail on the GPU than we've gotten during any other GPU launch. AMD takes graphics very seriously, and it recently reaffirmed its commitment to continue to deliver high end discrete graphics cards, so amidst countless delays and rumors of strange problems, the R600 architecture is quite possibly more important to AMD than the graphics cards themselves. An eventual derivative of this architecture will be used in AMD's Fusion processors, eventually making their way into a heterogeneous multi-core AMD microprocessor.

With AMD's disappointing Q1, it can't rest too much on the hope of Fusion changing the market, so we'll have to start by looking at where R600 is today and how it stacks up to NVIDIA's latest and almost greatest.

DX10 Redux
Comments Locked

86 Comments

View All Comments

  • dragonsqrrl - Thursday, August 25, 2011 - link

    You forgot c).

    -if you're an ATI fanboy
  • vijay333 - Monday, May 14, 2007 - link

    http://www.randomhouse.com/wotd/index.pperl?date=1...">http://www.randomhouse.com/wotd/index.pperl?date=1...

    "the expression to call a spade a spade is thousands of years old and etymologically has nothing whatsoever to do with any racial sentiment."
  • strikeback03 - Wednesday, May 16, 2007 - link

    What about in Euchre, where a spade can be a club (and vice versa)?
  • johnsonx - Monday, May 14, 2007 - link

    Just wait until AT refers to AMD's marketing budget as 'niggardly'...
  • bldckstark - Monday, May 14, 2007 - link

    What do shovels have to do with race?
  • Stan11003 - Monday, May 14, 2007 - link

    My big hope out all of this that the ATI part forces the Nvidia parts lower so I can use my upgrade option from EVGA to get a nice 8800 GTX instead of my 8800 GTS ACS3 320. However with a quad core and a decent 2GB I have no gaming issues at all. I play at 1600x1200(when that become a low rez?) and everything is butter smooth. Without newer titles all this hardware is a waist anyways.
  • Gul Westfale - Monday, May 14, 2007 - link

    the article says that the part is not a failure, but i disagree. i switched from a radeon 1950pro to an nvidia geforce 8800GTS 320MB about a mont ago, and i paid only $350US for it. now i see that it still outperforms the new 2900...

    one of my friends wanted to wait to buy a new card, he said he hoped that the ATI part was going to be faster. now he says he will just buy the 8800GTS 320, since ATI have failed.

    if they can bring out a part that competes well with the 8800GTS and price it similarly or lower then it would be worth buying, but until then i will stick with nvidia. better performance, better price, and better drivers... why would anyone buy the ATI card now?
  • ncage - Monday, May 14, 2007 - link

    My conclusion is to wait. All of the recent GPU do great with dx9...the question is how will they do with dx10? I think its best to wait for dx10 titles to come out. I think crysis would be a PERFECT test.
  • wingless - Monday, May 14, 2007 - link

    I agree with you. Crysis is going to be the benchmark for these DX10 cards. Its hard to tell both Nvidia and AMD's DX10 performance with these current, first generation DX10 titles (most of which have a DX9 version) because they don't fully take advantage of all the power on both the G80 or R600 yet. Its true that Crysis will have a DX9 version as well but the developer stated there are some big differences in code. I'm an Nvidia fanboy but I'm disappointed with the Pure Video and HDMI support on the 8800 series cards. ATI got this worked out with their great AVIVO and their nice HDMI implementation but for now Nvidia is still the performance champ with "simpler" hardware. The G80 and R600 continue the traditions of their manufacturers. Nvidia has always been about raw power and all out speed with few bells and whistles. ATI is all about refinement, bells and whistles, innovations, and unproven new methods which may make or break them.

    All I really want to wait for is to see how developers embrace CUDA or ATI's setup for PHYSICS PROCESSING! Both companies seem to have well thought out methods to do physics and I cant wait to see that showdown. AGEIA and HAVOK need to hop on-board and get some software support for all this good hardware potential they have to play with. Physics is the next big gimmick and you know how much we all love gimmicks (just like good 'ole 3D acceleration 10 years ago).
  • poohbear - Monday, May 14, 2007 - link

    they dont make a profit from high end parts that's why they're not bothering w/ it? that's AMD's story? so why bother having an FX line w/ their cpus?

Log in

Don't have an account? Sign up now