The Beginning: The Shot Heard Around the World

It all started back in 2001 when ATI, independent at the time, was working on the R300 GPU (Radeon 9700 Pro). If you were following the industry at all back then, you’d never forget the R300. NVIDIA was steadily gaining steam and nothing ATI could do was enough to dethrone the king. The original Radeon was a nice attempt but poor drivers and no real performance advantage kept NVIDIA customers loyal. The Radeon 8500 wasn’t good at all; there was just no beating NVIDIA’s GeForce4, the Ti 4200 did well in the mainstream market and the Ti 4600 was king of the high end.

While ATI was taking punches with the original Radeon and Radeon 8500, internally the company decided that in order to win the market - it had to win the halo. If ATI could produce the fastest GPU, it would get the brand recognition and loyalty necessary to not only sell those high end GPUs but also lower end models at cheaper price points. The GPU would hit the high end first, but within the next 6 - 12 months we’d see derivatives for lower market segments. One important takeaway is that at this point, the high end of the market was $399 - keep that in mind.

With everyone at ATI thinking that they had to make the fastest GPU in the world in order to beat NVIDIA, the successor to the Radeon 8500 was going to be a big GPU. The Radeon 8500 was built on a 0.15-micron manufacturing process and had around 60M transistors; R300 was going to be built on the same process, but with 110M transistors - nearly twice that of the 8500 without a die shrink.

Its competition, the GeForce4 was still only a 63M transistor chip and even NVIDIA didn’t dare to build something so big on the 150nm node, the GF4 successor would wait for 130nm.

We all know how the story unfolded from here. The R300 was eventually branded the ATI Radeon 9700 Pro and mopped the floor with the GeForce4. What Intel did to AMD with Conroe, ATI did to NVIDIA with R300 - back in 2002.

The success with R300 solidified ATI’s strategy: in order to beat NVIDIA, it had to keep pushing the envelope for chip size. Each subsequent GPU would have to be bigger and faster at the high end. Begun these GPU wars had.

Index Re-evaluating Strategy, Creating the RV770 in 2005
POST A COMMENT

116 Comments

View All Comments

  • MrSpadge - Saturday, December 6, 2008 - link

    Exactly what I was thinking! That's why I got a 8500LE back then, when Geforce 4 was not in (public) sight yet. Reply
  • FireSnake - Wednesday, December 3, 2008 - link

    ... which one is Anand (on the picture at the beginning of the article)?

    I always wondered how he looks like ... I guess the one on the right.
    Reply
  • 3DoubleD - Wednesday, December 3, 2008 - link

    I've had Anandtech as my home page for 5 years and I've read almost every article since (and even some of the older ones). This is by far one of your greatest works!

    Thanks
    Reply
  • hellstrider - Wednesday, December 3, 2008 - link

    Kudos to Anand for such a great article, extremely insightful. I may even go out and purchase AMD stock now :)

    I love AMD even when it’s on the bottom, I own 780G + X2 + hd4850, in hopes that Deneb (or AM3 processors for that matter) will come in time to repeat the success of rv770 launch, at which point I will upgrade my obsolete X2 and have a sweet midrange machine.

    My only concern is that Nvidia is looking at all this smirking and planning an onslaught with the 55nm refresh. There is a very “disturbing” article at Xbitlabs that Nvidia is stock-piling the 55nm GT200 parts; seems like that’s something they would do – start selling those soon and undercut 4800 series badly.
    I’m just a concerned hd4850 owner and I don’t want to see my card obsolete within couple of months. I don’t really see AMD’s answer to 55nm GT200 in such short period of time?!?!

    Any thoughts?
    Reply
  • Goty - Wednesday, December 3, 2008 - link

    I don't think you'll have to worry too badly about the 55nm G200s. NVIDIA won't drop prices much, if at all; they're already smarting from the price drops enacted after the RV770 launch. There's also the fact that the 4850 isn't in the same market space as any of the G200 cards, so they're not really competitive anyhow. Reply
  • ltcommanderdata - Wednesday, December 3, 2008 - link

    I always imagined designing GPUs would be very stressful given you're trying to guess things years in advance, but this inside look at how things are done was very informative.

    On GDDR5, it's interesting to read that ATI was pushing so hard for this technology and they felt it was their only hope for the RV770. What about GDDR4? I thought ATI was a big supporter of it too and was the first to implement it. I'm pretty sure Samsung announced GDDR4 that could run at 3.2GBit/s in 2006 which isn't far from the 3.6GBit/s GDDR5 used in the 4870, and 4GBit/s GDDR4 was available in 2007. I guess there are still power savings to be had from GDDR5, but performance-wise I don't think it would have been a huge loss if GDDR5 had been delayed and ATI had to stick with GDDR4.

    And another interesting point in your article was definitely about the fate of the 4850. You report that ATI felt that the 4870 was perfectly specced and wasn't changed. I guess that meant they were always targeting the 750MHz core frequency that it launched with. Yet ATI was originally targeting the 4850 at 500MHz clock. With the 4870 being clocked 50% faster, I think it should be obvious to anyone just looking at the clock speed that there would be a huge performance gap between the 4850 and 4870. I believe the X1800XL and X1800XT had a similarly large performance gap. Thankfully Dave Baumann convinced them to clock the 4850 up to a more reasonable 625MHz core.

    One thing that I feel was missing from the article was how the AMD acquisition effected the design of the RV770. Perhaps there wasn't much change or the design was already set so AMD couldn't have changed things even if they wanted to, but they must have had an opinion. AMD was probably nervous that they bought ATI at it's height when the R580 was out and top, but once acquired, the R600 came out and underperformed. Would be interesting to know what AMD's initial opinion of ATI's small die, non-top tier targetted strategy was although it now seems to be more consistent with AMD's CPU strategy since they aren't targeting the high-end there anymore either.
    Reply
  • hooflung - Wednesday, December 3, 2008 - link

    The final frontier market share wise is to steal a major vendor like eVGA. If they can get an eVGA, BFG or XFX to just sell boards with their warranties AMD would be really dominant. Reply
  • JonnyDough - Wednesday, December 3, 2008 - link

    The best thing I've ever read on a tech site. This is why you're better than THG.

    Only one typo! It was a "to" when it should have been a "too."

    Chalk one up for the red team. This makes my appreciation for AMD rise even more. Anyone willing to disclose internal perspectives about the market like this is a team with less secrecy that I will support with my hard earned cash. So many companies could stand up and take a lesson here from this (i.e. Apple, MS).

    Keep articles like this coming, and I'll keep coming back for more.

    Sincerely,

    ~Ryan
    Reply
  • epyon96 - Wednesday, December 3, 2008 - link

    I have been an avid reader of this site for close to 8 years. I used to read almost every CPU, GPU and novelty gadget articles page to page. But over the years, my patience is much lower and I realize I get just as much enjoyment and information from just reading the first page and last page and skimming a few benchmarks.

    However, this is the first article in a while that I spent reading all of it and I thoroughly enjoyed it. These little back stories with a human element in one of the most interesting recent launches provides a refreshing change from boring benchmark-oriented articles.

    I hope to find an article based on Nehpalem of a similar nature and other Intel launches.

    Reply
  • GFC - Wednesday, December 3, 2008 - link

    Wow, all i can say is that i loved this review. It was realy enjoyable to read, and i must give my thanks to Anandtech and Carrell! Reply

Log in

Don't have an account? Sign up now