The Bet, Would NVIDIA Take It?

In the Spring of 2005 ATI had R480 on the market (Radeon X850 series), a 130nm chip that was a mild improvement over R420 another 130nm chip (Radeon X800 series). The R420 to 480 transition is an important one because it’s these sorts of trends that NVIDIA would look at to predict ATI’s future actions.

ATI was still trying to work through execution on the R520, which was the Radeon X1800, but as you may remember that part was delayed. ATI was having a problem with the chip at the time, with a particular piece of IP. The R520 delay ended up causing a ripple that affected everything in the pipeline, including the R600 which itself was delayed for other reasons as well.

When ATI looked at the R520 in particular it was a big chip and it didn’t look like it got good bang for the buck, so ATI made a change in architecture going from the R520 to the R580 that was unexpected: it broke the 1:1:1:1 ratio.

The R520 had a 1:1:1:1 ratio of ALUs:texture units:color units:z units, but in the R580 ATI varied this relationship to be a 3:1:1:1. Increasing arithmetic power without increasing texture/memory capabilities; ATI noticed that shading complexity of applications went up but bandwidth requirements didn’t, justifying the architectural shift.

This made the R520 to R580 transition a much larger one than anyone would’ve expected, including NVIDIA. While the Radeon X1800 wasn’t really competitive (partially due to its delay, but also due to how good G70 was), the Radeon X1900 put ATI on top for a while. It was an unexpected move that undoubtedly ruffled feathers at NVIDIA. Used to being on top, NVIDIA doesn’t exactly like it when ATI takes its place.

Inside ATI, Carrell made a bet. He bet that NVIDIA would underestimate R580, that it would look at what ATI did with R480 and expect that R580 would be similar in vain. He bet that NVIDIA would be surprised by R580 and the chip to follow G70 would be huge, NVIDIA wouldn’t want to lose again, G80 would be a monster.

ATI had hoped to ship the R520 in early summer 2005, it ended up shipping in October, almost 6 months later and as I already mentioned, it delayed the whole stack. The negative ripple effect made it all the way into the R600 family. ATI speculated that NVIDIA would design its next part (G71, 7900 GTX) to be around 20% faster than R520 and not expect much out of R580.


A comparison of die sizes for ATI and NVIDIA GPUs over the year, these boxes are to scale. Red is ATI, Green is NV.

ATI was planning the R600 at the time and knew it was going to be big; it started at 18mm x 18mm, then 19, then 20. Engineers kept asking Carrell, “do you think their chip is going to be bigger than this?”. “Definitely! They aren’t going to lose, after the 580 they aren’t going to lose”. Whether or not G80’s size and power was a direct result of ATI getting too good with R580 is up for debate, I’m sure NVIDIA will argue that it was by design and had nothing to do with ATI, and obviously we know where ATI stands, but the fact of the matter is that Carrell’s prediction was correct - the next generation after G70 was going to be a huge chip.

If ATI was responsible, even in part, for NVIDIA’s G80 (GeForce 8800 GTX) being as good as it was then ATI ensured its own demise. Not only was G80 good, but R600 was late, very late. Still impacted by the R520 delay, R600 had a serious problem with its AA resolve hardware that took a while to work through and ended up being a part that wasn’t very competitive. Not only was G80 very good, but without AA resolve hardware the R600 had an even tougher time competing. ATI had lost the halo, ATI’s biggest chip ever couldn’t compete with NVIDIA’s big chip and for the next year ATI’s revenues and marketshare would suffer. While this was going on, Carrell was still trying to convince everyone working on the RV770 that they were doing the right thing, that winning the halo didn’t matter...just as ATI was suffering from not winning the halo. He must’ve sounded like a lunatic at the time.

When Carrell and crew were specing the RV770 the prediction was that not only would it be good against similarly sized chips, but it would be competitive because NVIDIA would still be in overshoot mode after G80. Carrell believed that whatever followed G80 would be huge and that RV770 would have an advantage because NVIDIA would have to charge a lot for this chip.

Carrell and the rest of ATI were in for the surprise of their lives...

Building a GPU for the Masses If it Ain’t Broke...
Comments Locked

116 Comments

View All Comments

  • nezuko - Thursday, December 4, 2008 - link

    I think those phrase is describe what Graphic-field is. Another year win, and another year lose. But from those situations, only hardworking and tough guy would be able to turn all upside down. And ATi team do make it. Now I relieved I make a decision to buy 4670, though not performance, it still does big bang for the buck. And with those Catalyst 8.12, I would be more grateful that I bought this video card. Has been downloaded it and now testing it.

    Would Anand make another article about those GP-GPU programming language to make a data parralel computing possible.

    Well, I considering to build my Leo Platform in the H2 of 2009 when the AM3 Deneb is out, Sata 3, and RD890.
  • JimiP - Thursday, December 4, 2008 - link

    Like many before me have said, this has to be one of the best articles I've ever read here at AT. It really puts things into perspective. We (the consumer) are always criticizing or praising everything that comes out and don't take into account the amount of hard work and time put into the release. I'm 4850 owner, and I couldn't be happier with the performance I've received. I would like to personally thank ATI/AMD and the entire team that put RV770 into play. Absolutely brilliant.

    I would also like to thank Anand for sharing this awesome experience with us.
  • zshift - Thursday, December 4, 2008 - link

    I have to say this was a great article. Great idea to write about the story behind these guys and the rv770. musta been a helluva relief when they realized how great the gpus were in the market, especially after taking such huge risks. For these guys to pull through the way they did, with the whole gddr5 issue and the die-shrink/physical limitations is amazing. I thought I was stressed in college. I can't imagine what its like to design something like this for 3 years not being even sure it'll work in the end. That's one hell of a resolve, makes me like ATI a bit more than I already do.

    Keep writing great articles here, this is my favorite site to read reviews on, and this is another reason why.

    go anand! :p
  • strikeback03 - Thursday, December 4, 2008 - link

    I agree with everyone else that the article is very well written. I am not sure if these would even be the right guys to ask, but did you bring up any of the driver issues your other recent articles have mentioned with them? As you have mentioned before, it is probably not the best business plan to assume nVidia will screw up again, and they should probably get their crossfire support in order for the good feelings about this strategy to continue.
  • Dyno1979 - Thursday, December 4, 2008 - link

    Definitely one of the best articles I've read lately. And I didn't even notice that typo, probably because I was reading the article instead of looking at it.

    5 stars
  • CarrellK - Thursday, December 4, 2008 - link

    The "sweet spot" strategy would have amounted to *nothing* without the efforts of many very talented engineers (and a little luck as Anand has noted). They made the 770 happen and deserve the lion's share of the credit.

    I didn't think Anand would use this for anything other than background here-and-there in future articles. I fully expected him to politely cut me off at some point and say "about those future architectures..." which would have lead to Eric, Mike, and Mark telling a different interesting story. Thanks to Anand et al for telling this part of the 770 story. Responding to a comment or two in the posts:

    * Sorry to quench the speculation - the AMD purchase had no effect on the 770's execution. Dirk Meyer and the other AMD executives supported Rick, a guy that they really didn't know, during some pretty tough times at AMD. They did their jobs so that we could do ours.

    * The price range for 770-based cards was determined back in 2005 - it was an essential factor limiting the GPU cost, one of the big gambles. We had no clue what nV's 2008 pricing would be, but we did know what the gamers wanted. At launch we were tempted oh so briefly to launch at a higher price given the competitor's product offerings. It took some will-power for the starving man (us) to pass up a banquet (profits). We had a sneaking suspicion there was a lot of unhappiness about the direction prices had gone, and didn't want to be a party to that for the sake of a few weeks better revenue. Greed never pays. Remembering your customers does.

    P.S. We don't keep any dart-board pictures of Anand around the office. However I *do* recall seeing his picture somewhere and thinking at the time that it *would* make a good dart target. Just a thought... :-)

  • lyeoh - Sunday, December 7, 2008 - link

    You guys got the sweet spot right as far as I'm concerned (I'm not sure if it's true for others - does it show up in the units sold?)

    Before the ATI 3800 (RV670), and Nvidia 8800GT, it seemed like after shelling out a few hundred US dollars, you'd only get low/medium quality at current games. And cheaper cards were pathetic to unusable for new games.

    So I stuck to playing old games with on my old video card (Ti4200) - which was decent in its time.

    After the beginning of the new "sweet spot" era, this year I bought a 9800GT (and a new PC). While the 9800GT is not as good as AMD/ATI's offerings in hardware performance terms, I was concerned about ATI's drivers/software. A colleague tried an ATI card on his office PC, but in the end he had to switch to Nvidia to get his multiscreen set up on Linux working the way he wanted, and I had seen a fair number of complaints from others. So far Nvidia's drivers have been OK for me whether in Windows or Linux.

    On the other hand I've seen too many Nvidia cards failing in hardware terms (bad caps, bad whatever). So pick your poison ;).

    But if the cards aren't totally crap, it often takes less time to just replace a faulty card, than to keep tinkering with drivers and software configs (sometimes to no avail).

    Anyway, many thanks for helping to make stuff affordable, even though I picked Nvidia again ;).

    In the end I'm still back to mostly playing old games though...
  • MrSpadge - Saturday, December 6, 2008 - link

    Thanks Andantech, ATI & AMD for this amazing article!

    And I'd like to add a point which has not been raised yet, at least in this discussion: the "small and fast enough" strategy only works because GPUs hit the realm where they're power limited!

    The point is, whenever you go multi-GPU you loose performance due to inefficiencies and communication delays and there are also some transistors lost to redundant logic. If you had the choice between one 100 Mio transistor chip or 2 50 Mio ones, then the 100 Mio one would certainly be faster; assuming both could run at the same clock speed, which previously was determined by chip design (basically identical in the example) and process (identical).

    But GT200 is too big, it can not fully fledge its clock speed wings because its power limited. Imagine GT200 at 1.5 - 1.8 GHz shader clock - it would be much more in line with performance expectations. RV770 on the other hand can be pushed quite a bit and on the 4870 it chews up lots of power for such a small chip - but that's OK because this power envelope has been accepted and the performance is there to justify it. And the 2 GPU versions are succesful because the power envelope on such "freak"-cards is larger.

    And another frequently overlooked aspect: not all of GT200s transistors contribute to game performance. The 30 shaders which are 64 bit capable must be large and don't help games at all (and probably won't for quite some time). This is a very forward looking feature for games and a feature of immediate benefit for GP-GPU.

    MrS
  • Frallan - Friday, December 5, 2008 - link

    Thank you m8!

    Not only for delivering good products but also for delivering good information and entertainment.

    Please convey to the other "Fellows" the heartfelt thanks of this community.
  • JimmiG - Thursday, December 4, 2008 - link

    Congratulations to Anandtech for one of the most interesting articles this year. Congratulations to ATI/AMD for putting out their best and most exciting product since R300/9700 Pro.

    The industry really needed something like RV770. When the 9700 Pro came out in 2002, it was at the cutting edge of technology and performance, far ahead of the previous champion, the Ti4600, yet it launched at only $399. Nvidia launched the 8800 Ultra and GTX280 at $800 and $600 respectively, even though neither GPU introduced any significant new features, only moderately higher framerates.

    I currently have a 4850 512MB which I bought in July and I love it... It runs all my favorite games at great framerates and with fantastic image quality at 1680x1050. Still, I wouldn't considering myself an "ATI fan". When it's time for me to upgrade again, I will buy the best card in the $200 range and won't care whether the sticker on the GPU fan is green or red.

Log in

Don't have an account? Sign up now