40nm Supply Redux

If you have seen our Radeon 5800 supply article, then you know that AMD is currently trying to come to terms with a significant shortage of Cypress dice. Since the 5800 series launched in September, TSMC’s yields have taken a hit as the company ramps up 40nm production. And while this is resulting in more usable chips per week than when AMD started, it’s lower than it was supposed to be. Compounding matters is high demand for these cards thanks to their performance, features, and a lack of significant competition from NVIDIA at this time.

So when we were briefed about the 5970, we asked AMD point-blank whether it was a good idea to be launching another Cypress based card so soon, and at a time when they already don’t have enough chips to go around. Their answer was equally straightforward: why not?

The design is done and AMD is already capable of building the 5970. For AMD, there is no benefit in waiting; no matter what they do, anything with a Cypress chip in it today is going to sell out. Holding back may be slightly more egalitarian, but as the 5970 is a luxury part, it’s not a high-volume part anyhow, so its introduction isn’t going to significantly disrupt 5800 shipments even if it does use 2 GPUs per card. Ultimately I don’t think we would even be having this discussion unless the profit margin on the 5970 is higher than the 5870, so at some point this comes down to AMD doing what is most profitable for them.

At $600, AMD isn’t going to sell a ton of 5970s, and the launch numbers reflect this. While the 5800 series cards launched with tens-of-thousands of cards, the 5970 launch will simply be with thousands of cards. Even as a low-volume part, we’re expecting the 5970 to sell out just as fast as any 5800 card did. But depending on what AMD does with future chip shipments though and what TSMC’s yields do, this may be the first product line where demand finally gets met in the near future.

We also had a chance to talk to AMD about the overall 40nm supply situation. AMD of course isn’t very pleased with the situation, but this is something they’ve apparently planned for, after their first 40nm test chips came back as being less impressive than their 55nm and 65nm test chips were. Besides TSMC’s subpar yields, AMD is unable to get as many wafer starts as they’d like, which is compounding the issue.

Finally, we’re told that the TSMC situation is continuing to improve, and that AMD currently expects the Cypress chip supply to pick up in December. To what level of production “pick up” goes with we’re not sure, but it’s likely less than demand. In talking to AMD, they didn’t seem confident in being able to keep any Cypress-based products in stock through Christmas. Supplies will improve through the end of the year, but it sounds like it’s going to be 2010 before supply and demand finally balance out.

Meet The 5970 The Card They Beg You to Overclock
POST A COMMENT

114 Comments

View All Comments

  • SJD - Wednesday, November 18, 2009 - link

    Thanks Anand,

    That kind of explains it, but I'm still confused about the whole thing. If your third monitor supported mini-DP then you wouldn't need an active adapter, right? Why is this when mini-DP and regular DP are the 'same' appart from the actual plug size. I thought the whole timing issue was only relevant when wanting a third 'DVI' (/HDMI) output from the card.

    Simon
    Reply
  • CrystalBay - Wednesday, November 18, 2009 - link

    WTH is really up at TWSC ? Reply
  • Jacerie - Wednesday, November 18, 2009 - link

    All the single game tests are great and all, but once I would love to see AT run a series of video card tests where multiple instances of games like EVE Online are running. While single instance tests are great for the FPS crowd, all us crazy high-end MMO players need some love too. Reply
  • Makaveli - Wednesday, November 18, 2009 - link

    Jacerie the problem with benching MMO's and why you don't see more of them is all the other factors that come into play. You have to now deal with server latency, you also have no control of how many players are usually in the server at any given time when running benchmarks. There is just to many variables that would not make the benchmarks repeatable and valid for comparison purposes! Reply
  • mesiah - Thursday, November 19, 2009 - link

    I think more what he is interested in is how well the card can render multiple instances of the game running at once. This could easily be done with a private server or even a demo written with the game engine. It would not be real world data, but it would give an idea of performance scaling when multiple instances of a game are running. Myself being an occasional "Dual boxer" I wouldn't mind seeing the data myself. Reply
  • Jacerie - Thursday, November 19, 2009 - link

    That's exactly what I was trying to get at. It's not uncommon for me to be running at lease two instances of EVE with an entire assortment of other apps in the background. My current 3870X2 does the job just fine, but with 7 out and DX11 around the corner I'd like to know how much money I'm going to need to stash away to keep the same level of usability I have now with the newer cards. Reply
  • Zool - Wednesday, November 18, 2009 - link

    The so fast is only becouse 95% of the games are dx9 xbox ports. Still crysis is the most demanding game out there quite a time (it need to be added that it has a very lazy engine). In Age of Conan the diference in dx9 and dx10 is more than half(with plenty of those efects on screen even1/3) the fps drop. Those advanced shader efects that they are showing in demos are actualy much more demanding on the gpu than the dx9 shaders. Its just the thing they dont mention it. It will be same with dx11. A full dx11 game with all those fancy shaders will be on the level of crysis. Reply
  • crazzyeddie - Wednesday, November 18, 2009 - link

    ... after their first 40nm test chips came back as being less impressive than **there** 55nm and 65nm test chips were. Reply
  • silverblue - Wednesday, November 18, 2009 - link

    Hehe, I saw that one too. Reply
  • frozentundra123456 - Wednesday, November 18, 2009 - link

    Unfortunately, since playing MW2, my question is: are there enough games that are sufficiently superior on the PC to justify the inital expense and power usage of this card? Maybe thats where eyefinity for AMD and PhysX for nVidia come in: they at least differentiate the PC experience from the console.
    I hate to say it, but to me there just do not seem to be enough games optimized for the PC to justify the price and power usage of this card, that is unless one has money to burn.
    Reply

Log in

Don't have an account? Sign up now