There’s a lot to talk about with regards to NVIDIA and no time for a long intro, so let’s get right to it.

At the end of our Radeon HD 5850 Review we included this update:

Update: We went window shopping again this afternoon to see if there were any GTX 285 price changes. There weren't. In fact GTX 285 supply seems pretty low; MWave, ZipZoomFly, and Newegg only have a few models in stock. We asked NVIDIA about this, but all they had to say was "demand remains strong". Given the timing, we're still suspicious that something may be afoot.”

Less than a week later and there were stories everywhere about NVIDIA’s GT200b shortages. Fudo said that NVIDIA was unwilling to drop prices low enough to make the cards competitive. Charlie said that NVIDIA was going to abandon the high end and upper mid range graphics card markets completely.

Let’s look at what we do know. GT200b has around 1.4 billion transistors and is made at TSMC on a 55nm process. Wikipedia lists the die at 470mm^2, that’s roughly 80% the size of the original 65nm GT200 die. In either case it’s a lot bigger and still more expensive than Cypress’ 334mm^2 40nm die.

Cypress vs. GT200b die sizes to scale

NVIDIA could get into a price war with AMD, but given that both companies make their chips at the same place, and NVIDIA’s costs are higher - it’s not a war that makes sense to fight.

NVIDIA told me two things. One, that they have shared with some OEMs that they will no longer be making GT200b based products. That’s the GTX 260 all the way up to the GTX 285. The EOL (end of life) notices went out recently and they request that the OEMs submit their allocation requests asap otherwise they risk not getting any cards.

The second was that despite the EOL notices, end users should be able to purchase GeForce GTX 260, 275 and 285 cards all the way up through February of next year.

If you look carefully, neither of these statements directly supports or refutes the two articles above. NVIDIA is very clever.

NVIDIA’s explanation to me was that current GPU supplies were decided on months ago, and in light of the economy, the number of chips NVIDIA ordered from TSMC was low. Demand ended up being stronger than expected and thus you can expect supplies to be tight in the remaining months of the year and into 2010.

Board vendors have been telling us that they can’t get allocations from NVIDIA. Some are even wondering whether it makes sense to build more GTX cards for the end of this year.

If you want my opinion, it goes something like this. While RV770 caught NVIDIA off guard, Cypress did not. AMD used the extra area (and then some) allowed by the move to 40nm to double RV770, not an unpredictable move. NVIDIA knew they were going to be late with Fermi, knew how competitive Cypress would be, and made a conscious decision to cut back supply months ago rather than enter a price war with AMD.

While NVIDIA won’t publicly admit defeat, AMD clearly won this round. Obviously it makes sense to ramp down the old product in expectation of Fermi, but I don’t see Fermi with any real availability this year. We may see a launch with performance data in 2009, but I’d expect availability in 2010.

While NVIDIA just launched its first 40nm DX10.1 parts, AMD just launched $120 DX11 cards

Regardless of how you want to phrase it, there will be lower than normal supplies of GT200 cards in the market this quarter. With higher costs than AMD per card and better performance from AMD’s DX11 parts, would you expect things to be any different?

Things Get Better Next Year

NVIDIA launched GT200 on too old of a process (65nm) and they were thus too late to move to 55nm. Bumpgate happened. Then we had the issues with 40nm at TSMC and Fermi’s delays. In short, it hasn’t been the best 12 months for NVIDIA. Next year, there’s reason to be optimistic though.

When Fermi does launch, everything from that point should theoretically be smooth sailing. There aren’t any process transitions in 2010, it’s all about execution at that point and how quickly can NVIDIA get Fermi derivatives out the door. AMD will have virtually its entire product stack out by the time NVIDIA ships Fermi in quantities, but NVIDIA should have competitive product out in 2010. AMD wins the first half of the DX11 race, the second half will be a bit more challenging.

If anything, NVIDIA has proved to be a resilient company. Other than Intel, I don’t know of any company that could’ve recovered from NV30. The real question is how strong will Fermi 2 be? Stumble twice and you’re shaken, do it a third time and you’re likely to fall.

Chipsets: One Day You're In and the Next, You're Out


View All Comments

  • iwodo - Wednesday, October 14, 2009 - link

    Why no one thought of Nvidia Chipset using PCI-Express 8x?

    Couldn't you theoretically make an mGPU with IO function, ( the only thing left is SATA, USB and Ethernet ) and another PCI-Express 8x So the mGPU communicate with another Nvidia CPU via its own lane without going back to CPU.
  • chizow - Wednesday, October 14, 2009 - link

    [quote]Let’s look at what we do know. GT200b has around 1.4 billion transistors and is made at TSMC on a 55nm process. Wikipedia lists the die at 470mm^2, that’s roughly 80% the size of the original 65nm GT200 die. In either case it’s a lot bigger and still more expensive than Cypress’ 334mm^2 40nm die.[/quote]

    Anand, why perpetuate this myth comparing die sizes and price on different process nodes? Surely someone with intimate knowledge of the semiconductor industry like yourself isn't claiming a single TSMC 300mm wafer on 40nm costs the same as 55nm or 65nm?

    A wafer is just sand with some copper interconnects, the raw material price means nothing for the end price tag. Cost is determined by capitalization of assets used to manufacture goods, the actual raw material involved means very little. Obviously the uncapitalised investment on the new 40nm process exceeds that of 55nm or 65nm, so obviously prices would need to be higher to compensate.

    I can't think of anything "old" that costs more than the "new" despite the "old" being larger. If you think so, I have about 3-4 100 lb CRT TVs I want to sell you for current LCD prices. ;)

    In any case, I think the concerns about selling GT200b parts are a bit unfounded and mostly to justify the channel supply deficiency. We already know the lower bounds of GT200b pricing, the GTX 260 has been selling for $150 or less with rebates for quite some time already. If anything, the somewhat artificial supply deficiency has kept demand for Nvidia parts high.

    I think it was more of a calculated risk by Nvidia to limit their exposure to excess inventory in channel, which was a reportedly big issue during the 65nm G92/GT200 to 55nm G92b/GT200b transition. There's was also some rumors about Nvidia going to more of a JIT delivery system to avoid some of the purchasing discounts some of the major partners were exploiting. They basically waited for the last day of the quarter for Nvidia to discount and unload inventory stock levels in an effort to beef up quarterly results.
  • chizow - Wednesday, October 14, 2009 - link


    Let’s look at what we do know. GT200b has around 1.4 billion transistors and is made at TSMC on a 55nm process. Wikipedia lists the die at 470mm^2, that’s roughly 80% the size of the original 65nm GT200 die. In either case it’s a lot bigger and still more expensive than Cypress’ 334mm^2 40nm die.

    Properly formatted portion meant to be quoted in above post for emphasis.

  • MadMan007 - Wednesday, October 14, 2009 - link

    Possibly the most important question for desktop PC discrete graphics from gamers who aren't worried about business analysis is 'What will be the rollout of Fermi architecture to non-highend non-high cost graphics cards?'

    Is NV going to basically shaft that market by going with the cut down GT200 series DX10.1 chips like GT220? (OK, that's a little *too* lowend but I mean the architecture.) As much as we harped on G92 renaming at least it was competitive versus the HD4000 series in certain segments and the large GT200s, GTX260 in particular, were ok after price cuts. The same is very likely not going to be true for DX10.1 GT200 cards especially when you consider less tangible things like DX11 which people will feel better buying anyway.

    Answer that question and you'll know the shape of desktop discrete graphics for this generation.
  • vlado08 - Wednesday, October 14, 2009 - link

    I thing that Nvidia is preparing Fermi to meet with Larabee. Probably they are confident that AMD isn't a big threat. They know them very well and know what they are capable of and what to expect but they don't now their new opponent. Every body knows that Intel has very strong financial power and if they want something they do it some times it takes more time but eventually they bring every thing to an end. They are a force not to be underestimateed. If Nvidia has any time advantige they should use it. Reply
  • piesquared - Wednesday, October 14, 2009 - link

    [quote]Other than Intel, I don’t know of any company that could’ve recovered from NV30.[/quote]

    How about recoverying from both Barcelona AND R600
  • Scali - Thursday, October 15, 2009 - link

    AMD hasn't recovered yet. They've been making losses quarter after quarter. I expect more losses in the future, now that Nehalem has gone mainstream.
    Some financial experts have put AMD in their list of companies to most likely go bankrupt in 2010:">
  • bhougha10 - Wednesday, October 14, 2009 - link

    The one thing that is not considered in these articals is the real world. In the real world people were not waiting for the ATI 5800s to come out, but they are waiting for the GTX 300s to come out. The perception in the market place is that NVIDIA is namebrand and ATI is generic.
    It is not a big deal at all that the GTX 300s are late. Nvidia had the top cards for 3/4ths of the year, it is only healthy that ATI have the lead for 1/4th of the year. The only part that is bad for NVIDIA is that they don't have this stuff out for Christmas, and I am not sure that is even a big deal. Even so, these high end cards are not gifts, they are budget items (i.e. like I plan to wait till the beginning of next year to buy this or that)
    Go do some online gamming if you think this is all made up. You will have your AMD fanboys, but the percentage is low. (sorry didn't want to say fanboy)
    These new 210/220 GTs sound like crap, but the people buying them won't know that and will pay the money and not be any the wiser that they could have gotten a better value. They only thought of the NVIDIA namebrand.
    Anyway, I say this in regards to the predicting of the eventual demise of NVIDIA. Same reason these financial anaylst can't predict the market well.

    Another great artical, a lot of smart guys working for these tech sites.
  • Zool - Wednesday, October 14, 2009 - link

    So you think that the company who has the top card and is a namebrand is the winner. Even with top cards in 3/4ths of the year they had lost quite a money on predicted margins with 4k radeons on market. They couldnt even get to midrange cards with GT200 the price for them was too high. And u think that with that monster gt300 it will be better.
    They will sure sell it but at what cost for them.
  • bhougha10 - Wednesday, October 14, 2009 - link

    AMD has lost money for the last 3 years (as far as I looked back) and not just a little money a lot of money. NVIDIA on the other hand lost a little money last year. So,not sure of the point of that last reply.
    The point of the original post was that there is an intrinsic value or novelty that comes with a company. AMD has very little novelty. This is just the facts. I have AMD and NVIDIA stock, I want them to both win. It's good for everyone if they do both win.

Log in

Don't have an account? Sign up now