Chipsets: One Day You're In and the Next, You're Out

Presently, NVIDIA’s chipset business is far from dead. They are in nearly every single Apple computer on the market, not to mention countless other OEMs. I’m not sure how much money NVIDIA is making from these chipsets, but they are selling.


NVIDIA won Apple's chipset business, Intel was not happy

Long term I don’t see much of a future for NVIDIA’s chipset business. NVIDIA said that they have no interest in pursuing an LGA-1156 chipset given Intel’s legal threats. Even if NVIDIA had a license to produce DMI chipsets, I’m not sure it makes sense.


NVIDIA's Advantage: A single chip GeForce 9400M instead of a dated Intel solution

Once the ‘dales hit, every single mainstream CPU from Intel is going to come with graphics on-package. Go out one more generation and Sandy Bridge brings the graphics on-die. AMD is doing the same thing starting in 2012.

It’s taken longer than expected, but there’s honestly no need for a third party chipset maker anymore. Most of the performance differentiation in chipsets has been moved onto the CPU die anyway, all that’s left are SATA, USB, and a bunch of validation that no one likes doing. NVIDIA is much better off building a discrete GeForce 9400M GPU at low cost and selling that. There’s much less headache involved with selling discrete GPUs than selling chipsets, plus graphics is NVIDIA’s only value add when it comes to chipsets - everyone knows how to integrate a USB controller by now. I’d say the same about SATA but AMD still has some AHCI silliness that it needs to sort out.

NVIDIA committed to supporting existing products in the channel and continues to poke fun at AMD with lines like this:

“On AMD platforms, we continue to sell a higher quantity of chipsets than AMD itself. MCP61-based platforms continue to be extremely well positioned in the entry CPU segments where AMD CPUs are most competitive vs. Intel”

As successful as NVIDIA’s AMD chipsets are today, AMD is telling us that nearly all OEM designs going forward use AMD chipsets. Again, NVIDIA’s chipset business is quite healthy today, but I don’t see much of a future in it - not that it’s a bad thing.

The only reason NVIDIA’s chipset business has lasted this long is because AMD and Intel couldn’t get their houses in order quickly enough. AMD is finally there and Intel is getting there, although it remains to be seen how well the next-generation of Atom platforms will work in practice.


A pair of Ion motherboards we reviewed

The main reason Ion got traction in the press was because it could play Blu-ray content. If Intel had done the right thing from the start and paired Atom with a decent chipset, NVIDIA would never have had the niche for Ion to fit into.

Index Final Words
Comments Locked

106 Comments

View All Comments

  • dan101rayzor - Wednesday, October 14, 2009 - link

    anyone know when the fermi is coming out?
  • Zingam - Wednesday, October 14, 2009 - link

    Sometime in the future. It appears to me that it will be a great general purpose processor but if it will be great graphics processor is not quite clear.
    If you want a great GPU for graphics you won't be wrong with ATI in the near future. If you want to do scientific computing probably Fermi will be the king next year.
    And then comes Larrabee and thing would get quite interesting.

    I wonder who will release a great, next generation, mobile GPU first!
  • dragunover - Wednesday, October 14, 2009 - link

    I believe the 5770 could easily be transplanted into laptops with a different cooling solution and lower clocks and called the 5870M.
  • dan101rayzor - Wednesday, October 14, 2009 - link

    so for gaming an ATI 5800 is the best bet? Fermi will it be very expensive? Fermi not for gaming?
  • Scali - Wednesday, October 14, 2009 - link

    We can't say yet.
    We don't know the prices of Fermi parts, their gaming performance, or when they'll be released.
    Which also means we don't know how 'good' the 5800 series is. All we know is that it's the best part on the market today.
  • swaaye - Monday, October 19, 2009 - link

    It's always been rather hard to pronounce a part as "best". There's always an unknown future part in the works. And, even good parts sometimes have downsides. For ex, I can see a reason or two for going NV30 back in the day instead of R300 (namely OpenGL apps/games).

    I think it's safe to say that you can't go wrong with 58x0 right now. It doesn't have any major issues, AFAIK, the price is decent, and the competition is nowhere in sight.
  • MojaMonkey - Wednesday, October 14, 2009 - link

    I have a simple brand preference for nVidia and I use Badaboom so I'd buy their products even if price performance doesn't quite stack up. However there are limits in performance difference I'm willing to accept.

    I really hope the gt300 is a great competitive product as it sounds interesting.

    One thought, given that PC gaming is now on the back burner compared to consoles maybe nVidia is being smart branching out into mobiles and GPU computing? Both of these could be real growth areas as PC gaming starts to fade.
  • neomatrix724 - Wednesday, October 14, 2009 - link

    People have been saying that PC gaming has been dying for years. Most of the GPU developments and technologies have been created for the PC first and then downstream improvements are what the console gets.

    I don't foresee the PC disappearing as a gaming medium for a while.
  • MonkeyPaw - Wednesday, October 14, 2009 - link

    I don't really see games disappearing from PCs either. However, consoles are very popular, and are a much more guaranteed market, since developers know console sales numbers, and that essentially every console owner is a gamer. Consequently, developers will target a lucrative console market (at $40-60 per game). This means their games have to run on the consoles well enough (speed and graphics quality) to be appealing. That doesn't necessarily kill PC gaming, but it does slow the progression (and the need for expensive hardware). This, along with the fact that many gamers don't likely play on 30" LCDs, means high-end graphics card sales are directed at a very limited audience. When I gamed on my PC, I had a 4670, since this would play most games just fine on a 19" LCD. I would never need a 4870, much less a 5870.

    So no, PC gaming isn't dead, but it's not the target market anymore.
  • atlmann10 - Thursday, October 15, 2009 - link

    The point made earlier that PC Gaming has been dieing for decades, as an incorrect statement is true. I think we will see more personally, but largely in this nvidia has room to gain, that is in other things that gigantic PC graphics card though.

    The online gaming thing is much better than even the console market. Why you may ask? That is because of revenue stream, and delivery. With an online pc game you can update it weekly thereby changing anything you want to.

    Yes, console is concentrated to a point because they know everyone will go out and buy a 45 dollar game. But the subscription thing is the killer. The gamer who plays an online game subscribes. Therefore the buy the 45 dollar game and give you another 15 at the end of the month, and then when you have an upgrade you have a direct pat to them as a distributor.

    So then that customer that bought your 45 dollar game, and pays you 15 a month for access logs onto there game, there is a new update for 35 dollars do you want it? That user clicks yes proceeds to download it what have you done as a company? You have eliminated advertising completely for existing customers, you have also eliminated packaging, and distribution.

    So then what did you have to do to produce that upgrade? We didn't do anything but development. So we cut our production and cost of operations in half. Plus they bought our new game or upgrade for the same price, even though it cost us 60 percent less to make, and pay us an automatic 15 a month as long as they play it.


Log in

Don't have an account? Sign up now