Chipsets

With ATI, Intel, NVIDIA, ULi, SiS and VIA all competing for market share, the chipset business is particularly interesting right now. 

The AMD Chipset Battle: NVIDIA vs. VIA

The battle for AMD platform market share continues to be between NVIDIA and VIA.  VIA was largely responsible for the success of the very first AMD Athlon, as they were the only mainstream chipset provider for quite some time.  However, since then NVIDIA has stepped up to be a very serious competitor.  All of the manufacturers we have talked to have said that in the past year, NVIDIA has grown extremely quickly to take control over virtually all of the high end K8 chipset business. 

Despite NVIDIA's incredible growth, VIA is still found on quite a few AMD motherboards for three reasons in particular: 1) Socket-A, 2) Socket-754 and 3) K8 Integrated Graphics solutions. 

The K7 market continues to be dominated by VIA, but as a dying market, it isn't one that we normally focus on.  The Socket-754 and K8 Integrated Graphics solutions are also dominated by VIA however.  The Socket-754 market is very price sensitive right now, which is where VIA wins over NVIDIA.  Ironically enough, NVIDIA, the graphics manufacturer, does not have a K8 chipset shipping with integrated graphics and thus, gives up a large portion of K8 market share to VIA.

NVIDIA has been working on an integrated graphics solution for both the Intel and AMD markets: the C51 and C60 (AMD and Intel platforms respectively).  Motherboard manufacturers have received these new chipsets with relatively mixed response.  Both the C51 and C60 implement a much larger graphics core than the integrated S3 graphics that VIA offers in their chipsets. 

The problem is that NVIDIA's cheapest integrated solution is still more expensive than VIA's offerings, which are currently priced at the $13 - $14 price point.  The OEM markets will gladly pay the added premium to be able to use the NVIDIA name in their marketing, but the rest of the markets are simply looking for the cheapest overall solution, and NVIDIA's approach won't provide that.  So, it appears that although NVIDIA will be eating a bit of VIA's lunch, they will still leave a big hunk of it for VIA. 

If you're wondering why NVIDIA doesn't simply stick a small DX7 graphics core in their chipsets to compete with VIA, it comes down to profit margins.  NVIDIA needs to keep their profit margins high, and by going after the ultra low end integrated graphics market, they cannot maintain high enough profit margins to justify spending so much time and resources producing cheap enough chipsets to compete with VIA.  It would surely bring hard times to VIA, but with NVIDIA dominating the high end market, there's simply no economic reason to go after VIA's share of the AMD business. 

Motherboard manufacturers that we've talked to all expect the high end AMD market to be dominated by NVIDIA and ATI based solutions, while the integrated graphics offerings will be dominated by VIA.  Ironic, isn't it?

Rumor: AMD's Low Cost K8 with Integrated Graphics in 2008? The Multi-GPU Battle: ATI vs. NVIDIA
Comments Locked

45 Comments

View All Comments

  • spinportal - Tuesday, June 14, 2005 - link

    It's not ironic or a surprise to see ATI or Nvidia pushing chipsets without integrated graphics solution since it will cannabalize their wonderful Turbo PCIe cards! When was the last time Intel's i/g tech or Via's S3 tech on an add-on board could compare or compete to any ATI or Nvidia offerings? It's basic hubris - you want 3D? you buy our cards at additional cost. No free lunch for you!
  • redhatlinux - Tuesday, June 14, 2005 - link

    Oooops that's FAB
  • redhatlinux - Tuesday, June 14, 2005 - link

    Great Article, couldn' expect anything less from the boss. Back in the days AMD produced their own chipset, but as so well put, $ talks. AmD MUST focus their R&D $ on the best possible Retun on Investment, its that simple. BTW I have a buddy, BRIAN who worked at the Austin FAM plant over 4 years ago, These so called 'new cores' were in R@D back then. SOI and 69nm gates as well. Brian still uses a Tyan mobo with 2 MP's. Still pretty smokin rig.

    Eric
  • Nayr - Tuesday, June 14, 2005 - link

    Thanks #33 for pointing that out.

    +1 Nerd power.

    =P
  • Viditor - Tuesday, June 14, 2005 - link

    "This is, of course, why DDR2 is becomming popular for mobile computing where thermal dissipation is more important than performance"

    True...both heat and power are lower with DDR2, which will make it an excellent choice for mobile.
    Both AMD and Intel will be going DDR2 at the start of 2006...
  • 2cpuminimum - Tuesday, June 14, 2005 - link

    What seemed odd was "Being able to run at much higher frequencies than DDR1 is the major advantage that DDR2 offers." when the greatest advantage supposedly held by DDR2 is lower heat production due to a slower core speed. Higher frequency isn't really much of an advantage when that frequency isn't high enough to compensate for higher latency. This is, of course, why DDR2 is becomming popular for mobile computing where thermal dissipation is more important than performance.
  • Viditor - Tuesday, June 14, 2005 - link

    Well let's see...Porkster is trying to use a stress test that wasn't benchmarked for multiple apps as a rationale for a supposed Intel superiority in multitasking...sigh.

    1. Has anyone done any tests that were designed for this? Well gee whiz I guess they have...
    http://tinyurl.com/chck7
    http://tinyurl.com/akueq
    http://tinyurl.com/7agle

    The results were that the X2 was vastly superior in 2 threads with heavy workloads, and that with 4 threads of heavy workload the P4EE 840 pulled equal (not better) because HT gives it superior load balancing. Of course in single threads the X2 was again vastly superior (in fact the 840EE proved slower than some other P4 chips...)

    2. What about the actual purpose of Tom's test...which platform handles stress better?

    Well, on the face of it the X2 was the hands down winner without contestation!
    The Intel system kept crashing (5 times IIRC), then they restarted after changing from the Intel Nforce platform to the pure Intel system. After that the Intel platform had to be rebooted 3 times...
    The AMD platform just kept running the whole time!

    That said, Tom's test doesn't show anything worthwhile...

    1. The test methods are extremely flawed. To show stability of a platform, using 1 or 2 systems isn't scientific...it's just sensationalist.
    2. Many mistakes were made both in the performance and the design of the test..

    As to porkster's (dubbed by many forums as the greatest Troll who ever lived!) assertion of AMD being driven by the "teenager" market, I must say that I'm glad to see so many major corporations hiring teenagers to head up their IT departments! :-)
  • 4lpha0ne - Tuesday, June 14, 2005 - link

    @porkster:
    I'm sure, you'd also call Pentium Ds lemons, because they are also only able to run 2 threads at once. Everything else is a matter of priority settings (like low DivX encoding priority) and hyperthreading, which doesn't distinguish between low and high priority tasks.

  • 4lpha0ne - Tuesday, June 14, 2005 - link

    BTW, AMD already has a graphics core (in Geode). And I read, that a part (50 people or so) of the National Semiconductor team, which they took over, was already working on a 3D core.

    So this would make sense.
  • porkster - Tuesday, June 14, 2005 - link

    If you see the poor multitasking performance of the AMD X2 then you can expect the market share to drop big time, but are AMD users smart enough to avoid bad chips like the X2.

    AMD is riding the teenager market with a theme of join the club or feel out of it. It's peer group pressure into buying into poor hardware/old tech.

    Just check out THG review of the AMD X2 and you wont want one of those lemons.

    http://www.tomshardware.com/stresstest/load.html

    .

Log in

Don't have an account? Sign up now