The Multi-GPU Battle: ATI vs. NVIDIA

ATI's recent entry into the multi-GPU market with CrossFire has created competition in both price and performance aspects of high end AMD and Intel chipsets.

ATI continues to have problems with their South Bridges, and thus, they are turning to ULi to supply the South Bridges for motherboard designs based on their new multi-GPU chipset.  ATI's closest partners are currently beta-testing their new South Bridge, but none of them have any confidence in ATI's ability to bring their South Bridges to market in time.  While they are all ready to use ULi based South Bridges if necessary, in order to keep ATI happy, they are continuing to work with ATI's South Bridge in their designs. 

Given the lack of interest in any of ATI's previous chipsets, ATI knows that in order to get CrossFire off the ground with any sort of success, they will need some pretty powerful partners in the Taiwanese market.

Thus, ATI is talking to VIA and SiS to license out their multi-GPU technology so that you will be able to purchase a motherboard based on an ATI, VIA or SiS chipset and be able to run ATI graphics cards in multi-GPU modes.  VIA is particularly interested in this partnership as they aren't the biggest fans of NVIDIA at this point.

First availability of ATI's CrossFire chipsets won't be until July or August time frame from what we're hearing. 

NVIDIA is very curious about ATI's CrossFire, as it will mark the end of NVIDIA's exclusivity on multi-GPU platforms.  In order to help expand the SLI market, NVIDIA appears to be ready to drop the price of their nForce4 SLI chipset.  While currently priced at around $80, the chipset will drop in price to close to $40 later this year.  The goal is to enable SLI motherboards to be priced at $100 or less.  We have even heard that some very aggressive motherboard manufacturers are looking to offer sub-$80 nForce4 SLI motherboards by the end of this year. 

At $80, it would be senseless not to buy a SLI motherboard, which is exactly what NVIDIA wants.

The AMD Chipset Battle: NVIDIA vs. VIA VIA, ULi & SiS
Comments Locked

45 Comments

View All Comments

  • spinportal - Tuesday, June 14, 2005 - link

    It's not ironic or a surprise to see ATI or Nvidia pushing chipsets without integrated graphics solution since it will cannabalize their wonderful Turbo PCIe cards! When was the last time Intel's i/g tech or Via's S3 tech on an add-on board could compare or compete to any ATI or Nvidia offerings? It's basic hubris - you want 3D? you buy our cards at additional cost. No free lunch for you!
  • redhatlinux - Tuesday, June 14, 2005 - link

    Oooops that's FAB
  • redhatlinux - Tuesday, June 14, 2005 - link

    Great Article, couldn' expect anything less from the boss. Back in the days AMD produced their own chipset, but as so well put, $ talks. AmD MUST focus their R&D $ on the best possible Retun on Investment, its that simple. BTW I have a buddy, BRIAN who worked at the Austin FAM plant over 4 years ago, These so called 'new cores' were in R@D back then. SOI and 69nm gates as well. Brian still uses a Tyan mobo with 2 MP's. Still pretty smokin rig.

    Eric
  • Nayr - Tuesday, June 14, 2005 - link

    Thanks #33 for pointing that out.

    +1 Nerd power.

    =P
  • Viditor - Tuesday, June 14, 2005 - link

    "This is, of course, why DDR2 is becomming popular for mobile computing where thermal dissipation is more important than performance"

    True...both heat and power are lower with DDR2, which will make it an excellent choice for mobile.
    Both AMD and Intel will be going DDR2 at the start of 2006...
  • 2cpuminimum - Tuesday, June 14, 2005 - link

    What seemed odd was "Being able to run at much higher frequencies than DDR1 is the major advantage that DDR2 offers." when the greatest advantage supposedly held by DDR2 is lower heat production due to a slower core speed. Higher frequency isn't really much of an advantage when that frequency isn't high enough to compensate for higher latency. This is, of course, why DDR2 is becomming popular for mobile computing where thermal dissipation is more important than performance.
  • Viditor - Tuesday, June 14, 2005 - link

    Well let's see...Porkster is trying to use a stress test that wasn't benchmarked for multiple apps as a rationale for a supposed Intel superiority in multitasking...sigh.

    1. Has anyone done any tests that were designed for this? Well gee whiz I guess they have...
    http://tinyurl.com/chck7
    http://tinyurl.com/akueq
    http://tinyurl.com/7agle

    The results were that the X2 was vastly superior in 2 threads with heavy workloads, and that with 4 threads of heavy workload the P4EE 840 pulled equal (not better) because HT gives it superior load balancing. Of course in single threads the X2 was again vastly superior (in fact the 840EE proved slower than some other P4 chips...)

    2. What about the actual purpose of Tom's test...which platform handles stress better?

    Well, on the face of it the X2 was the hands down winner without contestation!
    The Intel system kept crashing (5 times IIRC), then they restarted after changing from the Intel Nforce platform to the pure Intel system. After that the Intel platform had to be rebooted 3 times...
    The AMD platform just kept running the whole time!

    That said, Tom's test doesn't show anything worthwhile...

    1. The test methods are extremely flawed. To show stability of a platform, using 1 or 2 systems isn't scientific...it's just sensationalist.
    2. Many mistakes were made both in the performance and the design of the test..

    As to porkster's (dubbed by many forums as the greatest Troll who ever lived!) assertion of AMD being driven by the "teenager" market, I must say that I'm glad to see so many major corporations hiring teenagers to head up their IT departments! :-)
  • 4lpha0ne - Tuesday, June 14, 2005 - link

    @porkster:
    I'm sure, you'd also call Pentium Ds lemons, because they are also only able to run 2 threads at once. Everything else is a matter of priority settings (like low DivX encoding priority) and hyperthreading, which doesn't distinguish between low and high priority tasks.

  • 4lpha0ne - Tuesday, June 14, 2005 - link

    BTW, AMD already has a graphics core (in Geode). And I read, that a part (50 people or so) of the National Semiconductor team, which they took over, was already working on a 3D core.

    So this would make sense.
  • porkster - Tuesday, June 14, 2005 - link

    If you see the poor multitasking performance of the AMD X2 then you can expect the market share to drop big time, but are AMD users smart enough to avoid bad chips like the X2.

    AMD is riding the teenager market with a theme of join the club or feel out of it. It's peer group pressure into buying into poor hardware/old tech.

    Just check out THG review of the AMD X2 and you wont want one of those lemons.

    http://www.tomshardware.com/stresstest/load.html

    .

Log in

Don't have an account? Sign up now