SLI Performance Throwdown: GTX 280 SLI vs. 9800 GX2 Quad SLI

We had two GeForce GTX 280s on hand and a plethora of SLI bridges, so we of course had to run them in SLI. Now remember that a single GTX 280 uses more power than a GeForce 9800 GX2, and thus two of them is going to use a lot of power. It was so much power in fact that our OCZ EliteXStream 1000W power supply wasn't enough. While the SLI system would boot and get into Windows, we couldn't actually complete any benchmarks. All of the power supplies on the GTX 280 SLI certified list are at least 1200W units. We didn't have any on hand so we had to rig up a second system with a separate power supply and used the second PSU to power the extra GTX 280 card. A 3-way SLI setup using GTX 280s may end up requiring a power supply that can draw more power than most household circuits can provide.

Although a single GeForce GTX 280 loses to a GeForce 9800 GX2 in most cases, scaling from two to four GPUs is never as good as scaling from one to two. Thus forcing the question: are a pair of GTX 280s in SLI faster than a 9800 GX2 Quad SLI setup?

Let's look at the performance improvements from one to two cards across our games:

GTX 280 SLI (Improvement from 1 to 2 cards) 9800 GX2 SLI
(Improvement from 1 to 2 cards)
Crysis 50.1% 30.3%
Call of Duty 4 62.8% 64.0%
Assassin's Creed 38.9% 12.7%
The Witcher 54.9% 36.2%
Bioshock 68.4% 63.7%
Oblivion 72.3% -35.7%

Crysis, Assassin's Creed, The Witcher and Oblivion are all situations where performance either doesn't scale as well or drops when going from one to two GX2s, giving NVIDIA a reason to offer two GTX 280s over a clumsy Quad SLI setup.

Crysis

Thanks to poor Quad SLI scaling, the GX2 SLI and the GTX 280 SLI perform the same, despite the GTX 280 being noticeably slower than the 9800 GX2 in single-card mode.

Call of Duty 4

When it does scale well however, the GX2 SLI outperforms the GTX 280 SLI setup just as you'd expect.

Assassin's Creed

The Witcher

Bioshock

Oblivion

Sometimes you run into serious issues with triple and quad SLI where performance is actually reduced; Oblivion at 2560 x 1600 is one of those situations and the result is the GTX 280 SLI gives you a better overall experience.

While we'd have trouble recommending a single GTX 280 over a single 9800 GX2, a pair of GTX 280s will probably give you a more hassle-free, and consistent experience than a pair of 9800 GX2s.

Overclocked: EVGA GeForce GTX 280 FTW Finally: GPU Video Encode & Folding@Home
Comments Locked

108 Comments

View All Comments

  • tkrushing - Wednesday, June 18, 2008 - link

    Say what you want about this guy but this is partially true which is why AMD/ATI is in the position they have been. They are slowly climbing out of that hole they've been in though. Would have been nice to see 4870x2 hit the market first. As we know competition = less prices for everyone!
  • hk690 - Tuesday, June 17, 2008 - link



    I would love to kick you hard in the face, breaking it. Then I'd cut your stomach open with a chainsaw, exposing your intestines. Then I'd cut your windpipe in two with a boxcutter. Then I'd tie you to the back of a pickup truck, and drag you, until your useless fucking corpse was torn to a million fucking useless, bloody, and gory pieces.

    Hopefully you'll get what's coming to you. Fucking bitch


    http://www.youtube.com/watch?v=XNAFUpDTy3M">http://www.youtube.com/watch?v=XNAFUpDTy3M

    I wish you a truly painful, bloody, gory and agonizing death, cunt
  • 7Enigma - Wednesday, June 18, 2008 - link

    Anand, I'm all for free speech and such, but this guy is going a bit far. I read these articles at work frequently and once the dreaded C-word is used I'm paranoid I'm being watched.
  • Mr Roboto - Thursday, June 19, 2008 - link

    I thought those comments would be deleted already. I'm sure no one cares if they are. I don't know what that person is so mad about .
  • hk690 - Tuesday, June 17, 2008 - link


    Die painfully okay? Prefearbly by getting crushed to death in a garbage compactor, by getting your face cut to ribbons with a pocketknife, your head cracked open with a baseball bat, your stomach sliced open and your entrails spilled out, and your eyeballs ripped out of their sockets. Fucking bitch
  • Mr Roboto - Wednesday, June 18, 2008 - link

    Ouch.. Looks like you hit a nerve with AMD\ATI's marketing team!
  • bobsmith1492 - Monday, June 16, 2008 - link

    The main benefit from the 280 is the reduced power at idle! If I read the graph right, at idle the 9800 takes ~150W more than the 280 while at idle. Since that's where computers spend the majority of their time, depending on how much you game, that can be a significant cost.
  • kilkennycat - Monday, June 16, 2008 - link

    Maybe you should look at the GT200 series from the point of view of nvidia's GPGPU customers - the academic researchers, technology companies requiring fast number-cruching available on the desktop, the professionals in graphics-effects and computer animation - not necessarily real-time, but as quick as possible... The CUDA-using crew. The Tesla initative. This is an explosively-expanding and highly profitable business for nVidia - far more profitable per unit than any home desktop graphics application. An in-depth analysis by Anandtech of what the GT200 architecture brings to these markets over and above the current G8xx/G9xx architecture would be highly appreciated. I have a very strong suspicion that sales of the GT2xx series to the (ultra-rich) home user who has to have the latest and greatest graphics card is just another way of paying the development bills and not the true focus for this particular architecture or product line.

    nVidia is strongly rumored to be working on the true 2nd-gen Dx10.x product family, to be introduced early next year. Considering the size of the GTX280 silicon, I would expect them to transition the 65nm GTX280 GPU to either TSMC's 45nm or 55nm process before the end of 2008 to prove out the process with this size of device, then in 2009 introduce their true 2nd-gen GPU/GPGPU family on this latter process. A variant on the Intel "tic-toc" process strategy.
  • strikeback03 - Tuesday, June 17, 2008 - link

    But look at the primary audience of this site. Whatever nvidia's intentions are for the GT280, I'm guessing more people here are interested in gaming than in subsidizing research.
  • Wirmish - Tuesday, June 17, 2008 - link

    "...requiring fast number-cruching available on the desktop..."

    GTX 260 = 715 GFLOPS
    GTX 280 = 933 GFLOPS
    HD 4850 = 1000 GFLOPS
    HD 4870 = 1200 GFLOPS
    4870 X2 = 2400 GFLOPS

    Take a look here: http://tinyurl.com/5jwym5">http://tinyurl.com/5jwym5

Log in

Don't have an account? Sign up now