Challenging NVIDIA's Strategy: Are Two RV770s Faster than One GT200?

NVIDIA insists on building these massive GPUs while AMD is heading in the direction of multiple, smaller GPUs in order to keep development time and costs manageable. Does NVIDIA's strategy make sense? In order to find out we paired two Radeon HD 4850s in CrossFireX and ran through our benchmark suite, this time focusing on a comparison to the recently announced GeForce GTX 280 as well as the 9800 GX2. The results were surprising:

512 256MB
  AMD Radeon HD 4850 CF NVIDIA GeForce GTX 280 NVIDIA GeForce 9800 GX2
Crysis 36.4 34.3 39.9
Call of Duty 4 88.2 67.4 73.2
Enemy Territory: Quake Wars 53.7 70.2 62.2
Assassin's Creed 51.9 45 52.6
Oblivion 39.5 36.8 35.6
The Witcher 20.9 37.7 37.6
Bioshock 68.6 63.9 75.4

So does AMD's approach invalidate NVIDIA's big-monolithic-GPU strategy? Not exactly. While it is true that two RV770s can outperform a single GT200 in many cases, you could also make the argument that two GT200s could outperform anything that AMD could possibly concoct (3 and 4-way CF scaling isn't nearly as good as 2-way). AMD's strategy makes sense, for AMD, but it's fundamentally no different than what NVIDIA is doing - AMD is simply targeting a different initial market and scaling up/down from there.

The scaling, or lack thereof, in games like Enemy Territory: Quake Wars highlights an important caveat with AMD's strategy: there are still software issues with SLI and CrossFireX. What is necessary is a truly seamless multi-GPU implementation, with shared frame buffer and where both GPUs operate as an extension of each other with direct GPU-to-GPU communication over a high speed (not PCIe) bus, similar to how AMD's Opteron or Intel's Nehalem work in multi-socketed systems.

Bioshock Multi-GPU Performance: Crysis, Call of Duty 4 and ET:QW
Comments Locked

114 Comments

View All Comments

  • Sunrise089 - Friday, June 20, 2008 - link

    Derrick should really clarify the source of the problem then Jarred. We all know on forums everyone says you need a 600 watt PS to even run integrated graphics, but one reason I love AT's real power draw numbers is that they show how little power most sane systems really need. But casually mentioning a 1KW unit isn't enough for even 4850 CF and not explaining further is about as close to pure FUD as I've seen here.
  • DerekWilson - Friday, June 20, 2008 - link

    all these tests have been done at Anand's place and at-the-wall power should not be a problem for any of these recent articles.

    we did have problems with our 1kW thermaltake and our 1kW ocz PSUs with the GTX 280 in SLI. we couldn't get through a crysis run.

    in testing 4850 crossfire, the 1kW ocz power supply (elite xtreme) failed during call of duty.

    we had no problems with the 1200 W pcp&c turbo cool PSU we now have installed.

    our peak power numbers were shown using one of 3dmarks GPU only feature tests. this is in order to isolate GPU power as much as possible for comparison purposes between different graphics cards.

    power draw at the wall will be MUCH larger when playing an actual game. this is because the CPU will be under load and system memory will likely be hit harder as well. we will also see the hard disk active as well.

    i do apologize for not explaining it further. knowing what app we used to test power would probably have done enought to explain why the PSU crashed under game tests but not under our power test with a 1kW PSU ...

    4850 crossfire and up and gt200 sli and up will absolutlely massive ammounts of power to run. we would be the first to say that a 1kW PSU was enough if it were -- but it is not.
  • semo - Saturday, June 21, 2008 - link

    so how much are you drawing at the wall. just saying "MUCH larger" doesn't mean anything.

    this also doesn't make much difference as power ratings refer to how much can be delivered to the system - not how much can be pulled from the socket.

    in other words, there seems to be some confusion. could we get some clarification the next time you do a review for GPUs (e.g. at 4870's launch)
  • flagpole - Saturday, June 21, 2008 - link

    I have a 650w Silverstone Zeus ST650ZF powering my system right now, and it's handling a pair of 4850's Crossfire'd fine.

    Not to mention the 4 harddrives, 5x 120mm fans, Swiftech water pump, an AMD 64X2 4400+ @ 2.7 Ghz, plus various other things like LED's and Cathode tubes sucking back power as well.
  • HOOfan 1 - Friday, June 20, 2008 - link

    How about the fact that nvidia has 2 CWT built 1000W systems certified on SLIzone for dual GTX 280.

    It really perplexes me that you guys think a 1Kw PSU wouldn't be enough for GTX 280 SLi or for 4850 Crossfire. an 800+ Watt PSU should be enough for either. Nvidia even certified the Zalman 850 Watt for dual 9800GX2. Jonnyguru stated that there was a problem specific to the GTX 280 that was not the fault of the PSUs.

    I think you guys really ought to have a talk with nVidia and ATI about this before you just claim that a 1Kw PSU isn't enough for dual GPU for these two cards...because quite honestly that claim sounds rather preposterous to me.
  • strikeback03 - Friday, June 20, 2008 - link

    I was wondering the same - the review says they had power supply problems with 2 4850s in CF, even though the table directly above says that configuration drew 335.7W total system power.
  • Sunrise089 - Thursday, June 19, 2008 - link

    Why the heck are you guys have power supply failures with this card? I know it draws a decent amount of power, but when you're load numbers are less than HALF the rating of the power supply something seems fishy.
  • BPB - Thursday, June 19, 2008 - link

    I thought these cards are to be better than current ATI cards for HD movies. Did you get a chance to play any movies? And if so, ho was the audio?
  • jay401 - Thursday, June 19, 2008 - link

    75C idle, 90C load is insane, i don't care how well the components can tolerate it. It's like an oven inside your case, and -something- will give eventually on it because those temps are nuts. Why does AMD/ATI have such trouble putting out reasonably-temped cards even after yet another die shrink? :(
  • Clauzii - Saturday, June 21, 2008 - link

    They used the die-shrink to ramp up performance, which they needed AND achieved :)

    I hope some Arctic cooling solution will show up even though two slots might be used.

Log in

Don't have an account? Sign up now