Who Scales: How Much?

To calculate this scaling data, we simply looked at percent performance improvement of two cards over one. With perfect scaling we would see 100%, while no improvement is 0% and a negative performance improvement means that the multiGPU solution actually produced worse numbers than the single card. There's a lot of data here, so we'll break it down a bit before we present it all.

It is possible to see more than 100% scaling in some tests for different reasons. Fluctuations in benchmark performance can contribute to just over 100% situations, and some times optimizations to enable better multiGPU performance can cut some work out enabling higher performance than would otherwise have been possible. In one of the cases we test today we have a situation where single GPU performance is limited at some framerate while multiple GPUs aren't hindered by the same limit. This artificially inflates the scaling percent.

When looking at games that scale overall, we end up seeing both Radeon HD 4870 configurations (512MB and 1GB) performing worse than we expected. Granted, the 4870 1GB looks better if we only take 2560x1600 into account, but even then the Radeon HD 4850, GeForce GTX 260 and GTX 280 beat out the 4870 1GB in terms of average performance improvement (when performance improves). When we add in CPU limited cases, the 4870 cards look even worse. Consistently, most of the ways we attempted to analyze the magnitude of performance improvement (averages, geometric means, per game, across games where call cards scaled, etc.), the Radeon HD 4850 and GeForce GTX 260 (and sometimes the GTX 280) did pretty well, while the Radeon HD 4870 cards came in pretty low on the list with the 1GB often looking worse because it hit harder CPU limits at lower resolutions.

Hitting CPU or system limits does speak more to value than desirability from a performance standpoint, but it's still important to look at all the cases. Configurations with lower baseline single GPU performance will have more headroom to scale, but these might not always scale enough to be playable even if they scale well. So it's important to take both value and absolute performance data into account when looking at scaling.

We've put all this data on our benchmark pages with the performance data to make it easier to see in context. There just isn't one good way to aggregate the data or we would talk about it here. Depending on the type of analysis we try to do, we could present it in ways that favor AMD and NVIDIA and since there really isn't a "correct" way to do it we've decided to just present the data per game and leave it at that.

Who Scales: How Often? Calculating Value: Performance per Dollar
Comments Locked

95 Comments

View All Comments

  • nubie - Sunday, March 1, 2009 - link

    Have you ever used a tool or edited the game profile yourself?

    I had an 8800GTS 320MB that I used with AA extensively (Also with 3D stereoscopic), and I was told on a forum to use nHancer to modify the profile into a specific mode of Anti-Aliasing, I am pretty sure it worked. It was the beta 162.50 Quadro drivers I believe, you can just put your card's id into the inf and they install and work great.

    It is possible the drivers work great and the control panel/GUI is piss-poor (a theory that may hold water).

    I wish that nVidia would open up the drivers a little so that control freaks like myself could really tweak the settings to where I want them.
  • Razorbladehaze - Monday, February 23, 2009 - link

    Yeah In my main rig right now i have a i7 920 with two 1gb 4850's i recently bought a third 4850 and installed it. There was some funky flickering, that i think was driver related in BF2 and HoI2 in 3-way mode, but most games seemed okay. Funny thing is... same thing happened when i tried a 3870x2 & 3870 in 3 way on my older x38 core2. I am really hoping these next articles will come with some additional commentary on image quality.

    To the person who stated that the 9800gtx+ was comparable to the 4850x2. What R U thinking???

    I have never really had a problem with any crossfire setups before except with 3-way and i wonder if it is the odd gpu count and if 4 would eliminate some issues. Looking forward the the upcoming articles, this is mostly a teaser with information many already knew.

    I agree that the new format for graphs looks good line graphs are crap visually, but i think the default should be the 1920x1080/1200 that most people are interested in based on your survey data : )

    See I pay attention.

  • SiliconDoc - Wednesday, March 18, 2009 - link

    THANK YOU !
    " Yeah In my main rig right now i have a i7 920 with two 1gb 4850's i recently bought a third 4850 and installed it. There was some funky flickering, that i think was driver related in BF2 and HoI2 in 3-way mode, but most games seemed okay. Funny thing is... same thing happened when i tried a 3870x2 & 3870 in 3 way on my older x38 core2. I am really hoping these next articles will come with some additional commentary on image quality. "
    ________

    Another PERFECT REASON to not mention "image quality" - the red fan boy wins again - assist +7 by Derek !
    Amazing.
    Thank you.
  • MagicPants - Monday, February 23, 2009 - link

    Have you tried forcing on transparency super-sampling? If you don't edges defined by transparency in the texture won't AA. By default Nvidia (ATI?) only AA edges defined by depth difference.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I've seen one review on that, with the blown up edged images, and the ati cards don't smooth and blurr as well - they have more jaggies - so they HAVE to leave that out here - cause you know Derek loves that red 4850 and all the red cards -
  • Elfear - Monday, February 23, 2009 - link

    Derrick (or anyone else for that matter) can you comment on why the 4870 512MB Crossfire solution generally performed better than the 4870X2?
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Or WHY the GTX260 isn't praised to the stars for running 20 of 21 tests successfully - taking THE WIN !
    I guess it doesn;t matter when a gamer spends hundreds and hundreds on their dual gpu setup then it epic fails at games... gosh that wouldn't be irritating, would it ?
    Amazing red bias...chizo pointed out the sapphire 4850 / other 4850 driver issues thankfully, while Derek has a special place in his heart for the bluebacked red card, and says so in the article - then translates that to ALL 4850's.
    DREAM ON if you think that would happen with ANY GREEN card Derek has ever tested!
  • MagicPants - Monday, February 23, 2009 - link

    I'd like to see an article that rates overall systems in price to performance. Try to get as high as fps for the least amount of money spent.

    As one reader mentioned frame rate below 15 fps doesn't count because it's unplayable, so just pick a number between 10 and 15 and subtract it from the fps. Maybe vary it by game. Frame rates over 60fps shouldn't count either because most monitors can't even show that.

    This would be interesting because even small tweaks would make a difference e.g. adding a $60 sound card might get you 4 or 5 fps in a few games and might pay for itself.
  • marsbound2024 - Monday, February 23, 2009 - link

    It doesn't look like the GTX 260 Core 216 provides much, if any, tangible benefit over the GTX 260 according to these tests. Sure it had some wins, but they weren't very big ones and it also had some loses--albeit not very big ones either. One would be tempted to just get a GTX 260 or 4850 and wait to upgrade until the next generation of cards come out this summer. The time is getting close, anyways.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Good call.
    Even the 4830 or the 9800GT twice either, or the 9800gtx gts250 or 9600gt or 9600gso twice each - or the ati the ati - uhh... uh... do the reds have their "midrange" filled up ? Uh.. the 4670 ?
    LOL
    Yeah, nvidia needs more midrange - right ?
    LOL
    THE RED LIARS ARE SOMETHING ELSE!

Log in

Don't have an account? Sign up now