Integrated Graphics Performance & GeForce Boost

As expected, AMD's 780G manages to outperform NVIDIA's integrated graphics steadily across the board:

  AMD 780G NVIDIA 780a/GeForce 8200 % Performance Advantage (AMD)
Half Life 2 Episode Two (10x7) 43.1 30.2 42.7%
Microsoft Flight Simulator X (10x7) 24.6 21.4 15.0%
Company of Heroes (10x7) 29.4 19.4 51.5%
Unreal Tournament 3 (10x7) 22.9 16.8 36.3%
Crysis (10x7 LowQ) 20.3 16.9 20.1%

 

Once again, although this comparison matters more for the nForce 730a and GeForce 8200 motherboards, NVIDIA's mGPU just doesn't compare to AMD's.

The performance advantage ranges from 15% to just over 50%, and the only surprising part here is that AMD doesn't do better given the theoretical advantage it holds over NVIDIA. As we mentioned before, it's doubtful that many will buy a nForce 780a board and use its integrated graphics to play games but the AMD performance advantage holds true for 750a and GeForce 8200 platforms as well. For a company that has been criticising Intel's integrated graphics performance as of late, we would expect nothing short of the best scores here.

If the mGPU performance of the nForce 780a (or any of the other new NVIDIA chipsets) isn't enough, you can simply toss in a low end discrete GPU (dGPU) and NVIDIA's latest drivers will enable GeForce Boost. GeForce Boost is nothing more than SLI but between a mGPU and dGPU. Given how slow the mGPU is, GeForce Boost will only actually improve performance with a low-end dGPU and thus NVIDIA only supports GeForce Boost with either a GeForce 8400GS or GeForce 8500 GT.

With GeForce Boost enabled, the display driver also comes up with a new name for the mGPU + dGPU combo. If you combine a nForce 780a with a GeForce 8400GS you get a GeForce 8500 and if you pair the 780a with an 8500 GT the driver will report the mix as a GeForce 8600.

NVIDIA 780a + GeForce 8400 GS Half Life 2 Episode Two MS Flight Simulator X Company of Heroes Crysis Unreal Tournament 3
mGPU alone 30.2 21.4 19.4 16.9 16.8
dGPU alone 41.2 39.6 38.7 20.3 21.4
mGPU + dGPU (GeForce Boost) 50.3 39.6 45.5 30.1 22.2
% Increase due to GF Boost 22.0% 0.0% 17.6% 48.3% 3.7%

 

With a GeForce 8400 GS we actually see decent scaling from a dGPU to the GeForce Boost mode. The added performance is large percentage-wise but in raw numbers it's nothing huge. You're basically getting a smoother gaming experience with GeForce Boost enabled, at least in those games where bridgeless SLI is supported.

NVIDIA 780a + GeForce 8500 GT Half Life 2 Episode Two MS Flight Simulator X Company of Heroes Crysis Unreal Tournament 3
mGPU alone 30.2 21.4 19.4 16.9 16.8
dGPU alone 47.5 35.3 48.6 24.8 33.1
mGPU + dGPU (GeForce Boost) 47.7 37.8 49.3 26.3 27.3
% Increase due to GF Boost 0.0% 7.0% 1.4% 6.0% -17.5%

 

GeForce Boost does next to nothing with an 8500 GT and in the case of Unreal Tournament 3, performance actually decreases. Of course it's a safe bet that future driver updates will improve scaling and performance from GeForce Boost.

AMD supports a similar technology with its 780G:

AMD 780G + Radeon 3450 Half Life 2 Episode Two MS Flight Simulator X Company of Heroes Crysis Unreal Tournament 3
mGPU alone 43.1 24.6 29.4 20.3 22.9
dGPU alone 51.6 30.2 36.8 23.3 28.5
mGPU + dGPU (Hybrid CrossFire) 61.4 37.3 54.3 31.4 32.7
% Increase due to Hybrid CF 19.0% 23.5% 47.6% 34.8% 14.7%
Integrated Graphics High Definition Video Decode Acceleration
Comments Locked

38 Comments

View All Comments

  • SiliconDoc - Wednesday, May 7, 2008 - link

    Maybe I'm the only one, but I'm so sick of every new PC component having a global warming psychotic power consumption "feature set", as if any of we end users actually give a d-a- you know what.
    Heck, maybe I'm lone gunman here, but it really makes me sick, over and over again, as if I'd buy their piece of crap because they have some wattage bean counter going gaga about their lower power requirements.
    Hey, here's an idea. STOP GAMING, lower yer rezz, use a tiny 13 inch monitor, and put monitor sleep mode to kick on in one minute.
    Better yet, shut your system off, you're wasting the earth, and get outside for heat from the sun or put on a wool sweater, or dunk ter head in the creek if you're too warm.
    Who are they fooling ? They're pushing 1,000 watt PS's, then we have to hear this beanny watt counter crud. Yeah, right after the Q6600, 4 HD's, 2 DVD's, memory card readers, dual monitor outputs, ohhh.. and make sure you got a 700 watt plus supergigajiggawatt or she might not run.....
    I for one would just like to say, to noone and nobody in particular, go take a green time out.
    PS- this article is no more or less green than any other, so it isn't a target. I guess it's clear this is a power surge and perhaps an overload. Well, good!
  • Donkey2008 - Wednesday, May 7, 2008 - link

    You are absolutely right, especially the application of this technology to notebooks, which is pure insanity. Why would I care if my laptop could switch from discrete to integrated GPU to save battery power and provide me another hour or so of use? I am trying to destroy the earth so I want as little battery life as possible so I can plug it in and use more resources.

    As for desktops, those crazy tree-huggers want you to use less power so that your systems run more efficient and PUT OUT LESS HEAT. This would be a complete waste for those who dropped several hundred dollars for water-cooling and giant, ridiculous, circus clown heatsinks. This isn't even mentioning the enviro-psychos who like to use their computer as a floor heater in winter.

    How about you take your finger out of your nose because it is clearly in too far and blocking your brain from releasing any common sense.
  • SiliconDoc - Wednesday, May 7, 2008 - link

    Why stop at that, you need the wind up power notebook, like the ones selling for the 3rd world. No plugging in and no charging any battery except by turning the crank handle.
    If you're gaming on a battery, it's not just your finger up your nose, but likely your neighbors as well, to hold it up so high. Where are you that you cannot plug in ... up in that airplane ... saving all that jet fuel ?... or did you drive your Yugo to some way out park to hack, away from civilization, also an energy saver, no doubt. Have fun disposing of the polluting battery, too.

    Desktops: If your system is putting out so much heat that you need to run a refrigerator to "cool just the system down", you certainly are not "saving" any power either.. DUH.
    Gigantic heatsinks (and their gargantuan fans)are power-hungry users trying to crank out the last bit of mhz, often with voltage increases, huh ... DUH. Maybe the jet engine they replaced was a cheap sleeve bearing, but they didn't "save power".

    Not sure exactly what the donkey you were trying to say, since you didn't make any sense, but then, that's what it's all about, huh. Preening your green self while wildly flailing about and praising the gigantic power (savings ? lol ) drain you are, anyway - while firing up the 250 watt triple 3d sli maxxed super whomper game.

    I think if you had any common sense, you'd "get it".


  • The Jedi - Wednesday, May 7, 2008 - link

    Jigga-WHAAAT?!
  • zander55 - Wednesday, May 7, 2008 - link

    Why on is ESA only available on the highest end model? Nvidia wants the industry to adopt and implement it into their hardware but won't even put it into their own stuff?
  • crimsonson - Tuesday, May 6, 2008 - link

    I don't understand why so many pages and charts are devoted to pure performance for motherboards. Unless there is physical flaw or bad drivers, performance between these motherboards are normally next to nil!
    I understand stability, overclocking, and power consumption. But looking at these charts a lot of them are minuscule difference that often can be explained by settings, other components or bad drivers. I am not saying bench testing are not useful. But I don't think it is necessary to view dozens of charts with no or little difference. In fact, it would make more sense to go in to details where there is a significant difference. I think your attention to detail gets the best of you :D

    My .02

    In general I do think you guys do awesome work.

  • wjl - Tuesday, May 6, 2008 - link

    Right. The benchmarks are not that interesting, and also which IGP runs which game at how many fps more or less is pretty uninteresting - as if the world had only gamers.

    As much as I like the image quality provided by Nvidia products, they're still a no-go if you want open source drivers - and here is much room for improvement. I won't buy (nor sell) any of them unless they follow the good examples of Intel and ATI/AMD.

    So my next mb - which will definitely have an IGP again - will be of the other mentioned makers, depending on whether I need an AMD or an Intel CPU next time.
  • strikeback03 - Thursday, May 8, 2008 - link

    I have to use the restricted drivers on both my desktop (discrete NVIDIA) and laptop (discrete ATi) in Ubuntu.

    And I've never understood the point of windows that wobble.
  • sprockkets - Tuesday, May 6, 2008 - link

    Tru, I love not having to install any drivers for compiz-fusion on my Intel G31 system. It actually runs it better than my 6150 AMD system.

    But, under load with movies and compiz and other stuff graphics wise running, the 6150 doesn't crap out as much.

    Good chipset, waiting for Intel's version. I have been an AMD person for a long time, but, for $70 a 2ghz Pentium Allendale works great for me.

    WTB a gen 2 Shuttle XPC in silver with either the G45 or Intel's. 3ghz Wolfdale will do nicely.
  • wjl - Wednesday, May 7, 2008 - link

    BTW: tried movies (MythTV) together with Compiz, and that really didn't look nice, even on my 6150/430 Nvidia. Only after switching off most or all desktop effects, the picture became more stable...

Log in

Don't have an account? Sign up now