Integrated Graphics

Beginning now, all new NVIDIA chipsets will ship with integrated graphics (which NVIDIA is now calling the mGPU), regardless of what market segment they are targeted at. It's a particularly bold move by NVIDIA but much appreciated given that the mGPU in all of its chipsets will receive PureVideo HD and thus can fully accelerate H.264/MPEG-2/VC1 decode streams.

While it's unlikely that many would purchase a high-end motherboard based on the NVIDIA nForce 780a SLI chipset and simply use its integrated graphics, the mGPU in the 780a is the same GPU used in the 750a, 730a, 720a and the GeForce 8200 based motherboards, so the discussion here is far reaching.

AMD 780G vs. NVIDIA 780a Graphics Architecture

AMD has built a superior Integrated graphics part this time around, both from a technical standpoint and in terms of realized performance. It isn't that AMD really went much further than NVIDIA in terms of engineering something great: they just selected a higher performance core to integrate into their chipset than NVIDIA did.

Neither AMD nor NVIDIA told us exactly how the built their interface to the system bus and system memory, but the lack of a local framebuffer does mean that fast and as low latency as possible communication with system memory are required. In both cases, the discrete GPU from which the integrated part is derived uses a 64-bit width connection to local memory. In both cases, since system memory offers a 128-bit wide but these parts make use of a wider bus to help compensate for increased latency to system memory. Increasing local (on die) cache would also help here, but since IGP solutions are as low cost as possible it doesn't seem likely that we've got loads more cache to play with.

We used 3dmark's single texture test to try to get an idea of memory bandwidth. The test largely removes computation overhead and ends up just pulling in as much data as possible as fast as possible and throwing it up on the screen. The result in MTexels/sec shows that NVIDIA has a bit of an advantage here, but the gap isn't huge. This means that performance differences will likely come down to compute power rather than bandwidth

  AMD 780G NVIDIA nForce 780a
3DMark '06 Single Texture Fillrate 910.6 MTexels/s 983.4 MTexels/s

 

Past here, NVIDIA and AMD integrated hardware diverge. AMD's solution is based on the RV610 graphics core. In fact, it is an RV610 core shrunk to 55nm and integrated into the Northbridge. This means we get 8 5-wide blocks of shader processors (SPs -- 40 total). In the very very worst case, we get 8 shader ops per clock (which isn't likely to happen in any real situation). Compare this to NVIDIA's G86 based 8 SP offering with a maximum of 8 shader ops per clock and we see quite a difference emerge. AMD's IGP can handle 8 vector instructions per clock and then some, while the similar code could run at 2 instructions per clock on NVIDIA hardware.

Of course, this difference isn't as decimating to NVIDIA as one might think at first blush. We must remember that NVIDIA cranks up it's shader clock to ridiculous speeds while AMD's shaders all run at core clock speed. With AMD and NVIDIA core clocks both coming in at 500MHz, NVIDIA's shader core runs at 1200MHz. In spite of the fact that AMD's part can do more operations per clock (probably averaging out to somewhere between 3x and 4x; it heavily depends on the application), NVIDIA is able to do 2.4x as many clocks per second which closes the gap a bit.

The only discrete part with 8 SPs is the GeForce 8300 which is OEM only. As of this writing, NVIDIA has not confirmed details other than core and shader speeds and the number of SPs in the part with us. They have stated that their integrated hardware is simlar to the 8400/8500 in order to optimize the benefit of Hybrid SLI, so it's possible the number of texutre and ROP units are 8 each. Of course, if half the number of SPs is "similar" to the 8400 and 8500 parts, we can't really be sure until NVIDIA confirms the details. We do know that AMD's hardware has 4 texture and 4 render outs since it is RV610. With so few SPs, and the competition sticking with 4/4 texture/render units, we suspect that this is what NVIDIA has done as well.

What is clear is that either way, AMD's hardware is more robust than NVIDIA's offering. Our performance tests reflect this, as we will soon show.

The Rest of the Family Integrated Graphics Performance & GeForce Boost
Comments Locked

38 Comments

View All Comments

  • SiliconDoc - Wednesday, May 7, 2008 - link

    Maybe I'm the only one, but I'm so sick of every new PC component having a global warming psychotic power consumption "feature set", as if any of we end users actually give a d-a- you know what.
    Heck, maybe I'm lone gunman here, but it really makes me sick, over and over again, as if I'd buy their piece of crap because they have some wattage bean counter going gaga about their lower power requirements.
    Hey, here's an idea. STOP GAMING, lower yer rezz, use a tiny 13 inch monitor, and put monitor sleep mode to kick on in one minute.
    Better yet, shut your system off, you're wasting the earth, and get outside for heat from the sun or put on a wool sweater, or dunk ter head in the creek if you're too warm.
    Who are they fooling ? They're pushing 1,000 watt PS's, then we have to hear this beanny watt counter crud. Yeah, right after the Q6600, 4 HD's, 2 DVD's, memory card readers, dual monitor outputs, ohhh.. and make sure you got a 700 watt plus supergigajiggawatt or she might not run.....
    I for one would just like to say, to noone and nobody in particular, go take a green time out.
    PS- this article is no more or less green than any other, so it isn't a target. I guess it's clear this is a power surge and perhaps an overload. Well, good!
  • Donkey2008 - Wednesday, May 7, 2008 - link

    You are absolutely right, especially the application of this technology to notebooks, which is pure insanity. Why would I care if my laptop could switch from discrete to integrated GPU to save battery power and provide me another hour or so of use? I am trying to destroy the earth so I want as little battery life as possible so I can plug it in and use more resources.

    As for desktops, those crazy tree-huggers want you to use less power so that your systems run more efficient and PUT OUT LESS HEAT. This would be a complete waste for those who dropped several hundred dollars for water-cooling and giant, ridiculous, circus clown heatsinks. This isn't even mentioning the enviro-psychos who like to use their computer as a floor heater in winter.

    How about you take your finger out of your nose because it is clearly in too far and blocking your brain from releasing any common sense.
  • SiliconDoc - Wednesday, May 7, 2008 - link

    Why stop at that, you need the wind up power notebook, like the ones selling for the 3rd world. No plugging in and no charging any battery except by turning the crank handle.
    If you're gaming on a battery, it's not just your finger up your nose, but likely your neighbors as well, to hold it up so high. Where are you that you cannot plug in ... up in that airplane ... saving all that jet fuel ?... or did you drive your Yugo to some way out park to hack, away from civilization, also an energy saver, no doubt. Have fun disposing of the polluting battery, too.

    Desktops: If your system is putting out so much heat that you need to run a refrigerator to "cool just the system down", you certainly are not "saving" any power either.. DUH.
    Gigantic heatsinks (and their gargantuan fans)are power-hungry users trying to crank out the last bit of mhz, often with voltage increases, huh ... DUH. Maybe the jet engine they replaced was a cheap sleeve bearing, but they didn't "save power".

    Not sure exactly what the donkey you were trying to say, since you didn't make any sense, but then, that's what it's all about, huh. Preening your green self while wildly flailing about and praising the gigantic power (savings ? lol ) drain you are, anyway - while firing up the 250 watt triple 3d sli maxxed super whomper game.

    I think if you had any common sense, you'd "get it".


  • The Jedi - Wednesday, May 7, 2008 - link

    Jigga-WHAAAT?!
  • zander55 - Wednesday, May 7, 2008 - link

    Why on is ESA only available on the highest end model? Nvidia wants the industry to adopt and implement it into their hardware but won't even put it into their own stuff?
  • crimsonson - Tuesday, May 6, 2008 - link

    I don't understand why so many pages and charts are devoted to pure performance for motherboards. Unless there is physical flaw or bad drivers, performance between these motherboards are normally next to nil!
    I understand stability, overclocking, and power consumption. But looking at these charts a lot of them are minuscule difference that often can be explained by settings, other components or bad drivers. I am not saying bench testing are not useful. But I don't think it is necessary to view dozens of charts with no or little difference. In fact, it would make more sense to go in to details where there is a significant difference. I think your attention to detail gets the best of you :D

    My .02

    In general I do think you guys do awesome work.

  • wjl - Tuesday, May 6, 2008 - link

    Right. The benchmarks are not that interesting, and also which IGP runs which game at how many fps more or less is pretty uninteresting - as if the world had only gamers.

    As much as I like the image quality provided by Nvidia products, they're still a no-go if you want open source drivers - and here is much room for improvement. I won't buy (nor sell) any of them unless they follow the good examples of Intel and ATI/AMD.

    So my next mb - which will definitely have an IGP again - will be of the other mentioned makers, depending on whether I need an AMD or an Intel CPU next time.
  • strikeback03 - Thursday, May 8, 2008 - link

    I have to use the restricted drivers on both my desktop (discrete NVIDIA) and laptop (discrete ATi) in Ubuntu.

    And I've never understood the point of windows that wobble.
  • sprockkets - Tuesday, May 6, 2008 - link

    Tru, I love not having to install any drivers for compiz-fusion on my Intel G31 system. It actually runs it better than my 6150 AMD system.

    But, under load with movies and compiz and other stuff graphics wise running, the 6150 doesn't crap out as much.

    Good chipset, waiting for Intel's version. I have been an AMD person for a long time, but, for $70 a 2ghz Pentium Allendale works great for me.

    WTB a gen 2 Shuttle XPC in silver with either the G45 or Intel's. 3ghz Wolfdale will do nicely.
  • wjl - Wednesday, May 7, 2008 - link

    BTW: tried movies (MythTV) together with Compiz, and that really didn't look nice, even on my 6150/430 Nvidia. Only after switching off most or all desktop effects, the picture became more stable...

Log in

Don't have an account? Sign up now