Competitive Integrated Graphics?

Since the G45 GMCH is built on a 65nm process, it can be larger than G35's GMCH - and thus Intel increased the number of unified shader processors from 8 to 10.

1024 x 768 Intel G45 Intel G35
Enemy Territory: Quake Wars 6.8 fps 5.6 fps
Company of Heroes 24.5 fps 16.5 fps
Race Driver GRID 3.7 fps 2.8 fps
Age of Conan 7.9 fps 6.1 fps
Crysis 9.3 fps 7.9 fps
Spore 10.8 fps 9.7 fps
Half Life 2 Episode Two 40.7 fps 27.9 fps
Oblivion (800 x 600) 22.7 fps 14.7 fps
Intel's G45 is definitely faster than G35, and its performance in Half Life 2 is promising, but for the most part the graphics core is a letdown

 

These shader processors are nearly directly comparable to NVIDIA's, meaning that in terms of raw processing power the G45 GMCH has 1/24th the execution resources of NVIDIA's GeForce GTX 280. It gets even worse if you compare it to a more mainstream solution - take the recently announced GeForce 9500 GT for example, it only has 32 SPs - putting the G45 at around 1/3 the clock-for-clock power of a 9500 GT.

Then there's the clock speed issue. While the GeForce 9500 GT runs its array of SPs at 1.4GHz, Intel runs its shader processors at 800MHz. Both Intel and NVIDIA's architectures have a peak throughput of one shader instruction per clock, so while the 9500 GT has 3x the execution resources of the G45 GMCH, it also has 75% clock speed advantage giving it a 4.6x raw throughput advantage over the G45 GMCH.

But how about comparing Intel's graphics core to NVIDIA's IGP equivalent? The GeForce 8200 is NVIDIA's latest integrated graphics core, it has 8 SPs and runs them at a clock speed of 1.2GHz - giving NVIDIA a 20% advantage in raw throughput on paper.

There are many unknowns here however. NVIDIA has special execution units for transcendentals and it's unclear whether or not Intel has the same. There are also times when Intel must relegate vertex processing to the CPU, which can cause strange performance characteristcis. But the point is that Intel's latest graphics core, at least on paper, is competitive to what NVIDIA is offering.

Neither the Intel or NVIDIA solution can hold a candle to AMD's 780G, which has a peak throughput of 40 shader operations per clock compared to 10 for Intel and 8 for NVIDIA. The reason AMD can do so much more is that each of its 8 processing clusters is 5-wide, just like on its desktop GPUs. If there's enough parallel data to work on, each one of these clusters can output five shader instructions per clock. The real world utilization is somewhere between one and five depending on how efficient AMD's real time compiler is and the code being run, but this generally translates into AMD dominating the IGP performance charts even with a lower clock speed than both Intel and NVIDIA parts.

Does this all really matter?

This next point is one that I've quietly argued for the past few years. ATI and NVIDIA have always acted holier than thou because of their IGP performance superiority over Intel, but I argue that they are no better than the boys in blue.

Despite both ATI and NVIDIA being much faster than Intel, the overall gameplay experience delivered by their integrated graphics solutions is still piss poor. Even on older games. Try running Oblivion, a 2.5-year old title, on even AMD's 780G and you'll find that you have to run it at the lowest visual quality settings, at the lowest resolutions (800 x 600, max) to get playable frame rates. At those settings, the game looks absolutely horrible.

In those games that aren't visually demanding, performance doesn't actually matter and all three vendors end up doing just fine. Fundamentally both ATI and NVIDIA want to sell more discrete cards, so they aren't going to enable overly high performance integrated solutions. The IGP performance advantages in games amount to little more than a marketing advantage, since anyone who actually cares about gaming is going to be frustrated even by their higher performing integrated solution.

The area where ATI/NVIDIA deliver where Intel historically hasn't is in the sheer ability to actually run games. In the past, driver issues and just basic compatibility with most games was simply broken on Intel hardware. Intel tried to address much of that with G45.

There is one aspect of IGP performance that really matters these days however: video decode acceleration.

The Last "Discrete" Intel Integrated Graphics Chipset? Blu-ray Playback: Integrated Graphics Matters Again
Comments Locked

53 Comments

View All Comments

  • kilkennycat - Wednesday, September 24, 2008 - link

    Anand and Gary,

    Seems as if an important candidate is MIA in your 3-part review of integrated chip-sets/uATX motherboards, the LONG-promised nVidia MCP7A chip-set for Intel processors with integrated 9400/9300 graphics. The only potential integrated-graphics competitor to the G45 in the Intel-processor world. When is the MCP7A due to be released? Most recent speculation (in DigiTimes, iirc) was the end of this month (September). In time to add a review of one or more uATX motherboards based on this chipset as Part 4 to this group of three reviews?
  • Anand Lal Shimpi - Wednesday, September 24, 2008 - link

    By the time this series is over the MCP7A won't be out, but we'll have a standalone review of that product to coincide with availability :)

    -A
  • yehuda - Thursday, September 25, 2008 - link

    Ok, when will this series be over? I ask because the last news on MCP7A said that it should be out before the end of this month and your statement makes me wonder if there's another delay ahead.
  • kilkennycat - Wednesday, September 24, 2008 - link

    Anand, thanks for the reply.

    NDA gag on projected-availability information from nVidia ??
  • Anand Lal Shimpi - Wednesday, September 24, 2008 - link

    Yep, it won't be too much longer and NV is quite excited about it but specifics I can't give out unfortunately.

    *If* I were in NVIDIA's shoes I'd want to capitalize as best as possible on Intel's handling of G45. I'd make sure that the first products worked *perfectly* and availability was immediate and at competitive prices.

    If NVIDIA blows this opportunity I'll be quite disappointed, especially given how much crap it has given Intel about Larrabee.

    We'll know soon enough, but after the IGP Chronicles are over :)

    -A
  • Pederv - Wednesday, September 24, 2008 - link

    From what I read, the G45 can be summed as, "It has a few good points but over all it sucks."
  • kevinkreiser - Wednesday, September 24, 2008 - link

    I was hoping there would be mention of the graphics problems when putting nvidia cards in these boards. I was very vocal about this over at the AVS forums. Did any one notice any mention of this in the article? Also will there be coverage of the new nvidia boards MCP7A?
  • Gary Key - Wednesday, September 24, 2008 - link

    Our final article will feature the discrete cards like the 9600GT and HD 4670. In testing so far, I have not had the problems that have been reported. I do have some additional NV cards coming to test. We will have coverage on the GeForce 9400 series when it launches. ;)
  • kevinkreiser - Wednesday, September 24, 2008 - link

    Thanks for the quick reply. I appreciate you guys keeping an eye out for the aforementioned problem. And also thanks for the heads up on the forthcoming MCP7A article.
  • sprockkets - Wednesday, September 24, 2008 - link

    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...

    Shame it costs $100 though. Shame too that Intel boards work so poorly, since I like that mini-itx board. The one from Jetway for the 8200 chipset probably doesn't work any better.

Log in

Don't have an account? Sign up now