Competitive Integrated Graphics?

Since the G45 GMCH is built on a 65nm process, it can be larger than G35's GMCH - and thus Intel increased the number of unified shader processors from 8 to 10.

1024 x 768 Intel G45 Intel G35
Enemy Territory: Quake Wars 6.8 fps 5.6 fps
Company of Heroes 24.5 fps 16.5 fps
Race Driver GRID 3.7 fps 2.8 fps
Age of Conan 7.9 fps 6.1 fps
Crysis 9.3 fps 7.9 fps
Spore 10.8 fps 9.7 fps
Half Life 2 Episode Two 40.7 fps 27.9 fps
Oblivion (800 x 600) 22.7 fps 14.7 fps
Intel's G45 is definitely faster than G35, and its performance in Half Life 2 is promising, but for the most part the graphics core is a letdown

 

These shader processors are nearly directly comparable to NVIDIA's, meaning that in terms of raw processing power the G45 GMCH has 1/24th the execution resources of NVIDIA's GeForce GTX 280. It gets even worse if you compare it to a more mainstream solution - take the recently announced GeForce 9500 GT for example, it only has 32 SPs - putting the G45 at around 1/3 the clock-for-clock power of a 9500 GT.

Then there's the clock speed issue. While the GeForce 9500 GT runs its array of SPs at 1.4GHz, Intel runs its shader processors at 800MHz. Both Intel and NVIDIA's architectures have a peak throughput of one shader instruction per clock, so while the 9500 GT has 3x the execution resources of the G45 GMCH, it also has 75% clock speed advantage giving it a 4.6x raw throughput advantage over the G45 GMCH.

But how about comparing Intel's graphics core to NVIDIA's IGP equivalent? The GeForce 8200 is NVIDIA's latest integrated graphics core, it has 8 SPs and runs them at a clock speed of 1.2GHz - giving NVIDIA a 20% advantage in raw throughput on paper.

There are many unknowns here however. NVIDIA has special execution units for transcendentals and it's unclear whether or not Intel has the same. There are also times when Intel must relegate vertex processing to the CPU, which can cause strange performance characteristcis. But the point is that Intel's latest graphics core, at least on paper, is competitive to what NVIDIA is offering.

Neither the Intel or NVIDIA solution can hold a candle to AMD's 780G, which has a peak throughput of 40 shader operations per clock compared to 10 for Intel and 8 for NVIDIA. The reason AMD can do so much more is that each of its 8 processing clusters is 5-wide, just like on its desktop GPUs. If there's enough parallel data to work on, each one of these clusters can output five shader instructions per clock. The real world utilization is somewhere between one and five depending on how efficient AMD's real time compiler is and the code being run, but this generally translates into AMD dominating the IGP performance charts even with a lower clock speed than both Intel and NVIDIA parts.

Does this all really matter?

This next point is one that I've quietly argued for the past few years. ATI and NVIDIA have always acted holier than thou because of their IGP performance superiority over Intel, but I argue that they are no better than the boys in blue.

Despite both ATI and NVIDIA being much faster than Intel, the overall gameplay experience delivered by their integrated graphics solutions is still piss poor. Even on older games. Try running Oblivion, a 2.5-year old title, on even AMD's 780G and you'll find that you have to run it at the lowest visual quality settings, at the lowest resolutions (800 x 600, max) to get playable frame rates. At those settings, the game looks absolutely horrible.

In those games that aren't visually demanding, performance doesn't actually matter and all three vendors end up doing just fine. Fundamentally both ATI and NVIDIA want to sell more discrete cards, so they aren't going to enable overly high performance integrated solutions. The IGP performance advantages in games amount to little more than a marketing advantage, since anyone who actually cares about gaming is going to be frustrated even by their higher performing integrated solution.

The area where ATI/NVIDIA deliver where Intel historically hasn't is in the sheer ability to actually run games. In the past, driver issues and just basic compatibility with most games was simply broken on Intel hardware. Intel tried to address much of that with G45.

There is one aspect of IGP performance that really matters these days however: video decode acceleration.

The Last "Discrete" Intel Integrated Graphics Chipset? Blu-ray Playback: Integrated Graphics Matters Again
Comments Locked

53 Comments

View All Comments

  • sprockkets - Wednesday, September 24, 2008 - link

    Except the fact that you needed a firmware update on the home theater receiver is just bulls****.

    Thanks DRM!

    I can't wait till VLC gets native blue ray support! At least we have Sly-Soft!
  • DoucheVader - Friday, September 26, 2008 - link

    Hey if it wasn't for a vast majority of people copying stuff, we wouldn't have DRM. I am sick of the complaints. We as consumers created this problem.

    Most things that have DRM are to protect someone's bread and butter. How would you like it if every time you got paid there was some money missing?


  • - Saturday, September 27, 2008 - link

    Your point might be valid if DRM worked, but can you point out a single mainstream home theater medium on which the DRM means anything to the pirates?

    DRMed CDs? Ha. Those just pissed off consumers when they inevitable didn't play in some players and/or contained bad software. Often defeated with the frickin shift key.

    DVD? People have tattoos of the DeCSS source code it's that damn short. Amusingly the longest lasting DRM scheme, with 2.5 years between the first DVD movie release and the release of DeCSS.

    HD-DVD? 253 days, not even a full year after the format first shipped its AACS protection system was cracked. Under three weeks later the first copies start showing up on private trackers.

    Blu-Ray (AACS)? The same AACS crack applied to it, and about two weeks after the first HD-DVD copies showed up Blu-Ray was right behind it. Launch to first pirated movie: 225 days.

    Blu-Ray (BD+)? Slightly harder than AACS apparently, but titles did not ship with it until October 2007 so the cracking community got off to a late start. AnyDVD HD supported decrypting all BD+ titles roughly 5 months after the first titles shipped and copies again showed up soon after.

    I'm less familiar with DVD-Audio and SACD, but my understanding is that there hasn't been a direct "crack" of their respective encryption but instead PC-based players and/or sound drivers are modified to just write the decoded bitstream to the hard drive. This works quite well for audio, as in most cases the compression (if any) applied on the disc is not wanted and the uncompressed PCM stream is exactly what the user desires. For obvious reasons that is not feasible with video.

    Once these protections are broken, they do nothing to reduce piracy and only remain to prevent fair-use backups by technologically illiterate users and/or to annoy consumers with crap like these HDCP issues.

    It doesn't even matter to the pirate crowd whether the cracks are public or private, as long as someone can do it that means the files will get out, and once they're out they're out.

Log in

Don't have an account? Sign up now