The Last "Discrete" Intel Integrated Graphics Chipset?

Intel always made its chipsets on a n - 1 manufacturing node. If the majority of its CPUs were being built on 90nm, Intel would make its chipsets on 130nm. When the CPUs moved to 65nm, chipsets would move to 90nm and so on and so forth. This only applied to the GMCH/North Bridge, the South Bridges were on a n-2 process. If the CPUs were built on 65nm, the GMCH would be built on 90nm and the ICH would be a 130nm part.

Building chipsets on n-1/n-2 manufacturing processes meant that Intel could get more use out of its older fabs before converting them to the latest technology. Given how large of an investment these multi-billion dollar fabs are, Intel's approach to manufacturing made financial sense.

Unfortunately from a performance standpoint, Intel's approach left much to be desired. Graphics performance, to a certain extent, is closely related to the die size of your GPU. The reason NVIDIA's GT200 is twice as fast as its previous generation G80 core is because there are simply more transistors, on a larger die, to crunch away at pixels (and more memory bandwidth to feed them). By limiting its chipset manufacturing to older technologies, Intel artificially limits the performance of its IGP solutions. This is compounded by the fact that they are also building hardware using architectures with fundamentally reduced capability and performance compared to competing solutions.

A year ago Intel committed to changing all of this; remember this slide?

With G45 the gap between the process that the chipsets are made on and the process that the CPUs are made on is narrowed, G45 being Intel's first 65nm IGP, it's also Intel's last IGP. After G45, there will be no more integrated graphics chipsets - Intel's graphics cores will simply be integrated onto the CPU package and eventually the CPU die itself.

A Lower Power Chipset

The move to 65nm does have some serious power benefits, we looked at the total system power consumption of G45 vs. G35 using a Core 2 Quad Q9300 running a variety of tests:

  Intel G45 (DDR3) Intel G45 (DDR2) Intel G35 (DDR2) Intel G45 Power Savings
System at Idle 66.8W 68.2W 79.7W 11.5W
Company of Heroes 92.4W 95.6W 103.9W 8.3W
Video Encoding (PCMark Vantage TV/Movies) 114.8W 115.8W 124.6W 8.8W
Blu-ray Playback 83.3W 84.9W 107.3 22.4W

 

Using the same memory, G45 manages to shave off a good 8 - 11% from the total system power consumption. There's a tremendous advantage in Blu-ray playback but that is due to more than just more power efficient transistors, which we'll address shortly.

We'll also see some G45 boards use DDR3 memory, which thanks to its lower operating voltage will shave off another couple of watts from your total system power budget.

The GMCH/ICH Showdown: What's New in the 4-Series Competitive Integrated Graphics?
Comments Locked

53 Comments

View All Comments

  • sprockkets - Wednesday, September 24, 2008 - link

    Except the fact that you needed a firmware update on the home theater receiver is just bulls****.

    Thanks DRM!

    I can't wait till VLC gets native blue ray support! At least we have Sly-Soft!
  • DoucheVader - Friday, September 26, 2008 - link

    Hey if it wasn't for a vast majority of people copying stuff, we wouldn't have DRM. I am sick of the complaints. We as consumers created this problem.

    Most things that have DRM are to protect someone's bread and butter. How would you like it if every time you got paid there was some money missing?


  • - Saturday, September 27, 2008 - link

    Your point might be valid if DRM worked, but can you point out a single mainstream home theater medium on which the DRM means anything to the pirates?

    DRMed CDs? Ha. Those just pissed off consumers when they inevitable didn't play in some players and/or contained bad software. Often defeated with the frickin shift key.

    DVD? People have tattoos of the DeCSS source code it's that damn short. Amusingly the longest lasting DRM scheme, with 2.5 years between the first DVD movie release and the release of DeCSS.

    HD-DVD? 253 days, not even a full year after the format first shipped its AACS protection system was cracked. Under three weeks later the first copies start showing up on private trackers.

    Blu-Ray (AACS)? The same AACS crack applied to it, and about two weeks after the first HD-DVD copies showed up Blu-Ray was right behind it. Launch to first pirated movie: 225 days.

    Blu-Ray (BD+)? Slightly harder than AACS apparently, but titles did not ship with it until October 2007 so the cracking community got off to a late start. AnyDVD HD supported decrypting all BD+ titles roughly 5 months after the first titles shipped and copies again showed up soon after.

    I'm less familiar with DVD-Audio and SACD, but my understanding is that there hasn't been a direct "crack" of their respective encryption but instead PC-based players and/or sound drivers are modified to just write the decoded bitstream to the hard drive. This works quite well for audio, as in most cases the compression (if any) applied on the disc is not wanted and the uncompressed PCM stream is exactly what the user desires. For obvious reasons that is not feasible with video.

    Once these protections are broken, they do nothing to reduce piracy and only remain to prevent fair-use backups by technologically illiterate users and/or to annoy consumers with crap like these HDCP issues.

    It doesn't even matter to the pirate crowd whether the cracks are public or private, as long as someone can do it that means the files will get out, and once they're out they're out.

Log in

Don't have an account? Sign up now