The Last "Discrete" Intel Integrated Graphics Chipset?

Intel always made its chipsets on a n - 1 manufacturing node. If the majority of its CPUs were being built on 90nm, Intel would make its chipsets on 130nm. When the CPUs moved to 65nm, chipsets would move to 90nm and so on and so forth. This only applied to the GMCH/North Bridge, the South Bridges were on a n-2 process. If the CPUs were built on 65nm, the GMCH would be built on 90nm and the ICH would be a 130nm part.

Building chipsets on n-1/n-2 manufacturing processes meant that Intel could get more use out of its older fabs before converting them to the latest technology. Given how large of an investment these multi-billion dollar fabs are, Intel's approach to manufacturing made financial sense.

Unfortunately from a performance standpoint, Intel's approach left much to be desired. Graphics performance, to a certain extent, is closely related to the die size of your GPU. The reason NVIDIA's GT200 is twice as fast as its previous generation G80 core is because there are simply more transistors, on a larger die, to crunch away at pixels (and more memory bandwidth to feed them). By limiting its chipset manufacturing to older technologies, Intel artificially limits the performance of its IGP solutions. This is compounded by the fact that they are also building hardware using architectures with fundamentally reduced capability and performance compared to competing solutions.

A year ago Intel committed to changing all of this; remember this slide?

With G45 the gap between the process that the chipsets are made on and the process that the CPUs are made on is narrowed, G45 being Intel's first 65nm IGP, it's also Intel's last IGP. After G45, there will be no more integrated graphics chipsets - Intel's graphics cores will simply be integrated onto the CPU package and eventually the CPU die itself.

A Lower Power Chipset

The move to 65nm does have some serious power benefits, we looked at the total system power consumption of G45 vs. G35 using a Core 2 Quad Q9300 running a variety of tests:

  Intel G45 (DDR3) Intel G45 (DDR2) Intel G35 (DDR2) Intel G45 Power Savings
System at Idle 66.8W 68.2W 79.7W 11.5W
Company of Heroes 92.4W 95.6W 103.9W 8.3W
Video Encoding (PCMark Vantage TV/Movies) 114.8W 115.8W 124.6W 8.8W
Blu-ray Playback 83.3W 84.9W 107.3 22.4W

 

Using the same memory, G45 manages to shave off a good 8 - 11% from the total system power consumption. There's a tremendous advantage in Blu-ray playback but that is due to more than just more power efficient transistors, which we'll address shortly.

We'll also see some G45 boards use DDR3 memory, which thanks to its lower operating voltage will shave off another couple of watts from your total system power budget.

The GMCH/ICH Showdown: What's New in the 4-Series Competitive Integrated Graphics?
Comments Locked

53 Comments

View All Comments

  • Imperor - Sunday, September 28, 2008 - link

    Impressive how many people just rant on about the review being inadequate when they obviously didn't even read the start of it! If they did that they'd know that reviews of AMD and nVidia boards are coming up and that all will be compared eventually!
    I get the feeling that the people talking about "Intel fanbois" tend to have the same kind of appreciation of another brand...
    Stating the obvious isn't being partial. It just so happens that AMD don't even come close to competing with Intel in the CPU department! Sure AMD might be cheaper, but there are cheap Intels out there as well. The whole platform tends to get a bit more expensive when you go with Intel but you get what you pay for. I'm perfectly happy with my G35+E2140. Does everything a computer is supposed to do but gaming. I'm not a gamer, so that is a non-issue for me.

    Very tempted to go mini-ITX with 1,5TB HDD. Tiny box and lots of diskspace!

    Found a nice case for it as well, Morex Venus 668. Not that I know anything about it really but it'll hold up to 3 HDDs and a full size ODD and probably house decent cooling for the CPU while still being tiny (~8"x9"x13").
  • robg1701 - Saturday, September 27, 2008 - link

    Do any of the boards support Dual-Link DVI?

    Im getting a bit sick of having to include a video card in otherwise low power boxes in order to drive my 30" monitor :)
  • deruberhanyok - Friday, September 26, 2008 - link

    [quote]We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review.[/quote]

    You weren't serious here, were you? That basically says "The chipset had problems so we didn't want to write a review talking about them."
  • piesquared - Friday, September 26, 2008 - link

    Does this sight have an ounce of integrity left? I seriously doubt it. Nothing but Intel pandering left here. You "reviewers" have the gaul to do a review of this attempt at an IGP, yet fail to show any review of either an AMD IGP if it proves how inverior G45 is. Are you seriously implying that people are so stupid that they aren't capable of seeing through this BS? I remember something about a SB750 promise somewhere around 2 months ago that never materialized, then a 790gx promise that never materialized, then another 790gx roundup, that not only never materialized, but the DFI preview article seems to have actually vanished, then the AMD IGP part II looks to be delayed or something, probably vanished due to Intel's poor performance.

    I am really really starting to wonder if AT was purchased by Intel. All evidence points to it. If not, then call a spade a spade and don't make promises you can't keep. I'm sure you think none of this matters because you're so popular that people will read no matter what you write here. I wouldn't be so confident if I were AT.
  • TA152H - Thursday, September 25, 2008 - link

    I can tell you guys are really working on gaining that female readership. As everyone knows, women really go for that low-class, vulgar language.

    Also, who would want to get rid of PS/2 ports? Whoever on your staff wants this, better have something more than they hate anything legacy. Where's the logic in adding two extra USB ports so you can remove the PS/2 ports? It's not like it's more flexible, really, because you pretty much always need the keyboard and mouse. When's the last time you were in the situation where you said "Oh, I won't be needing my mouse and keyboard today, and I'm so strapped for USB ports, it's a good thing I can use the ones I normally use for the keyboard and mouse for something else". Doubtful you've ever said it, and if you have, you have issues deeper than I am capable of dealing with.

    It's not like the keyboard or mouse work better in the USB port, or that it's somehow superior in this configuration. In fact, the PS/2 ports were made specifically for this, and are perfectly adequate for it. Didn't you guys know that USB has more overhead than the PS/2 ports? I guess not. So, you worry about fractions of a percent going from motherboard to motherboard with the same chipset, but you prefer to use a USB mouse and keyboard? I just do not understand that. USB was a nice invention of Intel to suck up CPU power so you'd need a faster processor. It's a pity this has been forgotten.

    Sure, let's the replace the efficient with the inefficient, so we can say we're done with the legacy ports and we can all feel like we've moved forward. Yes, that's real progress we want. Good grief.
  • CSMR - Friday, September 26, 2008 - link

    Yeah I had to get a quad core so I can dedicate one core to the USB mouse and one to the USB keyboard. Now I can type ultra fast and the mouse really zips around the screen.
  • MrFoo1 - Thursday, September 25, 2008 - link

    Non-integrated graphics cards are discrete, not discreet.

    discreet = modest/prudent/unnoticeable

    discrete = constituting a separate entity

  • dev0lution - Thursday, September 25, 2008 - link

    I really dislike the trend of recent reviews that go off on tangents about the state of the market, or particular vendor performance gripes and then the rest of the review doesn't even touch on relevant benchmarks or features to back up these rants. If you're going to complain about IGP performance from AMD or NVIDIA, you might want to back that up with at least ONE board being included in the comparison charts. Who cares if Intel G45 gets bad frame rates against itself (across the board to boot). Why not show how 3 IGP chipsets from the major vendors stack up against each other in something mainstream like Spore? If it's a G45 only review, how about you save the side comments for a true IGP roundup? Sorry, but if you have the time to post a "(p)review" that brings up competitive aspects with no benchmarks to balance out those comments, it's basically single-vendor propaganda - nothing in the conclusions deal with whether a IGP in the same price range from another vendor would fill the void that G45 clearly does not fill.

    Since when does issues at the release date mean you can't post the review? "We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review." - Ummm, might want to include that as disclosure in all your other post-launch day reviews!?! Or do other vendors get brownie points for being problem-free when you can actually buy the product?

    Unfortunately, the inconsistency across multiple reviews make it somewhat difficult to compare competing products from multiple vendors because the methodology varies between single chipset and competitive benchmarks, even when you can separate the irrelevant introductory comments and bias from the particular author from the rest of the review.

    More authors obviously does not equal consistency or more relevant reviews..
  • yyrkoon - Thursday, September 25, 2008 - link

    Looking forward to your review of this board(if I understood you correctly), as I have been keeping an eye on this board for a while now. Perfect for an all around general use board(minus gaming of course), but would have been really REALLY nice if that 1x PCIe slot were a 16x PCIe with atleast 8x bandwidth. Hell I think i would settle with 4xPCIe speeds, just to have the ability to use an AMD/ATI 3650/3670 in this system. I think Jetway has a similar board with a 16x PCIe slot, slightly less features, at the cost of like $350 usd . . .

    Now if someone reputable (meaning someone who can actually make a solid board from the START *cough*ABIT*cough*) using the Core 2 mobile CPU, SO-DIMMs, etc, AT A REASONABLE PRICE . . . I think I might be in power consumption heaven. Running my desktop 'beast' tends to drain the battery banks dry ; )
  • iwodo - Wednesday, September 24, 2008 - link

    I wonder if Anand could answer a few questions we have in our mind.

    Why with a generation Die Shrink we only get 2 extra Shader instead of like 4 - 6? Where did all the extra available die space went?

    With the New Radeon HD 4x series, people have consistent result they can get single digit CPU usage when viewing 1080P H.264 with a E7xxx Series CPU, or slightly more then 15% when using an old Celeron. This is 2 - 3 times better then G45!!!! Even 780G is a lot better then G45 as well. So why such a HUGE difference in performance of so called Hardware Accelerated Decoding?

Log in

Don't have an account? Sign up now