The GMCH/ICH Showdown: What's New in the 4-Series

The role of the chipset in a modern PC has changed considerably over the years, mostly due to AMD's integration of the main memory controller onto its CPU die. Intel won't do the same until Nehalem, so the role of its chipsets remain relatively unchanged despite taking on additional functionality over the years.

The role of a chipset is to connect everything in your system to one another; it's the controller logic that connects your CPU to your graphics card, Ethernet, hard drives, USB peripherals, etc.., and connects all of them to main memory. For all of modern desktop chipset history, most chipsets have been two chip solutions - normally known as a North and South Bridge. The North Bridge generally housed the memory controller and AGP or PCI Express interface, while the South Bridge took care of less bandwidth intensive things like PATA/SATA ports, LAN, USB, sound, etc...

Intel came up with its own terms for North and South Bridge back in the late 1990s with a move to its "hub architecture". The North Bridge became the Graphics and Memory Controller Hub (GMCH) while the South Bridge became the I/O Controller Hub (ICH). The GMCH is technically only present when it's a chipset with integrated graphics, otherwise it's simply a MCH.

The 4-series GMCH, which is used in the G45 chipset as well as the P45 chipset (just a MCH there) is honestly not much different from the 3-series (G)MCH used in the G35/P35 chipsets:

  4-series GMCH 3-series GMCH
Manufacturing Process 65nm 90nm
FSB 800 / 1066 / 1333MHz 800 / 1066 / 1333MHz
IOQ Depth 12 12
Memory Controller 2 x 64-bit DDR2/DDR3 channels 2 x 64-bit DDR2/DDR3 channels
Memory Speeds Supported DDR2-800/667
PCI Express 16 PCIe 2.0 lanes 16 PCIe 1.1 lanes
Graphics GMA X4500

GMA X3500

Core Clock 800MHz 667MHz
Shader Processors 10


Full H.264/VC-1/MPEG-2 HW Decode Yes No
Pin-out 1254-ball 1226-ball


The pinout is different, thus requiring new motherboard designs but the performance characteristics of the two GMCHs are basically identical. The 4-series chipsets added PCIe 2.0, but the biggest performance impact is the improved graphics core in the 4-series GMCH. If you've got a 3-series motherboard today, the 4-series equivalent shouldn't be any faster in non-gaming/video decoding applications (although it will use less power thanks to the 65nm manufacturing process).

The ICH comparison is even more tame, there's honestly no change between ICH10 and its predecessor: ICH9.

PCI Express 6 x1 PCIe 1.1 6 x1 PCIe 1.1 6 x1 PCIe 1.1
USB 12 ports 12 ports 10 ports
SATA (300MB/s) 6 ports 4 ports (ICH9 base)
6 ports (ICH9R)
4 ports (ICH8 base)
6 ports (ICH8R)
RAID* RAID 0/1/5/10 RAID 0/1/5/10 RAID 0/1/5/10
HD Audio Interface Yes Yes Yes
Ethernet Intel Gigabit LAN Intel Gigabit LAN Intel Gigabit LAN
G/MCH Interface DMI 10Gb/s each direction, full duplex DMI 10Gb/s each direction, full duplex DMI 10Gb/s each direction, full duplex
Voltage 1.1V 1.05V 1.05V
Release Date 2008 2007 2006
*RAID is only supported on -R derivatives


Even going back to ICH8, there's hardly a difference here (you do get some more USB ports with ICH9/10). There are some minor differences, for example the base ICH10 features 6 SATA ports while the base ICH8/9 only featured 4. The take away point is that feature-wise, there's not much new.

Index The Last "Discrete" Intel Integrated Graphics Chipset?


View All Comments

  • Imperor - Sunday, September 28, 2008 - link

    Impressive how many people just rant on about the review being inadequate when they obviously didn't even read the start of it! If they did that they'd know that reviews of AMD and nVidia boards are coming up and that all will be compared eventually!
    I get the feeling that the people talking about "Intel fanbois" tend to have the same kind of appreciation of another brand...
    Stating the obvious isn't being partial. It just so happens that AMD don't even come close to competing with Intel in the CPU department! Sure AMD might be cheaper, but there are cheap Intels out there as well. The whole platform tends to get a bit more expensive when you go with Intel but you get what you pay for. I'm perfectly happy with my G35+E2140. Does everything a computer is supposed to do but gaming. I'm not a gamer, so that is a non-issue for me.

    Very tempted to go mini-ITX with 1,5TB HDD. Tiny box and lots of diskspace!

    Found a nice case for it as well, Morex Venus 668. Not that I know anything about it really but it'll hold up to 3 HDDs and a full size ODD and probably house decent cooling for the CPU while still being tiny (~8"x9"x13").
  • robg1701 - Saturday, September 27, 2008 - link

    Do any of the boards support Dual-Link DVI?

    Im getting a bit sick of having to include a video card in otherwise low power boxes in order to drive my 30" monitor :)
  • deruberhanyok - Friday, September 26, 2008 - link

    [quote]We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review.[/quote]

    You weren't serious here, were you? That basically says "The chipset had problems so we didn't want to write a review talking about them."
  • piesquared - Friday, September 26, 2008 - link

    Does this sight have an ounce of integrity left? I seriously doubt it. Nothing but Intel pandering left here. You "reviewers" have the gaul to do a review of this attempt at an IGP, yet fail to show any review of either an AMD IGP if it proves how inverior G45 is. Are you seriously implying that people are so stupid that they aren't capable of seeing through this BS? I remember something about a SB750 promise somewhere around 2 months ago that never materialized, then a 790gx promise that never materialized, then another 790gx roundup, that not only never materialized, but the DFI preview article seems to have actually vanished, then the AMD IGP part II looks to be delayed or something, probably vanished due to Intel's poor performance.

    I am really really starting to wonder if AT was purchased by Intel. All evidence points to it. If not, then call a spade a spade and don't make promises you can't keep. I'm sure you think none of this matters because you're so popular that people will read no matter what you write here. I wouldn't be so confident if I were AT.
  • TA152H - Thursday, September 25, 2008 - link

    I can tell you guys are really working on gaining that female readership. As everyone knows, women really go for that low-class, vulgar language.

    Also, who would want to get rid of PS/2 ports? Whoever on your staff wants this, better have something more than they hate anything legacy. Where's the logic in adding two extra USB ports so you can remove the PS/2 ports? It's not like it's more flexible, really, because you pretty much always need the keyboard and mouse. When's the last time you were in the situation where you said "Oh, I won't be needing my mouse and keyboard today, and I'm so strapped for USB ports, it's a good thing I can use the ones I normally use for the keyboard and mouse for something else". Doubtful you've ever said it, and if you have, you have issues deeper than I am capable of dealing with.

    It's not like the keyboard or mouse work better in the USB port, or that it's somehow superior in this configuration. In fact, the PS/2 ports were made specifically for this, and are perfectly adequate for it. Didn't you guys know that USB has more overhead than the PS/2 ports? I guess not. So, you worry about fractions of a percent going from motherboard to motherboard with the same chipset, but you prefer to use a USB mouse and keyboard? I just do not understand that. USB was a nice invention of Intel to suck up CPU power so you'd need a faster processor. It's a pity this has been forgotten.

    Sure, let's the replace the efficient with the inefficient, so we can say we're done with the legacy ports and we can all feel like we've moved forward. Yes, that's real progress we want. Good grief.
  • CSMR - Friday, September 26, 2008 - link

    Yeah I had to get a quad core so I can dedicate one core to the USB mouse and one to the USB keyboard. Now I can type ultra fast and the mouse really zips around the screen. Reply
  • MrFoo1 - Thursday, September 25, 2008 - link

    Non-integrated graphics cards are discrete, not discreet.

    discreet = modest/prudent/unnoticeable

    discrete = constituting a separate entity

  • dev0lution - Thursday, September 25, 2008 - link

    I really dislike the trend of recent reviews that go off on tangents about the state of the market, or particular vendor performance gripes and then the rest of the review doesn't even touch on relevant benchmarks or features to back up these rants. If you're going to complain about IGP performance from AMD or NVIDIA, you might want to back that up with at least ONE board being included in the comparison charts. Who cares if Intel G45 gets bad frame rates against itself (across the board to boot). Why not show how 3 IGP chipsets from the major vendors stack up against each other in something mainstream like Spore? If it's a G45 only review, how about you save the side comments for a true IGP roundup? Sorry, but if you have the time to post a "(p)review" that brings up competitive aspects with no benchmarks to balance out those comments, it's basically single-vendor propaganda - nothing in the conclusions deal with whether a IGP in the same price range from another vendor would fill the void that G45 clearly does not fill.

    Since when does issues at the release date mean you can't post the review? "We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review." - Ummm, might want to include that as disclosure in all your other post-launch day reviews!?! Or do other vendors get brownie points for being problem-free when you can actually buy the product?

    Unfortunately, the inconsistency across multiple reviews make it somewhat difficult to compare competing products from multiple vendors because the methodology varies between single chipset and competitive benchmarks, even when you can separate the irrelevant introductory comments and bias from the particular author from the rest of the review.

    More authors obviously does not equal consistency or more relevant reviews..
  • yyrkoon - Thursday, September 25, 2008 - link

    Looking forward to your review of this board(if I understood you correctly), as I have been keeping an eye on this board for a while now. Perfect for an all around general use board(minus gaming of course), but would have been really REALLY nice if that 1x PCIe slot were a 16x PCIe with atleast 8x bandwidth. Hell I think i would settle with 4xPCIe speeds, just to have the ability to use an AMD/ATI 3650/3670 in this system. I think Jetway has a similar board with a 16x PCIe slot, slightly less features, at the cost of like $350 usd . . .

    Now if someone reputable (meaning someone who can actually make a solid board from the START *cough*ABIT*cough*) using the Core 2 mobile CPU, SO-DIMMs, etc, AT A REASONABLE PRICE . . . I think I might be in power consumption heaven. Running my desktop 'beast' tends to drain the battery banks dry ; )
  • iwodo - Wednesday, September 24, 2008 - link

    I wonder if Anand could answer a few questions we have in our mind.

    Why with a generation Die Shrink we only get 2 extra Shader instead of like 4 - 6? Where did all the extra available die space went?

    With the New Radeon HD 4x series, people have consistent result they can get single digit CPU usage when viewing 1080P H.264 with a E7xxx Series CPU, or slightly more then 15% when using an old Celeron. This is 2 - 3 times better then G45!!!! Even 780G is a lot better then G45 as well. So why such a HUGE difference in performance of so called Hardware Accelerated Decoding?

Log in

Don't have an account? Sign up now