Discovery: Two Channels Aren't Worse Than Three

Intel told me something interesting when I was out in LA earlier this summer: it takes at least 3 cores to fully saturate Lynnfield's dual-channel DDR3-1333 memory bus. That's three cores all working on memory bandwidth intensive threads at the same time. That's a pretty stiff requirement. In the vast, vast majority of situations Lynnfield's dual channel DDR3 memory controller won't hurt it.

Move up to 6 or 8 core designs and a third memory channel is necessary, and that's why we'll see those processors debut exclusively on LGA-1366 platforms. In fact, X58 motherboards will only need a BIOS update to work with the 6-core 32nm Gulftown processor next year. P55 looks like it'll be limited to four cores and below.

Because of this, Lynnfield's memory bandwidth and latency cores are actually quite similar to Bloomfield. I used Everest to look at memory bandwidth and latency between a Core i7 975 and Core i7 870 (Lynnfield):

Lynnfield's memory controller is good, easily as good as what's in Bloomfield if not slightly better.

 

Both processors turbo'd up to 3.46GHz, indicating that Everest's memory test uses no more than two threads. The 975 ran DDR3-1066 memory (the highest it officially supports), while the 870 used DDR3-1333. The faster memory gave the 870 the advantage. Since we're not taxing all four cores, Lynnfield is at no disadvantage from a bandwidth perspective. Surprisingly enough, even SiSoft Sandra (which does use four cores for its memory bandwidth test) shows Lynnfield's dual-channel DDR3-1333 memory controller as equal to Bloomfield's triple-channel DDR3-1066 interface.

SiSoft Sandra 2009.SP4 Intel Core i7 975 Intel Core i7 870
Aggregate Memory Bandwidth 17.8 GB/s 17.3 GB/s

 

Long story short? Lynnfield won't be memory bandwidth limited with DDR3-1333 for the overwhelming majority of usage cases.

Lynnfield's Un-Core: Faster Than Most Bloomfields The Best Gaming CPU?
Comments Locked

343 Comments

View All Comments

  • strikeback03 - Tuesday, September 8, 2009 - link

    How would you have graphics then? You would be limited to the 4xPCIe off the P55 on motherboards which support it, as there are no integrated graphics (yet)
  • MX5RX7 - Tuesday, September 8, 2009 - link

    I'm not sure that CPU/GPU integration is a good thing, from a consumer standpoint. At least in the short term.

    For example, in the article you mention how the majority of modern games are GPU, not CPU limited. The current model allows us to purchase a very capable processor and pair it with a very capable GPU. Then, when the ultra competitive GPU market has provided us with a choice of parts that easily eclipse the performance of the previous generation, we either swap graphics cards for the newer model, or purchase a second now cheaper identical card and (hopefully) double our game performance with SLI or Crossfire. All without having to upgrade the rest of the platform.

    With the current model, a new graphics API requires a new graphics card. With Larrabee, it might very well require a whole new platform.

  • Ben90 - Tuesday, September 8, 2009 - link

    Yea, im really excited for Larrabee, who knows if it will be good or not... but with intel kicking ass in everything else, it will at least be interesting

    With overclocking performance seemingly being limited by the PCI-E controller, it seems like an unlocked 1156 would be pretty sweet

    All in all i gotta admit i was kinda bitter with this whole 1156 thing because i jumped on the 1336 bandwagon and it seemed that Intel was mostly just jacking off with the new socket... but this processor seems to bring a lot more innovation than i expected (just not in raw performance, still great performance though)
  • chizow - Tuesday, September 8, 2009 - link

    Was worried no one was going to properly address one of the main differences between P55 and X58, thanks for giving it a dedicated comparison. Although I would've like to have seen more games tested, it clearly indicates PCIE bandwidth becoming an issue with current generation GPUs. This will only get worst with the impending launch of RV8x0 and GT300.
  • Anand Lal Shimpi - Tuesday, September 8, 2009 - link

    PCIe bandwidth on Lynnfield is only an issue with two GPUs, with one you get the same 16 lanes as you would on X58 or AMD 790FX.

    If I had more time I would've done more games, I just wanted to focus on those that I knew scaled the best to see what the worst case scenario would be for Lynnfield.

    In the end 2 GPUs are passable (although not always ideal on Lynnfield), but 4 GPUs are out of the question.

    Take care,
    Anand
  • JumpingJack - Thursday, September 10, 2009 - link

    Anand, a few other sites have attempted SLI/Xfire work ... on in particular shows 4 GPUs having no impact at all on gaming performance in general -- well, 3 or 4 FPS, but nothing more than a few percentages over norm.

    Could your configuration with beta or just bad first release drivers be an issue?

    Jack
  • JonnyDough - Tuesday, September 8, 2009 - link

    Would it be possible to incorporate two GPU controllers onto a die instead of one or is that what they'll be doing with future procs? I would think that two controllers with a communication hub might supply the needed bandwidth of x16 + x16.
  • Comdrpopnfresh - Tuesday, September 8, 2009 - link

    with two gpu's being passable- do you foresee that applying to both two independent gpus, as well as the single dual-card gpus?
  • Ryan Smith - Tuesday, September 8, 2009 - link

    Yes. The only difference between the two is where the PCIe bridge chip is. In the former it's on the mobo, in the latter it's on the card itself.
  • Eeqmcsq - Tuesday, September 8, 2009 - link

    Talk about bringing a bazooka to a knife fight. AMD better be throwing all their innovation ideas and the kitchen sink into Bulldozer, because Intel is thoroughly out-innovating AMD right now.

Log in

Don't have an account? Sign up now