FSB Bottlenecks: Is 1333MHz Necessary?

Although all desktop Core 2 processors currently feature a 1066MHz FSB, Intel's first Woodcrest processors (the server version of Conroe) offer 1333MHz FSB support. Intel doesn't currently have a desktop chipset with support for the 1333MHz FSB, but the question we wanted answered was whether or not the faster FSB made a difference.

We took our unlocked Core 2 Extreme X6800 and ran it at 2.66GHz using two different settings: 266MHz x 10 and 333MHz x 8; the former corresponds to a 1066MHz FSB and is the same setting that the E6700 runs at, while the latter uses a 1333MHz FSB. The 1333MHz setting used a slightly faster memory bus (DDR2-811 vs. DDR2-800) but given that the processor is not memory bandwidth limited even at DDR2-667 the difference between memory speeds is negligible.

With Intel pulling in the embargo date of all Core 2 benchmarks we had to cut our investigation a bit short, so we're not able to bring you the full suite of benchmarks here to investigate the impact of FSB frequency. That being said, we chose those that would be most representative of the rest.

Why does this 1333MHz vs. 1066MHz debate even matter? For starters, Core 2 Extreme owners will have the option of choosing since they can always just drop their multiplier and run at a higher FSB without overclocking their CPUs (if they so desire). There's also rumor that Apple's first Core 2 based desktops may end up using Woodcrest and not Conroe, which would mean that the 1333MHz FSB would see the light of day on some desktops sooner rather than later.

The final reason this comparison matters is because in reality, Intel's Core architecture is more data hungry than any previous Intel desktop architecture and thus should, in theory, be dependent on a nice and fast FSB. At the same time, thanks to a well engineered shared L2 cache, FSB traffic has been reduced on Core 2 processors. So which wins the battle: the data hungry 4-issue core or the efficient shared L2 cache? Let's find out.

On average at 2.66GHz, the 1333MHz FSB increases performance by 2.4%, but some applications can see an even larger increase in performance. Under DivX, the performance boost was almost as high as going from a 2MB L2 to a 4MB L2. Also remember that as clock speed goes up, the dependence on a faster FSB will also go up.

Thanks to the shared L2 cache, core to core traffic is no longer benefitted by a faster FSB so the improvements we're seeing here are simply due to how data hungry the new architecture is. With its wider front end and more aggressive pre-fetchers, it's no surprise that the Core 2 processors benefit from the 1333MHz FSB. The benefit will increase even more as the first quad core desktop CPUs are introduced. The only question that remains is how long before we see CPUs and motherboards with official 1333MHz FSB support?

If Apple does indeed use a 1333MHz Woodcrest for its new line of Intel based Macs, running Windows it may be the first time that an Apple system will be faster out of the box than an equivalently configured, non-overclocked PC. There's an interesting marketing angle.

Memory Latency: No Integrated Memory Controller Necessary Power Consumption: Who is the king?
Comments Locked

202 Comments

View All Comments

  • MrKaz - Friday, July 14, 2006 - link

    So how do you calculate performance/watt?

    Based on Doom3? Quake4? Lame? PowerDVD? Divx encoding?

    My point is, this is "impossible" to do, unless you do it for all progs and games.

    Picking up just one of them is being biased...
  • JarredWalton - Friday, July 14, 2006 - link

    Including performance/watt on *ANY* game is a bit odd, given that the GPU will comsume more power than the CPU. That's why when we talk about performance per watt on GPUs, we use the same platform for all tested systems.

    If we're going to talk about performance per watt and we're worried about the CPU and platform, then we should look at benchmarks that stress that portion of the system more than anything else. In fact, you could argue that we should drop down to the lowest power GPU possible, or even go with an integrated graphics solution. Anyway, here are a few of the results using WME9:

    0.358 FPS/W X6800
    0.319 FPS/W E6600
    0.279 FPS/W 4600+ EE
    0.276 FPS/W 3800+ EE
    0.273 FPS/W 5000+
    0.244 FPS/W FX-62
    0.244 FPS/W E6300
    0.228 FPS/W PD XE 965

    Part of the reasons on the lower performance Core 2 Duo chips score so poorly is because we are measuring Watts of the entire system. It's reasonable to say that the motherboard, hard drives, graphics card, etc. probably use up on average 100 W of power, give or take. The AMD motherboard and peripherals might also use a bit less power than than the Intel board, or vice versa, so the 12 W difference in power draw at idle shouldn't be considered really significant.

    What is significant is that other than the two energy efficient AMD chips (which you can't yet purchase on the retail market), Core 2 offers better performance per watt at similar price points. We could go and measure performance per watt on a bunch of the other applications (even games, though the differences are going to be greatly diminished given the GPUs requirements), but the results really aren't likely to change much. Core 2 is faster than AMD, and at worst it matches AMD's power requirements; ergo Core 2 offers better performance for watt.
  • epsilonparadox - Friday, July 14, 2006 - link

    Intel didn't start the focus on performance by watt. AMD started it and ruled the charts based on that measure. Every single X2 vs P4D review has a chart for that measurement. Intel w/ the C2D just turned the table back on them by harping on the same issue. If this measurement didn't become a big deal, you'd likely be running dual 1000W psus to run dual core/multi gpu setups.
  • Furen - Friday, July 14, 2006 - link

    It's hard to do a performance/watt chart because processors perform differently under different applications. I'm sure you'll agree with the fact that the E6600 is much faster than an X2 3800+ yet draws only slightly more power.
  • bupkus - Friday, July 14, 2006 - link

    : (

    Where's the pics?
    My browser doesn't show them on the first page.
  • Gary Key - Friday, July 14, 2006 - link

    What browser?
  • bupkus - Friday, July 14, 2006 - link

    Firefox
  • Gary Key - Friday, July 14, 2006 - link

    I have tried three different versions of FireFox on varying machines without an issue so far. Still looking in to it.
  • JarredWalton - Friday, July 14, 2006 - link

    Options ->
    Web features ->
    Load Images ->
    UNCHECK "for the originating web site only"
  • ianwhthse - Friday, July 14, 2006 - link

    Mine is already unchecked, however I cannot see the pictures either. [Firefox]

    Kicking and screaming, which is somewhat disruptive @4am, I opened Internet Explorer and I cannot see the images there, either.

Log in

Don't have an account? Sign up now