Z68

In developing its 6-series chipsets Intel wanted to minimize as much risk as possible, so much of the underlying chipset architecture is borrowed from Lynnfield’s 5-series platform. The conservative chipset development for Sandy Bridge left a hole in the lineup. The P67 chipset lets you overclock CPU and memory but it lacks the flexible display interface necessary to support SNB’s HD Graphics. The H67 chipset has an FDI so you can use the on-die GPU, however it doesn’t support CPU or memory overclocking. What about those users who don’t need a discrete GPU but still want to overclock their CPUs? With the chipsets that Intel is launching today, you’re effectively forced to buy a discrete GPU if you want to overclock your CPU. This is great for AMD/NVIDIA, but not so great for consumers who don’t need a discrete GPU and not the most sensible decision on Intel’s part.

There is a third member of the 6-series family that will begin shipping in Q2: Z68. Take P67, add processor graphics support and you’ve got Z68. It’s as simple as that. Z68 is also slated to support something called SSD Caching, which Intel hasn’t said anything to us about yet. With version 10.5 of Intel’s Rapid Storage Technology drivers, Z68 will support SSD caching. This sounds like the holy grail of SSD/HDD setups, where you have a single drive letter and the driver manages what goes on your SSD vs. HDD. Whether SSD Caching is indeed a DIY hybrid hard drive technology remains to be seen. It’s also unclear whether or not P67/H67 will get SSD Caching once 10.5 ships.

LGA-2011 Coming in Q4

One side effect of Intel’s tick-tock cadence is a staggered release update schedule for various market segments. For example, Nehalem’s release in Q4 2008 took care of the high-end desktop market, however it didn’t see an update until the beginning of 2010 with Gulftown. Similarly, while Lynnfield debuted in Q3 2009 it was left out of the 32nm refresh in early 2010. Sandy Bridge is essentially that 32nm update to Lynnfield.

So where does that leave Nehalem and Gulftown owners? For the most part, the X58 platform is a dead end. While there are some niche benefits (more PCIe lanes, more memory bandwidth, 6-core support), the majority of users would be better served by Sandy Bridge on LGA-1155.

For the users who need those benefits however, there is a version of Sandy Bridge for you. It’s codenamed Sandy Bridge-E and it’ll debut in Q4 2011. The chips will be available in both 4 and 6 core versions with a large L3 cache (Intel isn’t being specific at this point).

SNB-E will get the ring bus, on-die PCIe and all of the other features of the LGA-1155 Sandy Bridge processors, but it won’t have an integrated GPU. While current SNB parts top out at 95W TDP, SNB-E will run all the way up to 130W—similar to existing LGA-1366 parts.

The new high-end platform will require a new socket and motherboard (LGA-2011). Expect CPU prices to start off at around the $294 level of the new i7-2600 and run all the way up to $999.

UEFI Support: 3TB Drives & Mouse Support Pre-Boot A Near-Perfect HTPC
POST A COMMENT

282 Comments

View All Comments

  • dacipher - Monday, January 03, 2011 - link

    The Core i5-2500K was just what i was looking for. Performance/ Price is where it needs to be and overclocking should be a breeze. Reply
  • vol7ron - Monday, January 03, 2011 - link

    I agree.

    "As an added bonus, both K-series SKUs get Intel’s HD Graphics 3000, while the non-K series SKUs are left with the lower HD Graphics 2000 GPU."

    Doesn't it seem like Intel has this backwards? For me, I'd think to put the 3000 on the lesser performing CPUs. Users will probably have their own graphics to use with the unlocked procs, whereas the limit-locked ones will more likely be used in HTPC-like machines.
    Reply
  • DanNeely - Monday, January 03, 2011 - link

    This seems odd to me unless they're having yield problems with the GPU portion of their desktop chips. That doesn't seem too likely though because you'd expect the mobile version to have the same problem but they're all 12 EU parts. Perhaps they're binning more aggressively on TDP, and only had enough chips that met target with all 12 EUs to offer them at the top of the chart. Reply
  • dananski - Monday, January 03, 2011 - link

    I agree with both of you. This should be the ultimate upgrade for my E8400, but I can't help thinking they could've made it even better if they'd used the die space for more CPU and less graphics and video decode. The Quick Sync feature would be awesome if it could work while you're using a discrete card, but for most people who have discrete graphics, this and the HD Graphics 3000 are a complete waste of transistors. I suppose they're power gated off so the thermal headroom could maybe be used for overclocking. Reply
  • JE_Delta - Monday, January 03, 2011 - link

    WOW........

    Great review guys!
    Reply
  • vol7ron - Monday, January 03, 2011 - link

    Great review, but does anyone know how often 1 active core is used. I know this is a matter of subjection, but if you're running an anti-virus and have a bunch of standard services running in the background, are you likely to use only one core when idling?

    What should I advise people, as consumers, to really pay attention to? I know when playing games such as Counter-Strike or Battlefield: Bad Company 2, my C2D maxes out at 100%, I assume both cores are being used to achieve the 100% utilization. I'd imagine that in this age, hardly ever will there be a time to use just one core; probably 2 cores at idle.

    I would think that the 3-core figures are where the real noticeable impact is, especially in turbo, when gaming/browsing. Does anyone have any more perceived input on this?
    Reply
  • dualsmp - Monday, January 03, 2011 - link

    What resolution is tested under Gaming Performance on pg. 20? Reply
  • johnlewis - Monday, January 03, 2011 - link

    According to Bench, it looks like he used 1680×1050 for L4D, Fallout 3, Far Cry 2, Crysis Warhead, Dragon Age Origins, and Dawn of War 2, and 1024×768 for StarCraft 2. I couldn't find the tested resolution for World of Warcraft or Civilization V. I don't know why he didn't list the resolutions anywhere in the article or the graphs themselves, however. Reply
  • karlostomy - Thursday, January 06, 2011 - link

    what the hell is the point of posting gaming scores at resolutions that no one will be playing at?

    If i am not mistaken, the grahics cards in the test are:
    eVGA GeForce GTX 280 (Vista 64)
    ATI Radeon HD 5870 (Windows 7)
    MSI GeForce GTX 580 (Windows 7)

    So then, with a sandybridge processor, these resolutions are irrelevant.
    1080p or above should be standard resolution for modern setup reviews.

    Why, Anand, have you posted irrelevant resolutions for the hardware tested?
    Reply
  • dananski - Thursday, January 06, 2011 - link

    Games are usually limited in fps by the level of graphics, so processor speed doesn't make much of a difference unless you turn the graphics detail right down and use an overkill graphics card. As the point of this page was to review the CPU power, it's more representative to use low resolutions so that the CPU is the limiting factor.

    If you did this set of charts for gaming at 2560x1600 with full AA & max quality, all the processors would be stuck at about the same rate because the graphics card is the limiting factor.

    I expect Civ 5 would be an exception to this because it has really counter-intuitive performance.
    Reply

Log in

Don't have an account? Sign up now