The Roadmap & Pricing

I’ve defined the launch parts earlier in this article, but now I’m going to put them in perspective. When Intel provides its partners with roadmaps it also provides them with an idea of where future CPUs slot into various segments/price points. For example, Intel’s LGA-1366 roadmap tell us that in the “Extreme” market segment Intel only has a single product offering: the Core i7 980X. And in Q1 2011 the 980X gets replaced by the 990X.

Usually based on this information you can get a general idea of how much future products will cost - or at least what they will be comparable to. In this example the 990X will most likely be priced at whatever the 980X is priced at. Products may change, but the price people are willing to pay in a certain market segment usually doesn’t.

What we have below is the Intel roadmap, with Sandy Bridge included, for Q3 2010 through Q3 2011. The further out you go in a roadmap the lower your accuracy becomes, so I wouldn’t worry too much about us not seeing LGA-2011 on there yet.


Click to Enlarge

It’s based on this roadmap that I mentioned some pricing earlier. If all stays the same, the Core i7 2600K will take the place of the Core i7 950, currently priced at $562. The 2600 will fit somewhere around the 680 and 875K ($342) and the 2500K will replace the i5 760/655K ($205 - $216).

The cheapest Sandy Bridge at launch will be the Core i3 2100, which will replace the i3 560 at around $138.

Now pricing is always a huge variable, but I have to say, based on the performance you’re about to see - these parts would be priced right.

A New Socket and New Chipsets Overclocking Controversy
Comments Locked

200 Comments

View All Comments

  • seapeople - Sunday, August 29, 2010 - link

    So you're saying that integrated graphics should either be able to handle high resolution gaming using at least medium settings on the upper echelon of current games or they should not be included? That's fairly narrow minded. The bottom line is that most people will never need a better graphics card than SB provides, and the people who do are probably going to buy a $200+ graphics card anyway and replace it every summer, so are they really going to care if the integrated graphics drive the price of their $200 processor up by $10-20? Alternatively, this chip is begging for some sort of Optimus-like option, which will allow hardcore gamers to buy the graphics card they want, AND not have to chew up 100W of graphics power while browsing the web or watching a movie.

    Regardless, for people who aren't hard core gamers, the IGP on SB replaces the need to buy something like a Radeon HD 5450, ultimately saving them money. This seems like a positive step to me.
  • chizow - Sunday, August 29, 2010 - link

    No, I'm saying if this is being advertised as a suitable discrete GPU replacement, it should be compared to discrete GPUs at resolutions and settings you would expect a discrete GPU to handle and not IGPs that we already know are too slow to matter. 1024x768 and all lowest settings doesn't fit that criteria. Flash and web-based games don't either, since they don't even require a 3D accelerator in order to run (Intel's workaround Broadcom chip would be fine).

    Again, this card wouldn't even hold a candle to a mid-range $200 GPU from 3 years ago, the 8800GT would still do cartwheels all over it. You can buy these cards for much less than $100, even the GT240 or 4850 for example have been selling for less than $50 after MIR and would be a much more capable gaming card.

    Also, you're badly mistaken if you think this GPU is free by any means, as the cost to integrate a GPU onto SB's die comes at the expense of what could've been more actual CPU....so instead of better CPU performance this generation, you lose that for mediocre graphics performance. There is a price to pay for that relatively massive IGP whether you think so or not, you are paying for it.
  • wut - Sunday, August 29, 2010 - link

    You don't know what you're talking about. You pretend that you do, but you don't.

    The telling sign is your comment about L2/L3 cache.
  • chizow - Sunday, August 29, 2010 - link

    Actually it sounds like you don't know what you're talking about or you didn't read the article:

    "Only the Core i7 2600 has an 8MB L3 cache, the 2400, 2500 and 2600 have a 6MB L3 and the 2100 has a 3MB L3. The L3 size should matter more with Sandy Bridge due to the fact that it’s shared by the GPU in those cases where the integrated graphics is active. I am a bit puzzled why Intel strayed from the steadfast 2MB L3 per core Nehalem’s lead architect wanted to commit to. I guess I’ll find out more from him at IDF :)"

    You might've missed it very clearly stated in the tables also that only the 2600 has the same 8MB L3 or 2MB per core with previous 4C like Bloomfield/Lynnfield/Westmere/Clarkdale. The rest have 6MB or 3MB, which is less than 8MB or 4MB L3 used on the previous generation chips.

    This may change with the high-end/enthusiast platform, but again, the amount of L3 cache is actually going to be a downgrade on many of these Sandy Bridge SKUs for anyone who already owns a Nehalem/Westmere based CPU.
  • wut - Friday, September 10, 2010 - link

    You're parroting Anand and his purely number-based guess. Stop pretending.
  • mac2j - Saturday, August 28, 2010 - link

    990x is a Gulftown part on 1366 that's 130MHz faster than the 980x.... will cost $1000 and come out the same time as the 2600 (which will cost ~ 1/2 and deliver 90% of the performance) and at most a couple months before the i7-2800K which will cost less and trounce it performance-wise.

    You'd have to REALLY want those extra cores to buy a 990x on a lame-duck socket at that point!
  • wut - Sunday, August 29, 2010 - link

    Some has to get those chips to populate the uppermost echelons 3DMark score boards. It's an expensive hobby.
  • hybrid2d4x4 - Saturday, August 28, 2010 - link

    Anand, can you provide some more info on what the system configuration was when running the power tests? The test setup lists 2 vid cards and it's not clear which was used when deriving the power graphs. Also, what PSU was used?
    Just wondering since if it was a 1200W behemoth, then the 63W idle might really be 30W on a more reasonable PSU (assuming no vid cards)...
    As always, thanks for the article!
  • smilingcrow - Saturday, August 28, 2010 - link

    Was HT enabled for the power tests and what application was used to load the cores?
  • semo - Saturday, August 28, 2010 - link

    No USB3.0 support and a half baked SATA3 implementation. I could be a bit too harsh about the latter (can't say if SATA3 on a 6 series chipset will perform poorly or not) but why are they going with only 2 6Gb/s ports? I understand that most people are likely to be buying only 1 or so SSDs in the near future but what about in a few years when these things become mainstream? At least AMD took SATA3 seriously even if they couldn't quite make it work initially (we need a follow up on the 8 series chipsets' SATA performance!)

    Not only are Intel overlooking advance in technologies other than CPUs (which are important to most consumers, whether they are aware of it or not) but are also denying other companies who might have more focus in those areas. I wonder if Nvidia or someone else bother to release a chipset for Intel's latest and greatest.

Log in

Don't have an account? Sign up now