Note: This preview was not sanctioned or supported by Intel in any way.

I still remember hearing about Intel's tick-tock cadence and not having much faith that the company could pull it off. Granted Intel hasn't given us a new chip every 12 months on the dot, but more or less there's something new every year. Every year we either get a new architecture on an established process node (tock), or a derivative architecture on a new process node (tick). The table below summarizes what we've seen since Intel adopted the strategy:

Intel's Tick-Tock Cadence
Microarchitecture Process Node Tick or Tock Release Year
Conroe/Merom 65nm Tock 2006
Penryn 45nm Tick 2007
Nehalem 45nm Tock 2008
Westmere 32nm Tick 2010
Sandy Bridge 32nm Tock 2011
Ivy Bridge 22nm Tick 2012
Haswell 22nm Tock 2013

Last year was a big one. Sandy Bridge brought a Conroe-like increase in performance across the board thanks to a massive re-plumbing of Intel's out-of-order execution engine and other significant changes to the microarchitecture. If you remember Conroe (the first Core 2 architecture), what followed it was a relatively mild upgrade called Penryn that gave you a little bit in the way of performance and dropped power consumption at the same time.

Ivy Bridge, the tick that follows Sandy Bridge, would typically be just that: a mild upgrade that inched performance ahead while dropping power consumption. Intel's microprocessor ticks are usually very conservative on the architecture side, which limits the performance improvement. Being less risky on the architecture allows Intel to focus more on working out the kinks in its next process node, in turn delivering some amount of tangible power reduction.

Where Ivy Bridge shakes things up is on the graphics side. For years Intel has been able to ship substandard graphics in its chipsets based on the principle that only gamers needed real GPUs and Windows ran just fine on integrated graphics. Over the past decade that philosophy required adjustment. First it was HD video decode acceleration, then GPU accelerated user interfaces and, more recently, GPU computing applications. Intel eventually committed to taking GPU performance (and driver quality) seriously, setting out on a path to significantly improve its GPUs.

As Ivy is a tick in Intel's cadence, we shouldn't see much of a performance improvement. On the CPU side that's mostly true. You can expect a 5 - 15% increase in performance for the same price as a Sandy Bridge CPU today. A continued desire to be aggressive on the GPU front however puts Intel in a tough spot. Moving to a new manufacturing process, especially one as dramatically different as Intel's 22nm 3D tri-gate node isn't easy. Any additional complexity outside of the new process simply puts schedule at risk. That being said, its GPUs continue to lag significantly behind AMD and more importantly, they still aren't fast enough by customer standards.

Apple has been pushing Intel for faster graphics for years, having no issues with including discrete GPUs across its lineup or even prioritizing GPU over CPU upgrades. Intel's exclusivity agreement with Apple expired around Nehalem, meaning every design win can easily be lost if the fit isn't right.

With Haswell, Intel will finally deliver what Apple and other customers have been asking for on the GPU front. Until then Intel had to do something to move performance forward. A simple tick wouldn't cut it.

Intel calls Ivy Bridge a tick+. While CPU performance steps forward, GPU performance sees a more significant improvement - in the 20 - 50% range. The magnitude of improvement on the GPU side is more consistent with what you'd expect from a tock. The combination of a CPU tick and a GPU tock is how Intel arrives at the tick+ naming. I'm personally curious to see how this unfolds going forward. Will GPU and CPUs go through alternating tocks or will Intel try to synchronize them? Do we see innovation on one side slow down as the other increases? Does tick-tock remain on a two year cadence now that there are two fairly different architectures that need updating? These are questions I don't know that we'll see answers to until after Haswell. For now, let's focus on Ivy Bridge.

Ivy Bridge Architecture Recap
Comments Locked

195 Comments

View All Comments

  • silverblue - Wednesday, March 7, 2012 - link

    At the very least, AMD need a less power hungry successor to Bulldozer. From the Xeon review, it's mentioned that they should be in a position to do this, and could at least clock the thing a lot higher and still use less power than Bulldozer. Regardless, that IPC deficit is a killer - the following page is so telling of the architecture's current limitations:

    http://www.anandtech.com/show/4955/the-bulldozer-r...
  • abianand2 - Wednesday, March 7, 2012 - link

    1. General curiosity: You stated you did not get a sanction or support from Intel for this preview. I believed that sort of a thing isn't allowed before the release date. How do exceptions like this work?

    2. Specific: I observed most of the discrete GPU tests were at 1680x1050, where there . Any reason for this? I guess it is since this is just a preview. Am I right? Any other reason?

    Thanks
  • Kjella - Wednesday, March 7, 2012 - link

    1. If you want to officially review the chip, you sign an NDA and Intel provides you with it. Here he got access to it from a partner, who probably broke their agreements but Anand never signed any agreement so he can publish whatever he wants.

    2. I would think so ,and in GPU bound scenarios I wouldn't expect much change at all.
  • InsaneScientist - Wednesday, March 7, 2012 - link

    1) Generally what happens with previews and first looks is that the company producing a product (Intel) will send out press samples to reviewers if the reviewers will sign a Non Disclosure Agreement (NDA). When the NDA expires (generally the same time for everyone), the reviewers can post their findings to the public.

    This is done (I assume) to give reviewers enough time to thoroughly review a product without having (theoretically) to worry about having information leak until the company wants it to get out.

    If, on the other hand, a reviewer acquires a product via other means so there is no NDA that they have to sign in order to get the product... well, they're not under NDA, so they're free to disclose whatever they want.
  • sld - Wednesday, March 7, 2012 - link

    What a troll.

    Start crying when Intel is able to jack up prices by 2x - 3x when AMD is gone.

    You don't even realise that his fears of a tock- means that since Ivy Bridge has more features, presumably pushed forward from Haswell, Haswell itself will bring less features to the table.
  • Hrel - Wednesday, March 7, 2012 - link

    WHY!!!!? Does Intel HAVE to disable Hyper Threading on the sub 300 dollar CPU's? It's not like having in ENABLED costs them anything more at all. It would just be providing their customers with a better product. This shit is infuriating. It's there on the chip no matter what, HT should just be on every single Intel chip no matter what. That shit pisses me off SOOOO much.
  • Exodite - Wednesday, March 7, 2012 - link

    I would imagine there's going to be a fair few sub-300 USD dual-cores with HT down the line, though I suppose you meant specifically for the quads?

    The reason seem obvious enough for me, if you need the extra performance in applications that stand to gain from HT you'll have to pay for it.

    Frankly I don't see the added cost as anything major, considering the gains.

    It's just differentiation really.

    Sure, we'd all want more stuff cheaper (or for free!) but lacking HT doesn't in any way cripple a chip.
  • sicofante - Wednesday, March 7, 2012 - link

    It's called market segmentation.
  • Hector2 - Wednesday, March 7, 2012 - link

    Chill out. You'll pop a blodd vessel. With all that's happening in the world, THAT's what's pissing you off ? LMAO
  • sld - Wednesday, March 7, 2012 - link

    Intel's products get cheaper with smaller dies and with competition. Without competition, their dies cost the same to make, but they rob and loot your pockets and make obscene profits off you because your hated AMD no longer exists as an alternative supplier of good chips.

Log in

Don't have an account? Sign up now