Note: This preview was not sanctioned or supported by Intel in any way.

I still remember hearing about Intel's tick-tock cadence and not having much faith that the company could pull it off. Granted Intel hasn't given us a new chip every 12 months on the dot, but more or less there's something new every year. Every year we either get a new architecture on an established process node (tock), or a derivative architecture on a new process node (tick). The table below summarizes what we've seen since Intel adopted the strategy:

Intel's Tick-Tock Cadence
Microarchitecture Process Node Tick or Tock Release Year
Conroe/Merom 65nm Tock 2006
Penryn 45nm Tick 2007
Nehalem 45nm Tock 2008
Westmere 32nm Tick 2010
Sandy Bridge 32nm Tock 2011
Ivy Bridge 22nm Tick 2012
Haswell 22nm Tock 2013

Last year was a big one. Sandy Bridge brought a Conroe-like increase in performance across the board thanks to a massive re-plumbing of Intel's out-of-order execution engine and other significant changes to the microarchitecture. If you remember Conroe (the first Core 2 architecture), what followed it was a relatively mild upgrade called Penryn that gave you a little bit in the way of performance and dropped power consumption at the same time.

Ivy Bridge, the tick that follows Sandy Bridge, would typically be just that: a mild upgrade that inched performance ahead while dropping power consumption. Intel's microprocessor ticks are usually very conservative on the architecture side, which limits the performance improvement. Being less risky on the architecture allows Intel to focus more on working out the kinks in its next process node, in turn delivering some amount of tangible power reduction.

Where Ivy Bridge shakes things up is on the graphics side. For years Intel has been able to ship substandard graphics in its chipsets based on the principle that only gamers needed real GPUs and Windows ran just fine on integrated graphics. Over the past decade that philosophy required adjustment. First it was HD video decode acceleration, then GPU accelerated user interfaces and, more recently, GPU computing applications. Intel eventually committed to taking GPU performance (and driver quality) seriously, setting out on a path to significantly improve its GPUs.

As Ivy is a tick in Intel's cadence, we shouldn't see much of a performance improvement. On the CPU side that's mostly true. You can expect a 5 - 15% increase in performance for the same price as a Sandy Bridge CPU today. A continued desire to be aggressive on the GPU front however puts Intel in a tough spot. Moving to a new manufacturing process, especially one as dramatically different as Intel's 22nm 3D tri-gate node isn't easy. Any additional complexity outside of the new process simply puts schedule at risk. That being said, its GPUs continue to lag significantly behind AMD and more importantly, they still aren't fast enough by customer standards.

Apple has been pushing Intel for faster graphics for years, having no issues with including discrete GPUs across its lineup or even prioritizing GPU over CPU upgrades. Intel's exclusivity agreement with Apple expired around Nehalem, meaning every design win can easily be lost if the fit isn't right.

With Haswell, Intel will finally deliver what Apple and other customers have been asking for on the GPU front. Until then Intel had to do something to move performance forward. A simple tick wouldn't cut it.

Intel calls Ivy Bridge a tick+. While CPU performance steps forward, GPU performance sees a more significant improvement - in the 20 - 50% range. The magnitude of improvement on the GPU side is more consistent with what you'd expect from a tock. The combination of a CPU tick and a GPU tock is how Intel arrives at the tick+ naming. I'm personally curious to see how this unfolds going forward. Will GPU and CPUs go through alternating tocks or will Intel try to synchronize them? Do we see innovation on one side slow down as the other increases? Does tick-tock remain on a two year cadence now that there are two fairly different architectures that need updating? These are questions I don't know that we'll see answers to until after Haswell. For now, let's focus on Ivy Bridge.

Ivy Bridge Architecture Recap
POST A COMMENT

195 Comments

View All Comments

  • arno - Friday, March 09, 2012 - link

    No, it is just out of question for me to overclock. I wanna buy a profesional laptop (w520 lenovo). SO no way to teak it.
    Fact is memory will be 1600 Mhz and the processor a bit stronger with maybe a better memory controler.
    At one month of the release, it worth to wait it.
    Just wanna make sure that in my particular case it really worth it cause i'm tired of my heavy old laptop. I buy this damn machine just for working after all. At home, my E8400 is still upto date for what I do with it.
    Reply
  • DDR4 - Friday, March 09, 2012 - link

    I want to see some increase in performance and actual processing power. For now, i can leave the graphics to the GPU. Reply
  • Nexing - Friday, March 09, 2012 - link

    @Arno
    I'd consider a few aspects:
    -Do you need to use precision external gear, -like we audio people do with soundcards- and hence need ExpressCard or Thunderbolt connectors? Then I'd expect May-Jun launches will bring those professional Laptops and Ultrabooks.
    -If portability is important, factual Sandy bridge battery capacity is near 4 hours whether Ivy Bridge battery will extend real usage around eight hours for similar performance.
    -Furthermore USB 3.0 will be native, something important since most renesas boards have been far from perfect and just their recent (Feb/March 2012) releases seem to finally have nailed efficiency.. Problems with USB 3.0 equiped Sandy Bridge laptops abound in forums, and that is in professional brands.
    -If you were questioning about SandyB vs IvyB desktops, you could still buy now the former and later upgrade for the later CPU, but with the mobile platform, Intel has stated that H67M -their actual chipsett platform, also named Cougar Point- Upgradeability is not going to be feasible, despíte it could be technically possible easily..
    Therefore, there a many reasons pointing to wait. Since sales are very low, any are choosing this route.
    Reply
  • arno - Saturday, March 10, 2012 - link

    Thanks Nexing for u answer. Actually, i totally agree with you on:
    portability => IB is a shrink and must be more power efficient for an equivalent task load. Seems that the test proves it. moreover, I will work a lot in trains or outdoors (visiting customers), so it is definitely a +.
    USB 3 => u feedback is very interesting. I myself think that "native" versus "add on" USB3 feature must be better. And that was also a reason for me to wait when last december, i was already thinking of buying something new. Now i'm quite sure that it was the good thing to do.

    For the rest, more than external gears, I need a processor good in floating points calculation. I do intensive electrical simulations so i definitely need it.

    I took my decision and I will wait. This laptop will replace and desktop and laptop for work (and work only cause for internet or usual offices task, i definitely think a core 2 duo can make it); so better to catch the best. I will manage the present emergency I have, praying for Lenovo (or Samsung?) to offer new Ivy Bridge laptop as soon as possible. Let's make a bet: Lenovo got it ready to release and is just waiting for the official launch date....

    thanks for sharing ;)
    Reply
  • Nexing - Friday, March 09, 2012 - link

    Should say:
    "many are taking this waiting route"
    Reply
  • arno - Saturday, March 10, 2012 - link

    "FP/integer divider delivers 2x throughput compared to Sandy Bridge"

    I should read more carefully. That is an answer to my question. Maybe not a spectacular improvement, but still one.
    Reply
  • DrWattsOn - Tuesday, March 13, 2012 - link

    @arno I'm GLAD you didn't read more carefully, because you posted the question, and Nexing's answer focused me on something I still wasn't considering as a major factor in my decision: USB3. Between your question and the response, I also got a better picture of how specific use is affected by the tech. So, I'm a waiter (tho I don't serve food 8^D ). Reply
  • stephenbrooks - Saturday, March 10, 2012 - link

    Intel released on 2006, 2007, 2008, 2010, 2011, 2012.

    In base 9 they're on schedule.
    Reply
  • bhima - Saturday, March 10, 2012 - link

    Basically every 2D-based graphic designer/web designer doesn't need a discrete GPU for their work. The IGPs handle that workload fine (mainly because most of the processing needed for photoshop, indesign, illustrator or dreamweaver is CPU based). A discrete GPU gives you better performance with the very limited 3D stuff that photoshop offers which is situational at best for the vast majority of graphic designers.

    3D artists and those that pump a ton of effects in video editing, they would benefit from discrete.
    Reply
  • shadow king - Monday, March 12, 2012 - link

    ^ =) Reply

Log in

Don't have an account? Sign up now