The times, they are changing. In fact, the times have already changed, we're just waiting for the results. I remember the first time Intel brought me into a hotel room to show me their answer to AMD's Athlon 64 FX—the Pentium 4 Extreme Edition. Back then the desktop race was hotly contested. Pushing the absolute limits of what could be done without a concern for power consumption was the name of the game. In the mid-2000s, the notebook started to take over. Just like the famous day when Apple announced that it was no longer a manufacturer of personal computers but a manufacturer of mobile devices, Intel came to a similar realization years prior when these slides were first shown at an IDF in 2005:


IDF 2005


IDF 2005

Intel is preparing for another major transition, similar to the one it brought to light seven years ago. The move will once again be motivated by mobility, and the transition will be away from the giant CPUs that currently power high-end desktops and notebooks to lower power, more integrated SoCs that find their way into tablets and smartphones. Intel won't leave the high-end market behind, but the trend towards mobility didn't stop with notebooks.

The fact of the matter is that everything Charlie has said on the big H is correct. Haswell will be a significant step forward in graphics performance over Ivy Bridge, and will likely mark Intel's biggest generational leap in GPU technology of all time. Internally Haswell is viewed as the solution to the ARM problem. Build a chip that can deliver extremely low idle power, to the point where you can't tell the difference between an ARM tablet running in standby and one with a Haswell inside. At the same time, give it the performance we've come to expect from Intel. Haswell is the future, and this is the bridge to take us there.

In our Ivy Bridge preview I applauded Intel for executing so well over the past few years. By limiting major architectural shifts to known process technologies, and keeping design simple when transitioning to a new manufacturing process, Intel took what once was a five year design cycle for microprocessor architectures and condensed it into two. Sure the nature of the changes every 2 years was simpler than what we used to see every 5, but like most things in life—smaller but frequent progress often works better than putting big changes off for a long time.

It's Intel's tick-tock philosophy that kept it from having a Bulldozer, and the lack of such structure that left AMD in the situation it is today (on the CPU side at least). Ironically what we saw happen between AMD and Intel over the past ten years is really just a matter of the same mistake being made by both companies, just at different times. Intel's complacency and lack of an aggressive execution model led to AMD's ability to outshine it in the late K7/K8 days. AMD's similar lack of an execution model and executive complacency allowed the tides to turn once more.

Ivy Bridge is a tick+, as we've already established. Intel took a design risk and went for greater performance all while transitioning to the most significant process technology it has ever seen. The end result is a reasonable increase in CPU performance (for a tick), a big step in GPU performance, and a decrease in power consumption.

Today is the day that Ivy Bridge gets official. Its name truly embodies its purpose. While Sandy Bridge was a bridge to a new architecture, Ivy connects a different set of things. It's a bridge to 22nm, warming the seat before Haswell arrives. It's a bridge to a new world of notebooks that are significantly thinner and more power efficient than what we have today. It's a means to the next chapter in the evolution of the PC.

Let's get to it.

Additional Reading

Intel's Ivy Bridge Architecture Exposed
Mobile Ivy Bridge Review
Undervolting & Overclocking on Ivy Bridge

Intel's Ivy Bridge: An HTPC Perspective

The Lineup: Quad-Core Only for Now
Comments Locked

173 Comments

View All Comments

  • wingless - Monday, April 23, 2012 - link

    I'll keep my 2600K

    .....just kidding
  • formulav8 - Monday, April 23, 2012 - link

    I hope you give AMD even more praise when Trinity is released Anand. IMO you way overblew how great Intels igp stuff. Its their 4th gen that can't even beat AMDs first gen.

    Just my opinion :p
  • Zstream - Monday, April 23, 2012 - link

    I agree..
  • dananski - Monday, April 23, 2012 - link

    As much as I like the idea of decent Skyrim framerates on every laptop, and even though I find the HD4000 graphics an interesting read, I couldn't care less about it in my desktop. Gamers will not put up with integrated graphics - even this good - unless they're on a tight budget, in which case they'll just get Llano anyway, or wait for Trinity. As for IVB, why can't we have a Pentium III sized option without IGP, or get 6 cores and no IGP?
  • Kjella - Tuesday, April 24, 2012 - link

    Strategy, they're using their lead in CPUs to bundle it with a GPU whether you want it or not. When you take your gamer card out of your gamer machine it'll still have an Intel IGP for all your other uses (or for your family or the second-hand market or whatever), that's one sale they "stole" from AMD/nVidia's low end. Having a separate graphics card is becoming a niche market for gamers. That's better for Intel than lowering the expectation that a "premium" CPU costs $300, if you bring the price down it's always much harder to raise it again...
  • Samus - Tuesday, April 24, 2012 - link

    As amazing this CPU is, and how much I'd love it (considering I play BF3 and need a GTX560+ anyway) I have to agree the GPU improvement is pretty disappointing...

    After all that work, Intel still can't even come close to AMD's integrated graphics. It's 75% of AMD's performance at best.
  • Cogman - Thursday, May 3, 2012 - link

    There is actually a good reason for both AMD and Intel to keep a GPU on their CPUs no matter what. That reason is OpenCV. This move makes the assumption that OpenCV or programming languages like it will eventually become mainstream. With a GPU coupled to every CPU, it saves developers from writing two sets of code to deal with different platforms.
  • froggr - Saturday, May 12, 2012 - link

    OpenCV is Open Computer Vision and runs either way. I think you're talking about OpenCL (Open Compute Language). and even that runs fine without a GPU. OpenCL can use all cores CPU + GPU and does not require separate code bases.

    OpenCL runs faster with a GPU because it's better parallellized.
  • frozentundra123456 - Monday, April 23, 2012 - link

    Maybe we could actually see some hard numbers before heaping so much praise on Trinity??

    I will be convinced about the claims of 50% IGP improvements when I see them, and also they need to make a lot of improvements to Bulldozer, especially in power consumption, before it is a competitive CPU. I hope it turns out to be all the AMD fans are claiming, but we will see.
  • SpyCrab - Tuesday, April 24, 2012 - link

    Sure, Llano gives good gaming performance. But it's pretty much at Athlon II X4 CPU performance.

Log in

Don't have an account? Sign up now