The times, they are changing. In fact, the times have already changed, we're just waiting for the results. I remember the first time Intel brought me into a hotel room to show me their answer to AMD's Athlon 64 FX—the Pentium 4 Extreme Edition. Back then the desktop race was hotly contested. Pushing the absolute limits of what could be done without a concern for power consumption was the name of the game. In the mid-2000s, the notebook started to take over. Just like the famous day when Apple announced that it was no longer a manufacturer of personal computers but a manufacturer of mobile devices, Intel came to a similar realization years prior when these slides were first shown at an IDF in 2005:


IDF 2005


IDF 2005

Intel is preparing for another major transition, similar to the one it brought to light seven years ago. The move will once again be motivated by mobility, and the transition will be away from the giant CPUs that currently power high-end desktops and notebooks to lower power, more integrated SoCs that find their way into tablets and smartphones. Intel won't leave the high-end market behind, but the trend towards mobility didn't stop with notebooks.

The fact of the matter is that everything Charlie has said on the big H is correct. Haswell will be a significant step forward in graphics performance over Ivy Bridge, and will likely mark Intel's biggest generational leap in GPU technology of all time. Internally Haswell is viewed as the solution to the ARM problem. Build a chip that can deliver extremely low idle power, to the point where you can't tell the difference between an ARM tablet running in standby and one with a Haswell inside. At the same time, give it the performance we've come to expect from Intel. Haswell is the future, and this is the bridge to take us there.

In our Ivy Bridge preview I applauded Intel for executing so well over the past few years. By limiting major architectural shifts to known process technologies, and keeping design simple when transitioning to a new manufacturing process, Intel took what once was a five year design cycle for microprocessor architectures and condensed it into two. Sure the nature of the changes every 2 years was simpler than what we used to see every 5, but like most things in life—smaller but frequent progress often works better than putting big changes off for a long time.

It's Intel's tick-tock philosophy that kept it from having a Bulldozer, and the lack of such structure that left AMD in the situation it is today (on the CPU side at least). Ironically what we saw happen between AMD and Intel over the past ten years is really just a matter of the same mistake being made by both companies, just at different times. Intel's complacency and lack of an aggressive execution model led to AMD's ability to outshine it in the late K7/K8 days. AMD's similar lack of an execution model and executive complacency allowed the tides to turn once more.

Ivy Bridge is a tick+, as we've already established. Intel took a design risk and went for greater performance all while transitioning to the most significant process technology it has ever seen. The end result is a reasonable increase in CPU performance (for a tick), a big step in GPU performance, and a decrease in power consumption.

Today is the day that Ivy Bridge gets official. Its name truly embodies its purpose. While Sandy Bridge was a bridge to a new architecture, Ivy connects a different set of things. It's a bridge to 22nm, warming the seat before Haswell arrives. It's a bridge to a new world of notebooks that are significantly thinner and more power efficient than what we have today. It's a means to the next chapter in the evolution of the PC.

Let's get to it.

Additional Reading

Intel's Ivy Bridge Architecture Exposed
Mobile Ivy Bridge Review
Undervolting & Overclocking on Ivy Bridge

Intel's Ivy Bridge: An HTPC Perspective

The Lineup: Quad-Core Only for Now
Comments Locked

173 Comments

View All Comments

  • JarredWalton - Tuesday, April 24, 2012 - link

    I don't think it's a mystery. It's straight fact: "One problem Intel does currently struggle with is game developers specifically targeting Intel graphics and treating the GPU as a lower class citizen."

    It IS a problem, and it's one INTEL has to deal with. They need more advocates with game developers, they need to make better drivers, and they need to make faster hardware. We know exactly why this has happened: Intel IGP failed to run for so long that a lot of developers gave up and just blacklisted Intel. Now, Intel is actually capable of running most games, and so long as they aren't explicitly blacklisted things should be okay.

    In truth, the only title I can think of from recent history where Intel could theoretically work but was blacklisted by the game developer is Fallout 3. Even today, if you want to run FO3 on Intel IGP (HD 2000/3000/4000), you need to download a hacked DLL that will identify your Intel GPU as an NVIDIA GT 9800 or something.

    And really, there's no need to blacklist by game developers, because you can't predict the future. FO3 is the perfect example: it runs okay on HD 3000 and plenty fast on HD 4000, but the shortsighted developers locked out Intel for all time. It's better to pop up a warning like some games do: "Warning: we don't recognize your driver and the game may not run properly." Blacklisting is almost more of a political statement IMO.
  • craziplaya21 - Monday, April 23, 2012 - link

    I might be blind or something but did you guys not do a comparison between an original bluray IQ vs an encoded 1080p IQ by quicksync??
  • toyotabedzrock - Monday, April 23, 2012 - link

    Why is Intel disabling this on the K parts? And why disable vPro?
  • jwcalla - Monday, April 23, 2012 - link

    First, a diversion: "I was able to transcode a complete 130 minute 1080p video to an iPad friendly format..." Just kill me. Somebody please. Why do consumers put up with this crap? Even my ancient Galaxy S has better media playback support.

    It's the same story with my HP TouchPad: MP4 container or GTFO. Who can stand to re-encode their media libraries or has the patience to deal with DLNA slingers when the hardware is perfectly capable of curb-stomping any container / codec you could even conceive? Just get an Android tablet if this is the crap they force on you. Or, in the TouchPad case, wipe it and install ICS.

    As for the article... did I totally misunderstand the page about power consumption? I got the impression that idle power is relatively unchanged. I must be misreading that. Or maybe the lower-end chips will show a stark improvement. Otherwise I totally miss the point of IVB.

    I'm beginning to lose confidence in Intel, at least in terms of innovation. These tick-tock improvements are basically minor pushes in the same boring direction. From an enthusiasts' perspective, the stuff going into ARM SoCs is so much more interesting. Intel makes great high-end CPUs but it seems that these are becoming less important when looking at the consumer market as a whole.
  • Anand Lal Shimpi - Monday, April 23, 2012 - link

    Idle power didn't really go down because at idle nearly everything is power gated to begin with. Any improvements in leakage current don't help if the transistors aren't leaking to begin with :)

    Your ARM sentiments are spot on for a huge portion of the market however. Let's see what Haswell brings...

    Take care,
    Anand
  • thomas-hrb - Monday, April 23, 2012 - link

    I disagree with the testing methodology for the World of Warcraft test. Firstly no gamer of any game buys hardware so they can go to the most isolated areas in a game. Also the percentage of who can pay for one of these CPU's who would be playing at 1650x1050, would be pretty small.

    I've been playing WoW for a number of years and I don't care about 60fps+ because my monitor won't display it anyway. I care about minimum fps and average fps. nVidia's new adaptive vsync is a great innovation, but I am sure there are other tests that while not as controlled and repeatable is a much indicative of real world performance (the actual reason behind purchasing decisions).

    One possible testing methodology you could look into is to take a character into one of the topend 25man raids. There are 10 classes in WoW and my experience is that a 25man raid will show up every single possible spell/ability and effect that the game has to offer in fairly repeatable patterns.

    I agree that it is not the most scientific approach but I put more stock in a friend saying "go buy this cpu/gpu you can do all the raids and video capture and you get no lag" than you telling me that this cpu will give me 100+ fps in the middle of nowhere. There is a fine line between efficient and effective. I am just hoping that you can dial down the efficiency and come up with a testing methodology that actually produces a metric I can use in my purchasing decisions. After all that is one of the core reasons most people read reviews at all.
  • redisnidma - Monday, April 23, 2012 - link

    Expect Anand's Trinity review to be heavily biased with lots of AMD bashing.
    This site is so predictable...
  • Nfarce - Monday, April 23, 2012 - link

    Oh boy. Another delusional red label fangirl. Maybe when AMD gets their s**t together Anandtech will have something positive to review in comparison to the Intel offerings at the moment. Bulldozer bulldozed right off a cliff. And don't get me wrong: I WANT AMD to whip out some butt-kicking CPUs to keep the competition strong. But right now, Intel is not getting complacent and keep stepping their game up when the competition isn't even on the same playing court. But that's just for now. If AMD continues to falter, Intel may not be as motivated to stay ahead and spend so much R&D in the future. After all, why put the latest F1 car on the track when the competition can only bring a NASCAR car to every track?
  • Reikon - Monday, April 23, 2012 - link

    Temperature is in the overclocking article.

    http://www.anandtech.com/show/5763/undervolting-an...
  • rickthestik - Monday, April 23, 2012 - link

    An upgrade for me makes sense as my current cpu is an Intel Core 2 Quad and the new i7-3770K will be a pretty significant upgrade...2.34Ghz to 3.5Ghz and the heaps of additonal tech to go with it.
    I could see a fair number of Sandy Bridge owners holding off for Haswell, though for me this jump is pretty big and I'm looking forward to seeing what the i7-3770K can do with the Z77 motherboards and a shiny new PCI 3.0 GPU.

Log in

Don't have an account? Sign up now