Final Words

Reviewing a tick in Intel's cadence is always difficult. After Conroe if we didn't see a 40% jump in a generation we were disappointed. And honestly, after Sandy Bridge I felt it would be quite similar. Luckily for Intel, Ivy Bridge is quite possibly the strongest tick it has ever put forth.

Ivy Bridge is unique in that it gives us the mild CPU bump but combines it with a very significant increase in GPU performance. The latter may not matter to many desktop users, but in the mobile space it's quite significant. Ultimately that's what gives Ivy Bridge it's appeal. If you're already on Intel's latest and greatest, you won't appreciate Ivy as an upgrade but you may appreciate for the role it plays in the industry—as the first 22nm CPU from Intel and as a bridge to Haswell. If you missed last year's upgrade, it'll be Ivy's performance and lower TDP that will win you over instead.

Intel has done its best to make this tick more interesting than most. Ivy Bridge is being used as the introduction vehicle to Intel's 22nm process. In turn you get a cooler running CPU than Sandy Bridge (on the order of 20—30W under load), but you do give up a couple hundred MHz on the overclocking side. While I had no issues getting my 3770K up to 4.6GHz on the stock cooler, Sandy Bridge will likely be the better overclocker for most.

With Ivy Bridge and its 7-series chipset we finally get USB 3.0 support. In another month or so we'll also get Thunderbolt support (although you'll have to hold off on buying a 7-series motherboard until then if you want it). This platform is turning out to be everything Sandy Bridge should have been.

Ivy's GPU performance is, once again, a step in the right direction. While Sandy Bridge could play modern games at the absolute lowest quality settings, at low resolutions, Ivy lets us play at medium quality settings in most games. You're still going to be limited to 1366 x 768 in many situations, but things will look significantly better.

The sub-$80 GPU market continues to be in danger as we're finally able to get not-horrible graphics with nearly every Intel CPU sold. Intel still has a long way to go however. The GPUs we're comparing to are lackluster at best. While it's admirable that Intel has pulled itself out of the graphics rut that it was stuck in for the past decade, more progress is needed. Ivy's die size alone tells us that Intel could have given us more this generation, and I'm disappointed that we didn't get it. At some point Intel is going to have to be more aggressive with spending silicon real estate if it really wants to be taken seriously as a GPU company.

Similarly disappointing for everyone who isn't Intel, it's been more than a year after Sandy Bridge's launch and none of the GPU vendors have been able to put forth a better solution than Quick Sync. If you're constantly transcoding movies to get them onto your smartphone or tablet, you need Ivy Bridge. In less than 7 minutes, and with no impact to CPU usage, I was able to transcode a complete 130 minute 1080p video to an iPad friendly format—that's over 15x real time.

While it's not enough to tempt existing Sandy Bridge owners, if you missed the upgrade last year then Ivy Bridge is solid ground to walk on. It's still the best performing client x86 architecture on the planet and a little to a lot better than its predecessor depending on how much you use the on-die GPU.

Additional Reading

Intel's Ivy Bridge Architecture Exposed
Mobile Ivy Bridge Review
Undervolting & Overclocking on Ivy Bridge

Intel's Ivy Bridge: An HTPC Perspective

Quick Sync Image Quality & Performance
Comments Locked

173 Comments

View All Comments

  • cjb110 - Tuesday, April 24, 2012 - link

    Considering that on most games both the Xbox and PS 3tend to be sub 720p, the iGPU in Ivy Bridge is impressive. Has anybody compared the 3?
  • tipoo - Tuesday, April 24, 2012 - link

    You have a unified shader GPU with similar performance to the x1900 series but more flexible, and something like a Geforce 7800 with some parts of the 7600 like lower ROPs and memory bandwidth...7 years later, if even an IGP didn't beat those, it would be pretty sad. Those were ~200gflop cards, todays top end is over 3000, a lower-mid range chip like this I would expect to be in the upper hundreds.
  • gammaray - Tuesday, April 24, 2012 - link

    Why do Intel and AMD even started building IGPs in the first place?

    Why can't they just put a video card in every desktop and laptop?

    And if they continue making IGPs, whats their goal?

    Do they eventually wanna get rid of video card makers?
  • versesuvius - Tuesday, April 24, 2012 - link

    Better yet, why not just put the graphics on a chip like the CPU? That way the "upgrade path" is a lot clearer, not to mention "possible". It will also offer the possibility of having those chips in different flavors, for example good video transcoder or good gamer. There is room for that on the motherboard now that the north bridge is gone. Or they can review the IBM boards from 286 days and learn from their clean, and very efficient design.

    Unfortunately the financial model of the IT industry from a collective viewpoint entails throwing a lot of good hardware away just for a small advantage. Just the way many will throw away their HD3000 IGPs without having ever used it. The comparison is cruel but that should not be what distinguishes them from the toilet paper industry.

    As of late, Anand has taken to reminding us that technology has taken leaps ahead of our wishes and that we need time to absorb it. That is not the case. No wish is ever materialized. We only have to take whatever is offered and marvel at the only parameter that can be measured: speed. Less energy consumption is fine, but I suppose that comes with the territory (i.e. can Intel or AMD produce 22 nm chips that consume the same watts as 65 nm chips with the same number of transistors?).
  • tipoo - Wednesday, April 25, 2012 - link

    Cost, size, power draw. All reduced by putting everything on one chip. I'm not sure if AMD wants to get rid of discreet graphics cards considering that's their one profitable division, but Intel sure does :)
  • klmccaughey - Tuesday, April 24, 2012 - link

    I've tried every setup possible and have never got quicksync to work at all. It won't even work with discrete graphics enabled and my monitor hooked up to the intel chip on my Z68 board.

    I have tried mediacoder (error 14), media converter 7, MEdia Espresso. I have downloaded the media SDK, I have tried the new FFmthingy from the intel engineer. Nothing, nada. It has never ever worked. AMD media converter will convert a few limited formats that went out of fashion 5 years ago (of all my 1.5TB of video the only thing it would touch was old episodes of Becker).

    All in all I have got nothing whatsoever from any video accelerated encoding and I have always had to go back to my tried and trusted handbrake.

    I don't think it actually works - I've never heard of anyone getting it working and the forums on mediacoder are full of people who have given up.
  • klmccaughey - Tuesday, April 24, 2012 - link

    *disabled (for discrete graphics)
  • JarredWalton - Tuesday, April 24, 2012 - link

    What input format are you using? I've only tried it on laptops, and I've done MOV input from my Nikon D3100 camera with no issues whatsoever. I've also done a WMV input file (the sample Wildlife.WMV file from Windows 7) and it didn't have any trouble. If you're trying to do a larger video, that might be an issue, or it might just be a problem with the codec used on the original video.
  • dealcorn - Tuesday, April 24, 2012 - link

    Intel has big eyes for the workstation graphics market and has had some success at the bottom of this market. Will IB's IGD advances enhance Intel's access to this market?
  • chizow - Tuesday, April 24, 2012 - link

    Hope you're not counting yourself, you've proven long ago your opinion isn't worth paying attention to.

Log in

Don't have an account? Sign up now