Final Words

Based on these early numbers, Ivy Bridge is pretty much right where we expected it on the CPU side. You're looking at a 5 - 15% increase in CPU performance over Sandy Bridge at a similar price point. I have to say that I'm pretty impressed by the gains we've seen here today. It's quite difficult to get tangible IPC improvements from a modern architecture these days, particularly on such a strict nearly-annual basis. For a tick in Intel's cadence, Ivy Bridge is quite good. It feels a lot like Penryn did after Conroe, but better.

The improvement on the GPU side is significant. Although not nearly the jump we saw going to Sandy Bridge last year. Ivy's GPU finally puts Intel's processor graphics into the realm of reasonable for a system with low end GPU needs. Based on what we've seen, discrete GPUs below the $50 - $60 mark don't make sense if you've got Intel's HD 4000 inside your system. The discrete market above $100 remains fairly safe however.

With Ivy Bridge you aren't limited to playing older titles, although you are still limited to relatively low quality settings on newer games. If you're willing to trade off display resolution you can reach a much better balance. We are finally able to deliver acceptable performance at or above 1366 x 768. With the exception of Metro 2033, the games we tested even showed greater than 30 fps at 1680 x 1050. The fact that we were able to run Crysis: Warhead at 1680 x 1050 at over 50 fps on free graphics from Intel is sort of insane when you think about where Intel was just a few years ago.

Whether or not this is enough for mainstream gaming really depends on your definition of that segment of the market. Being able to play brand new titles at reasonable frame rates as realistic resolutions is a bar that Intel has safely met. I hate to sound like a broken record but Ivy Bridge continues Intel's march in the right direction when it comes to GPU performance. Personally, I want more and I suspect that Haswell will deliver much of that. It is worth pointing out that Intel is progressing at a faster rate than the discrete GPU industry at this point. Admittedly the gap is downright huge, but from what I've heard even the significant gains we're seeing here with Ivy will pale in comparison to what Haswell provides.

What Ivy Bridge does not appear to do is catch up to AMD's A8-series Llano APU. It narrows the gap, but for systems whose primary purpose is gaming AMD will still likely hold a significant advantage with Trinity. The fact that Ivy Bridge hasn't progressed enough to challenge AMD on the GPU side is good news. The last thing we need is for a single company to dominate on both fronts. At least today we still have some degree of competition in the market. To Intel's credit however, it's just as unlikely that AMD will surpass Intel in CPU performance this next round with Trinity. Both sides are getting more competitive, but it still boils down to what matters more to you: GPU or CPU performance. Similarly, there's also the question of which one (CPU or GPU) approaches "good enough" first. I suspect the answer to this is going to continue to vary wildly depending on the end user.

The power savings from 22nm are pretty good on the desktop. Under heavy CPU load we measured a ~30W decrease in total system power consumption compared to a similar Sandy Bridge part. If this is an indication of what we can expect from notebooks based on Ivy Bridge I'd say you shouldn't expect significant gains in battery life under light workloads, but you may see improvement in worst case scenario battery life. For example, in our Mac battery life suite we pegged the Sandy Bridge MacBook Pro at around 2.5 hours of battery life in our heavy multitasking scenario. That's the number I'd expect to see improve with Ivy Bridge. We only had a short amount of time with the system and couldn't validate Intel's claims of significant gains in GPU power efficiency but we'll hopefully be able to do that closer to launch.

There's still more to learn about Ivy Bridge, including how it performs as a notebook chip. If the results today are any indication, it should be a good showing all around. Lower power consumption and better performance at the same price as last year's parts - it's the Moore's Law way. There's not enough of an improvement to make existing SNB owners want to upgrade, but if you're still clinging to an old Core 2 (or earlier) system, Ivy will be a great step forward.

QuickSync Performance


View All Comments

  • krumme - Wednesday, March 7, 2012 - link

    Well the dilemma for Anand is apparent. If he stops writing those previews that is nice to Intel, someone else will get the oportunity and all the info. He can write two bad previews and the info and early chips just stops comming. Intel and Anand have a business to run, and there is a reason Intel gives Anand the chips (indirectly).

    He have a "deal" with Intel, the same way we have a deal with Anand when we read the review. We get the info - bended/biased - and then we can think ourselves. I think its a fair deal :) - we get a lot of good info from this preview. The uninformed gets raped, but its alway like that. Someone have to pay for the show.
  • chemist1 - Wednesday, March 7, 2012 - link

    The Macbook Pro, for instance, has a discrete GPU, yet can switch to the chip-based GPU to save power when on battery. So having a better chip-based GPU makes sense in this context. Reply
  • Sabresiberian - Wednesday, March 7, 2012 - link

    I'd like to see the discreet graphics card industry make the kind of progress, relatively speaking, Intel has made in the last 2 years.

    Ivy Bridge is a ways from competing with a high-end discreet solution, but if the relative rates in progress don't change, Intel will catch up soon.
  • sixtyfivedays - Wednesday, March 7, 2012 - link

    I use the iGPU on my build for my second monitor and it is quite nice.

    I can watch HD videos on it and it doesn't take away from my dedicated GPU at all.
  • mlkmade - Thursday, March 8, 2012 - link

    Is that even possible? Special hack or software?

    When you install a discrete graphics card, the integrated gpu gets disabled.

    Would love to know how you accomplished this..Is it a desktop or laptop?
  • mathew7 - Thursday, March 8, 2012 - link

    "When you install a discrete graphics card, the integrated gpu gets disabled."

    It was exclusive in northbridge-IGP units (Core2Duo/Quad and older). With Core-i, it's by default disabled but can be enabled through BIOS (of course if you don't have a P5x/6x chipset).
  • AnnonymousCoward - Wednesday, March 7, 2012 - link

    1. How much faster is Ivy Bridge at single thread versus my Conroe@3GHz?
    2. How much faster is my GTX560Ti than HD4000?
  • dr/owned - Thursday, March 8, 2012 - link

    1) Your 65 nm cpu would get the shit blow out of it by IB at the same clock speed in single threaded applications. Assuming 15% improvements in each of the tick-tocks since Conroe, a 1.8 Ghz IB would probably be about the same as a 3 Ghz Conroe.
    2) Discrete graphics vs. integrated graphics. Intel isn't trying to compete here so it's a stupid comparison.
  • AnnonymousCoward - Friday, March 9, 2012 - link

    1. Your "get the shit blown out" is worthless. All I'm looking for is a number, and your effective answer is +67%.

    2. It's not a stupid comparison, because:
    a) I'm interested.
    b) HD4000 is designed for games.
    c) They benchmarked with modern games.
    d) Games are designed around people's performance.
  • AnnonymousCoward - Friday, March 9, 2012 - link

    1. Another website shows the i7 3770K scored 2643 on the Fritz Chess Benchmark with 1 processor. My machine does 2093. That's only 26% different.

    2. I very roughly estimate the GTX560Ti might be 5-6x faster than the HD4000.

    It'd be useful to see a real comparison of these though.

Log in

Don't have an account? Sign up now