Final Words

Based on these early numbers, Ivy Bridge is pretty much right where we expected it on the CPU side. You're looking at a 5 - 15% increase in CPU performance over Sandy Bridge at a similar price point. I have to say that I'm pretty impressed by the gains we've seen here today. It's quite difficult to get tangible IPC improvements from a modern architecture these days, particularly on such a strict nearly-annual basis. For a tick in Intel's cadence, Ivy Bridge is quite good. It feels a lot like Penryn did after Conroe, but better.

The improvement on the GPU side is significant. Although not nearly the jump we saw going to Sandy Bridge last year. Ivy's GPU finally puts Intel's processor graphics into the realm of reasonable for a system with low end GPU needs. Based on what we've seen, discrete GPUs below the $50 - $60 mark don't make sense if you've got Intel's HD 4000 inside your system. The discrete market above $100 remains fairly safe however.

With Ivy Bridge you aren't limited to playing older titles, although you are still limited to relatively low quality settings on newer games. If you're willing to trade off display resolution you can reach a much better balance. We are finally able to deliver acceptable performance at or above 1366 x 768. With the exception of Metro 2033, the games we tested even showed greater than 30 fps at 1680 x 1050. The fact that we were able to run Crysis: Warhead at 1680 x 1050 at over 50 fps on free graphics from Intel is sort of insane when you think about where Intel was just a few years ago.

Whether or not this is enough for mainstream gaming really depends on your definition of that segment of the market. Being able to play brand new titles at reasonable frame rates as realistic resolutions is a bar that Intel has safely met. I hate to sound like a broken record but Ivy Bridge continues Intel's march in the right direction when it comes to GPU performance. Personally, I want more and I suspect that Haswell will deliver much of that. It is worth pointing out that Intel is progressing at a faster rate than the discrete GPU industry at this point. Admittedly the gap is downright huge, but from what I've heard even the significant gains we're seeing here with Ivy will pale in comparison to what Haswell provides.

What Ivy Bridge does not appear to do is catch up to AMD's A8-series Llano APU. It narrows the gap, but for systems whose primary purpose is gaming AMD will still likely hold a significant advantage with Trinity. The fact that Ivy Bridge hasn't progressed enough to challenge AMD on the GPU side is good news. The last thing we need is for a single company to dominate on both fronts. At least today we still have some degree of competition in the market. To Intel's credit however, it's just as unlikely that AMD will surpass Intel in CPU performance this next round with Trinity. Both sides are getting more competitive, but it still boils down to what matters more to you: GPU or CPU performance. Similarly, there's also the question of which one (CPU or GPU) approaches "good enough" first. I suspect the answer to this is going to continue to vary wildly depending on the end user.

The power savings from 22nm are pretty good on the desktop. Under heavy CPU load we measured a ~30W decrease in total system power consumption compared to a similar Sandy Bridge part. If this is an indication of what we can expect from notebooks based on Ivy Bridge I'd say you shouldn't expect significant gains in battery life under light workloads, but you may see improvement in worst case scenario battery life. For example, in our Mac battery life suite we pegged the Sandy Bridge MacBook Pro at around 2.5 hours of battery life in our heavy multitasking scenario. That's the number I'd expect to see improve with Ivy Bridge. We only had a short amount of time with the system and couldn't validate Intel's claims of significant gains in GPU power efficiency but we'll hopefully be able to do that closer to launch.

There's still more to learn about Ivy Bridge, including how it performs as a notebook chip. If the results today are any indication, it should be a good showing all around. Lower power consumption and better performance at the same price as last year's parts - it's the Moore's Law way. There's not enough of an improvement to make existing SNB owners want to upgrade, but if you're still clinging to an old Core 2 (or earlier) system, Ivy will be a great step forward.

QuickSync Performance
Comments Locked

195 Comments

View All Comments

  • mrSmigs - Wednesday, March 7, 2012 - link

    The ivy bridge 3770k is a direct replacement for the sandy bridge 2700k which is only a small upgrade from the 2600k yet still missing from the benchmarks to allow a direct architectural comparison.

    Intel badly need powervr in its graphics core.... will they finally use a multicore Rogue series 6 core in the next generation (Haswell???) for some decent performance in their IGP???? They developed easily the fastest graphics core in the arm soc tablets/phones inside the ipad 2/iphone4s now its time to save intel (one of imgtechs biggest shareholders along with apple). Intel need to ditch this old weak igp core architecture and get with the times....

    The amd llano even with its terribly weak cpu core still clearly outpaces this new improved intel hd4000 core in these non gpu limited tests. If amd had a faster cpu they would be even further ahead in regards to graphic capabilities, which appear cpu limited in many cases too(see discreet gpu tables to get an idea of intels cpu advantages).

    Where are the in game checks on intel's notorious poor image quality, much like when radeons are compared to geforces to ensure these are even producing an acceptable image for the performance they give and not cutting corners???

    Happy with the lower power and performance cpu gains of Ivy Bridge. Disappointed in the weak old graphics once again, which fail to match llano even with a far stronger cpu dragging it along...
  • hasseb64 - Wednesday, March 7, 2012 - link

    How about OPEN GL support?
  • numberoneoppa - Wednesday, March 7, 2012 - link

    Perhaps because not everybody who needs a lot of CPU power also needs to game or do other GPU heavy activities.

    Come on, mate. Think.
  • Conficio - Wednesday, March 7, 2012 - link

    You guys asked for it and finally I have something I feel is a good software build test.


    I just wanted to say thank you for this. May be we can add a maven based java test as well, which should give some idea of javac performance (or a large Eclipse base build all).
  • Conficio - Wednesday, March 7, 2012 - link

    Uhh, this comment renders funny oh Chrome.
  • piesquared - Wednesday, March 7, 2012 - link

    Is this some kind of joke? It may be comical, but it sure ain't funny. intel themselves had slides circulating around showing at least 2x performance increasee over last generation. Now they show up with not even half that and Anand falls to his knees in praise.. Seems a little fishy to me where have I seen this before....Right, the primary elections in the US! Same shit, the elite give the mainstream media their marching orders, and the main stream media sets out to brainwash the mass population with that message. And you continue to lead the charge on downplaying image quality and functionality, ever since you became intel's mouthpiece. Where are the days of proper image quality comparisons, and feature benefit to consumers. That's all dropped off the radar because intel has abysmal and atrocious graphics capability and know how. They're the WORST in the industry, and yet he we have good ol' anand patting his buddy on the bumb ensuring that intel will ever have a need to actualy compete. They can just hand off money to the pieces' of shit in the world and have them manipulate the perception.
    tics
  • Hector2 - Wednesday, March 7, 2012 - link

    Sounds like you have some issues. Maybe you should see a therapist
  • awg0681 - Wednesday, March 7, 2012 - link

    Maybe I misread the article or read a different one. It came across to me that Anand was mainly comparing the HD4000 to HD3000. In which case there is generally a notable increase in performance. It's not 2x the HD3000, but doing a quick search trying to find these slides you mention showing such an increase came up with nothing. Only found one on Tom's which was a leaked slide comparing HD2000 to HD4000. If you could link some of those that would be great. Also, in just about every case where the HD4000 was (almost inevitably) beaten by AMD in graphics performance, it was pointed out.
  • geddarkstorm - Wednesday, March 7, 2012 - link

    I wonder how much of the improvement in the performance to power ratio is due to the trigate technology. In same ways, I was expecting a bigger jump around 20%, but since they also dropped the power by 30W, that says a lot. Looking at his from the perf/power perspective makes it a bigger deal than it sounds from a 5-15% CPU gain.

    Still.. for some reason I feel a little disappointed. I thought trigate would change things even more in conjuncture with 22 nm process.

    So can't wait to see what Hanswell will do.
  • Exodite - Wednesday, March 7, 2012 - link

    Does it matter though?

    After all that argument cuts both ways.

    Any iGPU today is good enough for 2D use, browsing and mainstream gaming - which means stuff like The SIMS 3 rather than Crysis.

    The same is true for CPU power.

    Heck, most users would be perfectly happy with using their smartphones as a desktop.

Log in

Don't have an account? Sign up now