Final Words

I’m a fan of Haswell, even on the desktop. The performance gains over Ivy Bridge depend on workload, but in general you’re looking at low single digits to just under 20%. We saw great behavior in many of our FP heavy benchmarks as well as our Visual Studio compile test. If you’re upgrading from Sandy Bridge you can expect to see an average improvement just under 20%, while coming from an even older platform like Nehalem will yield closer to a 40% increase in performance at the same clocks. As always, annual upgrades are tough to justify although Haswell may be able to accomplish that in mobile.

Even on the desktop, idle power reductions are apparent both at the CPU level and at the platform level.  Intel focused on reducing CPU power, and it seems like Intel's motherboard partners did the same as well. Under load Haswell can draw more power than Ivy Bridge but it typically makes up for it with better performance.

Overclockers may be disappointed at the fact that Haswell is really no more of an overclocker (on air) compared to Ivy Bridge. Given the more mobile focused nature of design, and an increased focus on eliminating wasted power, I don’t know that we’ll ever see a return to the heyday of overclocking.

If the fact that you can’t easily get tons of additional frequency headroom at marginal increase to CPU voltage is the only real downside to the platform, then I’d consider Haswell a success on the desktop. You get more performance and a better platform at roughly the same prices as Ivy Bridge a year ago. It’s not enough to convince folks who just bought a PC over the past year or two to upgrade again, but if you are upgrading from even a 3 year old machine the performance gains will be significant.

Quick Sync Performance
Comments Locked

210 Comments

View All Comments

  • smoohta - Saturday, June 1, 2013 - link

    Blah, seems like a rather shallow review:
    1. What about benchmarks to take advantage of the new AVX2 instructions? (FMA specifically would be interesting)
    2. Same for TSX?
  • Klimax - Sunday, June 2, 2013 - link

    I know only about x264 having it in the last versions. Not sure who else has it.
  • Gigaplex - Saturday, June 1, 2013 - link

    "Here I’m showing an 11.8% increase in power consumption, and in this particular test the Core i7-4770K is 13% faster than the i7-3770K. Power consumption goes up, but so does performance per watt."

    So... performance per watt increased by ~1%. For a completely new architecture that's supposedly all about power optimisation, that's extremely underwhelming to say the least.
  • Homeles - Saturday, June 1, 2013 - link

    Haswell is not focusing on the desktop I'm not sure how you managed to believe that it is.
  • krumme - Saturday, June 1, 2013 - link

    Because Anand is a fan of it, even at desktop?
  • MatthiasP - Saturday, June 1, 2013 - link

    So we get +10% performance increase for +10% increase in energy consumption? That's rather disappointing for a new generation.
  • jeffkibuule - Saturday, June 1, 2013 - link

    Haswell is movinig voltage regulators that were already on the motherboard on die, so power consumption hasn't changed, it's just that the CPU cooling system has to deal with that extra heat now. Remember that those power ratings are NOT about how much power the chip uses, but how much cooling is needed.
  • Homeles - Saturday, June 1, 2013 - link

    System power consumption with Haswell is, in fact, higher. Take a look at page 2.

    Still, when you're running at these kind of frequencies, 10% more performance for 10% more power is a big deal. If you were to hold back the performance gains to 0%, power savings would be greater than 25%.

    The only reason Piledriver was able to avoid this was because it was improving on something that was already so broken. AMD's not immune to the laws of physics -- when they catch up to Intel, they will hit the same wall.
  • Klimax - Sunday, June 2, 2013 - link

    Most likely sooner, because they can't fine tune process.
  • dgz - Saturday, June 1, 2013 - link

    I agree but Intel has been doing that for many years. I just don't get what they're gaining by artificially restricting IOMMU support.

Log in

Don't have an account? Sign up now