GPU Performance

With a modest increase in EU hardware (20 EUs up from 16 EUs), the Intel HD 4400 GPU in the Core i7-4500U I’m testing today isn’t tremendously faster compared to the HD 4000 in the i7-3517U. On average I measured a 15% increase in the subset of game tests I was able to run in Taipei, and a 13% increase in performance across our 3DMark tests. The peak theoretical increase in performance we should see here (taking into account EU and frequency differences) is 19%, so it doesn’t look like Haswell is memory bandwidth limited just yet.

Bioshock Infinite - Value

Bioshock Infinite - Mainstream

Metro: Last Light - Value

Metro: Last Light - Mainstream

Tomb Raider - Value

Tomb Raider - Mainstream

Futuremark 3DMark (2013)

Futuremark 3DMark (2013)

Futuremark 3DMark (2013)

Futuremark 3DMark06

If we throw 35W Trinity into the mix, HD 4400 gets closer but it's still far away from 35W Trinity performance:

GPU Performance Comparison
  Metro: LL - Value Metro: LL - Mainstream BioShock Infinite - Value BioShock Infinite - Mainstream Tomb Raider - Value Tomb Raider - Mainstream
Core i7-3517U 15.4 fps 6.0 fps 16.4 fps 7.0 fps 20.1 fps 10.2 fps
Core i7-4500U 14.5 fps 6.5 fps 17.4 fps 9.9 fps 24.6 fps 12.2 fps
A10-4600M 16.8 fps 8.0 fps 25.8 fps 10.0 fps 30.1 fps 12.7 fps

For light gaming, Intel’s HD 4000 was borderline reasonable. Intel’s HD 4400 takes half a step forward, but it doesn't dramatically change the playability of games that HD 4000 couldn't run well. Personally I’m very interested to see how the 28W Iris 5100 based Haswell ULT part fares later this year.

CPU Performance Final Words
Comments Locked

87 Comments

View All Comments

  • FwFred - Monday, June 10, 2013 - link

    Single package instead of two, integrated VRs instead of discrete. Perhaps this allows a smaller mainboard and allows a bigger battery?
  • piroroadkill - Monday, June 10, 2013 - link

    Not impressed.
    Yeah, the idle time battery life is better, but that GPU is super-lousy. In my opinion, Intel have done themselves a massive disservice by making crappy GPUs available with Haswell. The choice should be only 5100 and 5200. The others are a total waste of time, and barely interesting over HD 4000.
  • nunomoreira10 - Monday, June 10, 2013 - link

    There is not a single 5100 17w sku, and the reason is power.
    intel is going the nvidea and amd road, choices, this is the budget i7, want more, pay more.
  • mikebelle - Monday, June 10, 2013 - link

    I still think he has a point though. While some consumers may prefer the battery life and/or cost savings. Intel seems to have made it very difficult to get access to any of there 5000 series graphics. I wouldn't be surprised to see Iris and Iris Pro come to a few Core i5 parts during Haswell's "refresh".
  • samkathungu - Monday, June 10, 2013 - link

    Is it just me or are the releases coming from Intel about all the flavours of Haswell getting a little confusing? Probably a better communications strategy next time will benefit consumers.
    The confusion over what graphics ships with the desktop or mobile parts is not pleasant.
  • vipw - Monday, June 10, 2013 - link

    Maybe I'm bad at counting, but it still looks like there are two chips on the package.
  • sheh - Monday, June 10, 2013 - link

    When are the i5 43xxM and 42xxM are going to be available?
  • darthrevan13 - Monday, June 10, 2013 - link

    So no more PCIe 2.0? Will Thunderbolt be available for ULT/ULX processors? You could in theory connect a dGPU through that, right?
  • Sugardaddy - Monday, June 10, 2013 - link

    On page 2, you state that "any hopes for pairing a meaningfully high performance discrete GPU with Haswell ULT are dead."

    But there is a lot of Ultrabooks coming out like the Aspire S3-392 with a discrete GT 735M, which is probably 50%-100% faster than the 620M in last year's Asus UX32VD.

    How does that fit together? Is the 735M not "meaningfully faster" than HD4400/HD5000?
    Thanks!
  • extide - Monday, June 10, 2013 - link

    I was hoping charlie would be wrong. Sadly, he was right, Intel took away PCIe 3.0 and all CPU based PCIe lanes from this CPU. This is how the kill off AMD/nVidia competition, make it literally not an option. Scary as hell, I hope they don't start doing this to higher TDP SKU's.

Log in

Don't have an account? Sign up now