GPU Performance

With a modest increase in EU hardware (20 EUs up from 16 EUs), the Intel HD 4400 GPU in the Core i7-4500U I’m testing today isn’t tremendously faster compared to the HD 4000 in the i7-3517U. On average I measured a 15% increase in the subset of game tests I was able to run in Taipei, and a 13% increase in performance across our 3DMark tests. The peak theoretical increase in performance we should see here (taking into account EU and frequency differences) is 19%, so it doesn’t look like Haswell is memory bandwidth limited just yet.

Bioshock Infinite - Value

Bioshock Infinite - Mainstream

Metro: Last Light - Value

Metro: Last Light - Mainstream

Tomb Raider - Value

Tomb Raider - Mainstream

Futuremark 3DMark (2013)

Futuremark 3DMark (2013)

Futuremark 3DMark (2013)

Futuremark 3DMark06

If we throw 35W Trinity into the mix, HD 4400 gets closer but it's still far away from 35W Trinity performance:

GPU Performance Comparison
  Metro: LL - Value Metro: LL - Mainstream BioShock Infinite - Value BioShock Infinite - Mainstream Tomb Raider - Value Tomb Raider - Mainstream
Core i7-3517U 15.4 fps 6.0 fps 16.4 fps 7.0 fps 20.1 fps 10.2 fps
Core i7-4500U 14.5 fps 6.5 fps 17.4 fps 9.9 fps 24.6 fps 12.2 fps
A10-4600M 16.8 fps 8.0 fps 25.8 fps 10.0 fps 30.1 fps 12.7 fps

For light gaming, Intel’s HD 4000 was borderline reasonable. Intel’s HD 4400 takes half a step forward, but it doesn't dramatically change the playability of games that HD 4000 couldn't run well. Personally I’m very interested to see how the 28W Iris 5100 based Haswell ULT part fares later this year.

CPU Performance Final Words


View All Comments

  • warezme - Monday, June 10, 2013 - link

    "the processor graphics story by finally delivering discrete GPU class gaming performance". I hate this summation being thrown around, as I'm sure it will get re quoted somewhere as gospel. It is definitely NOT discrete GPU class gaming performance in any shape or form. There should be a limit to what is considered discrete GPU performance, like maybe 30-60FPS at, at least 1600x900 resolution and game settings across the board for all games set to Medium. That is not crazy or unreasonable for a true discrete GPU you would actually go out and buy. It shouldn't be unreasonable than to expect that in a built in GPU that is sold as "discrete GPU" quality. Reply
  • gnx - Monday, June 10, 2013 - link

    Kudos! You have to love AnandTech for providing such detailed analysis, so soon after Haswell was made public!

    But it does seem that Haswell for Ultrabooks isn't so revolutionary as Intel seemed to imply. Not that we have much of a choice, since ARM isn't an option, and AMD doesn't provide much of an alternative, but I was personally hoping for more from Haswell.

    Maybe it's change the equation for Windows Tablets? look forward to more from AnandTech!
  • Kiijibari - Monday, June 10, 2013 - link

    Can you please add Wh numbers in the Battery Life Test graph ( or normalize them at least like in the previous tables? Seems to me that you compare 2 different batteries there. Haswell is great sure, but not THAT great ;-)

    Yes it is explained in the text below, but a picture not matching the numbers in the text is useless and misleading. A picture should be worth more than 1000 words and not demand reading 1000 words of explanation ;-)
  • broccauley - Tuesday, June 11, 2013 - link

    Does anyone know what the status of "activity alignment" for power optimisation is on the Linux kernel and how it compares with Windows 8? I assume such techniques were added when the changes from the Android branch were merged? Reply
  • Henry 3 Dogg - Tuesday, June 11, 2013 - link

    "And today, we had to track down a pre-production Haswell Ultrabook in Taiwan to even be able to bring you this review of Haswell ULT."

    And today, a day later, you can pick up a production Haswell ULT based MacBook Air in your local Apple store.
  • lhurt - Tuesday, June 11, 2013 - link

    So are Platform Activity Manager (Windows) and Timer Coalescing (OSx) two different OS implementations of the same idea, to take advantage of Intel's Power Optimizer and are Haswell cpus required to get the benefit? Reply
  • fteoath64 - Saturday, June 15, 2013 - link

    "Any hopes for pairing a meaningfully high performance discrete GPU with Haswell ULT are dead."
    This is Intel's method of CLOSING other discrete GPU solution on their cpus towards the future. This is a predatory move and premeditated !. Just stop buying their chips as this is forcing users into a proprietary path using their inferior gpu technology. It is a selfish and disgusting move. Now ARM is going to cream them on the desktop side as well soon and server side in time.

Log in

Don't have an account? Sign up now