Haswell isn't expected to launch until the beginning of June in desktops and quad-core notebooks, but Intel is beginning to talk performance. Intel used a mobile customer reference board in a desktop chassis featuring Haswell GT3 with embedded DRAM (the fastest Haswell GPU configuration that Intel will ship) and compared it to an ASUS UX15 with on-board NVIDIA GeForce GT 650M. 

Despite the chassis difference, Intel claims it will be able to deliver the same performance from the demo today in an identical UX15 chassis by the time Haswell ships.

The video below shows Dirt 3 running at 1080p on both systems, with identical detail settings (High Quality presets, no AA, vsync off). Intel wouldn't let us report performance numbers, but subjectively the two looked to deliver very similar performance. Note that I confirmed all settings myself and ran both games myself independently of the demo. You can be the judge using the video below:

Intel wouldn't let us confirm clock speeds on Haswell vs. the Core i7 (Ivy Bridge) system, but it claimed that the Haswell part was the immediate successor to its Ivy Bridge comparison point. 

As proof of Haswell's ability to fit in a notebook chassis, it did have another demo using older Haswell silicon running Call of Duty: Black Ops 2 in a notebook chassis. 

Haswell GT3e's performance looked great for processor graphics. I would assume that overall platform power would be reduced since you wouldn't have a discrete GPU inside, however there's also the question of the cost of the solution. I do expect that NVIDIA will continue to drive discrete GPU performance up, but as a solution for some of the thinner/space constrained form factors (think 13-inch MacBook Pro with Retina Display, maybe 11-inch Ultrabook/MacBook Air?) Haswell could be a revolutionary step forward.

Comments Locked

252 Comments

View All Comments

  • AmdInside - Wednesday, January 9, 2013 - link

    Without data, the comparison is meaningless. The only thing this tells me is that Haswell can run a game that came out in 2007.
  • AmdInside - Wednesday, January 9, 2013 - link

    I mean 2011
  • CeriseCogburn - Thursday, January 10, 2013 - link

    LMAO- if Intel made a 4 year mistake we'd never hear the end of it.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    What a minute there buddbo.

    What it means to me is I can hardly wait to have one.

    ( I'll actually wind up with a lesser sleeper model, and so will others I push it on, who will be grateful no doubt. )

    It means Intel is a power to be reckoned with that AMD is going to have a hard time going up against. Yeah it means that a LOT.
  • tipoo - Wednesday, January 9, 2013 - link

    I wonder how much the GT3 without eDRAM will improve over the HD4000? How big a factor is it in the performance jump? And the lower end SKUs?

    I've often wondered why graphics cards didn't have a bit of eDRAM to help with certain operations, as it seems to allow consoles to get away with much slower main memory (see the Wii U).
  • Spunjji - Wednesday, January 9, 2013 - link

    With PCs it has traditionally not been die-area-efficient to do that where the screen resolution target is unknown. If you don't have enough eDRAM then it will be of little benefit at high resolutions; too much of it burns die space and costs more. Better to have a larger quantity of general purpose RAM with a high-bandwidth bus and high clocks and whatever size caches work best with your estimated texture workload.

    In this instance, however, Intel finally have die area to burn and a usage case (low-end bandwidth-constrained graphics) where it may actually make some sense.
  • wsw1982 - Thursday, January 10, 2013 - link

    I think the eDRAM is on package not on die.
  • Spunjji - Friday, January 11, 2013 - link

    Ahhh, I may well have been mistaken about that bit. I know it's on-package with the Wii and 360, but I assumed Intel were going on-die with this.

    Comments about expense of manufacturing still apply, but definitely less-so if that's the case.
  • CeriseCogburn - Friday, January 11, 2013 - link


    I think the hater just spews whatever his gourd dreams up on the spot - as that last attempt at bashing he spewed proves again.
  • Spunjji - Monday, January 14, 2013 - link

    I explained why it makes technical sense for Intel to do this now and not before. Define what part of that was "hating" and/or incorrect.

Log in

Don't have an account? Sign up now