Haswell isn't expected to launch until the beginning of June in desktops and quad-core notebooks, but Intel is beginning to talk performance. Intel used a mobile customer reference board in a desktop chassis featuring Haswell GT3 with embedded DRAM (the fastest Haswell GPU configuration that Intel will ship) and compared it to an ASUS UX15 with on-board NVIDIA GeForce GT 650M. 

Despite the chassis difference, Intel claims it will be able to deliver the same performance from the demo today in an identical UX15 chassis by the time Haswell ships.

The video below shows Dirt 3 running at 1080p on both systems, with identical detail settings (High Quality presets, no AA, vsync off). Intel wouldn't let us report performance numbers, but subjectively the two looked to deliver very similar performance. Note that I confirmed all settings myself and ran both games myself independently of the demo. You can be the judge using the video below:

Intel wouldn't let us confirm clock speeds on Haswell vs. the Core i7 (Ivy Bridge) system, but it claimed that the Haswell part was the immediate successor to its Ivy Bridge comparison point. 

As proof of Haswell's ability to fit in a notebook chassis, it did have another demo using older Haswell silicon running Call of Duty: Black Ops 2 in a notebook chassis. 

Haswell GT3e's performance looked great for processor graphics. I would assume that overall platform power would be reduced since you wouldn't have a discrete GPU inside, however there's also the question of the cost of the solution. I do expect that NVIDIA will continue to drive discrete GPU performance up, but as a solution for some of the thinner/space constrained form factors (think 13-inch MacBook Pro with Retina Display, maybe 11-inch Ultrabook/MacBook Air?) Haswell could be a revolutionary step forward.

Comments Locked

252 Comments

View All Comments

  • Hector2 - Thursday, January 10, 2013 - link

    This is what we've been doing for over the last 40 years. Integration, integration, integration. SSI chips on the motherboard combined into a "chipset". The external L2 cache was integrated into the CPU, then the chipset and memory controller integrated into the CPU, now the graphics is combining into the "CPU". The silicon black hole sucking up all transistors around it into a single piece of silicon continues.

    After 22nm, then 14nm, then 10nm, then 7nm.
  • torp - Friday, January 11, 2013 - link

    I have a feeling that those of us running Linux will still need to get a NVidia card because they have the only full featured drivers that run with no problem...
  • Medallish - Friday, January 11, 2013 - link

    Last I checked AMD's Binary drivers on Linux weren't that bad, they aren't perfect, but the main reason AMD is criticized on Linux is for their Open Driver support, which as bad as it is, is even worse with nVidia(Linus Thorvaldsen's reason for his stunt a while back) currently, hoping this might change, Intel actually seems to have the best support when it comes to Open Source drivers.
  • torp - Saturday, January 12, 2013 - link

    Not interested in open source/closed source, just in drivers that run my 3d games in wine. If you take a look at the application DB on winehq.org, most people complaining of artifacts, missing details etc run on amd video cards, no matter the drivers. Nvidia has far less problems, while intel doesn't even get a mention usually :)
  • CeriseCogburn - Monday, January 14, 2013 - link

    It's an amd fanboy, so real world facts do not matter.
    Save the world open source everything for free forever matters, so long as evil profit companies lose in that process.

    Dummy doesn't realize Torvalds did his birdie stunt because nVidia won't hand him the goods, not because nVidia drivers "suck".

    amd fanboys are stupid, biased, liars, incorrect, etc.

    Whatever they dream up - who cares
  • Hrel - Friday, January 11, 2013 - link

    I'm not going to be able to recommend a dedicated GPU to very many people starting in August. If Intel starts focusing, and delivering, on reliable drivers for the iGPU then that number drops even lower. Honestly 1, really good, reliable, stable, driver release/year would do it.
  • CeriseCogburn - Friday, January 11, 2013 - link

    There ya go - there's a dose of coming reality. It's already occurring, and amd is left in the dust because their cpu sucks and their drivers suck.

    This spells the doom of amd, as the alternate recommendations will be Intel cpu and OPTIMUS / nVidia.

    That's why the amd fanboys are going nutso.

    Good luck AMD and you need more than just that, like reversal of the brain drain and a huge bailout from some oil sheiks, better name a few more upcoming products Abu Dhabi or the like....
  • Medallish - Friday, January 11, 2013 - link

    Please explain in a detailed why what's wrong with AMD's CPU's and gfx drivers? My AMD systems work just fine, they're stable, and they perform very well, Windows 8 has been a bigger cause of instability on my Laptop, and it hasn't been that often.

    I have an HTPC/fileserver using a Llano APU, it runs 24/7, so far it hasn't crashed at all.

    What's a mystery to me is how you're not banned? This could have been a nice discussion about the implications of Intel improving their GPU's this much, instead it's a fanboy feces-fest, and you're the only one throwing.
  • nicolbolas - Sunday, January 13, 2013 - link

    but soon became engulfed by his desire to overwhelm anyone who liked AMD or make non-super-positive comments about Intel.

    anyways, i doubt this will mean much for anyone unless Intel puts the GT3 in more parts than i think they will (i think only i7) and/or EDRAM in with all GT3, maybe even GT2 (once more, i think only SOME i7s).

    I think that will mean that only enthusiasts or people buying from boutiques will possibly get a GT3 (+ EDRAM).

    I think this narrows it down to only people who just spend a lot of money on computer the OEM recommends.
  • dj christian - Monday, January 14, 2013 - link

    Don't listen to him! He's just a immature teenager by the way he writes.

Log in

Don't have an account? Sign up now