Yesterday we reported Intel ran a video of a DX11 title instead of running the actual game itself on a live Ivy Bridge notebook during Mooly Eden's press conference. After the press conference Intel let me know that the demo was a late addition to the presentation and they didn't have time to set it up live but insisted that it worked. I believed Intel (I spent a lot of time with Ivy Bridge at the end of last year and saw no reason to believe that DX11 functionality was broken) but I needed definitive proof that there wasn't some issue that Intel was trying to deliberately hide. Intel promptly invited me to check out the demo live on an Ivy Bridge notebook which I just finished doing.

The notebook isn't the exact same one used yesterday (Mooly apparently has that one in some meetings today) but I confirmed that it was running with a single GPU that reported itself as Intel's HD Graphics 4000 (Ivy Bridge graphics):

The system was running an IVB engineering sample running at 2.5GHz:

And below is a video of F1 2011 running live on the system in DirectX 11 mode:

Case closed.

Comments Locked

50 Comments

View All Comments

  • Sabresiberian - Friday, January 13, 2012 - link

    DX11 is more efficient than DX10; if the on-die GPU has the code to run DX11, it should provide a frame rate increase in the same scenario (with DX11 capable content). I've experienced this with World of Warcraft myself, after the Cataclysm upgrade.

    The caveat to what I just said is that most DX11 capable games available today don't take full advantage of all of its capabilities. A game that does may not see a frame rate increase. Generally speaking though any ultrabook that could run DX10 content will be able to run content better, all else being the same, with a DX11 capable GPU inside it.

    I have no trouble believing that Intel (or any other serious CPU/GPU manufacturer) could make their chip DX11 capable. There is no reason for Intel to fake anything. They already have proven DX10 capable chips with Sandy Bridge, the leap to DX11 just isn't that big a deal, comparatively.

    ;)
  • drmo - Tuesday, January 10, 2012 - link

    The link says the graphics on the ULV for ultrabooks run at almost half speed 350 vs 650 (but turbo almost to the same). So they might have not wanted to show a slideshow in the live demo. Or, like they said, maybe they didn't have time to install a game, but they had plenty of time to install the game, take a video of it, and put said video on another computer.

    Regardless, "Will it work in other Dx11 titles?" is the real question.
  • Alexvrb - Friday, January 13, 2012 - link

    Always show your best hand. Most people will see the "decent" performance of the highest-clocked, highest-thermal-ceiling parts, and assume that anything with HD 4000 is identical in all cases.

    But hey, for a gaming laptop you can always get a dedicated Nvidia or AMD GPU. Ultrabooks? Not so much... kind of defeats the purpose.
  • Vesku - Tuesday, January 10, 2012 - link

    Why the racing wheel on stage though? Why not just say "this is a capture of IB graphics running a DX11 game". It seems to me they wanted to give the impression that the ultrabook on stage was running it live.
  • Vesku - Tuesday, January 10, 2012 - link

    He's walking away from his own presentation and using a jovial tone to say it's happening backstage in "realtime". Not sure how anyone can try to take that comment of his seriously without additional proof. IMO, he made a joking statement to break up the tension/frustration until proven otherwise. I definitely don't see it as a serious explanation, he wouldn't be walking away if it was.

    Main thing I learned from this fib is that Intel is extremely determined to sell it's ULV chips.
  • Donnie Darko - Tuesday, January 10, 2012 - link

    The following points are omitted from Anand's horrible cover up attempt.
    1: Different hardware that is apparently 25% over clocked (2.0GHz CPU running at 2.5 GHz).
    2. No report on GPU frequencies or shader units/cores/what ever intel calls them. These are probably seeing the same 25% over clock too.
    3. No report on power consumption (ie. at a 25% over clock being generous this 'laptop' is probably pulling around 45W for the CPU/GPU vs the 17W availible for the UltraBook chip)
    4. While the game is in 'DX11 mode' there's no indication that it's not running mostly DX9 code path.
    Intel chose a 'DX11' title with a very old base engine that sitll has DX9 render paths. Since Intel's previous efforts never had their DX10 drivers working with the full instruction set I'm highly skeptical that DX11 (or even DX10) is really being implimented here. Maybe a couple of easy features are enabled, but the game is almost certainly running most of the code on the DX9 render pipeline.
    5. No indication of game resolution or graphics settings.

    If they had a good case for DX11 gaming they would have done the demo live with a real DX10/DX11 engine (think BF3 Frostbite 2 engine with no DX9 redner path). They didn't because they couldn't There's no time issue when you underpromise and over deliver. People wait there whole lives for that moment and they would have spent the entire presentation showing an ultrabook crush the competition (AMD and NVIDIA) in gaming if they could have. We just need AMD to show a 'Trinity' demo that was using an Intel CPU to feed 'representative' discreet AMD graphics and all three will have had their 'woodscrew' moment and we can get on with our lives.

    So yes, case closed indeed sir. Too bad the case is that Intel doesn't have competative graphics. Also while I appreciate that you are an ex Intel employee, and still have very strong ties to the company, your treatment of their graphics has gone from humourous and informative, to sad and pathetic. It's ok to say that they aren't function for gaming. You don't even have to say they suck monkey nuts (which they do). They actually have lots of intresting and positive things going for them (which you point out ad naseum). Just stick to the good, and ignore the bad if you must. Don't try to rationalize the bad into good, it's depressing to watch.
  • Death666Angel - Tuesday, January 10, 2012 - link

    I don't think they attempt to prove anything (clock frequencies, awesome fps, high res textures), except to say "DX11 works on IVB". It does not go on saying how powerful it will be or anything that you mention.
  • Donnie Darko - Tuesday, January 10, 2012 - link

    Very specificaly they said "DX11 Games will be playable on an Ultrabook using IVB."
    They didn't say, "we can get a DX11 game to load."

    There's a huge difference between saying it has feature X, but cannot be used and saying something is a viable platform to do X with.

    So yes, they specifically promised Ultrabook IVB is powerful enough to play DX11 games. This is what they are now atempting to cover with FUD.

    If they had of just said, "look at us demo a DX11 game on IVB. We recoreded it as a video to save time" then they would have had no problem. I would have belived them and left it at that. I want Intel to be able to do these things. I don't want to have to expalin to my friends and family the thing they just bought cannot do what it is advertised to do. I really don't want to try to explain to them I cannot fix it either with my magic computer powers no matter how 'good at computers' I appear to be to them. People buying an IVB Ultraportable are going to be horribly fucked over because of Intel's lying, and people like Anand going to bat for them.
  • nategator - Tuesday, January 10, 2012 - link

    Technically you are spreading FUD.

    Intel is just trying, at worst, to over embellish. If you're unable to separate the meat from the puffery, I recommend allocating your purchase decision making to a parent/wife/family member.

    All we do know is that IVB can run run one old DX11 game at what looks like 720p. That's it.

    All the rest means jack until Anandtech and others get a hold of some IVB hardware and test the shit out of them.

    To be continued...
  • MrSpadge - Tuesday, January 10, 2012 - link

    Re 1: could it be Turbo?
    Re 2 & 3: There's no law for neither Intel nor AMD to disclose full specs of pre-production hardware.

    And you know SNB has no problems hitting 3.5 GHz in 35 W TDP packages? So, sure, the more advanced part much draw 45 W at 2.5 GHz ..

    +1 @ Death666Angel

Log in

Don't have an account? Sign up now