Intel's Graphics Performance Disadvantage

It is no secret that Intel's integrated graphics are very, very slow and nearly useless for most modern 3D graphics applications. So when Intel says we should see their integrated graphics parts increase in performance 10x by 2010 we should get excited right? That is much faster than Moore's law in terms of performance over time (we should only expect a little more than 2x performance over a 2 year period, not an order of magnitude).

The problem with this situation as noted by NVIDIA is that today's Intel integrated graphics parts are more than 10x slower than today's affordable discrete graphics parts.

Looking at that chart, we can see that Intel integrated graphics will be competitive with today's sub $100 hardware by the year 2010 in 3DMark06. With 4 year old games, NVIDIA's current hardware will still blow away Intel's 2010 integrated solution, and the margin just climbs higher if you look at a modern game. NVIDIA claims that these tests were done with low quality settings as well, but we can't speak to that as we weren't the one's running the tests. If that's the case, then the differences could be even larger.

The bottom line is that Intel doesn't have a solution that is within reach of current graphics hardware even if it could produce 10x performance today. In two years, after NVIDIA, AMD, and even S3 have again at least doubled performance from what we currently have, there's no way Intel can hope to keep up.

And NVIDIA wasn't pulling any punches. Jen-sun went so far as to say: "I wouldn't mind it if sometimes they just say thank you – that its possible to enjoy a game on an Intel microprocessor [because of NVIDIA graphics hardware]." This is certainly an audacious statement of the facts, even if it happens to be the truth. AMD can take some of the credit their as well, of course, but the point is that Intel couldn't make it on its own as a gaming platform today without the help of its competitors. Believe me when I say that we are trying to put a neutral spin on all this, but Jen-sun was out for blood today, and he absolutely hit his mark.

To prove the point further, they went to the data Valve has been collecting through Steam. This data must be taken with the proper dose of salt – it doesn't reflect the entire GPU market by any means. The Steam data reflects a specific subset of GPU users: gamers. Even more to the point, this is a specific subset of gamers: gamers who play games using Steam who chose to report their system information (anonymously of course). Yes the sample size is large, but this is by no means a random sample of anything and thus loses some points in the statistical accuracy department.

Steam data indicates that NVIDIA GPUs are utilized in 60% of Steam gamers’ boxes, while Intel GPUs are only being utilized in 2% of those surveyed. As Jen-sun pointed out, this isn't about juggling a few percentage points: gamers who use Steam clearly do not use Intel GPUs to play their games. The picture gets even grimmer for Intel when you look at DX10-specific data: NVIDIA has about 87% of the GPUs running in DX10 boxes while Intel powers only 0.11% (which Jen-sun said "I think that's just an error," and may well have been right). He was on target when he said that in this case "approximately zero seems statistically significant."

The implication is clear that while few gamers use Intel GPUs in the first place, gamers aren’t using Intel GPUs for DX10 gaming at all. We certainly buy that, as Intel does not even have a DX10 part on the market right now, but again this data is not a statistically accurate representation of the entire gaming population. Steam data is certainly not a bad indicator when taken in a proper context and as part of a broader look at the industry.

Further refuting the idea that Intel can displace NVIDIA, Jen-sun addressed Ron Fosner's assertion that multicore processors can handle graphics better than a dedicated graphics card ever could. This is where NVIDIA gets into a little counter FUD action, where Jen-sun shows that adding cores does nothing for gaming or graphics benchmark performance. In this case, Intel was certainly referring to the ability of CPUs to handle graphics code written specifically for multicore CPUs. But for the time being, when you compare adding cores to your system to adding a more powerful GPU, NVIDIA offers up to 27x more bang for your buck.

Currently Intel has some fairly decent lab demos, and there have been murmurs of a software renderer renaissance (I'd love to see John Carmack and Tim Sweeney duke it out one more time in software; maybe that's just me), but there just isn't anything in production that even tries to show what a CPU can or can't do when in direct competition with a GPU for graphics quality and performance. And there are reasons for that: currently it still isn't practical to develop the software. Maybe when everyone is running their 8 core 16 thread CPUs we'll see something interesting. But right now and even in the next few years rasterization is going to be the way to go and pure FLOPS with massive parallelism will win every time over Intel's programmability and relatively light parallelism.

Which brings us to a point NVIDIA made later in the day: GPUs are already multicore to the point where NVIDIA likes to refer to them as manycore (as we saw Intel do this with 100+ core concepts a few years back when they were first starting to push parallelism). It's a stretch for me to think of the 128 SPs in a G80 or G92 GPU as "cores" because they aren't really fully independent. But with the type of data GPUs normally tackle it's effectively very similar. Certainly no matter how you slice it GPUs are much wider hardware than any current multicore CPU.

The point NVIDIA needs to make is that the argument is far from over as the battle hasn't even really begun. At one point Jen-sun said "[Intel] can't go around saying the GPU is going to die when they don't know anything about it." This is a fair statement, but NVIDIA can't write Intel off either. They certainly will know about GPUs if they truly intend to go down the path they seem destined to travel. Either they'll be pushing the CPU (and all it's multicore glory) as a bastion of graphics power (for which some free multicore CPU graphics development tools might be nice, hint hint), or it will be essentially entering the graphics market outright with whatever Larabee ends up actually becoming.

Tackling the Market Share Myth The Tenderloin and the Two Buck Chuck
Comments Locked

43 Comments

View All Comments

  • panfist - Friday, April 11, 2008 - link

    There is a special place in my heart and in gaming history for John Carmack, but I don't think he's necessarily the one to trust when it comes to forecasting the industry anymore.

    Doom3 the single player game was disappointing, and the engine never really had a big hit game, either.

    Now maybe if Valve or Epic weighed in with similar comments...
  • StormEffect - Friday, April 11, 2008 - link

    It was called Prey and it was fairly successful.
  • Sunrise089 - Friday, April 11, 2008 - link

    In addition, while there wasn't one crazy breakthrough hit (and on the PC, what really is these days?), I would guess that in total installed copies of Doom 3, Quake 4, Prey, and Quake Wars is pretty competitive to some of the other contemporary engines.
  • Conroe - Friday, April 11, 2008 - link

    If Intel could integrate a GPU that actually could run games what do you think would happen to nvidia? He sounds a little frightened to me.
  • jtleon - Friday, April 11, 2008 - link

    Why is it in Jen-sun's best interest to draw attention to Intel's failed IGP?

    Consider the end user experience - I tried using Intel's IGP - and became so horribly frustrated that I abandoned the IGP altogether in disgust! As a competitor, Jen-sun cannot buy such a powerful motivator to drive customers to nVidia (or ATI), right?

    Jen-sun should be praising Intel for their IGP, and encourage them to continue the "good" work for nVidia! Don't ridicule Intel - Don't dare them to beat you.

    Jen-sun mis-managed this Financial Meeting and cannot retract his indignation - He has challenged Intel to a Dual, and he cannot win!

    Regards,
    jtleon
  • Griswold - Friday, April 11, 2008 - link

    "He has challenged Intel to a Dual, and he cannot win!"

    A dual what? Dual-core maybe?

    Its spelled d-u-e-l.
  • jtleon - Friday, April 11, 2008 - link

    Thanks Griswold...saw the mistake as I hit the Post button - unfortunately this site does not offer an "edit" after the fact!
  • poohbear - Friday, April 11, 2008 - link

    thanks for pointing out the obvious to all of us w/ a grade 3 and above education Griswold. Now, do us all a favor and go "fuk" yourself, and dont tell me how to spell fuk on the internet. Thank you very much.
  • jtleon - Friday, April 11, 2008 - link

    No doubt Jen-sun is very afraid. Intel could buy his entire engineering team - should they so choose.

    However, such fear is a vital ingredient (always has been) to generate true innovation. We should be worried if Jen-sun is not afraid.

    Regards,
    jtleon
  • Lonyo - Friday, April 11, 2008 - link

    Intel are arguably a long term company.
    It may be that no one can see anything happening in the near future, but give it time and we will see things shifting I am sure.
    They are in it for the long haul, but they also want to show they are making short term steps to get there.

    The Atom is by no means a finished platform, nor does it operate where Intel are aiming for, but it's a start on the road.

Log in

Don't have an account? Sign up now