Intel's Graphics Performance Disadvantage

It is no secret that Intel's integrated graphics are very, very slow and nearly useless for most modern 3D graphics applications. So when Intel says we should see their integrated graphics parts increase in performance 10x by 2010 we should get excited right? That is much faster than Moore's law in terms of performance over time (we should only expect a little more than 2x performance over a 2 year period, not an order of magnitude).

The problem with this situation as noted by NVIDIA is that today's Intel integrated graphics parts are more than 10x slower than today's affordable discrete graphics parts.

Looking at that chart, we can see that Intel integrated graphics will be competitive with today's sub $100 hardware by the year 2010 in 3DMark06. With 4 year old games, NVIDIA's current hardware will still blow away Intel's 2010 integrated solution, and the margin just climbs higher if you look at a modern game. NVIDIA claims that these tests were done with low quality settings as well, but we can't speak to that as we weren't the one's running the tests. If that's the case, then the differences could be even larger.

The bottom line is that Intel doesn't have a solution that is within reach of current graphics hardware even if it could produce 10x performance today. In two years, after NVIDIA, AMD, and even S3 have again at least doubled performance from what we currently have, there's no way Intel can hope to keep up.

And NVIDIA wasn't pulling any punches. Jen-sun went so far as to say: "I wouldn't mind it if sometimes they just say thank you – that its possible to enjoy a game on an Intel microprocessor [because of NVIDIA graphics hardware]." This is certainly an audacious statement of the facts, even if it happens to be the truth. AMD can take some of the credit their as well, of course, but the point is that Intel couldn't make it on its own as a gaming platform today without the help of its competitors. Believe me when I say that we are trying to put a neutral spin on all this, but Jen-sun was out for blood today, and he absolutely hit his mark.

To prove the point further, they went to the data Valve has been collecting through Steam. This data must be taken with the proper dose of salt – it doesn't reflect the entire GPU market by any means. The Steam data reflects a specific subset of GPU users: gamers. Even more to the point, this is a specific subset of gamers: gamers who play games using Steam who chose to report their system information (anonymously of course). Yes the sample size is large, but this is by no means a random sample of anything and thus loses some points in the statistical accuracy department.

Steam data indicates that NVIDIA GPUs are utilized in 60% of Steam gamers’ boxes, while Intel GPUs are only being utilized in 2% of those surveyed. As Jen-sun pointed out, this isn't about juggling a few percentage points: gamers who use Steam clearly do not use Intel GPUs to play their games. The picture gets even grimmer for Intel when you look at DX10-specific data: NVIDIA has about 87% of the GPUs running in DX10 boxes while Intel powers only 0.11% (which Jen-sun said "I think that's just an error," and may well have been right). He was on target when he said that in this case "approximately zero seems statistically significant."

The implication is clear that while few gamers use Intel GPUs in the first place, gamers aren’t using Intel GPUs for DX10 gaming at all. We certainly buy that, as Intel does not even have a DX10 part on the market right now, but again this data is not a statistically accurate representation of the entire gaming population. Steam data is certainly not a bad indicator when taken in a proper context and as part of a broader look at the industry.

Further refuting the idea that Intel can displace NVIDIA, Jen-sun addressed Ron Fosner's assertion that multicore processors can handle graphics better than a dedicated graphics card ever could. This is where NVIDIA gets into a little counter FUD action, where Jen-sun shows that adding cores does nothing for gaming or graphics benchmark performance. In this case, Intel was certainly referring to the ability of CPUs to handle graphics code written specifically for multicore CPUs. But for the time being, when you compare adding cores to your system to adding a more powerful GPU, NVIDIA offers up to 27x more bang for your buck.

Currently Intel has some fairly decent lab demos, and there have been murmurs of a software renderer renaissance (I'd love to see John Carmack and Tim Sweeney duke it out one more time in software; maybe that's just me), but there just isn't anything in production that even tries to show what a CPU can or can't do when in direct competition with a GPU for graphics quality and performance. And there are reasons for that: currently it still isn't practical to develop the software. Maybe when everyone is running their 8 core 16 thread CPUs we'll see something interesting. But right now and even in the next few years rasterization is going to be the way to go and pure FLOPS with massive parallelism will win every time over Intel's programmability and relatively light parallelism.

Which brings us to a point NVIDIA made later in the day: GPUs are already multicore to the point where NVIDIA likes to refer to them as manycore (as we saw Intel do this with 100+ core concepts a few years back when they were first starting to push parallelism). It's a stretch for me to think of the 128 SPs in a G80 or G92 GPU as "cores" because they aren't really fully independent. But with the type of data GPUs normally tackle it's effectively very similar. Certainly no matter how you slice it GPUs are much wider hardware than any current multicore CPU.

The point NVIDIA needs to make is that the argument is far from over as the battle hasn't even really begun. At one point Jen-sun said "[Intel] can't go around saying the GPU is going to die when they don't know anything about it." This is a fair statement, but NVIDIA can't write Intel off either. They certainly will know about GPUs if they truly intend to go down the path they seem destined to travel. Either they'll be pushing the CPU (and all it's multicore glory) as a bastion of graphics power (for which some free multicore CPU graphics development tools might be nice, hint hint), or it will be essentially entering the graphics market outright with whatever Larabee ends up actually becoming.

Tackling the Market Share Myth The Tenderloin and the Two Buck Chuck
Comments Locked

43 Comments

View All Comments

  • zsdersw - Friday, April 11, 2008 - link

    Prognosticators, no matter how well qualified or respected, are very often wrong.
  • UNHchabo - Friday, April 11, 2008 - link

    You only think this is true because the ones who are wrong are often the only ones you remember.

    Example:
    "Spam will be a thing of the past in two years' time." -Bill Gates, 2004
  • zsdersw - Saturday, April 12, 2008 - link

    Umm, no. Predicting the future is rarely entirely accurate or precise, no matter how much of an expert you may be. Prognosticators who are experts are usually wrong as often as they're right. Experts are just as fallible as anyone else, if not more.

Log in

Don't have an account? Sign up now