Intel's Graphics Performance Disadvantage

It is no secret that Intel's integrated graphics are very, very slow and nearly useless for most modern 3D graphics applications. So when Intel says we should see their integrated graphics parts increase in performance 10x by 2010 we should get excited right? That is much faster than Moore's law in terms of performance over time (we should only expect a little more than 2x performance over a 2 year period, not an order of magnitude).

The problem with this situation as noted by NVIDIA is that today's Intel integrated graphics parts are more than 10x slower than today's affordable discrete graphics parts.

Looking at that chart, we can see that Intel integrated graphics will be competitive with today's sub $100 hardware by the year 2010 in 3DMark06. With 4 year old games, NVIDIA's current hardware will still blow away Intel's 2010 integrated solution, and the margin just climbs higher if you look at a modern game. NVIDIA claims that these tests were done with low quality settings as well, but we can't speak to that as we weren't the one's running the tests. If that's the case, then the differences could be even larger.

The bottom line is that Intel doesn't have a solution that is within reach of current graphics hardware even if it could produce 10x performance today. In two years, after NVIDIA, AMD, and even S3 have again at least doubled performance from what we currently have, there's no way Intel can hope to keep up.

And NVIDIA wasn't pulling any punches. Jen-sun went so far as to say: "I wouldn't mind it if sometimes they just say thank you – that its possible to enjoy a game on an Intel microprocessor [because of NVIDIA graphics hardware]." This is certainly an audacious statement of the facts, even if it happens to be the truth. AMD can take some of the credit their as well, of course, but the point is that Intel couldn't make it on its own as a gaming platform today without the help of its competitors. Believe me when I say that we are trying to put a neutral spin on all this, but Jen-sun was out for blood today, and he absolutely hit his mark.

To prove the point further, they went to the data Valve has been collecting through Steam. This data must be taken with the proper dose of salt – it doesn't reflect the entire GPU market by any means. The Steam data reflects a specific subset of GPU users: gamers. Even more to the point, this is a specific subset of gamers: gamers who play games using Steam who chose to report their system information (anonymously of course). Yes the sample size is large, but this is by no means a random sample of anything and thus loses some points in the statistical accuracy department.

Steam data indicates that NVIDIA GPUs are utilized in 60% of Steam gamers’ boxes, while Intel GPUs are only being utilized in 2% of those surveyed. As Jen-sun pointed out, this isn't about juggling a few percentage points: gamers who use Steam clearly do not use Intel GPUs to play their games. The picture gets even grimmer for Intel when you look at DX10-specific data: NVIDIA has about 87% of the GPUs running in DX10 boxes while Intel powers only 0.11% (which Jen-sun said "I think that's just an error," and may well have been right). He was on target when he said that in this case "approximately zero seems statistically significant."

The implication is clear that while few gamers use Intel GPUs in the first place, gamers aren’t using Intel GPUs for DX10 gaming at all. We certainly buy that, as Intel does not even have a DX10 part on the market right now, but again this data is not a statistically accurate representation of the entire gaming population. Steam data is certainly not a bad indicator when taken in a proper context and as part of a broader look at the industry.

Further refuting the idea that Intel can displace NVIDIA, Jen-sun addressed Ron Fosner's assertion that multicore processors can handle graphics better than a dedicated graphics card ever could. This is where NVIDIA gets into a little counter FUD action, where Jen-sun shows that adding cores does nothing for gaming or graphics benchmark performance. In this case, Intel was certainly referring to the ability of CPUs to handle graphics code written specifically for multicore CPUs. But for the time being, when you compare adding cores to your system to adding a more powerful GPU, NVIDIA offers up to 27x more bang for your buck.

Currently Intel has some fairly decent lab demos, and there have been murmurs of a software renderer renaissance (I'd love to see John Carmack and Tim Sweeney duke it out one more time in software; maybe that's just me), but there just isn't anything in production that even tries to show what a CPU can or can't do when in direct competition with a GPU for graphics quality and performance. And there are reasons for that: currently it still isn't practical to develop the software. Maybe when everyone is running their 8 core 16 thread CPUs we'll see something interesting. But right now and even in the next few years rasterization is going to be the way to go and pure FLOPS with massive parallelism will win every time over Intel's programmability and relatively light parallelism.

Which brings us to a point NVIDIA made later in the day: GPUs are already multicore to the point where NVIDIA likes to refer to them as manycore (as we saw Intel do this with 100+ core concepts a few years back when they were first starting to push parallelism). It's a stretch for me to think of the 128 SPs in a G80 or G92 GPU as "cores" because they aren't really fully independent. But with the type of data GPUs normally tackle it's effectively very similar. Certainly no matter how you slice it GPUs are much wider hardware than any current multicore CPU.

The point NVIDIA needs to make is that the argument is far from over as the battle hasn't even really begun. At one point Jen-sun said "[Intel] can't go around saying the GPU is going to die when they don't know anything about it." This is a fair statement, but NVIDIA can't write Intel off either. They certainly will know about GPUs if they truly intend to go down the path they seem destined to travel. Either they'll be pushing the CPU (and all it's multicore glory) as a bastion of graphics power (for which some free multicore CPU graphics development tools might be nice, hint hint), or it will be essentially entering the graphics market outright with whatever Larabee ends up actually becoming.

Tackling the Market Share Myth The Tenderloin and the Two Buck Chuck
Comments Locked

43 Comments

View All Comments

  • segerstein - Saturday, April 12, 2008 - link

    As I read the article, but I wasn't wholly convinced about the arguments made by the CEO. As we have seen with EEE and other low cost computers, the current technology was about serving the first billion of people. But most people still don't have computers, because they are too expensive for them.

    Nvidia, not fully addressing even the first billion, because of its expensive discrete solutions, will see its market share shrink. Besides, there are many consumer electronics devices that would benefit from a low powered "System-on-a-chip".

    Intel has Atom+chipset, AMD bought ATI precisely because it want to offer low powered "System-on-a-chip" (but also multicore high performing parts).

    It would only make sense for Nvidia to buy VIA. VIA Isaiah processor seems promising. This was they could cater to a smaller high-end market with discrete solutions and to a growing market for low cost integrated solutions.
  • BZDTemp - Saturday, April 12, 2008 - link

    Seems Nvidia does not like to be in the receiving end.

    I do remember Nvidia spreading lies about PowerVR's Kyro 3D cards sometime back when it looked like they might have a chance to be the third factor in 3D gaming hardware.

    With ATIAMD in crisis I think it's great that Nvidia and Intel start competing even though I sincerely hope ATIAMD to come back strong and kick both their asses. After all I can't recall the red/green guys using unfair tactics and like to see integrity rewarded.


    Finally I would Anandtech to be more critical when reporting from such venues. Try and google Kyro Nvidia and pdf to find the old story or just check out the pdf directly: ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf">ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf
  • duron266 - Saturday, April 12, 2008 - link

    "Jensen is known as a very passionate, brilliant and arrogant guy but going against Intel on a frontal full scale might be the worst thing that they ever decided. Nvidia went from close to $40 to current $19.88 which means that the company has to do something to fix this but this is simply too much."
  • duron266 - Friday, April 11, 2008 - link

    NVIDIA...too high-profile,

    if they were going to vanish,

    Jen-Hsun would be the number one to blame...
  • anandtech02148 - Friday, April 11, 2008 - link

    there's a huge differences when audio is being process on a many core cpu like intel and on a stand alone pci card.
    putting the pci card in you can feel the cpu less bogged down, and the motherboard chipsets generating less heat.

    An Integrated gpu, audio, and many cores doesn't solve the problem, there will be bandwith issues too.
    Nvidia should hit Intel hard with a low powered, high performanced gpu, to prove a point.

  • epsilonparadox - Friday, April 11, 2008 - link

    NVidia will never be able to compete on the low power arena with intel. Intel just have a better process and fabs for that process. NVidia has other companies building their chips. Plus graphics chips don't go with a new process like cpus do.
  • poohbear - Friday, April 11, 2008 - link

    very nice article, but how many of us are gonna understand the analogy:

    "someone's kid topping of a decanted bottle of '63 Chateau Latour with an '07 Robert Mondavi."

    wtf is that?!? im guessing he's talking about wine, but whatever.
  • kb3edk - Friday, April 11, 2008 - link

    Well of course it's a wine reference, consider the audience: Institutional investors. These are people who are much more likely to spend $500 worth of disposable income on a bottle of Chateau Something-Or-Other instead of a GeForce 9800 GTX.

    Also note the Mondavi reference because they're in Napa just on the other side of San Fran from nVidia HQ in Silicon Valley.

    And it's still a bit odd seeing such strong words from nVidia against Intel considering that nVidia/Intel is the main enthusiast platform out there these days (as opposed to an all-AMD solution).

  • Khato - Friday, April 11, 2008 - link

    Really quite enjoyed this, makes me all the more confident in the products Intel is currently developing.

    I mean really, how similar does NVIDIA's ranting sound compared to AMD's back when they were on top? No question that they're more competent than AMD, but they've done just as good a job at awakening a previously complacent beast at Intel. Heh, and they've never had to compete with someone that has a marked manufacturing advantage before...
  • tfranzese - Sunday, April 13, 2008 - link

    Intel is no beast in these parts. Their track record in the discrete segment and drivers up to this day is complete failure. Until they execute on both the hardware and software, both monumental tasks, they'll continue to be right where they are in the discrete market (i.e. no where).

Log in

Don't have an account? Sign up now