Wrap Up

The industry shipped 34.4 million discrete graphics cards in the first three-quarters of 2016, an increase of 5.35% from 32.65 in the same period last year. If everything goes as planned for AMD and NVIDIA, year-over-year unit sales of graphics adapters will increase in 2016 for the first time in nearly a decade. Obviously, shipments of desktop GPUs this year will barely touch sales of GPUs even in 2014, but it could be important that shipments of graphics adapters may have bottomed out.

With 9.34 million desktop AIBs sold so far in 2016, AMD has already beaten its last year desktop GPU shipments and is on a recovery path. Nonetheless, the company still cannot win back its market share from its arch-rival: its share has dropped to 29.1% in Q3 from 29.9% in the previous quarter.

NVIDIA’s desktop GPU sales so far (from Q1 to Q3 2016) are slightly behind shipments of its AIBs in the first three-quarters of 2015 because the company decided to aggressively clear out inventory in Q2. Nonetheless, with 25 million desktop GPUs sold this year (until September 30, 2016, to be correct), the company can still supply the same amount of desktop GPUs as last year.

As reported, iGPUs are slowly eating the lunch of entry-level and mainstream graphics cards that cost below $99. In Q3 of 2016, the industry shipped around five million of such AIBs, whereas shipments of gaming-grade desktop GPUs were around seven million. Meanwhile, the popularity of enthusiast-class graphics hardware decreased in Q3 2016 compared to Q3 2015 mostly due to classification, rather than due to changes in the general consumer behavior (what used to be enthusiast class performance can now be had at mainstream prices, in the classification scale based on price). Those who play games are going to continue to buy gaming-grade hardware and well-developed nations are going to increase purchases of more advanced GPUs because of factors like 4K/5K resolutions, VR and others.

“I think, one, the number of gamers in the world is growing,” said Jen-Hsun Huang. “Everybody that is effectively born in the last 10-15 years [is] likely to be a gamer.”

Important Notices

  • Jon Peddie Research does not officially disclose actual unit sales of AMD, NVIDIA and Intel in its press releases. All unit sales published here are derived from market shares of appropriate vendors.
  • Since in many cases JPR does not disclose quarterly TAM numbers, those numbers are derived from historical numbers published by the company.
  • Some historical numbers were re-stated by IHVs and JPR reflected such updates in subsequent reports and releases. As such, some numbers in our graphs may differ from publicly available press releases.
  • JPR did not release any data concerning sales of desktop discrete graphics cards in Q1 – Q3 2010, but only disclosed shipments for Q4 and total shipments for the year, which is why the numbers in the charts for Q1, Q2 and Q3 are the same.
  • Given the fact that unit sales and TAM figures are approximate, we recommend to buy full reports from Jon Peddie Research if you need the data for decision-making.

Related Reading:

Market Share: AMD Is Increasing Units, Not Share
Comments Locked

53 Comments

View All Comments

  • DwayneAK - Wednesday, November 30, 2016 - link

    Also MSI, Gigabyte, and XFX are pretty good. And as far as AMD's 'lousy' partners go, I think Powercolor and Sapphire are pretty underrated.
  • Michael Bay - Thursday, December 1, 2016 - link

    After using their 980 for a year, I don`t think EVGA is especially good. My next purchase, if I`ll even bother, will be ASUS as usual.
  • just4U - Friday, December 2, 2016 - link

    I never had any more issues with Ati/Amd drivers then I had with Nvidia drivers... not ever.. I always believed it was just a rumor put out to try and keep Nvidia sales up and ati/amd down.
  • vladx - Wednesday, December 7, 2016 - link

    And now you know you were wrong and they were in fact very real. Heck, I had to sell my old laptop and buy a new one with Nvidia card and thus lose hundreds of euros because of how bad AMD drivers were.
  • zmeul - Monday, November 28, 2016 - link

    quick question: why are you using the term "GPU" interchangeably with video card?!
    in one paragraph you talk about video adapter, discrete graphics and in the next you say "various manufactures sold xyz GPUs"

    the GPU is the chip inside the video card and has been the de facto definition since 1999:
    "a single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second"
  • TheinsanegamerN - Monday, November 28, 2016 - link

    Aside from you needing to be pedantic about grammar, the term "GPU" has been used to describe a video card for years. It's nothing new.
  • heffeque - Monday, November 28, 2016 - link

    Don't mind him. He's been living under a rock and can't catch up with normal tech language.
  • zmeul - Monday, November 28, 2016 - link

    the term GPU is already defined (since 1999) and it's not used to describe a video card
    the people who use it to describe a video card, do it wrongly
  • BrokenCrayons - Monday, November 28, 2016 - link

    Is it worth mentioning that you didn't even attempt to use correct punctuation or capitalization while nitpicking about the usage of technical jargon? :)

    Anyway, the fact that you understood what the author meant when using the term "GPU" to refer to a video card means that the intended message reached the recipient, was decoded correctly, and information was shared. The goal of effective communication was achieved.

    Besides that, English hasn't quite caught up with computer industry jargon. Credible dictionary publishers don't really include GPU in their work and there aren't defined, formal rules regarding its usage. In fact, you could argue that the term "GPU" was just something Nvidia made popular during the introduction of the first GeForce graphics cards. It became a commonly used term in the industry, but it was originally just marketing jargon that helped the company differentiate their video processors that included hardware transform and lighting from other competing products. Getting wrapped up in the terminology just seems sort of silly given its origin. There's also the idea of linguistic drift either which is something else you're ignoring because it doesn't support your barely relevant criticism.
  • Meteor2 - Wednesday, November 30, 2016 - link

    This was confusing me too. In an article discussing shipments of AIBs and GPUs, it's best to be precise, because they *are* different things.

    It would be like calling a complete computer a CPU.

Log in

Don't have an account? Sign up now