Market Share: AMD Is Increasing Units, Not Share

Last year AMD addressed the high-end of the market with unique products like the Radeon R9 Fury series with HBM memory, as well as the Radeon R9 Nano aimed at small form factor systems. This year the company decided to focus on mainstream video cards with its Radeon RX series (previously known as Polaris). So far, this tactic has been paying off: over the past 12 months, AMD regained over 10% of the market and increased quarterly shipments of desktop discrete GPUs by over 1.5 million units.

AMD shipped approximately 3.8 million standalone graphics chips for desktop computers in the third quarter of 2016, which is a two-year high, according to Jon Peddie Research. The company’s desktop discrete GPU sales were up nearly one million units from the previous quarter (an increase of 34%) and grew by over 1.5 million units from the same period last year (an increase of 68.8%). Meanwhile, AMD’s market share declined 0.8% from the previous quarter (Q2 2016) due to strong NVIDIA performance but surged 10% from Q3 2015.

NVIDIA also managed to increase its discrete desktop GPU shipments in the third quarter significantly. The company sold 9.25 million GPUs, up from 6.61 million in Q2 2016 (an increase of about 40%), and up from 8.97 million in Q3 2015 (an increase of 3.1%). NVIDIA typically clears out its inventory in the second quarter, hence, its sequential growth of chip sales in the third quarter is not particularly surprising. Meanwhile, the company has managed to bring its sales back to recent historical levels, which is not bad on a market that has been on a decline for years.

NVIDIA: Record Sales of Gaming GPUs Wrap Up and Notices
Comments Locked

53 Comments

View All Comments

  • DwayneAK - Wednesday, November 30, 2016 - link

    Also MSI, Gigabyte, and XFX are pretty good. And as far as AMD's 'lousy' partners go, I think Powercolor and Sapphire are pretty underrated.
  • Michael Bay - Thursday, December 1, 2016 - link

    After using their 980 for a year, I don`t think EVGA is especially good. My next purchase, if I`ll even bother, will be ASUS as usual.
  • just4U - Friday, December 2, 2016 - link

    I never had any more issues with Ati/Amd drivers then I had with Nvidia drivers... not ever.. I always believed it was just a rumor put out to try and keep Nvidia sales up and ati/amd down.
  • vladx - Wednesday, December 7, 2016 - link

    And now you know you were wrong and they were in fact very real. Heck, I had to sell my old laptop and buy a new one with Nvidia card and thus lose hundreds of euros because of how bad AMD drivers were.
  • zmeul - Monday, November 28, 2016 - link

    quick question: why are you using the term "GPU" interchangeably with video card?!
    in one paragraph you talk about video adapter, discrete graphics and in the next you say "various manufactures sold xyz GPUs"

    the GPU is the chip inside the video card and has been the de facto definition since 1999:
    "a single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second"
  • TheinsanegamerN - Monday, November 28, 2016 - link

    Aside from you needing to be pedantic about grammar, the term "GPU" has been used to describe a video card for years. It's nothing new.
  • heffeque - Monday, November 28, 2016 - link

    Don't mind him. He's been living under a rock and can't catch up with normal tech language.
  • zmeul - Monday, November 28, 2016 - link

    the term GPU is already defined (since 1999) and it's not used to describe a video card
    the people who use it to describe a video card, do it wrongly
  • BrokenCrayons - Monday, November 28, 2016 - link

    Is it worth mentioning that you didn't even attempt to use correct punctuation or capitalization while nitpicking about the usage of technical jargon? :)

    Anyway, the fact that you understood what the author meant when using the term "GPU" to refer to a video card means that the intended message reached the recipient, was decoded correctly, and information was shared. The goal of effective communication was achieved.

    Besides that, English hasn't quite caught up with computer industry jargon. Credible dictionary publishers don't really include GPU in their work and there aren't defined, formal rules regarding its usage. In fact, you could argue that the term "GPU" was just something Nvidia made popular during the introduction of the first GeForce graphics cards. It became a commonly used term in the industry, but it was originally just marketing jargon that helped the company differentiate their video processors that included hardware transform and lighting from other competing products. Getting wrapped up in the terminology just seems sort of silly given its origin. There's also the idea of linguistic drift either which is something else you're ignoring because it doesn't support your barely relevant criticism.
  • Meteor2 - Wednesday, November 30, 2016 - link

    This was confusing me too. In an article discussing shipments of AIBs and GPUs, it's best to be precise, because they *are* different things.

    It would be like calling a complete computer a CPU.

Log in

Don't have an account? Sign up now