Market Share: AMD Is Increasing Units, Not Share

Last year AMD addressed the high-end of the market with unique products like the Radeon R9 Fury series with HBM memory, as well as the Radeon R9 Nano aimed at small form factor systems. This year the company decided to focus on mainstream video cards with its Radeon RX series (previously known as Polaris). So far, this tactic has been paying off: over the past 12 months, AMD regained over 10% of the market and increased quarterly shipments of desktop discrete GPUs by over 1.5 million units.

AMD shipped approximately 3.8 million standalone graphics chips for desktop computers in the third quarter of 2016, which is a two-year high, according to Jon Peddie Research. The company’s desktop discrete GPU sales were up nearly one million units from the previous quarter (an increase of 34%) and grew by over 1.5 million units from the same period last year (an increase of 68.8%). Meanwhile, AMD’s market share declined 0.8% from the previous quarter (Q2 2016) due to strong NVIDIA performance but surged 10% from Q3 2015.

NVIDIA also managed to increase its discrete desktop GPU shipments in the third quarter significantly. The company sold 9.25 million GPUs, up from 6.61 million in Q2 2016 (an increase of about 40%), and up from 8.97 million in Q3 2015 (an increase of 3.1%). NVIDIA typically clears out its inventory in the second quarter, hence, its sequential growth of chip sales in the third quarter is not particularly surprising. Meanwhile, the company has managed to bring its sales back to recent historical levels, which is not bad on a market that has been on a decline for years.

NVIDIA: Record Sales of Gaming GPUs Wrap Up and Notices
POST A COMMENT

49 Comments

View All Comments

  • zmeul - Monday, November 28, 2016 - link

    quick question: why are you using the term "GPU" interchangeably with video card?!
    in one paragraph you talk about video adapter, discrete graphics and in the next you say "various manufactures sold xyz GPUs"

    the GPU is the chip inside the video card and has been the de facto definition since 1999:
    "a single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second"
    Reply
  • TheinsanegamerN - Monday, November 28, 2016 - link

    Aside from you needing to be pedantic about grammar, the term "GPU" has been used to describe a video card for years. It's nothing new. Reply
  • heffeque - Monday, November 28, 2016 - link

    Don't mind him. He's been living under a rock and can't catch up with normal tech language. Reply
  • zmeul - Monday, November 28, 2016 - link

    the term GPU is already defined (since 1999) and it's not used to describe a video card
    the people who use it to describe a video card, do it wrongly
    Reply
  • BrokenCrayons - Monday, November 28, 2016 - link

    Is it worth mentioning that you didn't even attempt to use correct punctuation or capitalization while nitpicking about the usage of technical jargon? :)

    Anyway, the fact that you understood what the author meant when using the term "GPU" to refer to a video card means that the intended message reached the recipient, was decoded correctly, and information was shared. The goal of effective communication was achieved.

    Besides that, English hasn't quite caught up with computer industry jargon. Credible dictionary publishers don't really include GPU in their work and there aren't defined, formal rules regarding its usage. In fact, you could argue that the term "GPU" was just something Nvidia made popular during the introduction of the first GeForce graphics cards. It became a commonly used term in the industry, but it was originally just marketing jargon that helped the company differentiate their video processors that included hardware transform and lighting from other competing products. Getting wrapped up in the terminology just seems sort of silly given its origin. There's also the idea of linguistic drift either which is something else you're ignoring because it doesn't support your barely relevant criticism.
    Reply
  • Meteor2 - Wednesday, November 30, 2016 - link

    This was confusing me too. In an article discussing shipments of AIBs and GPUs, it's best to be precise, because they *are* different things.

    It would be like calling a complete computer a CPU.
    Reply
  • timbotim - Monday, November 28, 2016 - link

    “Everybody that is effectively born in the last 10-15 years [is] likely to be a gamer.”

    Gotta be up there with "640k" and "there is a world market for maybe 5 computers".
    Reply
  • TristanSDX - Monday, November 28, 2016 - link

    Great article Reply
  • beginner99 - Tuesday, November 29, 2016 - link

    And another couple of graphs clearly showing naive gamers getting ripped of by NV selling mid-range at flagship prices. Reply
  • just4U - Friday, December 02, 2016 - link

    I recall paying 400 for a Creative Geforce2 and (cough..) 870 for a Asus Geforce3 so... Prices have remained steady thru the years. Every once in awhile AMD/ATI throws a monkey wrench into Nvidia's pricing by releasing really great cards at the high Mid range price though.. and that temporarily changes things. Nvidia did it once with their 460s as well. Reply

Log in

Don't have an account? Sign up now