POST A COMMENT

49 Comments

Back to Article

  • Shadowmaster625 - Monday, November 28, 2016 - link

    It's actually rather sad that discrete GPU sales are only up 10% vs a year ago when they had been sandbagging on 2011 process tech for 4 years. I would have expected 14nm/16nm to provide a larger boost to sales. Reply
  • lefty2 - Monday, November 28, 2016 - link

    This is because no one is buying desktops anymore. All the top tech companies give there employees laptops, not desktops: http://www.techworm.net/2016/11/computerlaptop-big... Reply
  • TheinsanegamerN - Monday, November 28, 2016 - link

    And the majority of those desktops were using integrated solutions, not dedicated ones. Business moving to laptops hasnt had that big of an impact. Reply
  • Sarah Terra - Tuesday, December 13, 2016 - link

    The slower sales are because of this reason: Nvidia has been milking process and architures for abnormal periods in order to maximize profts. As such a 2-3 generation old GPU is usually still "good enough" for most poeple, much like how users clung to sandy bridge. Back in the day, if you had a 3 year old GPU you were left in the dust, upgrades were far more frequent and necessary to keep pace. Nvidia is eseentially monopolizing themselves to a smaller market share, more competition will boost sales. Reply
  • OldManMcNasty - Thursday, December 08, 2016 - link

    Our engineers get whatever they want, I have a SurFace Pro 4 but I also have a virtual workstation with a Nvidia vGPU. If you're not an engineer, you're given a virtual desktop or you can select an iPad. Reply
  • nathanddrews - Monday, November 28, 2016 - link

    Larger than 10%? Why would you expect that? Reply
  • DanNeely - Wednesday, November 30, 2016 - link

    Steadily bigger died and more efficient 28nm designs kept the performance gains coming, despite the process lag there wasn't a correspondingly huge buildup of demand. Reply
  • BurntMyBacon - Thursday, December 01, 2016 - link

    @Shadowmaster625: "It's actually rather sad that discrete GPU sales are only up 10% vs a year ago when they had been sandbagging on 2011 process tech for 4 years."

    You are making a perhaps faulty assumption that most people buying GPUs know or care what processes nodes do for the video card they are buying. In my estimation, more people are concerned with how much it costs and if it does what they want it to do. I don't suspect many of them track how new cards compare to the last gen products, much less the several generations back that they would need to know to be aware of the process node stall.
    Reply
  • tipoo - Monday, November 28, 2016 - link

    This is an important counterpoint to the "PC shipments are falling" doom and gloom. PC shipments are falling because a 5 year old 2500K can still run modern games if the GPU allows. But PC /gaming components/ are on the upswing, so it's still a great time to be a PC gamer. Complete system sales are tertiary. Reply
  • BrokenCrayons - Monday, November 28, 2016 - link

    This sums up the state of the declining computer sales nicely. While CPU performance and platform features (minor nod to USB) haven't pushed the performance envelope enough to matter AND software isn't demanding more, the GPU industry is driven by pent-up demand for a die shrink and widespread increases in screen resolution. Reply
  • Threska - Sunday, December 04, 2016 - link

    Well the killer uses for GPUs are going to be VR and machine learning. Reply
  • 0ldman79 - Monday, November 28, 2016 - link

    That's not necessarily a problem.

    We need tech to settle a bit. That might actually increase overall ownership and total market penetration. Having to upgrade the machine every two years just to keep up has kept a lot of people out of the market for anything but the cheapo computer.
    Reply
  • Strunf - Thursday, December 08, 2016 - link

    People didn't upgrade the machines every two years to keep up, CPU wise we have reached a good enough CPU for the average user like 10 years ago, people who upgrade every 2 years are enthusiasts and they will keep to do so, the ones who stopped upgrading are the ones that have a already good enough PC... or the vast majority of PC users and companies.

    There isn't really a problem, PC shipment will keep going down cause a PC has nowadays a very long useful life and cause of other technologies, tablets already replaced the PC on many households... my guess in a few years Gamers will represent the vast majority of the PC users and until then desktop PC sales will keep going down, and even then there are new technologies that allow to play PC Games over the network without even having a PC.
    Reply
  • stephenbrooks - Tuesday, November 29, 2016 - link

    Another thing to bear in mind with these graphs is the comparative improvement of integrated graphics over the time frame. It must be eating the low-end of discrete GPUs by this point. Reply
  • Samus - Tuesday, November 29, 2016 - link

    The problem isn't just a 5 year old 2500K is adequate for most common games, but many other tasks, especially simple content consumption, are clearly delegated to other devices most prominently smart phones and tablets.

    Unless you desire a PC for gaming, or you are a business owner that is sticking to the legacy operations schema (and not modernizing your IT infrastructure) then desktop PC's don't offer any clear compelling advantage over laptops and mobile devices.
    Reply
  • Meteor2 - Wednesday, November 30, 2016 - link

    Well, AIOs look nice, and are nice to use. Whether they're actually 'desktops', as they use laptop components, is another question. Reply
  • RussianSensation - Friday, December 02, 2016 - link

    I5 2500K/2600K turn 7 years old next month, since they were released on January 2011! Reply
  • catavalon21 - Sunday, December 11, 2016 - link

    How about splitting the difference, and call it 6 years old? Reply
  • Araa - Monday, November 28, 2016 - link

    Never been an AMD fan myself but I hope they get out of the slumber they are currently in. I want the good ol' 2010 days back where they were a strong second. Reply
  • Keao - Monday, November 28, 2016 - link

    Yeah and back then GPU mid-tier was a bargain :-)

    Aaaah I miss the days of the RADEON HD4870...
    Reply
  • Stuka87 - Monday, November 28, 2016 - link

    In 2010 AMD was in first place. Thats back when the 5000 series was king of the hill and the GTX 400 series was delayed 6 months and shipped as a power guzzling pig. It was later on when the updated 500 series brought nVidia back in front as the AMD 6000 series was just a basic re-badge. Reply
  • silverblue - Tuesday, November 29, 2016 - link

    Not quite; the 6900 series moved from VLIW5 to VLIW4 due to AMD realising that the average slot utilisation was 3.4, hence there were efficiency gains to be made in reducing the size of each shader block. The 6870 and below were based off the 5000 series, however. Reply
  • just4U - Friday, December 02, 2016 - link

    While the 470/80 were guzzlers for sure .. the 460 was the little darling... priced well and gave Nvidia fans something to cheer about. The 500 series certainly brought more to the table, however AMD was matching them in performance while beating them on pricing...

    I always remember wanting a 560Ti or a 570 but could never justify it on the price so opted for the AMD equivalent... sad to because even as it was about to be discontinued the damn cards never did come down in price. That's when I started to get a little miffed with Nvidia. After the TNT2 They really began to sock it to you price wise with the Geforce line... That's something ATi never really did... even when they were the king of graphics in the early 90s.
    Reply
  • Strunf - Thursday, December 08, 2016 - link

    ATI/AMD, nVIDIA will charge as much as they can... there are no good guys, they are quoted on the stock market and hence profit driven.

    In the early 90s the graphics card wasn't so important and there were a few players ATI, S3, Matrox, Intel... then the 3D accelerators kicked in and the market went crazy.
    Reply
  • Samus - Tuesday, November 29, 2016 - link

    I haven't had an AMD GPU since AMD bought ATI that I liked. They were cheaper, for the same performance as nVidia, but they failed early, the drivers were awful, and their partners were and still are lousy. AMD doesn't have an equivalent OEM builder to nVidia such as PNY or EVGA.

    And before you laugh about PNY, remember that PNY is actually the largest distributor of nVidia cards. By quite a lot. I remember reading a few years ago that PNY shipped more cards than every other nVidia partner COMBINED. All those Quadro's have helped their bottom line as well.
    Reply
  • peterfares - Wednesday, November 30, 2016 - link

    ASUS makes AMD cards. They're reputable. Reply
  • DwayneAK - Wednesday, November 30, 2016 - link

    Also MSI, Gigabyte, and XFX are pretty good. And as far as AMD's 'lousy' partners go, I think Powercolor and Sapphire are pretty underrated. Reply
  • Michael Bay - Thursday, December 01, 2016 - link

    After using their 980 for a year, I don`t think EVGA is especially good. My next purchase, if I`ll even bother, will be ASUS as usual. Reply
  • just4U - Friday, December 02, 2016 - link

    I never had any more issues with Ati/Amd drivers then I had with Nvidia drivers... not ever.. I always believed it was just a rumor put out to try and keep Nvidia sales up and ati/amd down. Reply
  • vladx - Wednesday, December 07, 2016 - link

    And now you know you were wrong and they were in fact very real. Heck, I had to sell my old laptop and buy a new one with Nvidia card and thus lose hundreds of euros because of how bad AMD drivers were. Reply
  • zmeul - Monday, November 28, 2016 - link

    quick question: why are you using the term "GPU" interchangeably with video card?!
    in one paragraph you talk about video adapter, discrete graphics and in the next you say "various manufactures sold xyz GPUs"

    the GPU is the chip inside the video card and has been the de facto definition since 1999:
    "a single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second"
    Reply
  • TheinsanegamerN - Monday, November 28, 2016 - link

    Aside from you needing to be pedantic about grammar, the term "GPU" has been used to describe a video card for years. It's nothing new. Reply
  • heffeque - Monday, November 28, 2016 - link

    Don't mind him. He's been living under a rock and can't catch up with normal tech language. Reply
  • zmeul - Monday, November 28, 2016 - link

    the term GPU is already defined (since 1999) and it's not used to describe a video card
    the people who use it to describe a video card, do it wrongly
    Reply
  • BrokenCrayons - Monday, November 28, 2016 - link

    Is it worth mentioning that you didn't even attempt to use correct punctuation or capitalization while nitpicking about the usage of technical jargon? :)

    Anyway, the fact that you understood what the author meant when using the term "GPU" to refer to a video card means that the intended message reached the recipient, was decoded correctly, and information was shared. The goal of effective communication was achieved.

    Besides that, English hasn't quite caught up with computer industry jargon. Credible dictionary publishers don't really include GPU in their work and there aren't defined, formal rules regarding its usage. In fact, you could argue that the term "GPU" was just something Nvidia made popular during the introduction of the first GeForce graphics cards. It became a commonly used term in the industry, but it was originally just marketing jargon that helped the company differentiate their video processors that included hardware transform and lighting from other competing products. Getting wrapped up in the terminology just seems sort of silly given its origin. There's also the idea of linguistic drift either which is something else you're ignoring because it doesn't support your barely relevant criticism.
    Reply
  • Meteor2 - Wednesday, November 30, 2016 - link

    This was confusing me too. In an article discussing shipments of AIBs and GPUs, it's best to be precise, because they *are* different things.

    It would be like calling a complete computer a CPU.
    Reply
  • timbotim - Monday, November 28, 2016 - link

    “Everybody that is effectively born in the last 10-15 years [is] likely to be a gamer.”

    Gotta be up there with "640k" and "there is a world market for maybe 5 computers".
    Reply
  • TristanSDX - Monday, November 28, 2016 - link

    Great article Reply
  • beginner99 - Tuesday, November 29, 2016 - link

    And another couple of graphs clearly showing naive gamers getting ripped of by NV selling mid-range at flagship prices. Reply
  • just4U - Friday, December 02, 2016 - link

    I recall paying 400 for a Creative Geforce2 and (cough..) 870 for a Asus Geforce3 so... Prices have remained steady thru the years. Every once in awhile AMD/ATI throws a monkey wrench into Nvidia's pricing by releasing really great cards at the high Mid range price though.. and that temporarily changes things. Nvidia did it once with their 460s as well. Reply
  • T1beriu - Tuesday, November 29, 2016 - link

    I guess you missed the AMD memo a month ago for lowering the prices of 460 and 470. The MSRP for the 460 2GB is $99 and for 470 4GB is $179. Reply
  • vladx - Wednesday, December 07, 2016 - link

    Too bad real actual prices don't reflect MSRP ones. Reply
  • mikelanding - Tuesday, November 29, 2016 - link

    This article and study failed to mention that AMD sale are up might be due to RX series card are most efficient for Cryptocurrencies mining like Ethereum, Zcash and Monero. Miners are buying RX series card in large quantity. I myself had many rigs (1 rigs = 6 RX series cards) doing just mining. Reply
  • BrokenCrayons - Tuesday, November 29, 2016 - link

    I'm not into the crypto currency thing at all, but I've heard from multiple sources that CPU and GPU mining is too inefficient. Much of the mining workload has shifted to custom ASICs that offer better performance for lower prices and less power consumption. Reply
  • TheinsanegamerN - Tuesday, November 29, 2016 - link

    For bitcoin, yes. For many other alt-coins, GPU is still king o the hill. Reply
  • colonelclaw - Thursday, December 01, 2016 - link

    Well done to Nvidia and AMD etc. etc.
    Now, is there any chance you drop your bloody prices?
    Reply
  • oranos - Sunday, December 04, 2016 - link

    gtx 1080 is to thank for this Reply
  • Apollo999 - Wednesday, December 07, 2016 - link

    Well the numbers you have shown tell us that only one person in 1000 :))) changes his or her video card in 6 months...500 years :))) must pass before the whole population of 7 billion humans living on this planet,not counting the dragons and other intelligent species should change their computers and/or the graphics cards
    and please dont tell me about the integrated graphics by Intel and how really good they are...
    Reply
  • jaden24 - Wednesday, December 07, 2016 - link

    But, but, PC gaming is dying... Reply

Log in

Don't have an account? Sign up now