Here's something I'm concerned about. AMD's Carrell Killebrew, part of yesterday's announced layoffs, was a Director of Product Planning for AMD's GPU division. His job, at least as he explained it to me so many times in the past, was to figure out what the next 3 - 5 years of AMD GPUs were going to look like. He's still technically with AMD today, although that will change in the not too distant future.

Carrell is a dedicated guy, he works hard and generally seems to know the right move for AMD in the GPU space. His track record as of late is a good one. The verdict isn't out on AMD's 28nm GPUs, but the last three years of AMD GPU releases have been excellent. They've been competitive and well executed.

When reducing workforce to cut costs, you don't go after your product planners - unless their vision and your vision don't line up. We all know what Carrell wanted for the future of AMD GPUs (as I wrote before, he wanted to deliver a first generation "Holodeck" by 2016), but what does AMD's new CEO want that conflicts with this goal?

Carrell's vision saw the continued growth of the high-end GPU. On November 9th we're supposed to hear more about Rory Read's new strategy for AMD. I am concerned that it may conflict with Carrell's vision. Maybe I'm reading too much into all of this. What do you all think?

Comments Locked

48 Comments

View All Comments

  • djc208 - Saturday, November 5, 2011 - link

    It seems like AMD is betting much harder on their CPU/GPU integration than on the discrete market. So a product planner that wants to push the high horsepower GPU system with larger and more complex chips probably doesn't fit with what seems to be AMDs vision of a chip that's less CPU/GPU and more of a big integrated package. Intel is pushing the same direction as more and more of the discrete systems become integrated into the CPU.

    In all honesty I can't say that for the mass of the market they are wrong. How many work PCs (mine included) couldn't be handled more efficiently by such a system? Even most home users are the same. If a tablet can do 90% or more of what you want to do on a computer then will most people really need a monster desktop with huge power? If you design smart CPU/GPU combos that can handle the home use tasks well (video playback, transcoding, and power management), then most people will be happy.

    I may like my gaming rig and it's capabilities but even I'll admit that the most used PC in my house is my home server which does DVR duties as well as storage and delivery, and it's headless.
  • TristanSDX - Saturday, November 5, 2011 - link

    They do not make profit on Radeons. While the Radeons are proud, lack of profit is shame. Designing GPU at AMD is pure idea. It does not matter they are first with GDDR5, or with DX11 or with new silicon process. What matters is lot of clients and cash - NV leads in this spaces.
    Additionaly NV is more complete graphics company. They make lot of osftware (CUDA, Physx, Optix, lot of other tools), have strong relationships with game developers, industry, movies studios, academia, etc.
    AMD have only good unprofitable GPU with some drivers and nothing more.
    Maybe Rory Read want new NV inside AMD, than old ATI.
  • Targon - Monday, November 7, 2011 - link

    Where do you think the GPU side of the APU came from, thin air? It comes from the Radeon side of things, and advancing the state of the GPU means long-term advances in the AMD APUs.
  • george1976 - Monday, November 7, 2011 - link

    Numbers, or i will call you a fanboy :)
  • Pantsu - Saturday, November 5, 2011 - link

    I wonder how Killebrew and restructuring is going will affect the development of next gen GPUs? While Southern Islands will be unlikely to be affected by this, the gen after that might be completely diffferent, if they ditch the current GPU strategy explained in the slide. Maybe they'll go for Nvidia style big chip again?

    It would be really sad (and bad) if AMD completely folded high performance GPUs and CPUs and concentrated on Brazos class products.
  • Tanclearas - Saturday, November 5, 2011 - link

    1. AMD's APU Strategy is just a sum of its parts (CPU+GPU)

    From Wikipedia:

    "Fusion was announced in 2006 and has been in development since then. The final design is the product of the merger between AMD and ATI, combining general processor execution as well as 3D geometry processing and other functions of modern GPUs (like GPGPU computation) into a single die."

    The reality is, AMD has done very little in the mainstream to actually leverage GPGPU. Intel has done a better job here with QuickSync. The GPU folks have done a horrible job with letting people use their GPU for anything other than graphics. AMD's video converter offers a limited number of default targets, and there is an extremely limited number of configurable options. To top it off, I'm willing to bet most people with AMD GPU's don't even know how to access the video converter, or even know it's there.

    2. GPU design methods forced upon CPU

    Nvidia and ATI use(d) more automated approaches to chip design. When they are releasing new architectures every 18 months, this is pretty much a requirement. This design approach won out between the CPU and GPU designers following the ATI acquisition, and was used for Bulldozer. The result is a chip that requires many more transistors to essentially just match the performance of the previous generation.

    I'm guessing the debate between the CPU and GPU folks about the design strategy was an unpleasant one, and given the result, the GPU folks are taking much of the blame.

    3. Failure in high-end GL products

    Nvidia commands a huge lead in the workstation class GPU segment. AMD is pretty much failing to compete in this high profit area as well. This isn't completely disconnected from the first point. AMD needs to significantly improve the software and drivers associated with their graphics chips.
  • TSS - Saturday, November 5, 2011 - link

    It's sad, but not a bad thing per se. What lots of people seem to forget is what graphics cards where developped to do in the first place: Display graphics! I remember playing carmageddon 2 at <25FPS and commenting to my friend he needed a new graphics card because his voodoo 1 wasn't cutting it any more. back then software always outpaced the hardware.

    In the current console age though, the most money isn't made off gamers it's made off everybody else. Most games are ported over from the console to the PC and while the PC gets performance upgrades the consoles don't thus limiting what the games can actually do.

    And we've seen news messages that the consoles are starting to lose revenue and die out in favor of social media games, which require even less graphical power.

    I've only built a monster PC once in my life. It had a radeon x1900xt graphics card bought a day after it was released. The entire PC cost around $3500. Back then, It was "for every $1100 you spend, you get an extra year of use. at >$3300, you get 1 year of high settings, 1 year of medium settings and 1 year in low settings of games, then you need a new PC." This was about 5 years ago. 3 years later, i needed a new PC as predicted. But the PC i bought then i still have now, with a GTX275. Still the most expensive component, but more then $200 cheaper then i paid for the X1900xt. If it was 5 years ago, i should've had to drop setting everything to max settings within 6 months. 2 years later, BF3 is the most taxing game released in a long time, and it's the first game i can't run on "ultra" anymore. But it runs >60 FPS on medium. Some settings i can even set to high. In the current age if i buy a $500 graphics card now it will still run games in a decade. Good for me, yes, but bad for AMD who won't sell me another graphics card for an entire decade - but still has to develop new high end parts that entire decade at the rate of 2 new generations per year!

    The reason to have high end parts simply isn't there anymore.

    But since everybody seems in agreement that the future of computing is mobile, be is smartphone, tablet or notebook, that would be the smart segment to focus on. It will still be about performance per watt, only at much less watt's then where used to. There's much more room to grow, as mobile graphics still suck (while PC's hardly got room left for improvement really).

    This doesn't mean high end parts will dissapear! once they have a good running laptop design, there's nothing to prevent them from just upping the watt's. Remember intel's Core CPU lineup came from a CPU designed for mobile stuff, the pentium M. But it will mean that the focus is no longer on getting the newest, fastest GPU.... It will be getting the one that does more with less.
  • nofumble62 - Saturday, November 5, 2011 - link

    Nobody is playing serious video game on mobile device. Perhaps AMD wants to change that.

    OK, but before we get to that point, AMD must have SOC design. Even Intel is struggling in this space. But they have buy many small companies, each provides a piece of puzzle. In 1-2 years, Intel will have a SOC with lot of features integrated into one single chip. AMD has not even started.

    Who want to partner with AMD to get into this space? I say who else? Qualcom but they don't have to buy AMD. Just send a head-hunter to AMD campus.
  • jabber - Saturday, November 5, 2011 - link

    As you'll find more and more exec boards will be taken over by the accountants and finance men. The creatives will be pushed out. It will happen to Nvidia/Intel/Apple etc in time.

    It's all about cutting costs, the bottom line etc. Excitement or revolutionary products don't come into their way of thinking. Its the money that counts to them, not the 'Vision'.

    Eventually we will all end up being drip fed a line of similar dull mediocre products from all the big corps.
  • softdrinkviking - Sunday, November 6, 2011 - link

    seriously. this always happens.
    it's like the wallmart syndrome.
    goodbye to quality and care, hello to bottom line.

Log in

Don't have an account? Sign up now