In what’s turning out to be an oddly GPU-centric week for Apple, this morning the company has revealed that they will finally be giving the long-neglected Mac Pro a major update in the 2018+ timeframe. Apple’s pro users have been increasingly unhappy by the lack of updates to the company’s flagship desktop computer, and once released, this update would be its first in what will be over 4 years.

Getting to the heart of matters, Apple invited a small contingent of press – including John Gruber and TechCrunch’s Matthew Panzarino – out to one of their labs to discuss the future of the Mac Pro and pro users in general. The message out of Apple is an odd one: they acknowledge that they erred in both the design and handling of the Mac Pro (as much as Apple can make such an acknowledgement, at least), and that they will do better for the next Mac Pro. However that Mac Pro won’t be ready until 2018 or later, and in the meantime Apple still needs to assuage their pro users, to prove to them that they are still committed to the Mac desktop and still committed to professional use cases.

Both of these articles are very well written, and rather than regurgitate them, I’d encourage you to read them. It’s extremely rare to see Apple talk about their future plans – even if it’s a bit vague at times – so this underscores the seriousness of Apple’s situation. As John Gruber puts it, Apple has opted to “bite the bullet and tell the world what your plans are, even though it’s your decades-long tradition — a fundamental part of the company’s culture — to let actual shipping products, not promises of future products, tell your story.”

However neither story spends too much time on what I feel is the core technical issue, Apple’s GPU options, so I’d like to spill a bit of ink on the subject, if only to provide some context to Apple’s decisions.

Analysis: GPUs Find Their Sweet Spot at 250 Watts

From a GPU perspective, the Mac Pro has been an oddball device from day-one. When Apple launched it, they turned to long-time partner AMD to provide the GPUs for the machine. What AMD provided them with was their Graphics Core Next (GCN) 1.0 family of GPUs: Pitcairn and Tahiti. These chips were the basis of AMD’s Radeon HD 7800 and HD 7900 series cards launched in early 2012. And by the time the Mac Pro launched in late 2013, they were already somewhat outdated, with AMD’s newer Hawaii GPU (based on the revised GCN 1.1 architecture) having taken the lead a few months earlier.

Ultimately Apple got pinched by timing: they would need to have chips well in advance for R&D and production stockpiling, and that’s a problem for high-end GPU launches. These products just have slow ramp-ups.

Complicating matters is the fact that the Mac Pro is a complicated device. Apple favored space efficiency and low-noise over standard form-factors, so instead of using PC-standard PCIe video cards for the Mac Pro, they needed to design their own cards. And while the Mac Pro is modular to a degree, this ultimately meant that Apple would need to design a new such card for each generation of GPUs. This isn’t a daunting task, but it limits their flexibility in a way they weren’t limited with the previous tower-style Mac Pros.


Mac Pro Assembled w/GPU Cards (Image Courtesy iFixit)

The previous two items we’ve known to be issues since the launch of the Mac Pro, and have commonly been cited as potential issues holding back a significant GPU update all of these years. However, as it turns out, this is only half of the story. The rest of the story – the consequences of Apple’s decision to go with dual GPUs and using a shared heatsink via the thermal core – has only finally come together with Apple’s latest revelation.

At a high-level, Apple opted to go with a pair of GPUs in order to chase a rather specific use case: using one GPU to drive the display, and using the second GPU as a co-processor. All things considered this wasn’t (and still isn’t) a bad strategy, but the number of applications that can use such a setup are limited. Graphical tasks are hit & miss in their ability to make good use of a second GPU, and GPU-compute tasks still aren’t quite as prevalent as Apple would like.

The drawback to this strategy is that if you can’t use the second GPU, two GPUs aren’t as good as one more powerful GPU. So why didn’t Apple just offer a configuration with a single, higher power GPU? The answer turns out to be heat. Via TechCrunch:

I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system that we thought with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture… that that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped.

Being able to put larger single GPUs required a different system architecture and more thermal capacity than that system was designed to accommodate. And so it became fairly difficult to adjust.

The thermal core at the heart of the Mac Pro is designed to be able to cool a pair of moderately powerful GPUs – and let’s be clear here, at around 200 Watts each under full load, a pair of Tahitis adds up to a lot of heat – however it apparently wasn’t built to handle a single, more powerful GPU.

The GPUs that have come to define the high-end market like AMD’s Hawaii at Fiji GPUs, or NVIDIA’s GM200 and GP102 GPUs, all push 250W+ in their highest performance configurations. This, apparently, is more than Apple’s thermal core can handle. In terms of total wattage, just one of these GPUs would be less than a pair of Tahitis, but it would be 250W+ over a relatively small surface area as opposed to the roughly 400W over nearly twice the surface area.

Video Card Average Power Consumption (Full Load, Approximate)
GPU Power Consumption
AMD Tahiti (HD 7970) 200W
AMD Hawaii (R9 290X) 275W
AMD Fiji (R9 Fury X) 275W
NVIDIA GM200 (GTX Titan X) 250W

It’s a strange day when Apple has backed themselves into a corner on GPU performance. The company has been one of the biggest advocates for more powerful GPUs, pushing the envelope on their SoCs, while pressuring partners like Intel to release Iris Pro-equipped (eDRAM-backed) CPUs. However what Apple didn’t see coming, it would seem, is that the GPU market would settle on 250W or so as the sweet spot for high-end GPUs.


Mac Pro Disassembled w/GPU Cards (Image Courtesy iFixit)

And to be clear here, GPU power consumption is somewhat arbitrary. AMD’s Fiji GPU was the heart of the 275W R9 Fury X video card, but it was also the heart of the 175W R9 Nano. There is clearly room to scale down to power levels more in-line with Apple’s ability, but they lose performance in the process. Without the ability to cool a 250W video card, it’s not possible to have GPU performance that will rival powerful PC workstations, which Apple is still very much in competition with.

Ultimately I think it’s fair to say that this was a painful lesson for Apple, but hopefully one they learn a very important lesson from. The lack of explicit modularity and user-upgradable parts in the Mac Pro has always been a point of concern for some customers, and this has ultimately made the current design the first and last of its kind. Apple is indicating that the next Mac Pro will be much more modular, which would be getting them back on the right track.

Source: Daring Fireball

Comments Locked

84 Comments

View All Comments

  • zepi - Tuesday, April 4, 2017 - link

    I think it is the exact opposite. "We figured out that we are losing sales and there need better and more sustainable way of milking money out of them".
  • bill.rookard - Tuesday, April 4, 2017 - link

    I think that's part of it. Take a look at the hardware you can get right now. I can put together a Ryzen 7 1700, 32GB of RAM, an SSD or two, and a GTX 1070 for easily under $1000 USD and the performance would wind up pretty comparable to one of the moderately spec'd M-Pros at 1/4 the cost.

    No, it won't look like a little circular pencil holder on my desk, but I can do quite a lot with the extra cash.
  • cocochanel - Tuesday, April 4, 2017 - link

    +1!!!
  • osxandwindows - Tuesday, April 4, 2017 - link

    no thunderbolt?
  • WinterCharm - Tuesday, April 4, 2017 - link

    > but I can do quite a lot with the extra cash.

    like a nice pair of 4K monitors...
  • Strunf - Thursday, April 6, 2017 - link

    The thing is that the Mac Pro is sold to artists and "trendy" people ... your PC could be 10x better it would still look like crap for them.

    For sure no one is buying the Mac Pro for it's value or performance.
  • drothgery - Tuesday, April 4, 2017 - link

    You can at least nominally be in the pro workstation market if your machine is reasonably upgradeable (at least for the GPU) or if you at least update it every year, but if you do neither? Not so much.
  • BurntMyBacon - Thursday, April 6, 2017 - link

    Agreed. I think they underestimated the work it would take to churn out new custom video cards periodically. I'm not convinced by the "we were thermally constrained" argument. As stated in the article, Fiji (175W) was capable of running within their thermal envelope (200W). A Fury Nano level of graphics card or two would have been a pretty nice upgrade over the 850MHz not-quite-7970 equivalent with a little more memory as well. Want even more graphics memory? Polaris 10 gives you more memory and still gives a performance improvement over what they had with and even lower TDP than the Fury Nano (~150W). They've got at least a year before the next Mac Pro is supposed to come out. If they were really serious about the Mac Pro, they could still release an RX480 equivalent.
  • fred666 - Tuesday, April 4, 2017 - link

    The trash can Mac Pro was a stupid idea to begin with, serving no purpose.
    They should built a real Pro device, with plenty of ports and room for expansion (hard drives, add-in cards).
    Also, not every "Pro" needs a discrete GPU and it should be optional.
  • Eidigean - Tuesday, April 4, 2017 - link

    When "Pro" means 6-24 core Xeon, then yes you do need a discrete GPU; unless you want it headless. Only the mainstream 4 core consumer chips have an integrated GPU. The trash can had a purpose: moderate compute ability in a very quiet package. You're not going to reach those low decibel levels with a blower on a GPU.

    Don't get me wrong, I want dual GTX 1080 Ti's in a Mac that blow the exhaust out the back, but my purpose is not compute inside a quiet sound studio. Perhaps this will bring Nvidia back to the table to make Pascal drivers for Mac.

Log in

Don't have an account? Sign up now