Plotting the Mac Pro’s GPU Performance Over Time

The Mac Pro’s CPU options have ballooned at times during its 7 year history. What started with four CPU options grew to six for the early 2009 - mid 2010 models. It was also during that time period that we saw an expansion of the number of total core counts from 4 up to the current mix of 4, 6, 8 and 12 core configurations.

What’s particularly unique about this year’s Mac Pro is that all configurations are accomplished with a single socket. Moore’s Law and the process cadence it characterizes leave us in a place where Intel can effectively ship a single die with 12 big x86 cores. It wasn’t that long ago where you’d need multiple sockets to achieve the same thing.

While the CPU moved to a single socket configuration this year, the Mac Pro’s GPU went the opposite direction. For the first time in Mac Pro history, the new system ships with two GPUs in all configurations. I turned to Ryan Smith, our Senior GPU Editor, for his help in roughly characterizing Mac Pro GPU options over the years.

Mac Pro - GPU Upgrade Path
  Mid 2006 Early 2008 Early 2009 Mid 2010 Mid 2012 Late 2013
Slowest GPU Option NVIDIA GeForce 7300 GT ATI Radeon HD 2600 XT NVIDIA GeForce GT 120 ATI Radeon HD 5770 ATI Radeon HD 5770 Dual AMD FirePro D300
Fastest GPU Option NVIDIA Quadro FX 4500 NVIDIA Quadro FX 5600 ATI Radeon HD 4870 ATI Radeon HD 5870 ATI Radeon HD 5870 Dual AMD FirePro D700

Since the Mac Pro GPU offerings were limited to 2 - 3 cards per generation, it was pretty easy to put together comparisons. We eliminated the mid range configuration for this comparison and only looked at scaling with the cheapest and most expensive GPU options each generation.

Now we’re talking. At the low end, Mac Pro GPU performance improved by 20x over the past 7 years. Even if you always bought the fastest GPU possible you'd be looking at a 6x increase in performance, and that's not taking into account the move to multiple GPUs this last round (if you assume 50% multi-GPU scaling then even the high end path would net you 9x better GPU performance over 7 years).

Ryan recommended presenting the data with a log scale as well to more accurately depict the gains over time:

Here you see convergence, at a high level, between the slowest and fastest GPU options in the Mac Pro. Another way of putting it is that Apple values GPU performance more today than it did back in 2006, so even the cheapest GPU is a much higher performing part than it would be.

If you’re a GPU company (or a Senior GPU Editor), this next chart should make you very happy. Here I’m comparing relative increases in performance for both CPU and GPU on the same graph:

This is exactly why Apple (and AMD) is so fond of ramping up GPU performance: it’s the only way to get serious performance gains each generation. Ultimately we’ll see GPU performance gains level off as well, but if you want to scale compute in a serious way you need to heavily leverage faster GPUs.

This is the crux of the Mac Pro story. It’s not just about a faster CPU, but rather a true shift towards GPU compute. In a little over a year, Apple increased the GPU horsepower of the cheapest Mac Pro by as great of a margin as it did from 2006 - 2012. The fastest GPU option didn’t improve by quite as much, but it’s close.

Looking at the same data on a log scale you’ll see that the percentage increase in GPU performance is slowing down over time, much like what we saw with CPUs, just to a much lesser extent. Note that this graph doesn't take into account that the Late 2013 Mac Pro has a second GPU. If we take that into account, GPU performance scaling obviously looks even better. Scaling silicon performance is tough regardless of what space you’re playing in these days. If you’re looking for big performance gains though, you’ll need to exploit the GPU.

The similarities between what I’m saying here about GPU performance and AMD’s mantra over the past few years aren’t lost. The key difference between Apple’s approach and those of every other GPU company is that Apple spends handsomely to ensure it has close to the best single threaded CPU performance as well as the best GPU performance. This is an important distinction, and ultimately the right approach.

Setting Expectations: A Preview of What's to Come in Mobile CPU Choices
Comments Locked

267 Comments

View All Comments

  • Dandu - Friday, January 10, 2014 - link

    Hi,

    It's possible to use a 2 560 x 1 440 HiDPI definition, with a NVIDIA card, a 4K Display and the (next) version of SwitchResX.

    I have tested that : http://www.journaldulapin.com/2014/01/10/ultra-144...
  • Haravikk - Sunday, January 12, 2014 - link

    The news about the USB3 ports is a bit strange, doesn't that mean a maximum throughput of 4gbps? I know most USB3 storage devices will struggle to push past 500mb/sec, but that seems pretty badly constrained. Granted, Thunderbolt is the interface that any storage *should* be using, but the choices are still pretty poor for the prices you're paying, and no-one offers Thunderbolt to USB3 cables (only insanely priced hubs with external power).

    Otherwise the review is great, though it'd be nice to see more on the actual capabilities of Apple's FirePro cards. Specifically, how many of the FirePro specific features do they have such as 30-bit colour output, EDC, ECC cache memory, order-independent-transparency (under OpenGL) and so-on? I'm assuming they do given that they're using the FirePro name, but we really need someone to cover it in-depth to finally put to rest claims that consumer cards would be better ;)
  • eodeot - Monday, February 24, 2014 - link

    I'd love a realistic comparison with an i7 4770k and say, 780ti.

    You also compare 12 cored version to older 12 core versions that hide behind (fairly) anonymous xeon labeling that hide their chip age (sandy/ ivy bridge/haswell...). I'd like to see in how any real world applications does a 12 core chip perform faster. Excluding 3d work and select video rendering, I doubt there is much need to extra cores. You note how its nice to have buffer of free cores for everyday use, while heavy rendering- but I never noticed a single hiccup or a slowdown with 3d rendering on my i7 4770k with all 8 logical cores taxed to their max. How much of better performance then "butter smooth" one already provided with a much cheaper CPU can you get?

    Also you compare non apple computers with same ridiculous CPU/GPU combinations. Who in their right mind would choose a 4core Xeon chip over a haswell i7? The same goes for silly "workstation" GPU over say a Titan. Excluding dated opengl 3d apps, no true modern workstation benefits from a "workstation" GPU, if we exclude select CUDA based 3d renderers like iray and vray rt that can benefit from 12gb of ram. GPUs included with Apple Mac pro have 2gb... Not a single valid reason a sane person would buy such a card. Not one.

    Also, you point out how gaming makes the most sense on windows, but do no such recommendation for 3d work. Like games, 3d programs perform significantly better under directX and that leaves windows as a sole option for any serious 3d work...

    I found this review interesting for design Apple took, but everything else appears one sided praise...
  • pls.edu.yourself - Wednesday, February 26, 2014 - link

    QUOTE: "The shared heatsink makes a lot of sense once you consider how Apple handles dividing compute/display workloads among all three processors (more on this later)."

    Can anyone help point me to this. I think one of my GPU's is not being used.
  • PiMatrix - Saturday, March 8, 2014 - link

    Apple Fixed the HiDPI issue on Sharp K321 in OS 10.9.3. Works great. Supported HiDPI resolutions are the native 3840x2160, and HiDPI: 3200x1800, 2560x1440, 1920x1080, and 1280x720. You can also define more resolutions with QuickResX but the above seem to be enough. Using 3200 x1800 looks fantastic on this 4K display. Great job Apple!
  • le_jean - Monday, March 10, 2014 - link

    Any information on updated 60Hz compatibility concerning Dell's UP 2414Q in 10.9.3?
    I would be very interested to get some feedback in relation to:
    nMP & Dell UP 2414Q
    rMBP & Dell UP 2414Q

    I remember in anandtech review of late 2013 nMP there have been issues concerning that specific display, while Sharp and ASUS performed just fine
  • philipus - Monday, April 14, 2014 - link

    As a happy photo amateur, I have to say the previous Mac Pro is good enough for me. I have the early 2008 version which I like because of its expandability. Over the years I have added drives, RAM and most recently a Sonnet Tempo Pro with two Intel 520 in order to get a faster system. As cool and powerful as the new Mac Pro is, it would cost me quite a lot to add Thunderbolt boxes for the drives I currently use, so it is not worth it for me.

    I do agree that it is about time a manufacturer of desktop computers pushed the platform envelope. It's been tediously samey for a very long time. I'm not surprised it was Apple that made the move - it's in Apple's DNA to be unexpected design-wise. But as much as it is nice to see a radical re-design of the concept of the desktop computer, I think a future version of the Mac Pro needs to be a bit more flexible and allow more user-based changes to the hardware. Even if I could afford the new Mac Pro - and I would also place it on my desktop because it's really pretty - I wouldn't want to have several Thunderbolt boxes milling around with cables variously criss-crossing and dangling from my desk.
  • walter555999 - Saturday, June 7, 2014 - link

    Dear Anand, could you post how to connect a up2414Q to macbook pro retina (2013) ? I have tried a cable mini display port-HDMI. But there are no image in the dell monitor. Thank you very much. Walter
  • Fasarinen - Saturday, August 9, 2014 - link

    Thanks for an excellent review. (And hello, everybody; this is my first post on this site.)

    I noticed, in the "GPU choices" section, what seems to be a very useful utility for monitoring the GPU. The title on the top of the screen is "OpenCL Driver Monitor"; the individual windows (which are displaying graphs of GPU utilisation) seem to be titled "AMDRadeonXL4000OpenCLDriver".

    I'm probably just being dim, but a bit of googling doesn't shed much light. If anybody could point to me to where this utility can be obtained from, I'd be most grateful.

    Thanks ....
  • pen-helm - Friday, September 12, 2014 - link

    I showed this page to a Mac user. They replied:

    I'm pretty sure that this simple fix takes care of the issue with
    monitors where OS X doesn't offer a HiDPI mode:

    http://cocoamanifest.net/articles/2013/01/turn-on-...

Log in

Don't have an account? Sign up now