GPU Performance

We’ve already established that NVIDIA’s Kepler architecture is fast, but the GeForce GT 650M used in the rMBP is hardly the best NVIDIA has to offer. The result however is a significant improvement in performance over the Radeon HD 6750M used in the previous generation model.

15-inch MacBook Pro Model Mid 2010 Upgraded Early 2011 Upgraded Late 2011 Retina
GPU GeForce GT 330M Radeon HD 6750M Radeon HD 6770M GeForce GT 650M
Cores 48 480 480 384
Core Clock 500MHz 600MHz 675MHz 900MHz
Memory Bus 128-bit GDDR3 128-bit GDDR5 128-bit GDDR5 128-bit GDDR5
Memory Data Rate 1580MHz 3200MHz 3200MHz 5016MHz
Memory Size 512MB 1GB 1GB 1GB

The GT 650M offers fewer “cores” compared to the 6750M and 6770M used in previous MacBook Pros, but likely better utilization of the available hardware. NVIDIA also clocks the cores much higher in the 650M, the result is a ~20% increase in theoretical raw compute power.

The memory bandwidth story is also better on Kepler. While both the GT 650M and the 67xxM feature a 128-bit GDDR5 interface, Apple clocked AMD’s memory interface at 800MHz compared to 1254MHz on Kepler. The resulting difference is 80.3GB/s of memory bandwidth vs. 51.2GB/s.

The real world impact is most noticeable at higher resolutions, thanks to the tremendous amount of memory bandwidth now available. The other benefit from the new GPU is obviously things run a lot cooler, which as I’ve already shown to considerably reduce thermal throttling under load.

Portal 2 Performance

Half Life 2 Episode Two Performance

At 1440 x 900 we actually see a regression compared to the 2011 models, but differences in the AMD and NVIDIA GPU drivers alone can account for the difference at this hardly GPU bound setting. Look at what happens once we crank up the resolution:

Half Life 2 Episode Two Performance

At 1680 x 1050 with 4X AA enabled we see a modest 11% increase in performance over last year's MacBook Pro. As I established earlier however, the rMBP will be able to more consistently deliver this performance over an extended period of time.

What's even more impressive is the 42.4 fps the GT 650M is able to deliver at the rMBP's native 2880 x 1800 resolution. Even though I ran the test with AA enabled I'm pretty sure AA was automatically disabled. At 2880 x 1800 the rMBP is able to outperform the two year old MacBook Pro running at 1680 x 1050. How's that for progress?

While the gains we've shown thus far have been modest at best, Starcraft 2 is a completely different story. Here for whatever reason the IVB + Kepler combination can be up to 2x the speed of last year’s models. I reran the tests both on the older and rMBP hardware to confirm, but the results were repeatable. The best explanation I have is Starcraft 2 is very stressful on both the CPU and GPU, so we could be seeing some thermal throttling on the older SNB + Turks hardware here.

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 - CPU Bench

Starcraft 2 - CPU Bench

Starcraft 2 - CPU Bench

Once again we see playable, although not entirely smooth frame rates at 2880 x 1800. I've also included a screenshot of SC2 at 2880 x 1800 below:


Starcraft 2 at 2880 x 1800, it's playable

Although gaming options continue to be limited under OS X, Diablo 3 is available and finally performs well on the platform thanks to the latest patches. Diablo 3 performance is appreciably better on the GT 650M compared to last year’s 6750M. There’s no FRAPS equivalent under OS X (free advertising to the first eager dev to correct that) so I have to rely on general discussion of performance here. The GT 650M is fast enough to drive the rMBP’s 2880 x 1800 panel at native resolution at playable frame rates, around 18 fps on average. Connected to an external 2560 x 1440 display however the GT 650M is fast enough to deliver around 30 fps in Diablo 3. For what it’s worth, performance under Diablo 3 is far more consistent with the rMBP than with last year’s MacBook Pro. I suspect once again we’re seeing the effects of thermal throttling under heavy CPU/GPU load that has been well mitigated by the move to more power efficient silicon.


Diablo 3 at 2880 x 1800

General Performance Battery Life
Comments Locked

471 Comments

View All Comments

  • OCedHrt - Sunday, June 24, 2012 - link

    He missed another important point. All of that was in 3 lbs. Now, the current generation starting from last summer has an external discrete graphics and optical drive connected via a thunderbolt based connector (because Apple had exclusivity) with the laptop being only 2.5 lbs.

    This isn't going to compare to the retina macbook pro though - at 15 inches 4.5 lbs is pretty impressive though I think if Sony wanted to do it they could do 4 lbs or less.
  • deathdemon89 - Saturday, June 23, 2012 - link

    I agree completely, I'm a proud owner of the old Z, and even today it doesn't feel the least bit dated. And the 1080p screen is holding up admirably well, with no signs of pixellation at normal viewing distances. This device was truly innovative for its time. I still don't understand why it received such mixed reviews by the press.
  • Spunjji - Monday, June 25, 2012 - link

    Mainly the price. Only Apple are allowed to charge that much for a laptop. Also, only Apple can have hot systems. Repeat ad infinitum.
  • mlambert890 - Wednesday, November 28, 2012 - link

    Really ridiculous comment. I can see you are bitter, as is the other mega z fan, but come on already. I worked for Sony for 5 years and love the company. I have owned probably a dozen Vaios including the top of the line last gen Z (with the SSD RAID)

    Instead of ranting and raving you need to ask yourself *why it is* that "only Apple can charge so much" and why "Anand only gives a free pass to Apple"

    You feel what exactly? That there is some grand conspiracy in play? Do you realize how ridiculous that sounds?

    WHY has Sony *lost the ability to charge a premium*? In other words WHY have they *burned all customer loyalty and good will*? I left the company back in 1999 because I saw the writing on the wall.

    You (and the other Z guy) are no different than any other apologist. Companies dont bleed marketshare and fail to sell cancer curing products (you guys are presenting the Z as "truly revolutionary" right?) for no reason. Sorry to tell you there is no "big conspiracy".

    Sony sells super high priced products into a super commoditized market and then they layer on a CRAP TON of bloatware dragging the machine to a stop, make idiot decisions like the HDMI one, and push proprietary crap *worse* than Apple ever has. All of that into the Wintel space which, sorry to tell you, was *always* driven by the cheapest possible parts at the best possible price.

    The PC industry grew *because it was cheap*. Apple *always* occupied a premium niche. I vividly remember the initial release of the Apple I, the Lisa, the Mac 128. These were all always premium products and the competition at the time (be it Commodore, Atari, Ti, or the wintel ecosystem) *always* captured share by being cheap.

    That may annoy you for some reason, but Apple has pretty much *always* held a core premium audience. The only exception was the period of insanity when Jobs had been pushed out and Scully destroyed the company. Even then, the core fans stayed.

    You two make it sound like poor Sony is a victim because the world doesnt all run out and by the Vaio Z.

    Even *without Apple* Sony would be going under, hate to tell you. Sony's problems are Sony's and the whole is *not* the sum of its parts with a laptop.
  • solipsism - Saturday, June 23, 2012 - link

    None of that makes sense and is, in fact, rubbish.

    Sony added 1080p because it was it was popular not because it made sense. You have a 168 PPI display on 13" machine which makes text too small to be a good experience for most users.

    They also didn't use a good quality display or add anything to the SW to make the experience good (unlike what Anand talked about in this review), they just added the single metric that was trending because of HDTVs.

    Blu-ray in a notebook has always been a silly option for most users. There is a reason the BRD adoption failed on PCs and it's not because everyone is stupid... except you. ODDs are long overdue for being removed since they take up 25% of the chassis, require them to placed at an edge reducing over 5" of port real estate and restricting design, require a lot of power, are noisy, more prone to break due to the many moving parts, are slow, are just too expensive to be feasible, and add nothing visually that most users trying to watch a movie can discern.

    Quad-SSDs? Really? That's a sensible solution for a consumer notebook?
  • EnzoFX - Saturday, June 23, 2012 - link

    and that really is what people don't get. It isn't just about raw specs. The package needs to be complete, polished, what have you. In this case of high dpi screens, is good scaling support, and Apple did it. Support on the software side is something they never get credit for by the Apple haters. All they can see is numbers and think "I've seen numbers like that before".
  • mabellon - Saturday, June 23, 2012 - link

    No Apple didn't do it. Just like on the iPad, they increased resolution by doubling width and height. Their software simply doesn't scale well to arbitrary higher resolution. If it was done right then Chrome would work out of the box - instead the OS 2x scales everything without increasing resolution/quality.

    To the consumer, the choice means a good experience without breaking apps. But claiming that Apple was successful simply bc of software? HA!
  • Ohhmaagawd - Saturday, June 23, 2012 - link

    Did you actually read the retina part of the review?

    Chrome doesn't work right because they do their own text rendering. Read the review. If an app uses the native text rendering, the app will look good (at least the text portion). They will have to update the graphical assets of course.

    BTW, Chrome Dev builds have this issue fixed.

    Windows DPI setting isn't default, so few use or even know about the setting and devs haven't made sure they work properly in the high DPI settings.

    Apple has made a move that will be short-term painful in having apps that aren't updated look a bit fuzzy. But since they made it default, this will force devs to update.
  • OCedHrt - Sunday, June 24, 2012 - link

    What do you mean Windows DPI setting isn't default? You can change it in a few clicks, but the same thing applies - if your app does not read the DPI values, then Windows can't help you. This is because windows UI is not vector based (I don't know about now, but older apps definitely not) and many applications back then were coded with hard coded pixel counts.

    When the DPI is changed, windows scales the text but the UI dimensions is controlled by the application implementation.
  • KitsuneKnight - Saturday, July 7, 2012 - link

    On Windows, changing the DPI will generally mean a huge amount of applications will become simply unusable.

    On this Retina MBP, the worst case appears to be slightly blurry text (which was quickly updated).

    Apple's solution is a good one, because it does things in a way that should keep existing apps working fine, while allowing future developers to leverage new APIs to take advantage of the increased resolution.

Log in

Don't have an account? Sign up now