The Big Picture

For years Intel has been telling me that the world is becoming more mobile. Yet I could never bring myself to replace my desktop with a notebook, despite the convenience. I was always a two-machine-man. I had my desktop at home and my notebook that I carried around with me to tradeshows and meetings. I eventually added more mobile devices to my collection: an ultraportable for when I need to write but don't need to edit/publish, a smartphone and a tablet. Admittedly the tablet gets the least amount of use of my computing devices, I mostly just have it because I sort of have to. Although my collection of computing devices has become more mobile, none of these devices has supplanted the need for a desktop in my life.

Last year when the Arrandale based MacBook Pros came out I decided to give the notebook as a desktop thing a try. The benefits were obvious. I would always have everything with me whenever I carried around my notebook. I wouldn't have to worry about keeping documents in sync between two machines. And I'd see a significant reduction in power consumption and heat output. I setup an external storage array for my photos, music and movies, and then moved my main drive image over to the 2010 15-inch MacBook Pro. I even made sure I had the fastest 2.66GHz Core i7 available at my disposal. Sure it wasn't an 8-core Nehalem setup, but maybe it wouldn't be that noticeable?

I lasted less than a day.

It wasn't so much that I needed an 8-core Xeon setup. I spend less than 10% of my time running applications that require all 8 cores/16 threads. No, the issue was that Arrandale's two cores just weren't enough.

Most of my workload isn't heavily threaded, but the issue with only having two cores is that if you are running one processor intensive task you're limited in what else you can do with your system. Run a heavily threaded application and you've got no CPU time left for anything else. Run a lightly threaded application that's CPU intensive and you still only have one remaining core to deal with everything else. I don't need 8 cores all of the time, but I need more than two.

I suspect I'm not the only user around who may not constantly run heavily threaded apps, but can definitely feel the difference between two and four cores. I'll also go out on a limb and say the number of users who can tell the difference between 2 and 4 cores is larger than the number of users who can tell the difference between 4 and 8. What I'm getting at is this: Apple outfitting the new 15-inch MacBook Pro with a quad-core processor is a deliberate attempt by Cupertino to bring mobility to more of its desktop users.

Apple doesn't offer a good desktop Mac. You can get the Mac Pro but it's quite expensive and is often times overkill if you don't have a heavy content creation workload. Then there's the iMac, which can hit the sweet spot of the performance curve but there's no way to get it without a massive integrated display. Truth be told, Apple's 27-inch iMac is actually a bargain (for a Mac) considering you get a quad-core Lynnfield with a pretty good $999 display all for $1999. However not everyone is sold on the all-in-one form factor.

The new 15-inch MacBook Pro, when paired with an SSD, gives desktop users another alternative. Bring your external display to the party but drive it off of a notebook. You'll sacrifice GPU performance of course, but if you aren't a heavy gamer then you're not giving up all that much. In fact, for normal workloads you'd be hard pressed to tell the difference between one of these new MBPs and an iMac or Mac Pro.

Ultimately I believe this is why Apple chose to make the move to quad-core alongside Thunderbolt enablement. The main reason to stick four cores in a 15-inch chassis is for desktop replacement workloads. The last remaining limitation for desktop users adoption a notebook? Expansion.

The Mac Pro has four 3.5" drive bays. The 15-inch MacBook Pro, on a good day, has two 2.5" drive bays and that's only if you ditch the optical drive and buy an optibay. Then there's the whole fact that you can't add anything that's not a USB or FireWire device. Where are the PCIe slots? What about GPU upgrades? Currently you can't do any of that on a 15-inch MacBook Pro.

Thunderbolt could enable external expansion boxes. Not just for storage but other PCIe add-in cards. The bandwidth offered by a single Thunderbolt channel isn't really enough for high end GPUs, but a faster link could change the way switchable graphics works in the future.

My Concerns

Ever since the new MacBook Pros have arrived I've powered down the Mac Pro and have been using the 15-inch 2.3GHz quad-core as my desktop replacement. When at my desk it's connected directly to my monitor, keyboard, mouse, speakers and other peripherals. It's my desktop. But when I leave my desk, I unplug the five cables I've got going to the MBP (power, DisplayPort, 2xUSB, 1/8" audio and Ethernet) and carry my "desktop" with me.

The convenience is nice, I will admit. Before the mobile-as-a-desktop switch I always had to prepare my notebook with the data I needed for whatever trip I was taking. That usually included the latest copy of my Bench databases, snippets of articles I was writing and other pertinent documents. I don't rely on any cloud syncing for my most sensitive information, I did it all manually. With my laptop being my desktop (and vice versa), I lose the need to manually sync content across those two devices. All of my windows are in the same place all the time and life is good.

The performance difference in my day to day work isn't noticeable. Everything seems just as fast. If Quick Sync were enabled I'm pretty sure I'd be happy with the overall level of performance from this machine vs. a beefier Mac Pro setup. The number of times I need more than 4 cores for something other than video transcoding are pretty limited. I'm not saying that's the case for everyone, it's just the case for me personally.

There are downsides however.

Security. In the past, if I lost my notebook I only lost a minimal amount of data. I typically only put whatever I needed for my trip on my notebook, everything else was at home on my desktop. Now if I lose my notebook, tons of data goes with it—including lots of NDA data. FileVault (OS X's built in home folder encryption) is an obvious solution, but it doesn't come without issues. With FileVault enabled Time Machine backups can only happen when you're logged out and seem to take forever.

I believe OS X 10.7 is better equipped to handle security for a mobile desktop usage model. You get full drive encryption (FileVault only does your home folder) and perhaps even a Find my Mac feature.

Noise. In a desktop, when you've got a high workload on one or more cores your fans may spin a little faster but it's hardly noticeable. The heatsink you have cooling your CPU has a lot of surface area and the fan attached to it is large and spins slowly. With a notebook you don't have the luxury of quickly dissipating heat. As a result, when I have too many browser windows open with Flash running or if the dGPU is doing anything in 3D, the CPU/GPU fans in the 15-inch MBP spin up and are loud. Under these circumstances the setup is louder than my desktop which is annoying.

Cables. Ideally I'd want no cables connecting my notebook to all of the peripherals I need to connect it to. I want to sit it down and have everything just work wirelessly. I'd also want wireless power and a bunch of things that aren't realistic today. So I'm willing to deal with some cabling inconvenience. My preference would be two cables: one for power and one for peripherals/display. Today, it's five.

I believe this is another potential use for Thunderbolt down the road. Apple could build a Cinema Display with Ethernet, more USB ports, FireWire and audio out integrated into the display itself. A single Thunderbolt cable would carry all of those interfaces, reducing my current cable clutter to just two cables.

All of these are solvable problems, but they are definite issues today. Personally I don't believe they are enough to make me switch back to a desktop for work, although the security thing still bothers me. I may end up segmenting my data into stuff I keep on locally attached storage vs. on my notebook's internal drive in order to minimize what I carry around with me when I'm traveling. As for FileVault, I may look into alternative encryption options as Apple's solution right now just isn't practical if you use Time Machine.

13-inch Gaming Performance under Windows (Medium Quality) Final Words
Comments Locked

198 Comments

View All Comments

  • IntelUser2000 - Friday, March 11, 2011 - link

    You don't know that, testing multiple systems over the years should have shown performance differences between manufacturers with identical hardware is minimal(<5%). Meaning its not Apple's fault. GPU bound doesn't mean rest of the systems woud have zero effect.

    It's not like the 2820QM is 50% faster, its 20-30% faster. The total of which could have been derived from:

    1. Quad core vs. Dual core
    2. HD3000 in the 2820QM has max clock of 1.3GHz, vs. 1.2GHz in the 2410M
    3. Clock speed of the 2820QM is quite higher in gaming scenarios
    4. LLC is shared between CPU and Graphics. 2410M has less than half the LLC of 2820QM
    5. Even at 20 fps, CPU has some impact, we're not talking 3-5 fps here

    It's quite reasonable to assume, in 3DMark03 and 05, which are explicitely single threaded, benefits from everything except #1, and frames should be high enough for CPU to affect it. Games with bigger gaps, quad core would explain to the difference, even as little as 5%.
  • JarredWalton - Friday, March 11, 2011 - link

    I should have another dual-core SNB setup shortly, with HD 3000, so we'll be able to see how that does.

    Anyway, we're not really focusing on 3DMarks, because they're not games. Looking just at the games, there's a larger than expected gap in the performance. Remember: we've been largely GPU limited with something like the GeForce G 310M using Core i3-330UM ULV vs. Core i3-370. That's a doubling of clock speed on the CPU, and the result was: http://www.anandtech.com/bench/Product/236?vs=244 That's a 2 to 14% difference, with the exception of the heavily CPU dependent StarCraft II (which is 155% faster with the U35Jc).

    Or if you want a significantly faster GPU comparison (i.e. so the onus is on the CPU), look at the Alienware M11x R2 vs. the ASUS N82JV: http://www.anandtech.com/bench/Product/246?vs=257 Again, much faster GPU than the HD 3000 and we're only seeing 10 to 25% difference in performance for low detail gaming. At medium detail, the difference between the two platforms drops to just 0 to 15% (but it grows to 28% in BFBC2 for some reason).

    Compare that spread to the 15 to 33% difference between the i5-2415M and the i7-2820QM at low detail, and perhaps even more telling is the difference remains large at medium settings (16.7 to 44% for the i7-2820QM, except SC2 turns the tables and leads by 37%). The theoretical clock speed difference on the IGP is only 8.3%, and we're seeing two to four times that much -- the average is around 22% faster, give or take. StarCraft II is a prime example of the funkiness we're talking about: the 2820QM is 31% faster at low, but the 2415M is 37% faster at medium? That's not right....

    Whatever is going on, I can say this much: it's not just about the CPU performance potential. I'll wager than when I test the dual-core SNB Windows notebook (an ASUS model) that scores in gaming will be a lot closer than what the MBP13 managed. We'll see....
  • IntelUser2000 - Saturday, March 19, 2011 - link

    I forgot one more thing. The quad core Sandy Bridge mobile chips support DDR3-1600 and dual core ones only up to DDR3-1333.
  • mczak - Thursday, March 10, 2011 - link

    memory bus width of HD6490M and H6750M is listed as 128bit/256bit. That's quite wrong, should be 64bit/128bit.

    btw I'm wondering what's the impact on battery life for the HD6490M? It isn't THAT much faster than the HD3000, so I'm wondering if at least the power consumption isn't that much higher neither...
  • Anand Lal Shimpi - Thursday, March 10, 2011 - link

    Thanks for the correction :)

    Take care,
    Anand
  • gstrickler - Thursday, March 10, 2011 - link

    Anand, I would like to see heat and maximum power consumption of the 15" with the dGPU disabled using gfxCardStatus. For those of us who aren't gamers and don't need OpenCL, the dGPU is basically just a waste of power (and therefore, battery life) and a waste of money. Those should be fairly quick tests.
  • Nickel020 - Thursday, March 10, 2011 - link

    The 2010 Macbooks with the Nvidia GPUs and Optimus switch to the iGPU again even if you don't close the application, right? Is this a general ATI issue that's also like this on Windows notebooks or is it only like this on OS X? This seems like quite an unnecessary hassle, actually having to manage it yourself. Not as bad as having to log off like on my late 2008 Macbook Pro, but still inconvenient.
  • tipoo - Thursday, March 10, 2011 - link

    Huh? You don't have to manage it yourself.
  • Nickel020 - Friday, March 11, 2011 - link

    Well if you don't want to use the dGPU when it's not necessary you kind of have to manage it yourself. If I don't want to have the dGPU power up while web browsing and make the Macbook hotter I have to manually switch to the iGPU with gfxCardStatus. I mean I can leave it set to iGPU, but then I will still manually have to switch to the dGPU when I need the dGPU. So I will have to manage it manually.

    I would really have liked to see more of a comparison with how the GPU switching works in the 2010 Macbook Pros. I mean I can look it up, but I can find most of the info in the review somewhere else too; the point of the review is kind of to have it all the info in one place, and not having to look stuff up.
  • tajmahal42 - Friday, March 11, 2011 - link

    I think switching behaviour should be exactly the same for the 2010 and 2011 MacBook Pros, as the switching is done by the Mac OS, not by the Hardware.

    Apparently, Chrome doesn't properly close done Flash when it doesn't need it anymore or something, so the OS thinks it should still be using the dGPU.

Log in

Don't have an account? Sign up now