Thunderbolt

We wrote about Thunderbolt when the new MBPs launched, and about the differences between when it existed as Intel's codename Light Peak like we used to know it and saw at IDF. Thunderbolt differs technically and in practice in a number of ways. The short version of the story is that Thunderbolt is Light Peak sans light in this initial form (electrical right now), uses the mini DisplayPort connector on the MBP, and is capable of two channels of full duplex 10 Gbps traffic, for a theoretical 20 Gbps up and down. Thunderbolt requires a controller on the host and peripheral, uses 4 PCIe lanes, and connects to Display Port internally on the MBP's discrete GPU. One of the interesting things is where those 4 lanes come from on the 2011 MacBook Pro.

Thunderbolt can supply 10 watts of power and support up to 7 devices, up to two of which can be DisplayPort 1.1a devices. Just PCIe and DisplayPort are tunneled over Thunderbolt links. However, you can connect a standard DisplayPort monitor to the jack on the MBP and use it natively as well.

Sandy Bridge brings 16 lanes of PCIe really purposed for running a GPU. Interestingly enough, the discrete GPU on the 2011 MBPs uses just 8 PCIe lanes:

So where do the remaining 8 lanes get used for? They're split into 2 x 4x ports, one of which is for Thunderbolt. It's surprising, but this configuration is totally supported. Originally I speculated that the other 4x lane was being used for another PCIe interface device in the MBP (the SDXC card reader and BCM7765 are both 1xPCIe devices), but it appears they're unused.


Intel's Thunderbolt controller

Thunderbolt launches with Apple, but isn't Apple exclusive. Intel reports that we just likely won't see adoption in the PC space until 2012. In addition, there's no per-port licensing fee or royalty for peripheral manufactuers wanting to use the port or controller, which are entirely Intel's. The controller is actually of appreciable size on the 2011 MBP:

Initially, Thunderbolt is electrical only, though the optical version of Thunderbolt is coming later this year. Optical cabling will be compatible with this electrical version through the use of electro-optical transceivers on the cable ends.

Bottom: 2011 MBP with Thunderbolt port, Top: 2010 MBP

We can't test and see whether Thunderbolt works or does anything right now, because there aren't any devices on the market with support. That said, Western Digital, LaCie, Promise, and other external storage manufacturers have stated that drives will arrive shortly, which we will surely take a look at. There are also rumors of various high end DSLRs shipping with Thunderbolt in the near future, though that's anyone's guess.

There's a field for Thunderbolt in system profiler, but even with a DisplayPort monitor attached, it shows nothing connected:

Interestingly enough, in Windows there's no trace of Thunderbolt at all. There aren't any unknown devices in the device manager, no device ID either. Hopefully Boot Camp drivers come along for Thunderbolt in Windows before devices start rolling out.

Improved WiFi Performance FaceTime HD Cameras
Comments Locked

198 Comments

View All Comments

  • IntelUser2000 - Friday, March 11, 2011 - link

    You don't know that, testing multiple systems over the years should have shown performance differences between manufacturers with identical hardware is minimal(<5%). Meaning its not Apple's fault. GPU bound doesn't mean rest of the systems woud have zero effect.

    It's not like the 2820QM is 50% faster, its 20-30% faster. The total of which could have been derived from:

    1. Quad core vs. Dual core
    2. HD3000 in the 2820QM has max clock of 1.3GHz, vs. 1.2GHz in the 2410M
    3. Clock speed of the 2820QM is quite higher in gaming scenarios
    4. LLC is shared between CPU and Graphics. 2410M has less than half the LLC of 2820QM
    5. Even at 20 fps, CPU has some impact, we're not talking 3-5 fps here

    It's quite reasonable to assume, in 3DMark03 and 05, which are explicitely single threaded, benefits from everything except #1, and frames should be high enough for CPU to affect it. Games with bigger gaps, quad core would explain to the difference, even as little as 5%.
  • JarredWalton - Friday, March 11, 2011 - link

    I should have another dual-core SNB setup shortly, with HD 3000, so we'll be able to see how that does.

    Anyway, we're not really focusing on 3DMarks, because they're not games. Looking just at the games, there's a larger than expected gap in the performance. Remember: we've been largely GPU limited with something like the GeForce G 310M using Core i3-330UM ULV vs. Core i3-370. That's a doubling of clock speed on the CPU, and the result was: http://www.anandtech.com/bench/Product/236?vs=244 That's a 2 to 14% difference, with the exception of the heavily CPU dependent StarCraft II (which is 155% faster with the U35Jc).

    Or if you want a significantly faster GPU comparison (i.e. so the onus is on the CPU), look at the Alienware M11x R2 vs. the ASUS N82JV: http://www.anandtech.com/bench/Product/246?vs=257 Again, much faster GPU than the HD 3000 and we're only seeing 10 to 25% difference in performance for low detail gaming. At medium detail, the difference between the two platforms drops to just 0 to 15% (but it grows to 28% in BFBC2 for some reason).

    Compare that spread to the 15 to 33% difference between the i5-2415M and the i7-2820QM at low detail, and perhaps even more telling is the difference remains large at medium settings (16.7 to 44% for the i7-2820QM, except SC2 turns the tables and leads by 37%). The theoretical clock speed difference on the IGP is only 8.3%, and we're seeing two to four times that much -- the average is around 22% faster, give or take. StarCraft II is a prime example of the funkiness we're talking about: the 2820QM is 31% faster at low, but the 2415M is 37% faster at medium? That's not right....

    Whatever is going on, I can say this much: it's not just about the CPU performance potential. I'll wager than when I test the dual-core SNB Windows notebook (an ASUS model) that scores in gaming will be a lot closer than what the MBP13 managed. We'll see....
  • IntelUser2000 - Saturday, March 19, 2011 - link

    I forgot one more thing. The quad core Sandy Bridge mobile chips support DDR3-1600 and dual core ones only up to DDR3-1333.
  • mczak - Thursday, March 10, 2011 - link

    memory bus width of HD6490M and H6750M is listed as 128bit/256bit. That's quite wrong, should be 64bit/128bit.

    btw I'm wondering what's the impact on battery life for the HD6490M? It isn't THAT much faster than the HD3000, so I'm wondering if at least the power consumption isn't that much higher neither...
  • Anand Lal Shimpi - Thursday, March 10, 2011 - link

    Thanks for the correction :)

    Take care,
    Anand
  • gstrickler - Thursday, March 10, 2011 - link

    Anand, I would like to see heat and maximum power consumption of the 15" with the dGPU disabled using gfxCardStatus. For those of us who aren't gamers and don't need OpenCL, the dGPU is basically just a waste of power (and therefore, battery life) and a waste of money. Those should be fairly quick tests.
  • Nickel020 - Thursday, March 10, 2011 - link

    The 2010 Macbooks with the Nvidia GPUs and Optimus switch to the iGPU again even if you don't close the application, right? Is this a general ATI issue that's also like this on Windows notebooks or is it only like this on OS X? This seems like quite an unnecessary hassle, actually having to manage it yourself. Not as bad as having to log off like on my late 2008 Macbook Pro, but still inconvenient.
  • tipoo - Thursday, March 10, 2011 - link

    Huh? You don't have to manage it yourself.
  • Nickel020 - Friday, March 11, 2011 - link

    Well if you don't want to use the dGPU when it's not necessary you kind of have to manage it yourself. If I don't want to have the dGPU power up while web browsing and make the Macbook hotter I have to manually switch to the iGPU with gfxCardStatus. I mean I can leave it set to iGPU, but then I will still manually have to switch to the dGPU when I need the dGPU. So I will have to manage it manually.

    I would really have liked to see more of a comparison with how the GPU switching works in the 2010 Macbook Pros. I mean I can look it up, but I can find most of the info in the review somewhere else too; the point of the review is kind of to have it all the info in one place, and not having to look stuff up.
  • tajmahal42 - Friday, March 11, 2011 - link

    I think switching behaviour should be exactly the same for the 2010 and 2011 MacBook Pros, as the switching is done by the Mac OS, not by the Hardware.

    Apparently, Chrome doesn't properly close done Flash when it doesn't need it anymore or something, so the OS thinks it should still be using the dGPU.

Log in

Don't have an account? Sign up now