The Thunderbolt Display

The first test was hooking up Apple's Thunderbolt Display, the only Thunderbolt display device available on the market today. Although I shouldn't have been, I was a bit surprised when the display just worked. Intel's HD 4000 drove the 2560 x 1440 panel just fine and there weren't any funny issues displaying the lower res UEFI setup mode.

Despite Ivy Bridge being able to drive three independent displays, I was only able to simultaneously output to two connected displays on the GD80. All combinations of two worked however (TB + HDMI, TB + VGA, VGA + HDMI).

Once in Windows, the Thunderbolt Display's integrated GigE, Firewire and other controllers started popping up. Unfortunately Apple doesn't offer a direct download package for Thunderbolt Display drivers. You can either hunt down the controllers/drivers on their own, or you can build a Windows Support (driver) package using a Mac and the Boot Camp Assistant. I'd much rather Apple just offer an easy route for non-Mac Windows users to take advantage of the Thunderbolt Display as it's the only TB display on the market, but I can understand the lack of motivation there.

With the Boot Camp drivers installed, I got working GigE and Firewire 800. The Thunderbolt Display's integrated USB hub gave me issues however. Anything I plugged into it would either partially work (e.g. my mouse was detected but moving the cursor was far from smooth) or not work at all (e.g. my attached USB keyboard never worked). The other issue with the Thunderbolt Display is you get no brightness control, which can be a problem given how bright the panel gets. I've seen reports of people getting brightness control working via software tools but the solutions don't seem permanent.

Apple's Thunderbolt Display definitely works, but Windows users will likely want to wait for a Thunderbolt display that is built specifically with Windows in mind.

Virtu and Thunderbolt: It Works

From a software perspective, Thunderbolt is treated just like another display output driven by Intel's processor graphics. I installed a GeForce GTX 680 along with Lucid's Virtu GPU virtualization software to see if I could use the 680 for gaming but drive the display using Intel's processor graphics and the Thunderbolt port. The setup worked flawlessly.

Virtu recognized the configuration immediately once I had NVIDIA's drivers installed, and I was able to run the 680 headless - using only the Thunderbolt port to drive the external display. Intel's HD 4000 powered things in Windows, while the 680 kicked in for games.

Thunderbolt under Windows The Storage Devices, Performance & Moving Forward
POST A COMMENT

98 Comments

View All Comments

  • zanon - Friday, May 11, 2012 - link

    Back during your CES 2012 coverage in January you reported on one of the initial planned external expansion boxes, the MSI GUS II. Things like that (I know there are a few others now) seem to be where the story gets truly interesting for mobile users IMO. Even though the bandwidth available is only equivalent to a few lanes, testing like HardOCP's 480 x16 vs x4 article indicates that at single screen resolutions (ie., no more then 2560x1600) graphics cards can perform shockingly well even with severely restricted bandwidth. So more then merely having a hub there's the potential of being able to plug an ultrabook into a hub and have a mid-range full GPU ready to go, along with a screen and other ports. If it all goes smoothly it could really expand the desirability of ultrabooks even farther, making them more and more no-compromise, although it'll probably take the next-gen 40/100 Gbps TB standard to push it farther or handle GPGPU applications. Still exciting stuff. Reply
  • Guspaz - Friday, May 11, 2012 - link

    Sadly, the MSI GUS II is not that useful due to only supporting bus power (75W, no external power connectors supported). The fastest (nVidia) card it can support is the GT 640, which isn't that much faster than the integrated graphics it would be intended to replace. Anything faster can't be bus-powered.

    It would give you a small graphics upgrade over the Ivy Bridge integrated graphics, but not a big enough one to warrant all the cost and effort... and a notebook with discrete graphics could easily outperform it.

    If they add support for externally powered GPUs (anything that draws over 75W), then it could be something special.
    Reply
  • zanon - Friday, May 11, 2012 - link

    If they add support for externally powered GPUs (anything that draws over 75W), then it could be something special.

    That's extremely trivial though, I was only using it as an Anandtech featured example of one initial expansion, not as the ideal solution itself. Not exactly a big deal to stick a tiny external power source in there, in fact it's odd MSI didn't do it in the first place (or who knows, maybe it'll get revised before release), but as the chips themselves become cheaper and get plenty of supply it seems likely other solutions will appear. It feels like one of the real possible killer apps for the interface after all, something that can't be easily replicated through other means.
    Reply
  • DerPuppy - Friday, May 11, 2012 - link

    they are (villagetronic+others in the egpu community) working on a thunderbolt system..supposedly development support is limited to large partners though because the thunderbolt team at intel is overworked? someday it'll come out, external gpu solutions on >x1 PCIe2.0 connections, that is. the power issue is somewhat of a nonissue if you don't mind bringing around any sort of 12v psu, be it normal atx, sfx, or w/e and just hooking it up Reply
  • Roland00Address - Friday, May 11, 2012 - link

    The product tag says the device can support up to 150w cards. See product tag picture here.

    http://www.eteknix.com/wp-content/uploads/2012/01/...

    I also have heard it has been reported there is no 6 pin power thus the device can only do 75w.

    If the device can do 150w you can put a 7850 in there, or possibly a gtx 660 (the gtx660 is a personal guess since the gtx 670 tdp was 170w and to do the gpu boost on the gtx 670 the tdp has to be 141w.)
    Reply
  • yyrkoon - Saturday, May 12, 2012 - link

    Thing is though, even more important than power. Is the technology bandwidth.

    10 Gbps (one-direction ) really is not that much in the way of graphics card bandwidth. As I recall. a 7600GT from a few years back, could chew through about 20Gbyte /s under intensive situations. So 10 Gbit/s is hardly going to put a dent into that. So with that in mind, we're basically stuck with integrated graphics performance again.

    That is, at least until the next iteration of the technology. Maybe.

    Still, from a modular system approach, I like the idea. However I doubt it would be practical any time soon ( like you pretty much already said ).
    Reply
  • DerPuppy - Saturday, May 12, 2012 - link

    No idea where you're getting these metrics from? PCIe 3.0 x16 is 16GB/s so there's no way a 7600GT could have used that much bandwidth. maybe you're referring to the internal memory bandwidth?? It has been proven that an x4 2.0 connection is sufficient for about ~80-90% of the performance of an x16 link in many gaming situations Reply
  • yyrkoon - Sunday, May 13, 2012 - link

    Of course it was the memory bandwidth.

    " It has been proven that an x4 2.0 connection is sufficient for about ~80-90% of the performance of an x16 link in many gaming situations"

    You would be very lucky to see half that. It is very likely, that you would see 25% or less of that.

    People have been working on this problem for years now. Through other means. Partially, they have succeeded, using the MXM laptop graphics connection. At a cost that begs to wonder why they did not just buy a $2000 laptop to begin with.

    You can buy an external graphics enclosure, for laptops, right now. If you're willing to spend ~$800 for it. Then, only if you have the right laptop.

    So in ending I will say this. You're dreaming. You're dreaming a dream I have had myself. At some point however, you're going to have to come back to reality.

    Oh, one last thing. I should point out that Gigabytes, and Gigabits are not universal with one another. 10 Gigabit == 1.25 Gigabyte. That is, 25% more, than a PCIe 1x 1.0 connection can handle. Under ideal circumstances.

    How do you like them metrics ?
    Reply
  • repoman27 - Sunday, May 13, 2012 - link

    Well, connecting a GPU via Thunderbolt doesn't affect the memory bandwidth, so that's not really relevant.

    Plus your math is way off. A single Thunderbolt controller can provide 10 Gbps of PCIe bandwidth. A PCIe 1.0 x1 connection provides 250 MB/s (i.e. 2 Gbps) which is 1/5 of what you get from Thunderbolt. Or put another way, Thunderbolt can currently provide the equivalent of a PCIe 2.0 x2.5 connection.

    DerPuppy is correct. If driver support with PCIe compression for Thunderbolt connected GPU's was available, we could achieve better than 80% of the real-world performance of a PCIe 3.0 x16 connected GPU with an external solution.

    Note the x2 (2 GB/s) performance in these charts: http://www.anandtech.com/show/5458/the-radeon-hd-7...
    Reply
  • Jaybus - Monday, May 14, 2012 - link

    Yes, more bandwidth is needed to support external high end GPUs, not to mention the simultaneous use of multiple external PCIe devices. This is why the initial Intel project was named Light Peak. Intel Labs silicon photonics researchers never intended the interface to use an electronic PHY. I believe the electronic PHY version (Thunderbolt) was due to Apple's collaboration along with Intel hitting snags in the development of an on-chip optical PHY.. Ultimately, there will be an optical PHY, since the ability to scale the electronic PHY is limited. Reply

Log in

Don't have an account? Sign up now