Back to Article

  • klagermkii - Monday, January 09, 2012 - link

    What kind of pathetic GPU can be powered by the 10W available over the Thunderbolt bus? This would be useful if it allowed a decent discrete card, but when a GTX580 is 250W+ what is this going to get us? Almost the TDP of a mid-range laptop graphics card. Reply
  • cmanderson - Monday, January 09, 2012 - link

    PCIe can supply at least 75W, so if there is external power running into the chassis, it wouldn't need to draw exclusively from the Thunderbolt bus. Reply
  • winie - Monday, January 09, 2012 - link

    If there is going to be an ac/dc converter why not add a 6-pin connector Reply
  • Ryan Smith - Monday, January 09, 2012 - link

    Heat. 150W in means 150W out. Reply
  • winie - Monday, January 09, 2012 - link

    that can be solved by more holes. the cards fan can surely move away its tdp without an enclosure Reply
  • piroroadkill - Tuesday, January 10, 2012 - link

    Yup. Just put a nice big hole right over where the fan intake is, and the card will exhaust the heat out the backplate. Cooling a card in an enclosure like this would be a breeze. Reply
  • repoman27 - Monday, January 09, 2012 - link

    And form factor. Consider the size of a typical 300W power supply (if you wanted to use a high end GPU) plus a full-length dual-slot PCIe x16 card, plus cooling. This thing looks to be not much larger than the card itself, but I guess it's probably using an external power brick... Reply
  • SlyNine - Tuesday, January 10, 2012 - link

    PowerSupplies take advantage of the size offered by ATX specs to lower costs, they don't have to be that big or else a 1200watt PSU would be 4 times as big as a 300 watt one.

    I'm not saying the economics of scale works exactly like that but the point is valid.
  • appliance5000 - Tuesday, March 13, 2012 - link

    Well some of that 150 watts is used by the card. What's not used is dissipated in the form of heat. Reply
  • danielchatfield - Sunday, June 16, 2013 - link

    How do you suppose the graphics card 'uses' it - you can't destroy energy, all electricity used by the GPU will end up as heat. Reply
  • phatboye - Monday, January 09, 2012 - link

    From the above picture it does in-fact look like there is a power core going into the GUS II external enclosure. So what I am guessing is that what the article meant was you could only use whatever power that the power cord supplies through the PCI-e bus and that the enclosure does not supply aux power connectors.

    Still though that puts a large limit on what GFX cards this thing can work with. More than likely you will only be able to use lower performance GFX cards with this thing which makes it kind of pointless as in that case you might as well just stick to the integrated GPU that comes with most CPUs these days.
  • SlyNine - Tuesday, January 10, 2012 - link

    Even low end GPU's can beat the crap out of IGPs, Plus this can be upgraded and you will never upgrade your IGP. Reply
  • ickibar1234 - Sunday, June 16, 2013 - link

    Not Intel's HD 5200 Iris integrated graphics. Reply
  • Bull Dog - Monday, January 09, 2012 - link

    Admittedly somewhat confusing the way it was written. The Enclosure receives power from an external adapter. The cord is visible in both pictures.

    Inside the enclosure however the means to deliver power to the GPU is through the physical PCI-e connector. So only GPUs without extra 6-8 Pin power connectors will work in it.

    Does that make sense?
  • Sttm - Monday, January 09, 2012 - link

    There might not be a point in using a faster gpu that would require more then 75W as the bandwidth you get over Thunderbolt is going to be a limiting factor. Reply
  • zanon - Monday, January 09, 2012 - link

    Honestly it really shouldn't be much of a problem, surprisingly enough. It seems to have become very widespread "common sense" that GPUs need a lot of bandwidth, and it's certainly what I assumed. But in fact short of a multi-gpu multiple-monitor setup it makes little to *zero* difference for even x16 vs x4. Take a look at results like the tests HardOCP did:

    Definitely not what I would have predicted, but there it is. The full size TB chips found in something like the MBP can put out 20 Gbps each way, the half size ones 10 Gbps, although this thing may only use 10 anyway. But that's something like x3 to x5 PCIe 2.0 equivalent, which based on testing should be plenty for any single screen setup even with a pretty high end GPU.

    Obviously, it's going to become more of an issue if people want to start combining or chaining a lot of stuff. Running an external GPU and trying to chain on a highspeed RAID for example, or wanting to have a multi-PCIe box, certainly looks like it could overload a single 1.0 TB port. It may be possible to bridge multiple ports, but longer term we'd have to hope it scales as Intel claims (at 100 Gbps per channel it shouldn't be a limiting factor under even near future use cases). Nevertheless, it is a pretty compelling use for the present. Being able to have full, standard PCIe slots for an ultrabook or equivalent would further knock down any remaining use cases for desktops. Docks that have enhanced active cooling that let the system's chips rev up higher would also help.
  • repoman27 - Monday, January 09, 2012 - link

    All of the current Thunderbolt controllers only provide a single PCIe to Thunderbolt protocol adapter, as far as I can tell. Thus total PCIe bandwidth is limited to the equivalent of PCIe 2.0 x2.5. This would begin to bottleneck higher end cards pretty severely, I'd think. Still better than integrated graphics, but a pretty crazy expensive solution for gains that could be achieved vs. just buying a system with a better dGPU. Reply
  • zanon - Monday, January 09, 2012 - link

    >All of the current Thunderbolt controllers only provide a single PCIe to Thunderbolt protocol adapter, as far as I can tell.
    I don't know enough at all myself to whether this is a current hard limit or whether it could be worked around somehow in situations where no DP is needed. Maybe Anand could look into that, but at any rate even if it is hard limit it's not the end of the world necessarily (see below), and furthermore something that manufacturers should have been on earlier :).

    >This would begin to bottleneck higher end cards pretty severely, I'd think.
    Yeah, "I'd think," but again see my link above. I honestly truly would have thought that a drop on a GFX 480 at 2560 resolution from x16 to x4 would have made a difference, but in many of the games it literally had ZERO effect, nada. I would want to see some real tests about exactly what kind of real-world effect a bandwidth drop to have. Don't forget, 1920 is plenty common, and even mid range dGPU boards would completely, utterly slaughter any integrated or even higher end mobile graphic solutions.
  • JNo - Tuesday, January 10, 2012 - link

    this. Reply
  • Penti - Tuesday, January 10, 2012 - link

    Actually they provide 4 lanes (x4) PCI-e (but can not communicate at full speed 4x PCI-e 2.0 obviously) except on Macbook Air where they have half the physical speed. If it where not physically connected to x4 PCI-e/PCI-e 2.0 on the motherboard and instead PCI-e 2.0 x1 you wouldn't get up to ~800MiB/s on products like say Pegasus R6 which is physically an x2 PCI-e external SATA-raid card in a box connected over Thunderbolt which tunnels the PCI-e protocol. Extra bandwidth is just there to provide the needed room of the 8b/10b encoding. Add to that the the Thunderbolt chip is an PCI Express switch which means that you can physically connect and establish a connection to more then one full speed x2 PCI-e 2.0, two x1 PCI-e 2.0 or up to four PCI-e 1.x full speed devices which means they can be bandwidth starved when daisychained. Effective bandwidth for GPU is the 8Gbit/s that PCI-e 2.0 x2 provides. System might see it as x4 device though, but it won't perform like an x2.5 device. Should still fine use for gpu use though, but not so much in the consumer space. Though it would be fast enough for some gaming compared to what you could otherwise achieve in a 11.6-13.3" case. For a 15" notebook just buy a faster discrete gpu as said though. Reply
  • repoman27 - Tuesday, January 10, 2012 - link

    All current Thunderbolt controllers have connections for 4 lanes of PCIe 2.0, including the MacBook Air. All Thunderbolt ports provide two 10Gbps, bi-directional channels over a single cable. They all operate at the same speed. The full 10Gbps is available to the underlying protocol (PCIe or DP) with no 8b/10b overhead.

    The bottleneck is that all current Thunderbolt controllers only appear to have a single PCIe to Thunderbolt protocol adapter. Even though this protocol adapter has a PCIe 2.0 x4 back end, it only has a single 10Gbps frontside connection to the on-chip Thunderbolt switch. PCIe 2.0 sans 8b/10b runs at 4Gbps. 10/4 = 2.5. The 1250MB/s PCIe throughput a Thunderbolt controller is capable of is equivalent to 2.5 lanes of PCIe 2.0.

    If there aren't other devices in the Thunderbolt chain sharing the PCIe bandwidth, an external GPU solution could probably yield some fairly impressive performance gains nonetheless.
  • Watwatwat - Monday, January 09, 2012 - link

    It is a given that it will have external power. Even if pci can supply 75 watts the laptop power supply cannot. Reply
  • Sttm - Monday, January 09, 2012 - link

    The black power cable coming out of the rear of it seems pretty given. Reply
  • Zigor - Tuesday, January 10, 2012 - link

    Just zoom in and read the product brief:

    "Supports up to 150W discrete graphics card".
  • ther00kie16 - Monday, January 09, 2012 - link

    Lack of additional power plugs is seriously disappointing. Might as well diy (google it). Reply
  • Pandamonium - Monday, January 09, 2012 - link

    VillageTronic's ViDock 2 (I believe that is the model number) used expresscard to power external GPUs. MSI had a predecessor to the GUS 2, IIRC too. The problem with each of these is that they are priced out of the market. They're just too expensive for what they seem to do. And in the case of the ViDock, I am not certain that the USB ports detract from the bandwidth available to the GPU, or if the USB ports pull from expresscard's USB access simultaneously while the GPU uses the x1. I think I had a lengthy query about it on the AT forums some months ago. Reply
  • zanon - Monday, January 09, 2012 - link

    Was going to post this in the other comment, but thought it deserved it's own. Really, given the general trend towards portables I'm honestly surprised to have not seen Nvidia in particular push this REALLY hard. The overall trend of the market is heavily towards notebook/ultrabook-class systems. AIOs continue to sell in significant numbers too, and all of them represent a vast swath of users where the best that can be hoped for is often to make it onto the mobile chipset, and even then Intel is furiously pushing their ever-more-integrated graphics (and of course AMD has potential with Fusion).

    This would seem like a big, golden and possibly single opportunity for Nvidia to open up the ability to sell real cards to this entire market. There could be bare PCIe expanders for the more technical groups, but fully integrated solutions that are literally plug-and-play are very feasible and even better for a lot of groups then traditional solutions (no need to open anything up, handle anything board-y, even think about "molex" or case cooling, just have a shiny attractive external case unit that has a cable to the computer and one for power). Seems like a big chance to mix things up a bit, I'm kind of sorry to not see Nvidia pushing TB a LOT harder.
  • repoman27 - Monday, January 09, 2012 - link

    Well, the only PCs to ship with Thunderbolt thus far are Macs, and Apple went all AMD/Intel this year for graphics, so there aren't any Mac drivers for NVIDIA GPUs released in the past 12 months. So I'm guessing that may have something to do with NVIDIA not pushing TB too hard yet.

    I totally agree with the complete solution concept, although I kinda figure they might come from the partners rather than AMD/NVIDIA directly. If you think about it, the performance of a Thunderbolt GPU setup is clearly going to plateau at the point when the available PCIe bandwidth becomes a bottleneck. If you select the lowest power / lowest cost GPU capable of saturating PCIe 2.0 x2.5 on a regular basis, and build a custom card around it specifically optimized for the low bandwidth situation, add the TB controller right on board and a power supply and cooling solution custom tailored to a compact external enclosure... Now you've taken all the guesswork out of it. You have the maximum performance you're going to get out of this type of setup, in the smallest form-factor, and maybe even at a reasonable price.
  • zanon - Monday, January 09, 2012 - link

    >so there aren't any Mac drivers for NVIDIA GPUs released in the past 12 months. So I'm guessing that may have something to do with NVIDIA not pushing TB too hard yet.
    I don't find that convincing. This stuff doesn't just happen by magic, as the Nvidia vs AMD driver situation has shown. Nvidia often has done dramatically better with drivers and game performance because they expend dramatically more resources connecting with developers. By the same token, they could have done a lot more to talk up TB, to push PC makers to include it, and for that matter to negotiate with Apple. Apple themselves have every reason to want TB to succeed and of course to push their own portable machines, which is a major profit center for them. Anything that boosts that could be of mutual benefit, maybe even a way to more widely get back into that particular segment.

    I think Nvidia just plain hadn't really thought about it, which is unfortunate.

    >If you think about it, the performance of a Thunderbolt GPU setup is clearly going to plateau at the point when the available PCIe bandwidth becomes a bottleneck.
    See my other response above, but this could be a much higher plateau then you seem to be assuming, at least for non-compute work. Also, it's not like there is exactly a very high plateau to surpass when the competition is the HD3000 or whatever.

    But yeah, with a bit of nice engineering it certainly seems there could be a market ripe for the picking, even on the Mac side.
  • repoman27 - Tuesday, January 10, 2012 - link

    I've checked out a fair number of PCIe scaling tests, and I understand what you're saying. At x4 things look pretty darn good, the impact ranges from negligible to roughly 18% in some scenarios. When you look at x1, however, things become a bit grimmer. Of course nobody has posted test results for x2.5 because that isn't a normal scenario, so it's hard to say where the choke point really is.

    I reckon the performance plateau for a TB connected GPU is still considerably higher than HD3000, but what about vs a 6770m? Would it still be worth the money to buy an external upgrade?
  • zanon - Tuesday, January 10, 2012 - link

    >I reckon the performance plateau for a TB connected GPU is still considerably higher than HD3000, but what about vs a 6770m? Would it still be worth the money to buy an external upgrade?

    Couldn't know without testing of course. However, it's easy enough to get a feel for the sort of delta there is between desktop and mobile just by going and taking a look at some of the Anand reviews for various notebooks, including the Macbook Pros, the various Alienware gaming machines, and so forth, and then comparing them to even something like a 6950. It's not even in the same area, and those are bigger, heftier machines. On the really light ones you will be seeing integrated, and then what have we got? HD3000, Starcraft II, 1366x768, medium settings: 16.5 FPS. Yeah. A big fat gaming focused machine with something like a 555M pushes that up a lot, same settings, to over 68. But that's like the framerate a ~$250 desktop card would get you at 1920x1200 with ultra settings and AA.

    Ultimately I don't think there's any real way to get around the fact that a single desktop card can easily draw 3x-5x the power of the entire max machine draw of many notebooks. It's just plain physics, there's only so much you can do with a fifth or less the power and thermal budget. Even with bandwidth constraining things, a raw difference of 500% or more is a lot to make up, and there's room for optimization of the approach too (ie,. CPU ramp up when docked to an external GPU to take advantage of the additional thermal budget available).
  • sullrosh - Monday, January 09, 2012 - link

    since it is a pcie slot does it work with any pcie card? Reply
  • tpurves - Tuesday, January 10, 2012 - link

    Would be awesome to take over graphics for a macbook air when docked. Even stuck at a 75W power budget would leave a lot of room for improvement over intel integrated graphics. For comparison the AC adapter for the MBA is only rated for a max 50W to power and charge the entire computer. Reply
  • Khenglish - Tuesday, January 10, 2012 - link

    You guys are vastly underestimating what can be run on thunderbolt with the limited PCI-E bandwidth. In short, thunderbolt with optimized drivers can run absolutely anything with no performance loss. Check to see what you can do with a simple PCI-E 1.1 x1 link. Even with 1/32 the bandwidth of a desktop, in some cases 90%+ of the PCI-E 2.0 x16 performance is obtained. On a 2.0 x1 link, you get practically no performance loss die to PCI-E. Ex 15587 gpu score in vantage with an overclocked 560 ti, and 4096 gpu score in 3dm11. Even a 580 scales well on a 1.1 x1 link. Check the forum I posted to see the scaling.

    The reason the performance loss is so little is because nvidia provides PCI-E compression with current drivers. Unfortunately this current only works with x1 links. The compression will not engage on an x2 link, causing x1 to usually outperform x2.

    There are some results with systems that do not run any compression. When looking at the performance on an 1.1 x2 link, it's not hard to imagine that on a2.0 x2 link or better, which thunderbolt provides, would offer excellent performance even on the new HD 7970.
  • DanNeely - Tuesday, January 10, 2012 - link

    That page is an utter mess, the one useful bit of scaling I found was to 5870 tests by Tech Powerup, a direct link is below. The average performance hit they found from a 1x 2.0 link was about 25% reduction; there were however very large variations with some games only losing a few percent and others having framerate drops as high as 75% of their 16x performance. (A few games even scored higher at lower PCIe bandwidth numbers: Inconsistent benchmarks?????). A 2.0 x4 link was fast enough that none of the games tested suffered any catastrophic failures, in most cases the delta was small enough it probably wouldn't affect the [H]ocp minimum relevant performance threshold of being able to increase IQ settings while maintaining a playable framerate.

    Unfortunately they didn't test a 2.0 x2 link (or equivalent). Toms hardware did similar tests on a 1.1 x4 link several gpu generations ago and IIRC didn't find any major problems. But I think that was with an 8800 family GPU so I'm not sure how relevant it would be with modern ones being a half dozenish times faster.
  • Khenglish - Tuesday, January 10, 2012 - link

    Did you bother to scroll down to the performance tables? They're pretty straightforward and if you click the name of the person's experience you often get quite a bit of detail.

    There is so much info on that page about PCI-E performance but no one bothers to check it.... I get over 4k total score in 3dm11 with a overclocked 460 on a 1.1 x2 link...
  • mpschan - Tuesday, January 10, 2012 - link

    I, and many others, are very excited to see something like this come along.

    What I really want is a laptop that can drive 1080p for games with mid quality settings. Obviously power and heat from the GPU are the biggest obstacles to that, but this could address that and move the concern over to the CPU, which might be up to the task (if not now then hopefully soon).

    If they can get this to work and work well enough to pull 40-60 fps from a laptop at 1080, you will see people lining up to buy this.
  • Haydon - Tuesday, January 10, 2012 - link

    For the money it would cost to buy the enclosure and the card, why not just get a laptop with a decent GPU in the first place with Optimus technology and the like which switches to the integrated GPU when you're not playing games on it? Untill some iteration of lightpeak can do PCIe 3.0 at x4 or PCIe 2.0 at x8 this just isn't going to sell well.

    I can already get 40-60 fps at medium/high settings on my laptop that cost only $1500 from AVA Direct.

    This seems more for the Mac crowd who is as of yet unaware that they're being ripped off so hard in the graphics department that they don't know laptops exist that can play games well.

    On the PC side of life there are laptops that don't need an external enclosure to run everything at 1080p in 3D (like the 3d that pops out of the screen) even with maxed out settings.
  • aliasfox - Tuesday, January 10, 2012 - link

    I don't want a bigger laptop. I want to carry a 12-13" laptop around - just big enough for a full size keyboard. You can't fit a fast GPU and a fast processor in something that size and expect to get a good amount of battery life.

    But sometimes I'm at home and I want to do something that poor old integrated graphics can't handle. Plug this box into the laptop, plug the TV into the box, and I have now have 10x the graphics performance at 1080p, while still being able to carry around a 3 lbs laptop when I need to take it out.
  • know of fence - Thursday, January 12, 2012 - link

    2012 to be the year of the Thousand-bucks - erm - Ultrabooks! Thunder bolt-on crutches may be available a year later down the line when people will realize those things fall flat - erm - when it comes to graphics. Reply
  • ionis - Monday, January 16, 2012 - link

    These external GPUs are pointless without HDMI in on the laptops. Who's going to bring their laptop, external GPU, AND monitor with them? It would be so great to be able to take a laptop and a gaming capable GPU with me to a friend's house or on a business trip. Reply
  • mercutiouk - Sunday, April 15, 2012 - link

    I really think the first company that comes up with a box shaped to fit a PCI-e dual slot (almost) full length card WITH relevant power and cooling. A dual USB 3 to thunderbolt adapter, single 7200RPM hard drive and enough lanes tied to it to achieve close to desktop performance all in a box you can hang off the back of a monitor is going to clean up.

    Have the laptop set to boot from thunderbolt first.

    Coming home with a laptop, pushing 1 plug in it and having it capable of near desktop performance with about £120 plus whatever you want to spend on a graphics card is going to quite compelling to a fair chunk of PC users.

    Even i'd probably make the switch for that (assuming I could throw a 2nd plug in to talk to the raid-5).

    Modular computing where you have a basic, "does general duties" core that's portable and once docked has available performance boosted significantly is almost certainly the future.

    I'd like to see the virtu tech being used with this to allow a brick to be plugged into a laptop and give performance boost to the built in screen as per the current desktop setup. I run with a pretty good performing laptop (5870, i5, 6GB) that's been shoe-horned into a 15.6" setup but something lighter and more modular would be awesome.
  • theunfrailhale - Sunday, June 10, 2012 - link

    I would like to see a new LCD display that includes a GPU port. You're running power to the monitor, you could incorporate the GPU into its power scheme and get all the wattage you need for a good card. Slap some USB ports on the side, and a TB cable to your ultrabook of choice and voila! plug in one cable and now your laptop is a rough equivalent of a desktop.

    ... ok... you probably want to charge your laptop too, so thats two cables to plug in. =/
  • ErroneousUsername - Monday, May 27, 2013 - link

    Maybe I'm being naïve but couldn't you dedicate a 500W desktop power supply to the graphics card via the 6/8 pin adapters and run dual thunderbolt to alleviate the bottleneck of the controller a little? the laptop should only have to power the cable itself I would think. Then you could run a fully powered HD 6950 or so on 4 lanes of pcie 2.0. I don't know if that would significantly out-perform the GTX 680m or not but I would imagine so. On a second thought though, I believe they now have laptops with SLI GTX 680m's or Crossfire HD 7970m's. I'm not sure where they benchmark compared to a PCIe 2.0 x4 desktop card but I think they'd be able to handle quite a bit. So maybe thunderbolt will be primarily for storage and peripherals and everyone will just have to buy a laptop with better dedicated graphics. I had high hopes for thunderbolt driven exterior graphics solutions but it appears that there may be too many roadblocks. Reply
  • Jblagden - Saturday, August 17, 2013 - link

    I recently emailed MSI about it and the sales representative said that MSI will not put the GUS II into production because it "Did not meet our production criteria to launch to market as a viable product.".

    It looks like MSI may have underestimated the amount of people that want to buy the GUS II.

    It looks like the only way to convince MSI to produce it is through this petition:
  • Jblagden - Wednesday, September 04, 2013 - link

    I just found out that the other reason why they haven't released it is because Apple hasn't tested it yet. That's a problem because the MSI GUS II mainly appeals to Mac users. The GUS II doesn't really appeal to too many PC users. This is partially because there are a lot of people who have a 13 inch MacBook Pro or a MacBook Air and would rather pay around $200 for an external graphics card instead of having to pay $2500 for a Mac Pro. Also, with IBM Clones, there are gaming laptops, while there aren't any Mac gaming laptops. So, we'll have to find out when Apple will be able to test the GUS II. Reply
  • Thatguy97 - Wednesday, July 01, 2015 - link

    still cant do this... Reply

Log in

Don't have an account? Sign up now