Back to Article

  • Hrel - Tuesday, April 09, 2013 - link

    Until it comes standard on all motherboards and SOC's for all things, who cares? Reply
  • name99 - Tuesday, April 09, 2013 - link

    Uhh, what?
    This is a blog devoted to telling us the FUTURE of tech. What do you expect?
    Do you make the same comment when Anand discusses Haswell (which I can't buy yet) or 802.11ac (which has only just become buyable) or next generation LTE?

    There are plenty of complaints you can make about TB, but yours has to be the most foolish.
  • SomeoneSimple - Tuesday, April 09, 2013 - link

    In Hrel's defense, if anyone outside of fruity companies would actually make an effort supporting Thunderbolt by implementing this in both devices as well as peripherals, information about its successor would be a lot more exciting. Reply
  • repoman27 - Tuesday, April 09, 2013 - link

    The same could be said about laptop displays with resolutions higher than 1366x768 that come calibrated from the factory.

    Hrel's comment isn't really defensible.
  • Tams80 - Sunday, April 14, 2013 - link

    Thunderbolt is essentially the Apple implementation that Intel are now using to describe it as a whole. Light Peak is the name for the actual technology (though I guess at least in terms of publicity this has been dropped) and had other implementations, such as on the Sony Vaio Z 2011. It used a USB 3.0 port and another pin connector.

    If I recall correctly, Intel aren't exactly in favour of this, as Alexvrb pointed out, because they pretty much lost control of USB.
  • epobirs - Tuesday, April 16, 2013 - link

    Intel never had control of USB. It didn't start at Intel and has always been a collaboration of multiple companies, although one of the five founding members (Compaq) acquired another of the founders (DEC). The USB TA has always had final say over any additions to the USB standard and controls the logo certification process. Intel wield a lot of power within the TA but they always wanted it to be independent for the purpose of encouraging wide adoption.

    Intel simply doesn't want multiple versions of the connectors creating confusion. Sony has done this before. I have an old VAIO laptop that has a four pin 1394 port and a 2 pin power port next to it. Using a cable that combines the two you can connect a standard six pin FireWire cable. It's a bit of a hack because Sony got impatient waiting for the FireWire connector spec to be finalized.

    Compare the history of USB to FireWire and you'll see why one took off and the other remained a niche technology. It better served the purposes of the companies involved to not kept close control and try to monetize the technology itself rather than take the long view of benefiting from the overall improvement to their products.
  • Alexvrb - Tuesday, April 09, 2013 - link

    Intel worked with Apple to make TB an Apple exclusive for a while. Between the exclusive head-start, their high-cost proprietary machines, and Apple's constant need to sell new devices with interfaces that won't work with older hardware, it's pretty obvious why Apple is such a strong Thunderbolt supporter.

    Not to mention that Intel was never interested in an actual standard that anyone can implement. They'd rather use their muscle to push TB on the world and sell TB controller chips, and prevent other parties from making compatible controllers.
  • inovice - Wednesday, April 10, 2013 - link

    Actually the whole concept behind Intel`s Light Peak was to have one cable/connector to connect to multiple devices. I know someone who was working on Hybrid Silicon Laser technology and research (back in 2007). It was brought up by Steve Jobs himself, while having a chat with Intel folks. Apparently Steve complained about laptop ports (need USB, display, Ethernet etc..) and was seeking for alternatives. I think Apple was still working on MacBook Air prototype around that time (2006/2007). Apple had their hands on TB first before anyone because the idea to simplify laptop peripherals came from them. TB was developed and tested at Silicon Photonics (Google it). Light Peak would have seen the day without input from Apple. Just wanted to share:) Reply
  • ssj4Gogeta - Thursday, April 11, 2013 - link

    AFAIK, Apple only designed the connector. Reply
  • russdarens - Monday, April 15, 2013 - link

    "High cost proprietary machines"?
    Apple laptops dont cost any more than comparable HP or Dell business class laptops. And there is nothing proprietary about a Mac. It's Intel hardware. You can load it up with windows and it will run just fine as a windows computer.
    You may as well be complaining about Subarus that have that dang 5 bolt lug pattern so they can't fit Honda wheels.
  • repoman27 - Tuesday, April 09, 2013 - link

    You are right, comrade. We should only concern ourselves with technologies that are provided to all at no additional cost and that are capable of operating within the most meager power of power budgets.

    Then again, even Volkswagen produced the Phaeton.

    Anywho, a bit of a bummer that Falcon Ridge appears to be Redwood Ridge with channel bonding, but I guess it makes sense. I'm also a little concerned after watching the video of Intel's Falcon Ridge demo that the PCIe back end is still 2.0 x4, seeing as 2 SSD's capable of 1 GB/s each only hit 1259.87 MB/s combined.
  • p_giguere1 - Tuesday, April 09, 2013 - link

    Why would it come standard on your average cheap laptop when it only targets a niche market? Consumer products like external hard drives can't even saturate USB 3.0. Thunderbolt displays, arrays and external SSDs are too expensive to target most consumers.
    Should something not exist only because not 100% of people use it?
    I don't understand your logic, I basically just read that as "I won't use it so I don't care".
  • GNUminex - Wednesday, April 10, 2013 - link

    Not even high end mobile computers can be obtained with thunderbolt save one Vaio and Apple's laptops. With hardware like the apple monitor or sony's apparently short lived TB dock, you only have to plug in one cable to connect to all of your peripherals, and you can have expandability options not traditionally available in laptops. In the not too distant future ultrabooks are going to be fast enough but we will still be stuck with two usb ports and a minihdmi port and plugging in peripherals when using the computer at desk is going to be a pain. Reply
  • Tams80 - Sunday, April 14, 2013 - link


    It's useful for using several devices through one port, so that port can be saturated. It's also better for EGPUs, as ExpressCard does get saturated and is being used less and less by manufacturers, not to mention being somewhat clumsy when used for EGPUs.

    My worry is that manufacturers will go "Oh, you only need one or two ports now. You can just dock using the Thunderbolt port" and in the process forgetting that while mobile that is less practical than having multiple ports. That's providing Thunderbolt doesn't go the way of Firewire.
  • CSMR - Tuesday, April 09, 2013 - link

    Who cares? Displayport is the modern standard for computer video, and USB3 & firewire for data. This is no better than these standards and just adds extra connectors, extra processing, and incompatibility. Displayport can already do 4k at 60hz and 30bpp not that anyone uses it. Reply
  • A5 - Tuesday, April 09, 2013 - link

    1) Firewire isn't standard for anything. Even Apple doesn't have FW ports anymore.

    2) You have a basic misunderstanding of what TB is for if you think it's competing with USB 3.0.
  • p_giguere1 - Tuesday, April 09, 2013 - link

    Extra connectors and incompatibility?
    Are you aware that Thunderbolt uses a mini DiplayPort connector and is retrocompatible with it? It's basically just a mDP port that happens to transfer data really fast (if you want to). You can also still use it strictly for video outside without any downside if you wish.
  • p_giguere1 - Tuesday, April 09, 2013 - link

    *video output Reply
  • CSMR - Friday, April 12, 2013 - link

    I did not realize it uses a displayport connector. That will confuse users who will connect thunderbolt devices to displayport outputs. So using it as displayport brings no disadvantages? But you have an extra chip in the computer, adding cost and space and increasing power consumption, and an extra source of unreliability.

    A separate external PCI express connector and interface would have been a better idea than combining two very different connections with different protocols into one.
  • erple2 - Friday, April 12, 2013 - link

    It doesn't matter if you plug in a monitor to a tb port on a Mac. The hardware figures it out and does the right thing. It's a pretty nice solution from apple. I think that is the right way to implement it - the connector figures out what you've plugged in, mini display port or thunderbolt peripheral and acts accordingly. It probably adds a little bit of cost, but it is apple you're talking about, so... Reply
  • epobirs - Tuesday, April 16, 2013 - link

    Thunderbolt IS an external PCI-e connection. It is PCI-e over a cable and uses the same protocol. This is why it is so easily applied to things like external GPUs. Splitting off the DP signals from the Thunderbolt signals is already handled by existing adapters in the market.

    By having both on a single port it simplifies docking, especially when used with a monitor that has a USB 3.0 hub integrated.

    The extra chip is a temporary situation. It wasn't that long ago that having USB 3.0 in a system meant an extra chip on the motherboard. Now it's part of core chip set, which in turn is increasing integrated to the CPU.

    Millions of machines have eSATA ports but only a fraction of them are ever used. Just pointing it out can be a good way to get a blank star from a lot of people. But it doesn't add much cost and has great value for those who need it. How many PCs even need more than two SATA ports for the hard drive and optical drive? Many business desktops are so limited. But would you buy a motherboard today for a full size machine that didn't have at least six ports available? Even though you might never use more than half of them?

    When you're measuring in nickels and dimes across millions of units, you have to make educated guesses over what is simply a waste and what is a valued feature, even if it goes largely unused by a large portion of the consumers.

    Once the premium falls within a certain cost envelope you'll see nearly every new machine equipped for Thunderbolt. By our presence in this forum, we're the sort who tend to be a bit ahead of the curve and likely to discuss a technology before it is ready to be mainstream.
  • AggressorPrime - Tuesday, April 09, 2013 - link

    No progress since Thunderbolt first launched. Just an optimization that should have been there since the beginning. That said, if I am running a DP 1.2 signal on the up-link channel, can I split the 20Gbps down-link channel into a 10Gbps bidirectional channel and use that to manage a device that wants back and forth communication?

    Also, can I have 2 up-link channels running at 20Gbps, like to run 2 4K monitors from a single cable?
  • repoman27 - Wednesday, April 10, 2013 - link

    The only thing that has apparently not progressed one iota since the launch of Thunderbolt is the understanding of the technology by most of those posting comments about it online.

    Why would Intel have included DisplayPort 1.2 support in Thunderbolt controllers before they included it in the processor platforms they were designed to accompany? How could Intel have improved the power characteristics of their first gen controllers without performing these generational iterations and moving to progressively more efficient fabrication processes? You may as well say that Haswell should have been there since Sandy Bridge.

    That said, I believe the way it will work is that Falcon Ridge devices can connect to each other as one 20 Gbit/s, full-duplex channel, or connect to a previous generation controller via 2 separate 10 Gbit/s, full-duplex channels. Each direction in a channel can carry DisplayPort and/or PCIe packets. Each controller has a crossbar switch, so you wouldn't actually be splitting the channel, you'd just be using the switch to share the channel's bandwidth between devices.

    It is unlikely that Falcon Ridge will offer 2x 20 Gbit/s channels, and even more unlikely that Intel would ever offer a simplex or half-duplex Thunderbolt arrangement. Since there aren't any 4K Thunderbolt displays yet, running 2 would not be possible until such time as they become available (and one could afford them) anyway. You could run two 4K displays via a single DP 1.2 link, just not at 24 BPP, 60 Hz. Or you could simply use both ports of a host with two ports.
  • ShieTar - Wednesday, April 10, 2013 - link

    So, how many more generations until can start putting the GPU into the monitor? Reply
  • kwrzesien - Wednesday, April 10, 2013 - link

    Now we're talking! Reply
  • DanNeely - Wednesday, April 10, 2013 - link

    Ugh. If you did that you'd need to throw out your monitor every time you upgraded your GPU. Even a cheap crappy monitor costs at least $100; a good one will be at least that. A a stroke you've doubled the cost of a GPU upgrade. Reply
  • ShieTar - Thursday, April 11, 2013 - link

    No you would not. You would have to remove the old GPU from the monitor and place the new in. I see no technical reason not to make this easily achievable.

    The point is just, that at some point with increasing resolutions and refresh frequencies, we are likely to come to a point where we need to transfer much more data from the GPU to the screen than from the CPU to the GPU. And at the same time, we will build CPUs which are good enough for GPU-restricted tasks into more and more of our devices. So by moving the GPU into the screen, you can connect your Laptop or your mobile or your TV-Receiver to your screen, and it is always capable of 8K resolution at 200Hz (One might dream).
  • epobirs - Tuesday, April 16, 2013 - link

    High-end video cards are not small items. I wouldn't want to bulk up my monitor with such a thing. A small box inline between the user device and the display should serve just fine. It would be another role for a good docking station that takes your highly portable device and instantly turns it into a powerhouse desktop system.

    On the one hand, that means no need to sync data between a desktop and a portable. On the other, it means losing the use of both modes if you lose the portable portion.
  • Alien959 - Wednesday, April 10, 2013 - link

    What about upgrading notebook graphics? I have an older Clevo laptop with core 2 duo T7250 but with integrated S3 Graphics GPU if something like thunderbolt was around then probably it can bring new life in that notebook with external gpu. Reply
  • nickeditor - Thursday, April 11, 2013 - link

    Is not clear if current computers with Thunderbolt can bump up to 20Gb/s
    New chip implies new computers?
    Are they talking about Thunderbolt 2.0?
  • repoman27 - Thursday, April 11, 2013 - link

    The 1st-3rd gen Thunderbolt host controllers provide 2x 10 Gbit/s, full-duplex channels per port. It would appear from this post that in certain configurations, Falcon Ridge (4th gen) controllers will be able to bond those two channels to provide a single 20 Gbit/s, full-duplex link. It is highly unlikely that earlier controllers will ever be able to provide a 20 Gbit/s link in this fashion.

    According to Anand's post, Falcon Ridge won't be available until 2014, so we won't see PC's or devices with these controllers until then.

    Intel has been iterating Thunderbolt continuously, apparently targeting a refresh to accompany each new processor platform release. Calling Falcon Ridge "Thunderbolt 2.0" would be odd—sort of like calling Broadwell "Sandy Bridge 2.0".

Log in

Don't have an account? Sign up now