Thunderbolt is for Apple

Thunderbolt is that incredible interconnect standard that promised speed and scale and very truly delivered on it. Apple and Intel collaborated to launch Thunderbolt as a PCIe and Display Port interface in early 2011. The initial rollout was on Copper cables (still in use today) which afforded the standard 10Gbps simultaneous symmetric throughoutput. This has been followed by optical cables for high-end applications and the imminent launch of Thunderbolt 2.0. Thunderbolt 2.0 combines the independent channels together to permit a max bidirectional throughoutput of 20Gbps, enough to drive 4K displays daisy-chained with other hardware. The competing standard at the time of Thunderbolt’s launch, USB 3.0, capped out at 4Gbps and it’s only this year that the next iteration, USB 3.1, was announced, with transfer rates inching close to Thunderbolt 1.0.

With respect to transfer speeds and performance, there is no match for Thunderbolt in any external interface standard. However, stories about it’s failure at widespread adoption are frequent. Ars attributed this to Intel’s firm grip on the standard and its lack of interest in working with 3rd party manufacturers.

The one final factor—one that has likely had the most impact on Thunderbolt rolling out to market—is Intel’s licensing and certification process. Several vendors we have spoken to over the past year have claimed that Intel was holding up the process, cherry picking which vendors it worked with.

OS support, cost of the controllers and lack of resources on Intel’s parts are additional explanations. All of these are very addressable. The cost of controllers will drop with scale. While the initial run might bring in losses, as production scales, yields will improve and costs will go down. Intel, of everyone else, must be familiar with and used to this. Every new processor architecture undergoes similar scaling. OS support will inevitably come along if there is enough hardware in the market. It’s a matter of an OTA patch. Then, Intel employs about 106,000 people (!!!) and spent $12 Billion in CapEx in 2012. If Intel wants something to happen, it can make it happen[1], including making Thunderbolt a wildly adopted standard. But that hasn’t happened. This oddity lends itself to speculation as to why?

Thunderbolt is for Apple

Beginning with the iPhone 4 in 2010, Apple has been relentlessly trying to up the ante with Retina Displays across the board. The force behind the push was felt when the iPad 3 had to go slightly heavier and run a bit hotter than usual to accommodate the battery-hungry Retina Display. Successively, Apple launched the Retina MacBook Pros and rumors of a 12" Retina Air abound. New Mac Pros are supposed to be unveiled this coming Tuesday and with the insane amount of graphics power that they carry, it seems logical to assume that a 4K display from Apple is imminent, if not in 2013, in early 2014. Additionally, the not-so-extensible Mac Pro will rely very heavily on connected peripherals for storage and media access. USB 3.0 or 3.1 barely scale to support external scratch disks for video-editing and other data-driven workflows. Thunderbolt is what makes the sleek form-factor of the new Mac Pros possible. Thunderbolt is what will make a new generation of retina cinema displays possible. Thunderbolt is what makes external storage practical for SSD-phile anorexic MacBooks.

Thunderbolt is a crucial cog in the very finely balanced movement that drives Apple’s technological marvel like clockwork. It was made for Apple.

Thunderbolt is not for everyone

Thunderbolt is an enabler of a standard. It makes sleek form-factors practical and extensible. And others, besides Apple, stand to benefit from it. It can allow Windows machines access to some of the fastest external storage and, given sufficient graphics power, make them instantly compatible with Thunderbolt displays which can be daisy-chained into an elegant setup. However, the race-to-the-bottom strategy in the WinTel space erects a threshold on this path. OEMs can only implement Thunderbolt, a relatively pricey component, if it’s so popular that it’s the only standard they need to support for hi-def output (not true today) or if it’s backwards compatible (which it is not) or if it’s so cheap that that it’s worth that check-mark in a feature-comparison list (not today or anytime soon). Simple cost/benefit analysis. Today, USB 3.0 is good-enough for them and USB 3.1 will, again, be good enough against Thunderbolt 2.0. Bargain basement machines and most Windows devices are just that, good enough.

Limited Thunderbolt adoption keeps prices high. This gives Apple another one of those advantages that only Apple can afford and which delight users. Where other OEMs have to resort to new but backwards compatible standards like HDMI 4.1, Apple can seamlessly support external retina displays, insanely fast external storage, daisy-chained setups with minimal cable-clutter, and create unbelievably beautiful devices like the new Mac Pro.

As for third party devices for the Mac, there are plenty of them with new ones coming out everyday. They carry a premium in their price and performance and cater to customers who are willing to pay for quality, simplicity and reliability. They are beautifully designed and some are even explicitly advertised as Mac-only. This makes them relatively inaccessible to WinTel machines and augments all-Apple setups where devices and software form an experience that is greater than the sum of their parts.

Apple is in no hurry to popularize Thunderbolt. FireWire set a precedent for this strategy and Thunderbolt will advance it. If tomorrow Intel decides to take Thunderbolt out for a ride in the wild, Apple will have already reaped the benefits of being the only ecosystem to have supported such a seamless and high-performance standard for years.

Isn’t this beautiful.


  1. Except for break into the mobile-chip industry, which is so tightly held by ARM, with its x86 architecture.  ↩