The promise and confusion of USB Type-C

Bob O'Donnell

Posts: 80   +1
Staff member

I’ll admit it: this is not exactly the sexiest topic in the world. But when it comes to the practical, day-to-day existence with all of our modern devices, connectivity is an important story. And when you survey the landscape of connectivity topics, it’s hard to ignore the impact that various types of USB have had. Sure, the multiple new wireless standards tend to get a lot more attention. However, for most people, wired connections between devices are still an extremely common means of making things work, and no wired connection is more ubiquitous than USB.

The latest iteration of the USB connector is called Type-C, and while it was officially introduced in 2014, it’s really just starting to appear on the devices we can buy and use. Apple’s 2015 MacBook was among the first to support the new connector, but it’s now showing up on all kinds of Windows PCs, smartphones, monitors, docking stations, storage peripherals and more. Like Apple’s Lightning connector, the USB Type-C connector is reversible, meaning you can plug it in in any orientation and it will work.

USB Type-C is also associated with, though officially different from, USB version 3.1, which is currently the highest speed iteration of the standard. It supports transfer rates of 10 Gb/sec, a nearly 1,000x improvement over the 1996-era USB 1.0 spec, which topped out at 12 Mb/sec.

Equally important, USB Type-C supports several alternate modes, most notably the ability to carry up to 100W of power over the line, as well as the ability to drive up to two 4K displays at a 60Hz refresh rate. Best of all, it can do this simultaneously with data transfer, allowing a single connector to theoretically now deliver power, data and video over a single line. Truly, this should be the one cable to rule them all.

The real problem is that there are no simple means of demarcation or labelling for different varieties of USB Type-C.

As we all know, however, there’s often a big difference between theory and practice. The crux of the problem is that not all USB Type-C connectors support all of these different capabilities, and with one important exception, it’s almost impossible for an average person to figure out what a given USB Type-C equipped device supports without doing a good deal of research.

The key exception is for Thunderbolt 3.0, a technology originally developed by Intel. It’s a different interface standard than USB 3.1, but uses the same USB Type-C connectors. Thunderbolt 3.0 connectors (which, by the way, are different than previous versions of Thunderbolt—versions 1 and 2 used the same connectors as the mini-DisplayPort video standard) are marked by a lightning bolt next to the connector, making them easy for almost anyone to identify. To be clear, however, they aren’t the same as the somewhat similarly shaped Lightning connectors used by Apple (which, ironically, don’t have a lightning bolt next to them). Confused? You’re not alone…

Arguably, Thunderbolt 3.0 is essentially a superset of USB 3.1, as it can carry full USB 3.1 signals at 10 Gb/sec, as well as PCIe 3.0, HDMI 2.0 or DisplayPort 1.2 video signals, 100W of power, and Thunderbolt data connections at up to 40 Gb/sec, all over a single USB Type-C connection. The only downside to Thunderbolt 3 is that it requires a dedicated Thunderbolt controller chip in any device that supports it, which adds cost. Also, full-bandwidth Thunderbolt 3 cables can be expensive, because they require active electronics inside them.

Standard USB Type-C, on the other hand, can be implemented by device makers a bit less expensively, and full bandwidth cables, while also active, tend to be cheaper than Thunderbolt versions. However, along with this cost decrease comes the opportunity for confusion. Just because a device has USB Type-C connectors does not mean that it supports power or any other alternate mode, such as support for video standards DisplayPort or MHL (used on some smartphones to drive larger displays). In fact, technically, it’s even possible to have USB Type-C ports that don’t support USB 3.1, although in reality, that’s highly unlikely to ever occur.

The real problem is that there are no simple means of demarcation or labelling for different varieties of USB Type-C. One of the goals of the standard was to produce a much smaller connector that would fit on smaller devices—leaving little room for any type of icon.

The other issue is that with the launch of USB Type-C, we’re seeing one of the first iterations of what I would call “virtualization” of the port. Until recently, each kind of port had its own connector and carried its own type of signal. USB carried data to peripherals, Ethernet handled networking, video connectors such as HDMI and DisplayPort carried video, etc. Now the rise of multipurpose ports such as USB Type-C have broken that 1:1 correlation between ports and functions. While this consolidation is clearly an important technical step forward, it also points out the opportunity for confusion if user education and basic labelling techniques are overlooked.

On the bright side, this “virtualization” of ports will lead to a wide variety of the most useful docking stations and port replicators we’ve ever seen, particularly for notebook PCs, tablets, and even smartphones. Now, you’ll be able to plug a single cable into your device and get access to every single port you can imagine, as well as providing power back to the device. We’ll also start to see new types of peripherals, such as single cable monitors that can also act as hubs to other devices, receiving power and video from the host device, while also enabling the connection of speakers, USB storage, and even a second daisy-chained monitor.

Eventually, most of these connections will likely become wireless, but given the need for power and the expected challenges around delivering wireless power to many devices, it’s clear that variations on USB Type-C, particularly Thunderbolt 3.0 and later iterations, will be around for some time to come.

The proliferation of USB Type-C clearly marks the dawn of a great new era of connectivity for our devices, but it may require a bit of homework on your part to fully enjoy it.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter . This article was originally published on Tech.pinions. Header image credit: Marques Brownlee.

Permalink to story.

 
All of that plus, not to mention that the cable sellers are making a fortune on this one too! You really have to wonder if we are getting truly new technology of it the cable maker is just the brother in law to these other guys and needs the work ........
 
What a mess. USB was supposed to make connections Universally compatible. If the manufacturers expect users to do their homework first, they clearly underestimate the amount of damage the average user can do.
 
Or you get people like Oppo/OnePlus that say they have a "usb-c" cable/connector on their device,
"first one" in the world, but then come to find out, their usb-c is just the CONNECTOR, not the port.
The port is still usb-2, causing confusion when their cheap cable didn't work right, and you went to
buy a REAL usb-c cable and it did or didn't work right, or you tried using the OnePlus cable on another
usb-c port and it wouldn't work.
 
Most of these problems sound like non issues. For example, if and when USB Type C becomes popular, I really don't think that the average person is going to get confused if the cable or device doesn't have a USB Type C label on it. How many people look at the USB 2 logo before they plug one of those cables in. If one of their phones comes with a USB Type C cord, their plug it in, if they plug that phone to a computer that has a monitor that's being powered by USB Type C, then that monitor came with is own cable that can supply enough power. How many average person is going to be connecting devices that require 50+ watts of power? For those computer literate people will, they will know. For those non literate people, the device that they bought will most likely contain the required cable. Oh no, but it's the same shape.

I mean, if the average person can survive plugging in their USB cables incorrectly 50% of the time, then this will be a non issue.

Some all current USB cables behave the same, companies will sell lower quality ones that don't work the intended way. For example, my phone wont tether if I plug it in with a lower quality cable. What am I going to do? Denounce USB because that has made it ultra complicated?

Imagine if we introduced computers to natives of a remote island. Will they want one specific cable for their video, one for Ethernet, etc. No, they will find that more complicated than what's being discussed here.
 
The thing that annoyed me about USB-C is I got it wrong. I read somewhere that USB-C was going to be this easy to plug in cable. My phone charger is USB-C no more is it fiddly and no more do I almost break that crappy connector that all other phones have. Finally something done right.

However, I also thought that the USB on PC Motherboards and cases was getting a new connection which would be so damn welcome. I just thought it was going to be easier both sides of the cable, and it is so much nicer having a connector on my phone that is how it should have always been finally here, so why are the pc connectors still, oh its the wrong way around. Cause it kinda ain't fkn UNIVERSAL is it.

Change it, change it now. Make more business by making everyone buy new motherboards and cases. DO IT!!!!
 
Most of these problems sound like non issues. For example, if and when USB Type C becomes popular, I really don't think that the average person is going to get confused if the cable or device doesn't have a USB Type C label on it. How many people look at the USB 2 logo before they plug one of those cables in. If one of their phones comes with a USB Type C cord, their plug it in, if they plug that phone to a computer that has a monitor that's being powered by USB Type C, then that monitor came with is own cable that can supply enough power. How many average person is going to be connecting devices that require 50+ watts of power? For those computer literate people will, they will know. For those non literate people, the device that they bought will most likely contain the required cable. Oh no, but it's the same shape.

I mean, if the average person can survive plugging in their USB cables incorrectly 50% of the time, then this will be a non issue.

Some all current USB cables behave the same, companies will sell lower quality ones that don't work the intended way. For example, my phone wont tether if I plug it in with a lower quality cable. What am I going to do? Denounce USB because that has made it ultra complicated?

Imagine if we introduced computers to natives of a remote island. Will they want one specific cable for their video, one for Ethernet, etc. No, they will find that more complicated than what's being discussed here.

I'm not sure you fully grasp the issue here. USB-C connectors are now able to provide data and power for all devices. This data can come in the form of standard USB data, video, ethernet, etc. How do you know by picking up a random USB-C cable whether that cable is capable of providing video?

You wouldn't

This issue actually happens with USB 2.0 as well. Manufacturers of battery packs often provide charging cables with said packs that only include the ground and power lines. A standard USB 2.0 cable has four lines: ground, data+, data-, and power. Modern phones use data+ and data- to communicate with the charger, and if this communication couldn't happen, the phone wouldn't charge. Now the average user is left high and dry wondering why their charger doesn't work, when in fact it's the cable causing the issue.

Now imagine you compound that problem with a cable that can supposedly provide completely universal connectivity. Then you have devices that use this cable and connector, but the cable is actually only USB 2.0 capable, but you're trying to use it with your monitor. Wouldn't you be confused when this setup fell on it's face?
 
Most of these problems sound like non issues. For example, if and when USB Type C becomes popular, I really don't think that the average person is going to get confused if the cable or device doesn't have a USB Type C label on it. How many people look at the USB 2 logo before they plug one of those cables in. If one of their phones comes with a USB Type C cord, their plug it in, if they plug that phone to a computer that has a monitor that's being powered by USB Type C, then that monitor came with is own cable that can supply enough power. How many average person is going to be connecting devices that require 50+ watts of power? For those computer literate people will, they will know. For those non literate people, the device that they bought will most likely contain the required cable. Oh no, but it's the same shape.

I mean, if the average person can survive plugging in their USB cables incorrectly 50% of the time, then this will be a non issue.

Some all current USB cables behave the same, companies will sell lower quality ones that don't work the intended way. For example, my phone wont tether if I plug it in with a lower quality cable. What am I going to do? Denounce USB because that has made it ultra complicated?

Imagine if we introduced computers to natives of a remote island. Will they want one specific cable for their video, one for Ethernet, etc. No, they will find that more complicated than what's being discussed here.

I'm not sure you fully grasp the issue here. USB-C connectors are now able to provide data and power for all devices. This data can come in the form of standard USB data, video, ethernet, etc. How do you know by picking up a random USB-C cable whether that cable is capable of providing video?

You wouldn't

This issue actually happens with USB 2.0 as well. Manufacturers of battery packs often provide charging cables with said packs that only include the ground and power lines. A standard USB 2.0 cable has four lines: ground, data+, data-, and power. Modern phones use data+ and data- to communicate with the charger, and if this communication couldn't happen, the phone wouldn't charge. Now the average user is left high and dry wondering why their charger doesn't work, when in fact it's the cable causing the issue.

Now imagine you compound that problem with a cable that can supposedly provide completely universal connectivity. Then you have devices that use this cable and connector, but the cable is actually only USB 2.0 capable, but you're trying to use it with your monitor. Wouldn't you be confused when this setup fell on it's face?

I guess we just have a different view of how such a thing will impact the average person. My view is that a monitor will come with its own cable. And once the average user sees that it works with that cable, and not all cables, they will learn as the technology becomes the norm. My view is such that I don’t see the average user messing around too much with other cables, and just always have their monitor connected with the cable that it came with.

I used a similar analogy that you used with USB 2. I connect my phone with a lower quality cable that came with another device, my tether doesn’t work with that cable. An average person will quickly learn not to use said cable, and understand that it’s the cable. And it won’t be a confusing issue.

I would even think that once USB Type C becomes mainstream, the average cable sold will take full advantage of everything it has to offer. I’m sure there not going to be many manufactures making USB Type C to USB Type C USB 2 cables.

I’m checking Amazon right now and there are plenty of USB Type C to USB 2 cables, followed be a (480Mbps). And if a customer is expecting USB C speeds and getting confused that they’re not getting that, then that’s just unfortunate for their lack of knowledge on the matter. Then of course if a customer is buying such a cable, it’s to connect it to a regular USB device.

So basically I just have a different opinion on how big of a problem such issues will be to the average person. And don’t see it as such a big negative.
 
I've purchased a USB 3.1 Type-C cable from Amazon.com (type-C to type mini-B) to use my Seagate External Backup+ HDD. Of course the USB HDD is USB 3.0 only but at least in this manner I'm using the type-C port on my motherboard that would remain unused for life otherwise.

If we are lucky to have USB 3.1 type-A connector on our motherboard then existing HDD & cable will work without any problems. However in both cases we are using a USB 3.1 port (10 GBps) at USB 3.0 (5 GBps).

Once this connector becomes a regular on all boards automatically its usage will become more.
 
TL;DR

A summary article with table which shows the difference of each connector type and its specifications would've been leaps and bounds better than writing a long-*** article. oh and images would've helped.
 
Back