The Wonderful World Of USB Type-C

Despite becoming common over the last few years USB-C remains a bit of a mystery. Try asking someone with a new blade-thin laptop what ports it has and the response will often include an awkward pause followed by “USB-C?”. That is unless you hear “USB 3” or maybe USB 3.1. Perhaps even “a charging port”. So what is that new oval hole in the side of your laptop called? And what can it really do? [jason] at Reclaimer Labs put together a must-read series of blog posts in 2016 and 2017 plumbing the depths of the USB 3.1 rabbit hole with a focus on Power Delivery. Oh, and he made a slick Easy Bake Oven with it too.

A single USB Type-C connector

When talking about USB-C, it’s important to start at the beginning. What do the words “USB-C” entail? Unsurprisingly, the answer is complicated. “USB Type-C” refers only to the physical connector and detail about how it is used, including some of the 24 pins it contains. Then there are the other terms. “USB 3.1” is the overall standard that encompasses the Type-C connector and new high-speed data busses (“USB SuperSpeed” and “SuperSpeedPlus”). In addition there is “USB Power Delivery” which describes power modes and even more pin assignments. We’re summarizing here, so go read the first post for more detail.

The second post devotes a formidable 1,200 words to providing an overview of the electrical specifications, configuration communication, and connector types for USB 3.1.

A GIF of a flipping USB Type C connector
Marketing at its finest

The third post is devoted to USB Power Delivery. Power Delivery encompasses not only the new higher power modes supported (up to 100W!), but the ways to use the extra 10 or 13 pins available on the Type-C connector. This is both the boon and bane of USB-C, allowing apparently identical ports to carry common signals like HDMI or DisplayPort, act as analog audio outputs, and provide more exotic interfaces like PCIe 3.0 (in the form of Thunderbolt 3, which is a yet another thing this connector can be used for).

It should be clear at this point that the topics touched by “USB Type-C” are exceptionally complex. Save yourself the trouble of a 90MB specification zipfile and take a pass through [jason]’s posts to understand what’s happening. For even more detail about Power Delivery, he walks through sample transactions in a separate post.

58 thoughts on “The Wonderful World Of USB Type-C

  1. What blows me away is the shortsighted vision of the engineers when they made USB-C. Its grate you can plug the one end in any direction. But Damit! why didn’t they do that to the other end as well. Sheesh, what where they thinking, or lack of it.

    1. That’s a USB A to C adapter cable. If you have USB-C on all of your devices and charges, then all you need is a bunch of cables that are USB-C on both ends. You wouldn’t even have to worry about which end of the cable is connected to which device.

      I do agree that the USB Implementers Forum did a bad job of making a graceful transition. They suffer from Not Invented Here Syndrome, and did a purposefully bad job of making USB-C devices work with legacy device features that they didn’t originally define, especially charging.

      1. I bought a couple of USB 3 hubs with VGA port from an auction. Turned out the upstream end is USB C. So I bought a 3 to C adapter and… they aren’t recognized at all when connected to a USB 3 port. But plug them into a USB 2 port and the hub works in USB 2 mode. VGA doesn’t work at all. Didn’t care about that, was just wanting a cheap 3 hub.

        1. USB 3 standard connectors do not carry the sideband signals that allows reconfiguration of the USB-C ports. To have full functionality, you may need to actually plug the USBC connector into a true USBC port.

          1. Reminds me of that massive screwup at work.

            We’re currently building a new office on one of our locations, good for about 1000 people, more if you start cramming. Anyways, some parts of said building are what’s called shared desk areas. Which means the people there have no own desks that they always sit on, but pretty much anyone can choose a different desk every time he comes into office. This is supposed to avoid having empty desks when people are working from home or whatever.

            What these desks all have are a pair of screens and HIDs attached via USB3 dockers. They work pretty much like the good old docking stations you might know from IBM/Lenovo notebooks, except that these attach via USB3. They provide a few USB3 ports, HDMI/DVI and some more, all via a single cable attached to the laptop. We know from testing and experience that these work when attached via a USB3AUSB3B cable.

            Welp…when planning said new office, these dockers were decided to be integrated in the desks, with the only thing accessible to the person occupying said desk being the USB3A end of a cable. When ordering 250 of said cables, someone screwed up – tbh, i would’ve, too – and ordered ones that only work partially.

            Partially as in: Everything works, except for the darn HDMI/DVI ports.

          2. Phrewfuf, that’s the perils of being an early adopter. I’m a geek, and I wouldn’t have gone with that “solution”. But still, how nice of the USB IF to have incompatible, but only partly incompatible, standards where one’s called “3” and one’s called “C”.

            Maybe they were pissed off at, with USB 1 and 2, having invented something that actually does work nearly all the time. A true success story in getting rid of the rainbow of different ports previously on the back of PCs. Now 4 or so ports is absolutely fine and adding, say, a Blu-Ray drive to a laptop is as simple as plugging it in.

            With USB C or possibly 3, they headed a step further in trying to eliminate the power adaptors needed to run your printer or Blu-Ray drive. PCs would have approached being… convenient!… if only by another step. So obviously they needed to fuck it up and increase the confusion quotient, or else all the tech support jobs they rely on in an emergency would dry up.

            That, or the Power Supply Brick Manufacturers’ Consortium sneaked spies in there and nobbled them.

          3. lol, if it was just about power, or even PCI-E, or even HDMI, how hard would ita been to use the same ol’ connector with *expansion* connectors to the sides? Coulda put it all in one connector shell, and would be easy, then, to discern *which* add-ons are supported by the device, etc. and adaptor-dongles wouldn’t be necessary… though *could* be used to e.g. send HDMI from a regular source like a vidgame or separate graphics card to a device with a USB-C connector, or supply high power via a wallwart when the other periphs are connected to a device that can’t supply the power.
            Nah, this whole mess reaks of the sorts of proprietary hack-jobs they use on PDAs and Cameras, etc. to add additional dongle-features and requiring specific cables for each device and peripheral.
            Some “standard…” People keep saying “when” USB-C becomes commonplace, but it’s surely had plenty of time to do-so already. So a better question is *why* it hasn’t.

          4. BTW all these varying-voltage power supply shenanigans, to send more power without needing thick wires, rely on cheap and simple switch-mode power supplies. The technology for those has been around for decades. How come it’s only recently that they’ve crept into everything? I’m sure early mobile phone chargers (say mid-1990s) were often linear PSUs with a big transformer. Some computers in the 1980s, but not most, went with switchers. Why did they take so long to become widespread? Now you can even replace a 7805 with a switcher that’s cheaper and more efficient.

          5. Greenaum: why more switch mode power supplies? Because of efficiency requirements and improvements in the technology behind switching supplies. Higher switching frequencies allow smaller components, along with smaller heat-sinks. the transistors involved mostly consume energy (and produce heat) when in the act of switching. Spend less time switching (faster transistors) and you produce less heat. The inductors involved can then be smaller, because they don’t hit their saturation limit- high current leads to high magnetic fields, and the inductors become nonlinear and less… inductory … at high magnetic fields… in general, faster is better, if everything else can support it.

            To slightly modify what you said- the tech behind it has not been around for decades- the concept has been around for decades- it is only more recently that the technology has improved enough for the switching power supplies to be that magic combination of more efficient and more reasonably priced.

      2. I’ve worked on USB-C devices & cables, poured over the spec, even been to a couple USB-IF interop sessions. In my experience there is no “Not Invented Here Syndrome” – the problem is those that try to include things outside the spec most often do so *only* for their benefit, blocking other things that are already in the spec. You can strive for universal backward compatibility, but it most likely will come at a high cost- so high that few will implement- and when few implement, the spec dies- so everything must be balanced. When you strike out on your own, you must be accepting that not everyone will follow you.

        At these interop events, I was surprised by the lack of ego- particularly from the fruit-company.

    2. I feel like I am missing something. Not sure if you are trolling or if I have a basic lack of understanding of the USB-C connector spec. Or maybe the joke flew so far over my head I didn’t even hear it go by.

    1. No, but allow me: the Nintendo Switch PSU does not follow the USB PD Source Rules, and consequently is not USB-IF certified. Much (but not all) of what we experience in the form of poor inter-operability with the Switch ecosystem in particular, is due in parts:
      1) multiple ASIC and USB PD firmware vendors being utilized to implement and each of the USB-C/PD ports and only validated within this specific ecosystem

      2) expressly rejecting fundamental specifications which would have otherwise provided for mass-interoperability

      e.g.: the USB PD Source Rules (provide increasing monotonic power at designated fixed voltages, based upon the max voltage of the PSU – the PSU would have then included appropriate support for each of 5V, 9V, and 15V at corresponding current limits).

      Also, the DisplayPort on USB Type-C Alternate Mode specification was not followed and instead implemented in proprietary fashion.

      1. This statement was supposed to read: “e.g.: the USB PD Source Rules (provide increasing monotonic power at designated fixed voltages, based upon the max –power– of the PSU “

      2. Let me see if I can translate this from engineer-ese:
        The Switch peripherals were built to a price with lowest bidder parts that just barely managed to do what Nintendo wanted (charge the Switch) so the fact that it doesn’t have the capacity to identify and charge other devices is no surprise.
        Is what I think that means.
        A more optimistic interpretation of events would be that Nintendo seeing that they have no control over what devises are attached to this standardized power connector had visions of houses in flames and people blaming their Switch dock rather than the cheap Chinese phone plugged into it and so to calm their quaking nerves they had the USB-PD Device Policy of the dock configured to only charge the Switch.

        1. Nintendo… ANYTHING they don’t control… creates reflex-jerks so strong in their management, that they often require spinal surgery to repair the damage. Nintendo have always hated standardised, open anything.

          Partly it’s to do with fighting piracy, although realistically using non-standard thingies is only one small hurdle for the geniuses the pirates employ, and of course every geek loves a challenge, especially when Nintendo have a history of being arsey in all sorts of other ways.

          I suppose there’s the possible bad press of someone using a shitty Chinese replacement charger. But that also applies to every mobile phone, particularly in a year or two when USB-C hits the mainstream. I would hope the idiots currently doing the jobs of journalists would be able to distinguish between something that’s Nintendo’s fault, vs something they have no control over.

    1. https://xkcd.com/927/

      USB was supposed to be UNIVERSAL. Just two types of connectors, A for the host end. B for the device end. Then someone wanted a smaller device connector, thus was born Mini-B. Camera companies wanted to continue being proprietary and sometimes incorporate AV functions so there’s 2 or 3 digicam USB port variants. The first USB MP3 player (RIO PMP 500) used one of the digicam type connectors but disconnected the 5V line. Don’t just plug in any cord that fits, the 5V could fry it.

      The USB 3 Micro B and full size B connectors achieved backwards compatibility by simply grafting the additional contacts for 3 onto the top (full size) or beside (micro) the other connector and USB 2.0 B plugs will go right in. Where’s the USB 3 Mini B? I’ve mostly seen the full size USB 3 B on POS and industrial equipment.

      Then comes 2007 and phone companies want to go thinner, so here comes Micro B connectors. Nevermind that almost every cellphone from 2007 to this day is plenty thick enough to accommodate a Mini B connector. Micro is *thinner*!

      Now we have USB C which is *supposed to be* compatible with previous USB standards with adapters, but some devices just won’t work connected to a C port with a 2.x or 3.x adapter.

      1. USB Mini connectors were deprecated because they were so awful mechanically weren’t they? There isn’t a Mini USB 3 connector because they had already been retired. (They were unreliable and had poor retention functionality)

          1. Not a joke. The mini connector was deprecated a while back. I’ve had a number of early USB 2.0 external hard drives with mini connectors suffer from ‘loose socket’ issues where you have to be careful with plug insertion because the socket is not mechanically good at keeping the cable securely connected. The socket is the bit you really don’t want to fail (as it’s much more difficult to replace than a plug on a cable). One failed early enough it was still within a 12 month warranty, others I just retired (shucking the drives where possible)

            There was a sneaky re-design of USB Mini connectors after the standard was ratified but it didn’t help much.

            Micro USB has been much more reliable for me – I’ve not had a single failure with USB 2.0 or USB 3.0 Micro-B plugs or sockets. The Micro-USB design includes a plug-integrated latch that ensures a decent connection, and most failures I’ve had had e been to do with the plug not the socket (where you really don’t want a failure) and I’ve just binned the cable then.

          2. I’ve had the opposite experience to Steve, and in line with duh. I’ve had multiple micro connectors (sockets and plugs) fail, but only one mini (plug, no mini sockets).
            Bit like Apples move to remove the 3.5 audio socket “because it was too big”. Ever heard of 2.5mm sockets?

          1. USB Mini is only rated to something like 1000 insertions, micro is rated to 10000. Micro was also designed so that the moving parts and latch are done by the cable so that the cable will be more likely to fail than the connector, leaving the device functional but you needing a new cable which was generally cheaper at the time.

        1. All my hobbyist stuff uses mini, and I’ve never had one fail. I’ve replaced tons of micros, and thrown out a few devices because the failed micro was located too close to other stuff and I destroyed it trying to replace it.

          Mini was/is far, far, far more robust than micro.

        2. I got a Note 8 and the USB-C port on it is busted and it’s not even a year old. I have ZERO faith in such small connectors. How the heck do you push 100W of power plus data over such small contacts especially when (irrespective of what the spec says about how mechanically robust it is) the connectors break so easily.

          I do software development work, requiring a USB teather to deploy and debug by default needs to die a horrible violent bloody death, especially when you can put ADB in TCP mode

          1. You push 100W of power mostly by increasing the voltage – you only get 100W when at 20V- that is still 5A, so the cable should be electronically marked as well to show that the conductors are sufficient and not a fire hazard.

        1. Sounds a bit like any other “designed for” claims, from Amp-Hour ratings between different battery manufacturers, to LED luminosity. Experience shows otherwise. Different designers/manufacturers use different metrics, different test methods, have different expectations of real-world [ab]use.

          So, basically a rewording of your statement: Mini wasn’t designed with a specific MTBF in mind, whereas micro was.
          Marketters can use that to make micro sound better, and now even standards-organizations fall for the hype. Experience shows the opposite, from micro connectors ripping from boards, to the shell splitting at the seam, to the flimsy plastic insulator/key cracking… and that’s just at the PCB side. The cable-side connectors fail at remarkable rates, and even high-cost allegedly high-quality cables seem no better than cheapos.
          “Designed for failure” is, I think, the key. C doesn’t look any better.

  2. It was quite a surprise to me how hard it is to buy a simple USB-C to USB-C cable. Mostly you find USB-A to USB-C, which has its uses but is not what I want. To get USB-C on both ends, you generally have to head over to the “something to do with Apple” section and be prepared for a pretty stiff price jump.

    OTOH, USB-A and USB-B (micro) to USB-C adapter doohickeys are plentiful and cheap. They’re just inelegant and possibly sacrifice … something.

    1. Depends on what you expect your USB-C to USB-C cable to do. The shitty ones for the shitty phones are pretty much just regular USB 3 cables with different connectors, then there are the better ones with all pins of the connectors properly wired, then there are the ones with beefier wires for high wattage PD and marker chips and then there are the specialty ones with transceivers e.g. for Thunderbolt-3 support.

      But yeah, if you want something that just works you better head for the Apple section; at least there’s no risk you’re going to pick up a shitty half-assed Android phone charging cable or one that is not suitable for high wattage PD…

        1. This cable will at least enable DisplayPort video out of a Thunderbolt port. For true Thunderbolt support through a cable, you have to buy a Thunderbolt-certified cable which will have the required active circuitry inside the cable.

    2. We ended up designing our own cables for a customer, and had to purchase the USB-C subassembly from a source in China (min was 2500). It was a steep learning curve, but also found the USB-C form factor supports USB 2.0 data.

      Also learned a 2.0 rated cable had 4 plain wires, with no twisted pairs for data and no shielding. Was just shocked. No one in China would ever label something USB and not comply with the standard…

  3. I only have one C device, my LG phone and it infuriates me with a condescending message about using the original charger and cable which I am using. The USB C port has only ever worked with the cable one way round so it is even worse than a micro USB because I cannot tell with looking at it which way to plug it in, I have to suck it and see and get that infuriating message if I guess wrong.

    1. On the other end I bought a pack of micro USB cables a year ago and was kinda surprised when I opened to box and saw some strange looking connectors. Turned out to be reversible micro USB connectors (still plugs into a standard micro female connector) that I never knew was possible and they work extremely fine while just having a little weaker connector retention when mating which in term of strength feels more like pulling out a mini USB than a micro).
      Anyway, they actually turns out to be quite solid but could probably break if you manage to bend it strongly enough while inserted.

      1. And USB-A side is probably reversible too. I’ve seen such cables, but I don’t like cables that completely ignore specifications to be more versatile. In USB-A reversibility is achieved by using thin PCB with contact on both sides, so the tension is sacrificed compared to normal USB-A. Probably it’s the same with USB-micro side.

    2. I’ve always marked my USB connectors with a stripe on the top side using a felt-tipped red pen. I never have a problem other than the stripe need to be renewed occasionally.

    3. I see that felt tipped pens have been already been mentioned.
      Cheap ~ dollar store fingernail polish is handy for marking & color coding things also.
      Just be sure to clean the intended surfaces beforehand to get good adherence.
      Just keep in mind there is a volatile flammable solvent in the polish and the remover.

  4. The article (and blog and comments) seem to ignore that there’s another wrinkle with USB-C connectors…The animated GIF (or is it “gif”?) illustrates this.

    Thunderbolt-3 uses the USB-C connector but has additional capabilities and requires a different connector (which I am guessing has a chip in the cable to identify it as a Thunderbolt-3 cable.) Thunderbolt-3 ports can, of course, talk USB 3.1 and below but the reverse is NOT true, you can’t send network or video over a USB-C cable. The cable in the animated GIF has the Thunderbolt icon on it, making it a Thunderbolt-3 cable.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.