This form of "authentication" smells like DRM, especially as they're using it to limit power and they're using Digicert to sign devices. And if you're speaking X.509 or something on both sides I'm intensely curious to see how vulnerable the attack surface is.
The "authentication" that would actually help is to default to only allowing charging through ports (and allow charging without authentication - who cares where I get 65 W of power from? are there malicious electrons?), and show a little "new device" authentication screen with a Bluetooth-style pairing process when I connect something new. Then either burn an asymmetric key onto the device or generate a random symmetric key (and leave the CAs out of it), and on the host, bind the key to the current function of the device, as displayed on the pairing screen. If I clicked "Trust this keyboard," don't let it turn into a CD drive. If I clicked "Trust these headphones," don't let it turn into a keyboard. And so forth.
I do want to be able to plug into public USB PD outlets, other people's chargers, etc. confidently, and there is a legitimate security problem there. Even though it's wired and physically adjacent it isn't necessarily trusted. But that can be done without DRM.
Even with some kind of authentication that doesn't change the fact that public USB outlets can still be rigged to fry your machine. The likelihood of it is probably incredibly small but I personally still wouldn't use public USB outlets.
I can trust a public outlet to not fry my phone. That is simple, if my phone stays functional I'm good, and if it gets fried I take some real world measures (go to small causes court, make a denounce to some government agency) to get compensation.
The real problem on public outlets is with data attacks, not power. If I can set my phone to ignore all the data at the first hardware level, and it is a simple enough level that no attack is viable (that second one being a large "if"), then there is no large problem anymore.
> That is simple, if my phone stays functional I'm good, and if it gets fried I take some real world measures (go to small causes court, make a denounce to some government agency) to get compensation.
This is much easier said than done (in the US at least.)
First, you'd be looking at at least a year of going back and forth in small claims court and by the end of it still have no guarantee of winning. Even if you did win, there's no guarantee the judge would award you the full amount to pay for your fried device.
It almost certainly would not pay your time at the courthouse though; you can ask the judge for punitive but you cannot include legal expenses in small claims court demands.
In the end you would still be out hundreds or thousands of dollars and numerous hours of your time. That's why to me the tiny benefit of using a public USB outlet is just not remotely worth the risk of such a nightmarish hassle.
This is the same thing as just plugging into a regular power outlet. You don't know if the thing you're expecting to provide 110v is actually giving you 10,000v.
Then again if things were really bad or malicious you could end up losing your life like Sheryl Aldeguer.
The real problem on public outlets is with data attacks, not power. If I can set my phone to ignore all the data at the first hardware level, and it is a simple enough level that no attack is viable (that second one being a large "if"), then there is no large problem anymore.
My old no-name Android phone (not USB C) has such an option, and searching around it seems to be a reasonably common feature, although not present on all devices. It looks similar to this (mine has the same typo, "USB fuctions"):
In charge-only mode it doesn't even enumerate as a USB device when plugged into a computer with an active USB controller, so I suspect no attacks (besides physical ones like overvolting as others here have mentioned) are possible in this mode --- the USB controller on the device is completely disabled.
>> If I can set my phone to ignore all the data at the first hardware level, and it is a simple enough level that no attack is viable (that second one being a large "if"), then there is no large problem anymore.
> My old no-name Android phone (not USB C) has such an option, and searching around it seems to be a reasonably common feature, although not present on all devices.
I use one of these. The nice thing about the newer versions is that you can verify the impossibility of a data connection through easy physical inspection.
The problem with that is you also block the pins needed to negotiate power delivery over USB C, so you're limited to 10 W (5 V * 2 A). Laptops that use USB C for charging use C and not micro-B precisely because they need more power than that.
Isn't that the wrong direction? My concern is not so much that my phone will show up to an attacker as a storage device, which can generally be restricted or be set to require pairing - my concern is that the attacker will show up to my phone (or laptop) as an accessory, in particular as a keyboard and mouse and perhaps an external display, at which point they can click through any permission or pairing prompts the OS might try to show.
If my Moto G5+ is locked then it seems that plugging in a keyboard behaves pretty much like plugging a keyboard into a desktop or laptop computer except that it doesn't respond to c-a-del.
I can unlock from the external keyboard by pressing the windows key and then typing my pin number but the multimedia keys and print screen on the keyboard work even with the screen locked.
I would rather that it didn't do any of that until I give permission on the mobile itself.
No such devices exist with USB Power Delivery support, I believe. (I believe you cannot make a passive adapter as you could for 5V, 2A-or-less plain USB A/B power connections; you have to speak the USB protocol and register as a device before you can switch to more than 5V or draw more than 2A. And plenty of people want more than 10W power.)
I also don't believe any OSes offer a mode where they negotiate power delivery and not data, though I'd love to be wrong - that would be 90% of my proposal. It wouldn't let you authenticate individual devices, but it would let you go back to having a single port that's just a charging port.
> you have to speak the USB protocol and register as a device before you can switch to more than 5V or draw more than 2A.
It depends on which "USB protocol" you're talking about. With USB-C, there are five separate data buses on the plug, each running its own protocol. There is the traditional USB 2.0 bus (half-duplex differential pair), a pair of differential dual simplex buses (normally USB 3.0/3.1/3.2, can also be used for alternate protocols like DisplayPort), a pair of sideband wires (used only by alternate protocols), and the Configuration Channel. All configuration of the other data buses and of the main power bus is done through the Configuration Channel, in simple cases through resistor values, in more complex cases through the USB-PD protocol.
So yeah, you have to speak the "USB protocol" to switch to more than 5V or more than 3A, but the "USB protocol" you have to speak is not the traditional USB 2.0 or USB 3.x protocol, but instead the completely separate USB-PD protocol, which runs on its own dedicated set of wires. I doubt you can show up as a pen drive full of malware through the USB-PD protocol, but I don't doubt that cutting the other buses (keeping only the Configuration Channel/VCONN and the main power bus) will still work to deliver power, even at higher voltages and/or currents.
The reason this isn't standardized (as far as I know, a "charging-only" plug is only allowed as a captive cable on a non-charger) is for compatibility: if your device doesn't understand USB-PD, it might not charge at a full speed unless it sees a short on the USB 2.0 bus (Battery Charging specification), and doing that short on an adapter would allow a device to draw too much power from a non-charger.
DC voltages below 50 V are considered "safe" in normal indoor conditions. It's often referred to as "extra low voltage", as opposed to "low voltage" (50 V - 1000 V) and "high voltage" ( > 1000 V).
If you want to verify the safety of 18 V: ever touch both poles of a 9V battery with your fingers? You can't feel anything. Go ahead and try two in series, you still won't feel it.
That's roughly the voltage that laptop AC adapters output, and they do so continuously without needing any special negotiation/signaling.
(Even those from manufacturers that try to lock this down using an EEPROM, e.g. Dell, still output their voltage like the others, but the laptop might not want to charge from an "unofficial" power source.)
Well, no. If 5A flows through you from hand-to-hand, sustained for longer than an instant, you're probably dead. But I've touched live power supply terminals for something rated for 20A without a second thought.
And if there's a 10kV differential between your left and your right hand, sustained for longer than an instant... You're probably dead. But I've touched a 10kV power supply (not on purpose, this one), and...still here.
So! What gives?
Well. Place a 0.1 ohm resistor across that power supply. 18V across 0.1 ohms will produce a current of 180A. This will dissipate more than 3kW in the resistor, and it will very rapidly disassemble itself. But that doesn't happen!
Okay, now put a 10 megaohm resistor across that power supply. It's 5A, right? So 5A across 1 megaohm will produce a voltage of 5000kV. This will dissipate 25 megawatts, and destroy the resistor even faster. This also doesn't happen!
In fact, what will happen in the first case is that 5A, not 180A flows (assuming the PSU doesn't detect the apparent short and shut down entirely.). But Ohm's law says that if 18V is placed across a 0.1 ohm resistor, 180A must flow!
Ah, but 18V isn't placed across the resistor. Instead, as current draw approaches the 5A limit, the power supply starts dropping the voltage. No more than 5A will flow, even if that means the voltage must be decreased.
And 5A won't flow through the 1 megaohm resistor, either. Instead, no more than 18V will be placed across the load, even if that means the current must decrease.
So, the power supply won't kill you UNLESS your skin resistance is low enough that 18V causes a lethal (very roughly roughly 1A) current to flow through you. Luckily, humans have a fairly high skin resistance, at least when we're dry, and 18V is fine.
It's not the volts that kill you, it's the amps which the volts are directly related to. 10kV across your body, sustained for a bit, will kill you dead, modulo some sort of miracle. 10kV at 1mA across your body ... is impossible. If you touch a 10kV, 1mA supply- the voltage will drop so that only 1mA flows.
It's not the voltage limit on the power supply that kills you. It's not the current limit on the power supply that kills you. Both need to be high enough to do you in.
I would like to peruse your electronic skills for another quite unrelated question that has been hovering my head for quite a while.
I know that normal USB (1,2) devices charge using the provided 5V rail off the USB connection. And I've read somewhere else that coming from a PC motherboard or other regulated (standardized) USB outlet, the port will provide about 500mA.
What would happen if i took a ATX PSU and rigged usb ports directly (at the correct power pins) to the psu's 5V rails?
Would I fry any devices plugged in? Or..
Would any plugged devices limit their intake and I'd be left with a 'super quick charger' which can charge any device at the maximum amount of power that the device can receive power? those 5V rails usually output at 30A+ and stuff.
thank you again, and hope you can shed some light into this long doubt of mine (I always 'dreamed' of creating some atx-frankstein psu which i could use for electronic experiments but also fit some usb ports in it).
1. Assuming you hooked the 5V to the USB 5V and the GND to the USB GND pins, nothing bad would happen.
2. Nothing would get fried
3. Plugged in devices would respond according to how they're designed. Most cell phones will limit current and only draw 500mA if they don't have any other signalling to test otherwise. Some will slowly ramp current until their own internal limits if voltage drop isn't too bad. Some 'dumb' devices will pull as much current as they 'want' or need. You won't be able to push 30A into any device because there are almost no USB devices that 'want' that much current. Even really poor behaving devices (outside of outright broken ones) will probably only draw 3-5A max. Technically this is non-compliant, since there are defined USB specifications for how USB ports should be connected.
If you wanted to make a 5V 'mega USB' charger, here's all you'd need to do:
Take your 5V, many amp power supply and connect all USB-A ports you want to it, with a constraint that number of USB ports should be total amperage divided by 2.5. E.g. if you had a 10A supply, only use 4 USB ports.
Short the D+ and D- pins together on the connector. 5V should be connected to your large 5V rail, GND should be connected to ground.
That's it. You've just made a 'DCP' (Dedicated charging port) device. By USB standards any device connected can pull up to 1.5A; in practice they'll usually pull 2.1A or more if they can.
In particular, people seem to use untrusted AC power outlets all the time without their devices being fried. The motivations of an attacker who wants to fry devices (mere lulz) and the motivation of an attacker who wants to break into working devices silently are very different; it makes sense to expect more of the latter, and to define the latter and not the former as within your threat model.
A fuse protects against overcurrent. It does nothing against overvoltage, or worse, reverse polarity (which was what fried Benson Leung's device). USB-C has a maximum of 20V, and that's only when negotiated by both devices; many devices will only expect 5V. If an untrusted low-quality power supply fails, it can put the full line voltage (alternating from -110V to 110V or more) into the USB cable. Protecting against that would add a lot of bulk and cost to a device. AC powered devices, on the other hand, already have to deal with the full line voltage (which includes negative voltages) all the time, so there's no extra cost for them.
I mean, they also can't solve a malicious attacker connecting a powerful taser above a USB port and trying to maim whoever uses the port. What's your threat model again?
Obviously. I mean that you can't protect the device from malicious power and usb outlets. And for non-malicious things the device should already have input protection, otherwise it's unsafe to use if it doesn't when plugged in and unsafe leaving it plugged in unattended (you can die from electrocution or it can cause fire).
A real problem for some is that standardizing power adapters interferes with the sale of original manufacturers' adapters. The authentication mechanism looks to be designed to fix this revenue leak.
Why would anyone do that? If you just want to vandalize things and have the physical access necessary to install a USB outlet, there are much easier ways to do that. Data attacks are much more realistic because the attacker can gain something beyond just causing destruction.
I am also suspicious of public usb charging ports. That's why I prefer to use my AC power adapter. Not only do I make sure to get full USB-C power capacity, I know nobody can hack my phone because the AC port has no data channel.
I don't know squat about USB C (I don't have any equipment that uses it yet), but with A & B, my solution to this problem is simple -- I use charging-only cables that don't pass the data lines through.
USB C has a power delivery spec that permits increasing voltage above 5V; in order to do that you need to negotiate what each side (and the cable, too, I think) supports before increasing voltage. That's how laptops that use much more than 10 W can use USB C as their charging port.
But that's only for high voltage charging, right? You can still get 5V without negotiation, if I understand correctly. If so, that's a tradeoff I would be perfectly fine with.
USB-C is being used to power things where 5V is not remotely sufficient. So while for some devices (cell phones, largely) 5V is a plausible tradeoff, I don't really see a world where many MacBook / Chromebook / etc users opt for 5V max charging power.
Fair enough, but I was really just talking about safety when using public charging points, which implies smaller devices. Also, I was really talking about what I find acceptable. Other people's mileage may vary.
Counterfeit and/or unsafe USB power delivery cables are a fire hazard today, whether or not a data channel is negotiated. How does disabling data delivery improve the safety of a PD-only USB connection over a physically unsafe cable?
This crypto-signature approach is already used by the Apple MFi program, which has made it difficult-to-impossible to sell counterfeit and/or unsafe Lightning devices.
I think we have completely unrelated threat models. I'm worried about malicious USB ports / devices trying to steal my data or at least make me have a bad day. If someone is trying to set a device on fire, authentication won't help. They just need to fail authentication and use that as the bomb's trigger.
You're worried, I think, about poor-quality devices that aren't malicious, just incompetent - or poor-quality cables. (I'm already willing to carry my own cable - I already carry a USB-A data blocker around, but USB-C data blockers don't seem to exist.) I think that can be solved without cryptographic authentication, with the standard approach of having a certification organization like UL or the USB-IF themselves investigate cable/charger production and test them. It seems the major problem here is with Amazon listing cables without showing whether they have certifications (or checking whether certifications are forged), and that's a general Amazon problem with poor-quality hardware and counterfeits.
This article is talking about autorun, Stuxnet, etc. so I think they're focusing on the latter. If the USB-IF wanted to use digital signatures to authenticate cables I'd be more on board, but if they do that, they should just be authenticating cables, not devices. (A cable, I believe, can be engineered to just cut the circuits if a device tries to pass too much voltage, or is wired backwards, or something.)
They do -- I carry one one of my employees made in our lab. Search for "usb-c condom" or "usb-c data blocker" and you will find several offered for sale.
The ones I've seen work by severing all of the data pins, limiting you to the old 5 V * 2 A limit. Have you seen any data blockeds that support the power delivery spec and are suitable for devices that need over 10 W (larger laptops, the Switch, etc.)?
I have one that negotiates power delivery on both ends. But I don’t believe it was manufactured commercially — it’s just a bare board with no housing. I’ll ask the guy who made it if he’d publish the design files.
I assume USB-IF wants all uses of the word USB with respect to any hardware - including cables - to be backed by a DigiCert signature, and all other uses to be rejected by non-developer-mode devices so their enforcement success rate improves versus generic knockoffs that can’t easily be sued.
Apparently not according to the Chinese, who made the first clone (based on an 8051 microcontroller IIRC) within months of the official release. They're still in an ongoing cat-and-mouse game with Apple.
Difficult-or-impossible to bring to market for sale to all iPhone users worldwide, not difficult to reverse engineer on a bench.
By controlling issuance of permission to use the connector, Apple is able to prohibit non-compliant products from becoming widespread worldwide. (And even China just last month declared their intent to make IP enforcement viable, which further strengthens Apple’s position with respect to MFi.)
The USB folks appear to believe that if they introduce a similar licensing arrangement for USB, they’ll be able to refuse licenses to unsafe implementations to deny them access to the worldwide USB market. MFi still works today, so I certainly don’t blame them for trying.
That is not a meaning I would have derived from my reply. You’re welcome to construct a scenario explaining what you mean, but the question alone offers no guidance on what you’re thinking to permit me to construct a reply.
You've argued that legal protections, including China's stronger IP stance, means that USB-IF can be confident about cracking down on counterfeiting, and that it doesn't matter that the connector is east to reverse-engineer. If so, how is a cryptographic authentication scheme as described in the article (including authentication of devices, not just connectors and cables) relevant, let alone useful? How does it help anti-counterfeiting and why are the legal mechanisms by themselves insufficient?
I'm not planning to follow up further on this thread, sorry. If you are able to define an anti-counterfeiting mechanism that does not depend on cryptographic signatures, you could probably corner a billion dollar market. I encourage doing so, but have no further assistance to offer you.
Hopefully the computer it's getting plugged into. Considering how viruses apparently spread through USB sticks, some protection is clearly a good idea.
No idea if that's what this system does, though. Ultimately it has to be the computer itself that knows to protect itself. If insecure USB-C also exists, having the possibility of security may not be good enough.
Given the number of crappy USBPD implementations out there, it would be nice if some sort of counterfeit-resistant validation program existed. That said, I would still like the option to trust/use an "insecure" device.
UPDATE: actually, no, the displays is upside down on one of my devices! Duh! A cheap gyroscope would help, or another OLED screen on the back but in the opposite direction (as USBC-cables are reversible)
If you spend a bit more, you can find ones that will trigger USB-PD without you having to plug a device in. Will also test the various protocols like Qualcolm's Quick Charge, Huawei's FCP, Samsung's AFC, etc.
The ports are only on the right hand side. The voltmeter screen always display in the same orientation. So if I flip it over, it will display correctly instead of upside down, but only the person sitting in front of me will be able to read it as the display will be facing the back of the screen of the laptop
It is a simple USB-C design oversight from Pluggable that a simple gyroscope or extra screen would have fixed easily.
>And if you're speaking X.509 or something on both sides I'm intensely curious to see how vulnerable the attack surface is.
Very. ASN.1 vulnerabilities have been exploited multiple times in the wild, in mainstream libraries (like OpenSSL). Now imagine every hardware vendor writes their own shitty implementation and ships it as a blob to phone manufacturers. And that's all without going into the full blown X.509 PKI part of the spec, which adds yet another layer of complexity - all under control of chargers you plug into. Now imagine this is implemented by the kind of companies who wish to perform DRM on what charger you are allowed to use.
This spec is a recipe for disaster and implementing it will seriously reduce the security of devices.
1) There's no practical alternative to X.509 for PKI because the standards, tooling, and libraries simply don't exist and arguably never will given the momentum of X.509. A PKI infrastructure won't even make it out the gate if it can't interoperate with the existing X.509-based CA ecosystem.
2) Historically you never wrote ad hoc parsers. ASN.1 describes a grammar to be consumed by a compiler generator. The X.509 specification and extensions are almost entirely ASN.1 grammars with interspersed commentary. You can feed those grammars directly into the generator, and IIRC I've seen builds where the grammars were extracted directly from the RFCs. OpenSSL and other open source projects suffered from parsing exploits because historically all ASN.1 compiler generators were closed source, commercial products.
I don't disagree that it will likely be ugly. Open source options for properly working with ASN.1 are still poor, and these days even commercial vendors principally rely on open source toolchains. I've used the amazing asn1c which can be fed the official X.509 grammar and spit out a clean, strongly-typed BER, DER, or PER parser. But I've never seen it used by other open source projects. IIRC the mailing-list is mostly chatter from telecommunications contractors.
Bad ASN.1 support doesn't need to be the case. Hand rolling DER parsers is a choice. If we can write good generators for gRPC, we can write them for ASN.1. And from the perspective of C and C++ asn1c largely suffices.[1]
[1] My biggest grip with asn1c is that, IIRC, it still requires dynamic memory for grammars like X.509 with variably sized arrays and strings. It would be nice to see a compiler that supports generating statically sized fields and which simply errors out if an object doesn't fit within the field. Then you could use asn1c for really low-level, embedded firmware, and more importantly could greatly simplify application code traversing the data structures. ASN.1 supports specifying maximum sizes which could be used to size fields (and maybe are in asn1c) but that's a separate thing, and in any event X.509 doesn't make use of this (though I suppose it would be easy to amend the specification, formally or informally.)
EDIT: s/compiler generator/parser generator/. Though, ASN.1 parser generators are also typically compiler generators, I guess, in the sense of generating code to "compile" data structures to the encoded form, which is probably what was going through my head when I muddled the terminology. The benefit of fixed-size, pointer-free structure fields in C is more pronounced when initializing a structure to be serialized.
Can't this be solved in the host OS? USB devices have a vendor and product id (VID, PID) and a device class (HID class [1]). I'm not sure why the operating system couldn't keep track of which ones are authorized. It could also allow whitelisting of certain HID classes (eg "allow all mice"). If the HID changes, it would prompt for authorization again, which would mitigate an entire class of attacks ("BAD USB").
It would still allow someone to bypass the authentication if they know the VID and PID of something already authorized with the desired HID class, but that seems reasonable and is significantly better than what we have today.
Are there any systems that work like this today, and if not, is there a reason?
That would work somewhat—VID/PID can't be trusted but device class can be. But the problem is that on e.g. my MacBook with a single USB C port, sometimes I want to use a keyboard (either an actual keyboard or a YubiKey) and sometimes I don't (if I'm charging). If I plug in a hub I want to trust my hub but not someone else's device claiming to be a hub. And so forth.
Having a charging-only-by-default port would be a 90% solution, but would be very annoying for people who need to reauthorize a device every time they plug it in, which is why I think pairing is better. And I'm not sure what devices you can reasonably trust other than pure power. HID is definitely out for fear of Rubber Ducky-style attacks. External video adapters can read your screen. Headphones can listen in on calls. Thumbdrives / PTP devices might be safe if handled carefully but are kind of the classic worry, so I'm not sure if people will want those. Maybe pure U2F devices, webcams, and audio input devices? Not sure how useful or user-friendly such a restriction would be.
I guess what I'm picturing is the first time you plug in your keyboard/Yubikey/whatever, your system prompts you "A new device 'AcmeCo USB Keyboard' (VID=xxxx/PID=xxxx) wants to provide service(s): Keyboard Input. Allow? Once/Always/No/Never." After that, your system saves that combination, and assuming you authorized it permanently, whenever you plug in that VID/PID, it has access to be a keyboard.
If it tries to present another HID later, that would result in a new prompt, the same as a brand new device. If it changes HID while plugged in, that could even be a more severe message along the lines of "Hey, this keyboard just tried to become a video adapter and thumb drive, that's really suspicious and you should probably destroy it with fire".
Likewise, VID/PID changing without being unplugged/replugged in, changing too quickly, or even just after too many new VID/PIDs in a short time period should result in a temporary lockdown -- in the same way fail2ban blocks repeated login attempts from a given IP.
Exactly why cryptographic authentication is required. There are fuzzing devices today that present themselves as known VID/PID/HID combinations that have generic drivers in modern OS's and can exploit that.
One of the primary drivers for this is the power delivery - imagine a scenario where you have authentic power brick and laptop and buy a counterfeit USB-PD cable, power brick sends 100W over it and it results in the cable melting since it was really a $5 knock-off and you end up with a fried USB-C port in your laptop and possibly a fire. I'm sure most people would wish that the power-brick/laptop checked that cable is genuine and build to handle the power.
That seems like a very weak failsafe likely to be easily cracked and those bad actors only slowed down a bit - if history is any guide.
Why not implement something like detecting voltage drop between charger and device? Short circuit detection, etc. You can do a lot if you can have both sides communicate with each other, and at that point it matters a whole lot less if your cable lies to you plus you catch a lot of other failure conditions.
There are more use cases where being able to prove cryptographically the identity of the device is very useful from security point of view, PD is simply one of them.
> One of the primary drivers for this is the power delivery - imagine a scenario...
Not a USB engineer, but a quick peek at the BC 1.2 spec suggests the ball would have to be dropped on at least two additional fronts for the suggested failure mode to occur:
1.) The Charging Port device vendor for failing to implement any of the allowed shutdown measures suggested in BC1.2 §4.1.4; and
2.) the Downstream Port device vendor for failing to constrain current draw based on sensed Vaca_opr min = 4.1V (Table V) per ibid. §§6.2.2 and 6.3.1.
I can see how the knock-off cable would be toast, and depending on insulation rating, may potentially catch fire, but struggle to see how either the CP or DP USB-C ports would fry (considering they're both designed for a target power).
In other words, basically treat the driver initialized to talk to the client-device the way mobile iOS treats apps: start them off with no capabilities, and then make them ask the OS to prompt in order to get persistent capability tokens granted.
The USB standards body is worried about 'fake' cables which claim to be able to support 1000 amps of current at 1000 volts and then proceed to catch fire...
This authentication is supposed to prove that the cables claims have been tested.
It will undoubtedly help safety and compatibility, at the cost of an open ecosystem and pushing prices up.
Well, as it pertains to power delivery, it seems to me that UL and friends could have their own keys, which they would delegate to manufacturers of certified devices.
I figure the authentication will happen largely in the host software as well, so if you're running a real operating system you'll be able to install your own keys.
Most of the people I love and care about, as well as most of the people who hold my personal data in trust, are not using "real" operating systems in your definition.
(Besides, how do you see this working? Are you manufacturing your own USB devices?)
I've done a USB device, so yeah I have some stake in this. I suspect there will be CAs and certificates just as with TLS, the question is how costly and time consuming it'll all be.
"and allow charging without authentication - who cares where I get 65 W of power from"
That 65W of power on USB-C is coming in at 20V most likely. Many devices simply will not tolerate that high of a voltage. This is why authentication and negotiation are absolutely essential.
Negotiation, yes. But why authentication? Why do I care from whom the 20 V come as long as it is actually 20 V?
You can fry a regular AC-powered device by giving it 240 V power instead of 120 V, or 50 Hz instead of 60 Hz, or whatever. But we don't need authentication of power outlets. What makes USB different?
I'm torn with USB-C. We're finally getting some cool additions to the most used standard (though I don't know If I like this one), but at the cost of fragmenting it to the point that it's not so much a standard anymore.
USB-C inconsistency on the hardware side of things has been a meme by itself. With this addition sometimes when you plug a device in it might not work despite being the exact right device/port combination (which would be a miracle on it's own even in the current environment.)
I don't know. Maybe in 5 years when we've glued on all the stuff we want from this connector things will be okay. In the meantime though it's chaos. Committee approved chaos.
I've come to believe that USB-C is a mistake that we will all regret.
USB-C enthusiasts are (from my experience) people who haven't really used it much and think that they will be able to get around with a single cable and a single connector standard.
People who actually tried to use it for a number of things quickly realize that USB-C is just the name of the physical connector, which has nothing to do with what the device supports, which in turn has nothing to do with what the (unlabeled!) cable supports. There are no standards for labeling devices or cables, so you quickly end up in a world where you have a bunch of cables and devices all using a single plug, but you have no idea which device will work with what. I daresay this is a worse situation than having multiple types of plugs and cables, because at least then you could set reasonable expectations.
Add to this the fact that there are no reliable hubs for USB-C connectors, so you're basically stuck with what your laptop/computer offers, unless you want to live in a world of crappy unreliable hardware (I don't).
>USB-C enthusiasts are (from my experience) people who haven't really used it much and think that they will be able to get around with a single cable and a single connector standard. People who actually tried to use it for a number of things quickly realize that USB-C is just the name of the physical connector, which has nothing to do with what the device supports, which in turn has nothing to do with what the (unlabeled!) cable supports.
Well, I'm using it for my monitors, soundcard, power, iPhone/iPad charging, Sony headphones, and external hard disks.
What exactly am I missing from the whole issue, since I don't seem to be having any problems?
Unfortunately that isn't enough yet, as there is no "universal" cable, and may never be. You can have an excellent cable that nonetheless fails for what you plugged it into, or (potentially worse) falls back to something that kind of works (e.g. a slow data transfer through a port that supports PCI-e).
The biggest botch the committee made was not mandating connector labeling. I'd have gone with colors, but not everyone can see them. If the cable had resistor-style stripes to indicate what it supported, and if the ports on the host devices did likewise we'd be in much better shape. Even some licensed logos would have helped though the connectors are so tiny.
Then again the names the USB committee has come up with in the past (high speed, SuperSpeed, 3.0, 3.1 etc) have uniformly been confusing rather than elucidating.
In my case I agree: I have had nothing but success with Type C, and it has simplified my life (except for the lack of high power chargers). But I had to learn more than most people would or should have to in order to get there.
The cable is not a dub set of wires; there's a a lot of silicon in the connectors. Though they fall back by default to the old resistor sensing used in earlier USB specs, the higher functions (which include power delivery) are negotiated.
As far as I know a universal cable is not possible. My reading of the spec says that it should be but I am not a USB implementor and some of my friends who are tell me it's not possible. In any case nobody ships such a thing.
The charging cables provided with devices are likely to be power-only. With the 87W charger, Apple ships ones that can carry about 100W (the maximum per USB spec -- 20V@5A) which requires thicker conductors; my understanding is that those cables don't carry data (i.e. they can safely be plugged into any third party brick without you worrying that the remote device will attack your computer). You want a power-only cable for this purpose. I am not sure of they have the same cable with the smaller adaptors.
The very high speed cables only work over short distances unless you go to optical (which I'm not sure anybody has deployed).
The most standard cables carry "USB 3.1" which is just another name for USB 3.0 carried over Type C connectors. The higher frequency and higher power cables are more expensive to design and build so will cost the customer more. That's one reason you have to specifically look for cables for DP, PCI-e/Thunderbolt etc.
> The most standard cables carry "USB 3.1" which is just another name for USB 3.0 carried over Type C connectors.
No, it's not; USB 3.1 adds the new SuperSpeed+ mode over all supported connector types (and USB 3.2 adds two additional SuperSpeed+ modes for Type-C connectors.)
USB Type-C connector spec is a separate spec issued after USB 3.1, USB 3.1 isn't USB 3.0 over Type-C connectors.
Apple isn't the only vendor using Thunderbolt 3. According to Wikipedia [0], the following vendors have all shipped devices with at least one port: Acer, Asus, Clevo, HP, Dell, Dell Alienware, Lenovo, MSI, Razer, and Sony.
No, this has entirely to do with thunderbolt and USB spec.
Dating back to the early days of Apple, they have been hyper conformant with the spec (e.g. no drivers needed to connect to conforming devices). They still are.
I answered your question about which cables above.
Thunderbolt 3 is 40 Gb/s of PCIe + DP — basically exposes your bus out through a Type C connector. USB theoretically tops out, as of last year, at half tha, though in practice you won’t see anything faster than USB 3.0 (if you’re lucky) which tops out at 625 MB/s, before all the USB protocol overhead.
> though in practice you won’t see anything faster than USB 3.0 (if you’re lucky) which tops out at 625 MB/s
625 MB/s is 5Gb/s; the switch from bits to bytes seems to be a way to exaggerate the difference from Thunderbolt speeds quoted in GB/s; at any rate, USB 3.1 Gen 2 (equivalent to USB 3.2 Gen 2×1) 10 GB/s cables and device support isn't hard to find; I don't know that anyone had USB 3.2 hardware (controllers, or cables supporting the new 2×2, 20Gb/s, mode) yet.
The main reason you can't make a truly universal USB cable is the same reason you can't make a universal ethernet cable. You can make a cat5e cable, you can make a cat6 cable, you can make a cat8 cable. But you can't make a cable that's guaranteed to support anything.
Currently there are four common speeds/specs for the high-speed lanes: none, USB 3.0 at 5GHz, USB 3.1 at 10GHz, Thunderbolt 3 at 20GHz.
Cables that can support the higher speeds without signal loss issues tend to also be increasingly short. That's another issue getting in the way of an ideal cable; it's hard to carry such high frequencies to the same distances.
There's no tradeoff between speed and 100W support, but most cables do lack 100W support for whatever reason.
A lot of cables (even from Apple and Google) pick the "none" option for high speed lanes. Such a cable can only run at USB 2.0 speed and is often called a "charging-only" cable. (True charging-only cables are significantly more rare. They also probably violate the spec.)
> What exactly am I missing from the whole issue, since I don't seem to be having any problems?
Dude, you're using a pair of headphones that cost more than most people's cellphones and laptops. Normals use years-old Android devices, out-of-date Windows 7 laptops and the absolute cheapest peripherals from the drug store.
Of course that stuff is going to be buggy.
Even the Nintendo Switch, due to its hardware, uses the highly unusual "myDP" USB-C display standard instead of conventional alternate mode.
Probably what the OP meant was that people who don't have that much money to splash around won't be bothered to pay crazy money for the "right" connector cables. For example I just got a new Mac Mini (which I genuinely like until now) and I just checked to see how much a USB to USB-C connector costs on Apple's website. When I saw that it costs $20 I said "f.ck it, I'm going to buy the 4.50 euro no-name connector that I just found online". I suppose I'm not the only user thinking like this.
The Amazon Basics USB to USB-C cable that pops up when you search for usb-c cable on Amazon is $6. That should be reputable enough that you won't damage your $800 computer without having to shell out $20 to Apple for their branding on a cable.
I think you are mostly lucky in that you only need a limited subset of the full functionality. You were lucky with video, you probably don't need Thunderbolt, your charging needs are modest, and you don't care about peculiarities like USB 3.1 Gen 2 speeds. In that scenario things will usually work fine, assuming you don't save on cables.
But it's enough to just read the cable reviews to learn that things are not always fine.
Also, one thing which cannot be blamed on "crappy cables" is that the presence of a USB-C port on a computer does not mean much at all, because in general one has no idea what can be connected to it.
the problem that ^can^ occur is drawing overcurrent.
usbc is supposed to support higher power demands, and if your usb port cant pass the current because its made of components too fine/resistive for the current being drawn, then {P@FF!}
PD hosts are required to monitor and limit current draw. Any magic smoke released is the fault of the host device, not the peripheral that tried to draw too much current.
thats what i mean, and when they dont limit current draw because despite being required to the design is cheaper faster not better, then thats it.
you get heat at the highest resistance so port gets very hot and undervoltage beyond that resistance, so device gets wonky, unless it is regulated, to prop up the voltage
It's worse than that. We now have a situation where different cables of same connection type can damage hardware (as we've seen with powered USB-C hubs and the Nintendo Switch with some 3rd party chargers and docks, etc).
It used to be the case that anything of the same connection type would - at worst - not work if it wasn't designed for that piece of equipment. But now you have a situation where things "kinda" work together so consumers are encouraged to mix and match yet some of the time it could literally break your hardware.
So now we have a situation where some people are too scared to mix and match which breaks the entire point of a standard connection type - and on the flip-side you have other people who are left with bricked hardware because they were unlucky enough to plug the wrong charger with the wrong device (for example).
I have a few modular power supplies around from different vendors, and I made the mistake of using one cable in a different vendors supply.
The supply side had the same connections; so I thought that I could just mix-and-match.
Turns out, that even though the connections are standardized, I ended up putting ground on a 5v pin, 12v on the 5v, 3.3v on the 3.3v (magically) and 5v on the 12v pin on a standard SATA connector.
As a result, I fried several hard drives, because the output of the supplies wasn't standardized for the connector type. I can't even imagine an end-user using a volt-meter to check which are compatible with which vendor.
This was a problem before USB-C.
TL;DL: Check your modular supply cables, and don't use other vendors cables.
Completely agree. I was using USB-C earlier than most, with a Google Pixel laptop, and I've added a couple more USB-C devices since then. The situation with different chargers and cables supporting different voltages and features is still insane. The fact that it took three years before I could find chargers with more than one USB-PD port is also insane. Too many vendors (e.g. Sony for their latest ANC headphones) have already had to ship power-only USB-A to USB-C cables just to have something that works.
That kind of brings me to my next point: proliferation of cables and dongles. Whenever I travel now, I need to have the same USB-A to micro-B cables I was already using, plus some pure USB-C, plus USB-A to USB-C. Not uncommonly, USB-C to micro-B gets thrown into the mix. And the variance across all those is even greater than the pure USB-C stuff. Good luck keeping track of voltage, directionality, data vs. power-only, and all the other variables. For now at least, the addition of USB-C has made things far worse.
I for one would be happier if USB-C were only for laptops and tablets, and all the people making the zillions of smaller devices would FFS stick with micro-B like they were already starting to. Then I'd only need one power adapter, two kinds of cables, and no dongles, instead of this mess.
You know I never even considered the cable itself in the whole usb-c equation. And labeling them is a hard problem to solve. Not technically, but aesthetically. We COULD just make different colors mean different things (But then you have to remember the colors...), but everyone wants a specific color(me included).
Text labeling could work, but could get out of control, especially at the rate that the standard is changing. Would it need to list the specific features the cable supports? Or can I assume that a cable labeled with 'Thunderbolt' will be compatible with all future iterations of Thunderbolt?
Maybe we can do something like resistor color codes with colored bands to make a rainbowed venn diagram of cable features. Surely that's a good idea which will only decrease complexity :P
> People who actually tried to use it for a number of things quickly realize that USB-C is just the name of the physical connector, which has nothing to do with what the device supports, which in turn has nothing to do with what the (unlabeled!) cable supports.
This is also true with USB A, B mini-A and mini-B, mini-AB, micro-A and micro-B, and micro-AB; also A, B, and micro-B actually have two versions each of the connectors, with compatibility in one direction, so you don't even always know that physical connectors are compatible from the connector name without also the USB version.
All I wanted from USB-C was a single charging cable for everything - phone, tablet, laptop, portable battery, headphones etc. I've got exactly that, and I'm using it plenty.
Oddly... I have several tablets, Bluetooth headphones, bike lights, batteries, and old phones. All use the same charging cable. My newer phone uses USB c.
If it was designed to have a single connector for charging cables, they were designing to solve a solved problem.
Like I said, all I wanted is a single connector for me to standardize on. It exists now, and I can (and did) so standardize - and it did make my life easier.
My assertion you you probably haven't standardized yet. That or you just have phones and a single new Mac laptop. Making you ridiculously edge case, here.
Seriously look at my list of things I have. I suspect my phones and hopefully the spare batteries I buy in the coming years will be USB C. However, my bike gear and other hobby equipment won't any time soon. Even my laptop and tablets are going to be years before I update them to one that supports USB C.
So again, outside of laptops, the world had standardized on a connector. That my laptop was different is as relevant to me as knowing that my drier isn't USB C any time soon. That I'm now going to have to deal with a ridiculously slow transition from the de facto standard everyone else was following to USB C is just obnoxious.
I have a phone, an Android tablet, a Windows laptop, and several power banks and and external USB drives. My spouse has a phone and a tablet. Everything is USB-C.
Edge case? Probably, but only because most people don't consciously factor this into their purchase decisions. It's clear that USB-C is the way forward for phones, and once that happens on the low-mid market segment, everything else will follow.
Did you read everything else I have? All that predate USB c. If you are able to get all new devices that work with it, you are just able to spend way more disposable income on gadgets.
This isn't an insult. Just pointing out that most of the electronics world had standardised. To pretend otherwise is just odd.
I already have this problem with current USB: I don't really know which of my cables do data transfer, and which just do power transfer. The few times I try to transfer data by cable, I have to keep trying a bunch of different cables.
It sounds like USB-C is that, but worse.
(If there's any way to tell what cable is a data cable, I'd love for someone to enlighten me.)
Oh, the current USB situation is much, much better. You can a) throw the "power-only" cables into the trash immediately and forget about them, b) look at the USB-A plug and the number of pins and see if it's USB 3.0 or 2.0. After that, assuming your cables aren't junk, you're fine: if you can plug it in, you can expect things to work.
How are there cable enthusiasts? Either they have a good experience or a bad experience overall.
I've had generally good experiences with Type-C and IMHO is an improvement over buying video out dongles for every device and being able to use a single cable for charging.
I've heard a lot about fragmentation, but just anecdotally it doesn't seem to be an issue in practice. My family has USB-C headphones, phones, laptops, remotes, and then USB-C charging cables around the house. Everything can be charged anywhere and everything is swappable, with the only inconvenience of the laptops needing the larger charging bricks to charge fast (otherwise they charge slow).
I connect my desktop to different phones via USB-C, my phone to my headphones, laptop to monitor, my headphones using my car charger, etc. and it all works. The charging bricks and cables are cheap on sale -- $30 for bricks that can charge laptops and about $4 a cable. We're a 100% USB-C household and will only buy electronics with USB-C or USB. I'm waiting for everything else to go USB-C: come on projectors, razors, speakers, and musical instruments.
In a USB-backed world things do seem to go pretty smoothly. I'm looking forward to that all USB-C world too :) Imagine never having to think about which way a plug should go!
I suppose my biggest gripe is that things that aren't USB are able to use the USB-C form factor. For instance thunderbolt. My dock at work connects with thunderbolt over USB-C, and everything works fine minus the display port pass through. It's kinda cool that it works at all to be honest. Oddly enough the Ethernet jack and Dock sound card are run over USB so those work fine.
Mine was when laptop tried to charge screen and not vice-versa, using Amazon USB-c to display port. In the end, laptop gets new mother board.
To be fair, OSX warned me something was drawing too much voltage, and I still tried it like 5 times after that, but the screen did flicker on a few times and work for a sec before it all went kaput.
If you have Nintendo switch in your household ensure everyone understands to use the Nintendo branded USB-C or else your switch will be toast and warranty voided.
Source: USB-C household with a 9 year old and a new switch.
This is misinformation, and this particular strand of it seems to never die. There is an issue with unofficial 3rd party docks for Nintendo Switch causing issues. USB-C PD chargers work fine.
I had to buy a Nintendo USB C cable to use for my switch. None of the ones I got with my Pixel or other USB C devices worked. Granted, I only had 5 other cables to test at the time.
The only unofficial thing I used was the cable itself. Once I bought their cable, the switch charges on everything I connect it to now.
Not quite the same issue as burning up my switch (as of now), but issues nonetheless.
What? I've charged my Switch with a dozen different cables and 5+ different chargers and it was not damaged, altough it does not charge with 1 charger.
Was it a cheap USB A -> USB C Cable without a resistor? Those a well known to be dangerous [1]
I have this rule too. While there are a bunch of people with anecdotes saying their Switch is fine with non-Nintendo power supplies, the fact that the dock and console are both ridiculously noncompliant[0] is enough to dissuade me from ever trying it. I'd rather eat the markup cost on a Nintendo-branded power supply than risk bricking an expensive piece of tech.
It varies strongly. Handheld charging is typically fine, it's using unofficial docks which cause problems (since Nintendo didn't follow USB-C spec when it charges via that)
And I use an Aukey charger on my Switch. I never even used the stock one, it still is in the box. OP is misleading everyone by mixing the dock problem (which counterfeits can brick the switch) with something he imagined.
Pretty sure I didn't -imagine- my son plugging his switch into my Google Pixel USB C charger. I did not -imagine- the device being bricked. I didn't -imagine- the hours wasted dealing with tech support and RMA'ing the device only to be told that we voided the warranty.
I'd send you the correspondence between Nintendo and myself but I feel as if I've wasted enough time on you already.
Is that really the case?! Wow. I've used a USB-C car charger from Anker with the Switch and not given it a second thought. I recall reading that the Switch had odd or picky behavior but I didn't think that it could be damaged by non-Nintendo equipment.
This is my biggest concern as well. The problem is, by the time it's consistent and has all the bells and whistles, it'll probably have to be renamed because all of the cables/devices in the past who implemented it incorrectly, at which point they'll think it's a good time to add new features while at it, and lead us back to square one.
I'm not torn about USB C at all. All in all, it doesn't seem like a good thing to me, which is why I'm avoiding it so far. I know that eventually I'll have to give in, but that day isn't here yet.
That had nothing to do with USB-C. Switching Vbus and GND would destroy many devices, regardless of what kind of cable it was.
You're probably right that because USB-C is a newish standard, manufacturers might make idiotic newbie mistakes like the ones Surjtech made with that cable. That would argue in favor of waiting... though I'd estimate that you're more likely to be struck by lightning than encounter a retail USB-C cable with power and ground reversed.
The headphone jack on my windows 10 powered Dell at work randomly stops "detecting" my headphones. I need to reinstall the drivers. Every week or so. And of course I don't because that's insane.
Now we're to trust the same bottom scraping hardware/software makers with an already overly complex serial port.
Will most likely lead to Nintendo Switch style situations where the physical port is USB-C but in actuality it uses a proprietary charger. Now they can lock out third party docks/chargers.
The Nintendo Switch actually uses USB-C power delivery, and from everything I can tell, the link between the power supply and the official Nintendo dock is compliant. Using third party docks, or trying to power the switch from the bottom connector does seem to be a little dicier.
The issues people have had with it seem to be that the Switch dock requests exactly 15V from the power supply when plugged into a TV, and many USB-C power supplies don't support this voltage. The MacBook pro charger supports 9 and 20V (I think), and most others use 9, 12, or 20.
The problem is that there's no indication of what failed unless you're sniffing the power delivery handshake.
My first thought too. Every standard seems to eventually add DRM-ish features if it's been around long enough. This won't stop maliciousness/determined attackers, but instead it will just become another hurdle of proprietariness that restricts users unnecessarily while protecting the profits of the big companies.
(Of course, it's always done "for your security"...)
Was this applied in a later software update? I distinctly recall using my Thinkpad's charger with my Switch, prior to giving it to a friend.
edit:
Keep in mind that a charger meant for a phone is unlikely to provide anywhere enough power for the Switch. You likely need a charger that implements at least USB-C PD.
The issue specifically arises when using third-party chargers and/or docks with the system docked. In docked mode, the Switch draws more power than when undocked, and requests an amount of power which is not the same as the amount of power that it actually expects, or that is delivered by the first-party charger and dock. This can result in bricking your system.
In short, the Switch does not properly implement the USB-PD standard.
I read that the situation is more nuanced than that. The Switch requests more power than expected because USB devices plugged into the dock might use a lot of power. So it needs to make sure enough power is available just in case.
I've heard that third-party chargers work. I could be wrong though.
The problem is that the Switch isn't to spec in regards to how it negotiates power. It generally isn't a problem, but some third-party docks would toast your Switch because of it.
It'll be incredibly frustrating to have to think about which USB-C device is compatible with which. You want to make a proprietary charger? Use a custom connector and stop confusing end users.
My HP Spectre won't charge from non-HP usb-c chargers in Windows; or, at least, it complains and says it won't charge. I haven't tested whether it actually draws current or not since I only booted it once in Windows. However, in Linux it charges fine at full-speed from non-HP charger with appropriate wattage. It will not charge at all from several chargers that don't have the necessary 90W capability.
"hey this is a Samsung USB-C cable, you can't use it. Go buy an LG's one. They are exactly the same except of authentication but we can't trust if otherwise, it's for your safety!"
Than I'll wait for a crying herd of fanboys that proudly try to justify it well pushed by marketing sheepdogs in disguise...
Same kind of justifications that get made for apple stuff "Oh you cant replace the screen because we are protecting your security even though the replacement screen came from another iphone.
This is DRM and planned obsolescence all day long...just like with HDMI/HDCP it will make user's life a pain not to mention more expensive. I'm sure all the vendors are happy to advance new authentication schemes every year(i.e like hdcp 1.0,2.0,2.1) and resell you the same devices.
> Malware delivered via USB is suspected to be the root cause of infection behind the Stuxnet virus...
Not exactly top notch reporting here. It is the OSs responsibility to lock down access to I/O resources. Stuxnet was caused by the insecurity of Windows Autorun, not USB.
The article seems to claim that this will protect against malware on USB sticks. This is, obviously, utter BS.
Also, unless literally every “trusted” USB-C device has its key and its core functionality on a secure chip, the bad guys will just compromise a signed device and make it malicious. As far as I know, many, many USB devices can have their firmware replaced with no authentication whatsoever.
It's not perfect by any stretch but I do like the idea. One problem we have in environments with strong security requirements is that peripherals, typically USB sticks, need to be authorized and controlled. Since there's typically very little information to actually uniquely identify devices, far too many USB devices have the same serial number, we end up having to just ban them altogether.
I welcome the world where machines trust my CA and someone who wants a USB device just comes to me, signs the paperwork, and leaves with a signed device that suddenly just works.
I mean pushing the security boundary to "someone with the knowledge to flash USB firmware without corrupting the certificate" is pretty darn good.
> I welcome the world where machines trust my CA and someone who wants a USB device just comes to me, signs the paperwork, and leaves with a signed device that suddenly just works.
Digicert surely welcomes this world! I personally don’t expect Digicert to have any real controls at all.
Also, right now, to put malware on a USB stick, you can just write malware to it. To create a USB stick that has malicious firmware to attack the host drivers, you need a USB stick that doesn’t protect its firmware or you need to build your own. In the new world, this changes to, drumroll please, a USB stick that neither protects its firmware nor wipes its key when new firmware is uploaded. Or you can build your own and pay Digicert, or you can attack a machine that doesn’t bother validating the signature because it wants to retain compatibility with the billions of existing unauthenticated USB sticks.
Also, if devices need per-device keys, then I suspect that firmware upload won’t wipe the key, since otherwise firmware upload mostly breaks the security model unless some rather complicated checks on the firmware upload process are done.
USB-C devices are cheap, and they’re made by tons of vendors, many of whom know about voltage conversion but not security. Heck, most of these vendors, even big names like Nintendo, can’t even be bothered to speak the protocol correctly.
Maybe now to make a USB-C device you'll be required to undergo HDCP-style certification. Microsoft and Apple can now decide what specific devices are permitted to be connected based on being signed and certified. You'll need approval per host manufacturer to make and sell USB devices.
many USB devices can have the entire internal hardware replaced and snapped back into thier "chassis/plastic sarcophagus" whith no authentication. if you have an environment dependent, on USB media/devices and security at the same time, there has to be some handshake that IDs hardware as friendly, or USB devices need a security redo.
This smells like a profiteering move for DigiCert. Now you need yet another company's blessing to make hardware. A non-universal serial bus this is becoming even quicker.
I'm starting to think that merging the power and data connections was a REALLY bad idea... Sometimes all I want to do is wire up something to a battery and some resistors to at least get me online long enough to send something important. Unless I'm missing something, that's getting harder and harder every revision.
EDIT: Yea, I know I'l probably cause some amount of damage in the process.
I have to agree. The fact that people have had devices destroyed by badly designed cables and chargers is proof enough. I keep my laptops for years after the manufacturers have stopped selling replacement parts. With older Dells at least this means buying replacement batteries and AC adapters on Ebay, and they're usually China-made generics not Dell originals. I have never had a problem with this mess of different products together; pumping five amps into one of my data ports knowing people have fried motherboards doing that scares me.
The ideal solution from the vendors' perspective is to have a single hardware standard with this standardized method of making compatibility vendor specific so that we have the situation as with lightning cables but with cheaper hardware costs. A standardized incompatibility standard.
Speaking as an automotive mechanic by trade who just got into GPG and letsencrypt this week, what application does this have?
I guess another question is, why did we need "secured" x509 USB? whos pushing this standard?
For example, the Porsche 911 comes with single lug wheels...theyre exotic enough to take their own special wrench, at four hundred dollars, for no real reason other than 'racecar.' youll never race it at speeds where this F1 class hardware matters, but hey, racecar.
You might be able to use it to prevent BadUSB like attacks. Something looks like an USB stick with data on it, you plug it in. The device tells the computer that it's a keyboard and sends keypresses to install some virus or something. [1]
I don't know about the particulars of this standard though, maybe it's just some DRM thing and soon you'll have to buy expensive dell mice for dell laptops or something like it's already the case with gamepads and consoles.
In addition to any software (especially semiconductor vendor's) being additional attack surface, if you already have connected your device to something it could have already cause damage (Nintendo Switch situation, USB Killers and etc.), not even considering "trusted" devices being faulty. The only thing this provides is DRM.
What's $400 for someone that can afford a Porsche 911? It seems like when you're spending that much money on a car you should know what you're getting yourself into. You wanted a racecar, and that's what you got.
I was curious so I looked it up, here's their very expensive torque wrench [0]. Here's some pictures of guys holding it [1].
It seems weird to me to complaint about requiring specialty tools for a specialty car. It's not like there aren't a ton of car options to chose from.
A random thief may not have a few hundreds to invest in wrenches.
This will mostly be a deterrent for opportunistic thieves, the same reason you lock the door of your home. Even locked, it is still easy to open with the appropriate tools.
I imagine it's still possible to remove the wheels without the specialist wrench. It will just damage the lug nuts. Obviously to an owner wanting to legitimately work on their own vehicle this is a huge issue but to a thief who wants to leave the car on bricks and sell the wheels on, not so much.
This article says that the USB-IF "is going a step further" with this cryptographic authentication, which isn't quite right.
The 'I trust this device' system allows confirming that you trust the device. This crypto confirms that the USB-IF trust the device (and presumably all they are asserting is that the device conforms to standard).
It's an interesting idea, and not a bad one. That's assuming the certificate system is sound which I remain sceptical of until I know more, given the normal issues with managing a single unrevocable private certificate.
>The authentication program relies on cryptography to validate and digitally sign USB Type-C devices with 128-bit security.
Seems... low, esp for an offline device that you can get hands on both sides of. How many chargers are going to use as cheap a micro as possible for this?
Also, adding an encryption layer and further fragmenting - does have the benefit that Apple may make iPhones with TypeC connectors... so, hooray?
Brace yourselves for the next UEFI Secure Boot "Improvement" bonanza, and, of course the accompanying Stuxnet backdoor.
Just like Spectre/Meltdown: half of all people will just shrug, completely unsurprised, when the inevitable zero day drops. And the split between those that find the sky falling, will be the clueless and the plants. Whatever.
The "authentication" that would actually help is to default to only allowing charging through ports (and allow charging without authentication - who cares where I get 65 W of power from? are there malicious electrons?), and show a little "new device" authentication screen with a Bluetooth-style pairing process when I connect something new. Then either burn an asymmetric key onto the device or generate a random symmetric key (and leave the CAs out of it), and on the host, bind the key to the current function of the device, as displayed on the pairing screen. If I clicked "Trust this keyboard," don't let it turn into a CD drive. If I clicked "Trust these headphones," don't let it turn into a keyboard. And so forth.