Power over USB: when charging a computer means connecting to untrusted data-sources

[Permalink]

Just get a USB condom…

6 Likes

Until they make a yank-proof USB connecter, I’ll stick with my MagSafe power connection.

2 Likes

I hope people don’t use USB for household lighting. Low voltage DC rectified from AC (or via a switching power supply) is not a great way to power things efficiently. You must power batteries and computers this way–you don’t have a choice. But things like lighting can run just fine off of AC.

2 Likes

For power only charging for laptops, maybe a toggle switch to turn the data capabilities of one of the usb ports on and off would give the best of both worlds.

4 Likes

That USB Condom in the Independent article is rather primitive. Not what I’d call a product.

It’s not hard to make one yourself, by peeling a USB charging cable open and severing the white and green data lines. But it’s a good idea to add some resistors to the device end to tell Apple devices that it’s OK to charge. See Adafruit’s MintyBoost for details.

3 Likes

A good switch mode power supply can be 95% efficient or more, and LED lighting, which requires DC, is far more efficient than incandescents and gaining on fluorescents. Since they require less current, the IR cable loss is mostly mitigated. I wouldn’t try running a clothes dryer on 12 volts DC, but LED lighting? more and more of my house is illuminated with LEDs powered from 12 or 24 volts DC.

4 Likes

The right answer, as always, is a hardware switch on the data line before it reaches anything you care about protecting. Then all you have to worry about is social engineering.

4 Likes

AC or DC doesn’t matter. If it is AC, it will just be reverse biased for half the cycle. That is easy enough to adjust for (you can double up an oppositely wired LED for the other cycle or just increase the intensity). What matters is the voltage your LEDs are rated for. But that is easy enough to fix with integrated transformers (which are extremely efficient, far more than switching power supplies).

1 Like

One of the issues is if you don’t connect to the data lines is that you may not be able to negotiate for more power. USB only provides a minimal amount of power until you talk to the root hub and negotiate for more. If you exceed what you’ve negotiated for power you’ll find that the root hub might turn your power off.

1 Like

The trick is likely to be (for these higher-power modes) that some kind of active negotiation will be required: 100 watts at 5 volts is closer to a bus bar than a ‘cable’, by consumer standards, and anything that doesn’t at least start at 5 volts, less than an amp, is going to blow the magic smoke out of badly made cheapo devices (and a shameful number of expensive; but still somehow non-compliant devices) from the period when ‘eh, 5v, maybe an amp at the outside, polyfuse will go if I do anything really dumb’ was a valid assumption.

We aren’t talking about serious computation here, the charge management controllers that keep Li-Ion packs from indulging their tendencies to martyrdom could probably handle the function for an extra nickle (You do trust your vendor’s shitty firmware? Don’t you consumer, don’t you?), but some of the very low tech hacks that work just fine for making USB power cables probably won’t cut it.

(On a more serious note, the proposal to bring higher power to USB, outside of limited cases like IBM et al’s [“Powered USB”][1] which is a nasty, patent encumbered(for what brilliant innovations is somewhat unclear), and frankly rather clunky pseudo-standard; but is also safe from consumer confusion because it’s confined almost exclusively to point of sale systems and their peripherals, without an expectation of universal compatibility.

100 watts is, by the standards of little gadgets, a lot of power. Most laptops these days have bricks rated for less than that (a few DTRs excepted), and that is supposed to be enough headroom to run and charge at the same time. Desktop PSUs tend to be a bit chunkier; but still only a few hundred watts, with varying levels of headroom. Smaller devices, routers, set top boxes, all the assorted consumer-electronics-with-USB, might be powered from a little wall wart only good for a low power USB port or two, under load.

So, there’s the trouble: to have a marketplace of devices that actually exploit the new high-power capabilities, you need to have potential buyers with high power ports. But adding a high power port to your design could easily involve doubling the bulk of the power brick(and that’s for only a single port) and seriously buffing the traces. If you want the user to be able to plug in anywhere, not just the special-magic-high-power-port, things get more complicated and/or even bulkier.

Situations where a device is partially capable of what it claims to be capable of (and this isn’t theoretical, USB device descriptors are rife with lies: bus powered hubs that claim to be self powered, all kinds of… creative… behavior) or where a peripheral’s load changes based on activity(say a monitor that needs more power when brighter, or a laser printer that needs to heat its fuser from time to time), could easily lead to situations where the hapless user is faced with periodic brownouts, or manually tallying up power draw to stay under a device’s maximum across all ports.

I can see the appeal for a few, specific, niche applications, mostly where the device and the peripheral are purpose-built to work with one another and wiring is supposed to be kept to a minimum(though just gluing a 24 or 48V polarized connector to a USB cable and calling it a day sounds pretty attractive under those circumstances); but for ubiquitious use, don’t promise what you can’t deliver is always a good thing to keep in mind, and a 100watt target is almost certain to be widely and confusingly undeliverable. It’ll be a mess.
[1]: http://en.wikipedia.org/wiki/PoweredUSB

1 Like

If that actually happens, you’ve found a classy USB host device. It is what is supposed to happen, per spec: 100ma to start, negotiation for the remaining power, all very nice. In practice? Well, a lot of common devices draw more than 500ma (2.5 inch HDDs, annoyingly, are right on the edge. Some can some can run but not spin up, some need those ghastly ‘Y’ cables) so we don’t really want people to be unhappy, plus a polyfuse is cheaper than precision current control. Yeah, just throw a 1amp polyfuse on +5 and call it a day…

2 Likes

“Switching” power supplies exist because low-frequency transformers are huge, bulky, and not particularly efficient at all. It seems perverse; but it turns out that adding an additional ‘switching’ component, to boost the frequency substantially (from 50/60hz up to somewhere in the 10s of Khz to low Mhz) allows you to use much smaller magnetics. Plus, switchmode power supplies allow output voltage regulation, with minimal variation under varying load, far more efficiently than linear regulators allow for, and don’t kill hardware if lightly loaded like unregulated transformers do.

2 Likes

Oh I definitely agree with that. It’s “what’s supposed to happen”. Specs and reality are always two different things. :smile:

Powering a high power/high brightness LED from some stepped down AC line voltage would be a horribly inefficient use of what light they could be producing. Considering LEDs are current driven devices some type of SMPS or at least active power supply would provide the best over all utilization of the LED itself.

And I’m not sure how an integrated transformer is going to be more efficient than a switching power supply.

1 Like

Ummm…I’ve got plenty of books and math in my bookcase that tends to agree that’s exactly how it happens.

Besides modern computer motherboards take 12V DC in from the switching power supply and proceed to run it through parallel switching units to turn it into low voltage and high current (100 A or more). If there was a more efficient way of doing it, this is certainly an area you’d see it in.

Plus that’s what happens today when you’re talking about exporting 2W. When you start talking about exporting 100W then the device deciding whether it’s going to allow that kind of power draw might become much more stringent about just exporting the power without a proper negotiation. Especially if CE or TuV type regulations are required.

1 Like

I think you misunderstood me. That exactly how the books say it happens, or should happen. But fuzzyfungus is right. There are a lot of devices out there that do violate the USB standards. Root or powered hubs that will allow more than 100mA of current prior to negotiation, and devices that draw more than 100mA before requesting more power. USB is far from a trivial spec to be compliant with.

1 Like

So I can spend a few hours figuring out how resistors work, where to buy them, maybe drop $20 on a soldering iron while I’m at it, then mutilate my charging cable with fire and blade and, if I’m lucky enough not to destroy it, bundle the whole thing up in an ugly wad of electrical tape… Or I can spend $10 and get one shipped to my door hassle-free.

It seems like the real problem isn’t the USB hardware, it’s the idiot software designers who allow their devices to open data connections and execute code silently. I imagine that’ll change pretty quick as things take off. Expect to see confirmation dialogs any time an unknown device wants to exchange data, much like you get now whenever you download an executable.