Just get a USB condomâŚ
Until they make a yank-proof USB connecter, Iâll stick with my MagSafe power connection.
I hope people donât use USB for household lighting. Low voltage DC rectified from AC (or via a switching power supply) is not a great way to power things efficiently. You must power batteries and computers this wayâyou donât have a choice. But things like lighting can run just fine off of AC.
For power only charging for laptops, maybe a toggle switch to turn the data capabilities of one of the usb ports on and off would give the best of both worlds.
That USB Condom in the Independent article is rather primitive. Not what Iâd call a product.
Itâs not hard to make one yourself, by peeling a USB charging cable open and severing the white and green data lines. But itâs a good idea to add some resistors to the device end to tell Apple devices that itâs OK to charge. See Adafruitâs MintyBoost for details.
A good switch mode power supply can be 95% efficient or more, and LED lighting, which requires DC, is far more efficient than incandescents and gaining on fluorescents. Since they require less current, the IR cable loss is mostly mitigated. I wouldnât try running a clothes dryer on 12 volts DC, but LED lighting? more and more of my house is illuminated with LEDs powered from 12 or 24 volts DC.
The right answer, as always, is a hardware switch on the data line before it reaches anything you care about protecting. Then all you have to worry about is social engineering.
AC or DC doesnât matter. If it is AC, it will just be reverse biased for half the cycle. That is easy enough to adjust for (you can double up an oppositely wired LED for the other cycle or just increase the intensity). What matters is the voltage your LEDs are rated for. But that is easy enough to fix with integrated transformers (which are extremely efficient, far more than switching power supplies).
One of the issues is if you donât connect to the data lines is that you may not be able to negotiate for more power. USB only provides a minimal amount of power until you talk to the root hub and negotiate for more. If you exceed what youâve negotiated for power youâll find that the root hub might turn your power off.
The trick is likely to be (for these higher-power modes) that some kind of active negotiation will be required: 100 watts at 5 volts is closer to a bus bar than a âcableâ, by consumer standards, and anything that doesnât at least start at 5 volts, less than an amp, is going to blow the magic smoke out of badly made cheapo devices (and a shameful number of expensive; but still somehow non-compliant devices) from the period when âeh, 5v, maybe an amp at the outside, polyfuse will go if I do anything really dumbâ was a valid assumption.
We arenât talking about serious computation here, the charge management controllers that keep Li-Ion packs from indulging their tendencies to martyrdom could probably handle the function for an extra nickle (You do trust your vendorâs shitty firmware? Donât you consumer, donât you?), but some of the very low tech hacks that work just fine for making USB power cables probably wonât cut it.
(On a more serious note, the proposal to bring higher power to USB, outside of limited cases like IBM et alâs [âPowered USBâ][1] which is a nasty, patent encumbered(for what brilliant innovations is somewhat unclear), and frankly rather clunky pseudo-standard; but is also safe from consumer confusion because itâs confined almost exclusively to point of sale systems and their peripherals, without an expectation of universal compatibility.
100 watts is, by the standards of little gadgets, a lot of power. Most laptops these days have bricks rated for less than that (a few DTRs excepted), and that is supposed to be enough headroom to run and charge at the same time. Desktop PSUs tend to be a bit chunkier; but still only a few hundred watts, with varying levels of headroom. Smaller devices, routers, set top boxes, all the assorted consumer-electronics-with-USB, might be powered from a little wall wart only good for a low power USB port or two, under load.
So, thereâs the trouble: to have a marketplace of devices that actually exploit the new high-power capabilities, you need to have potential buyers with high power ports. But adding a high power port to your design could easily involve doubling the bulk of the power brick(and thatâs for only a single port) and seriously buffing the traces. If you want the user to be able to plug in anywhere, not just the special-magic-high-power-port, things get more complicated and/or even bulkier.
Situations where a device is partially capable of what it claims to be capable of (and this isnât theoretical, USB device descriptors are rife with lies: bus powered hubs that claim to be self powered, all kinds of⌠creative⌠behavior) or where a peripheralâs load changes based on activity(say a monitor that needs more power when brighter, or a laser printer that needs to heat its fuser from time to time), could easily lead to situations where the hapless user is faced with periodic brownouts, or manually tallying up power draw to stay under a deviceâs maximum across all ports.
I can see the appeal for a few, specific, niche applications, mostly where the device and the peripheral are purpose-built to work with one another and wiring is supposed to be kept to a minimum(though just gluing a 24 or 48V polarized connector to a USB cable and calling it a day sounds pretty attractive under those circumstances); but for ubiquitious use, donât promise what you canât deliver is always a good thing to keep in mind, and a 100watt target is almost certain to be widely and confusingly undeliverable. Itâll be a mess.
[1]: http://en.wikipedia.org/wiki/PoweredUSB
If that actually happens, youâve found a classy USB host device. It is what is supposed to happen, per spec: 100ma to start, negotiation for the remaining power, all very nice. In practice? Well, a lot of common devices draw more than 500ma (2.5 inch HDDs, annoyingly, are right on the edge. Some can some can run but not spin up, some need those ghastly âYâ cables) so we donât really want people to be unhappy, plus a polyfuse is cheaper than precision current control. Yeah, just throw a 1amp polyfuse on +5 and call it a dayâŚ
âSwitchingâ power supplies exist because low-frequency transformers are huge, bulky, and not particularly efficient at all. It seems perverse; but it turns out that adding an additional âswitchingâ component, to boost the frequency substantially (from 50/60hz up to somewhere in the 10s of Khz to low Mhz) allows you to use much smaller magnetics. Plus, switchmode power supplies allow output voltage regulation, with minimal variation under varying load, far more efficiently than linear regulators allow for, and donât kill hardware if lightly loaded like unregulated transformers do.
Oh I definitely agree with that. Itâs âwhatâs supposed to happenâ. Specs and reality are always two different things.
Powering a high power/high brightness LED from some stepped down AC line voltage would be a horribly inefficient use of what light they could be producing. Considering LEDs are current driven devices some type of SMPS or at least active power supply would provide the best over all utilization of the LED itself.
And Iâm not sure how an integrated transformer is going to be more efficient than a switching power supply.
UmmmâŚIâve got plenty of books and math in my bookcase that tends to agree thatâs exactly how it happens.
Besides modern computer motherboards take 12V DC in from the switching power supply and proceed to run it through parallel switching units to turn it into low voltage and high current (100 A or more). If there was a more efficient way of doing it, this is certainly an area youâd see it in.
Plus thatâs what happens today when youâre talking about exporting 2W. When you start talking about exporting 100W then the device deciding whether itâs going to allow that kind of power draw might become much more stringent about just exporting the power without a proper negotiation. Especially if CE or TuV type regulations are required.
I think you misunderstood me. That exactly how the books say it happens, or should happen. But fuzzyfungus is right. There are a lot of devices out there that do violate the USB standards. Root or powered hubs that will allow more than 100mA of current prior to negotiation, and devices that draw more than 100mA before requesting more power. USB is far from a trivial spec to be compliant with.
So I can spend a few hours figuring out how resistors work, where to buy them, maybe drop $20 on a soldering iron while Iâm at it, then mutilate my charging cable with fire and blade and, if Iâm lucky enough not to destroy it, bundle the whole thing up in an ugly wad of electrical tape⌠Or I can spend $10 and get one shipped to my door hassle-free.
It seems like the real problem isnât the USB hardware, itâs the idiot software designers who allow their devices to open data connections and execute code silently. I imagine thatâll change pretty quick as things take off. Expect to see confirmation dialogs any time an unknown device wants to exchange data, much like you get now whenever you download an executable.