Thanks, fuzzyfungus. I, too, balked at the 100W@5V. Twenty amps?! Yikes! It’s such a bad idea, especially when you consider the physical specs of a USB connector.
Aside from the thickness that would be required of the cables, I can guarantee that the USB connectors will be burnt to a crisp after a while. I have to replace enough of 'em where the flattened pins cause poor, unreliable data transfer. And I have enough experience with cable and/or PCB connectors which, while not-at-all designed to carry high amperage, were used to carry power - resulting in fire.
I’m not against USB outlets. I think it would be a neat thing to have in a house.
Anyways, I think I was focusing on a trivial issue. The efficiency difference between AC and DC LEDs isn’t huge (though modern AC LEDs beat DC LEDs). But even more importantly, lighting is not the major electrical load in a house, especially when you’ve used LEDs that may cut its power consumption by 3/4ths. Appliances and electric heating/cooling still eat up the most power. Things like solar water heaters, high efficiency refrigerators, air drying clothes, and better insulation to reduce heating and cooling will have a far more significant impact than worrying about your LED power supply.
Quick reminder, since some folks seem to have missed this point:
The USB power source is supposed to be constant-voltage, limited-current, with negotiation to make sure folks don’t try to draw more current than they should and drag the voltage down. The proposal to to let it supply more current does NOT mean that devices risk being overpowered unless they’re already malfunctioning – they will still get the same voltage provided. The only difference on that front is that if they DO malfunction in a way that amounts to a short circuit (relatively unlikely), they would have the opportunity to do so more spectacularly… but, the odds of your thumb drive exploding remain insignificant. Especially if the handshaking is properly implemented and high-power devices have to declare themselves before being allowed to draw that much.
Personally, I agree that if a device needs that much power it should probably have its own power supply, just because I don’t see much point in overdesigning the PC for that uncommon case. But apparently some folks really detest the idea of running one more cable or having to carry a cube tap.
I’ve seen USB 2 power-only cables that short out the data-wire, and I wonder if Mike’s problem couldn’t be solved by just having a power-only USB port on the back of your laptop for charging – but I also wonder if people would buy such a laptop, or if they’d demand the convenience of being able to use any port for charging or data.
This is all speculative, but if we did have laptops with USB for charging, wouldn’t the charging port be a a B-type connector? If so, it would be impossible to plug peripherals into that anyhow. (Well, not impossible. You’d be plugging them in backwards though.)
If this is the case, either the charging connector wouldn’t need to have its data lines hooked up, or it could be hooked up to a one-trick microcontroller that is air-gapped to the rest of the computer.
That’s why I assumed that any ‘powered USB’ spec would require (rather than just plaintively suggest, as with the current spec) device negotiation to receive full power: 20 amps is absurd in the context(depending on your exact tolerance for risk, and de-rating estimates over the lifetime of the part, that’s like AWG #8 for the cable, and a quarter-inch trace for 2oz copper-clad (and even then, you’ll be dissipating more power than some of the ICs in the system…) Since that’s the case, a higher voltage is clearly necessary. But, you can’t have a USB port that starts at a higher voltage; because that would kill any legacy device plugged into it.
Therefor, it seems pretty clear that negotiation (or some really ugly new plugs and sockets, like those used by PoweredUSB) is the only path: Device gets plugged in, gets enough juice at 5V to send the magic handshake that pushes the port up to 24 or 48V, Now 100watts is only 2-4 amps; but still not high enough to trivially electrocute users.
Or those of us that already have soldering irons and solder, salvage some resistors from junk and grumble about why spend 10 bucks when I got the job done for zero.
Or those that don’t care about Apple products/USB specs, just leave the two data lines not connected or blocked. Note, I vaguely remember you should identify the device, but don’t quote me on that–just re-read the USB specs. to be sure.
Allow me to provide some facts about the USB-PD specs.
There are 5 power profiles. They are:
5V@2A (same as USB 2.0); 12V @ 1.5A, 3A, & 5A; and 20V @ 3A & 5 Amps.
The power levels above the existing standard will only be delivered over a new USB-PD cable / connector which is automatically detected. If a new cable is not used, power delivery will be limited to the existing standard of 5V @ 2 Amps.
As has already been noted, cables that lack the D+ /D- lines used for data transfer already exist. Negotiation for power does not use the D+ / D- lines, but rather an FSK scheme over the power lines that terminates at the USB-PD power controller (it does not interface to the system CPU, which is critical in the case of needing to charge a dead battery).
USB-PD AC adapters don’t need the D+ / D- data lines either.
So there are options for securely charging via USB-PD.
100 Watt at 5 Volts (20 Amps) is not in the USB- PD (Power Delivery) spec.
The USB-PD specs 5 power profiles, none of which is higher than 5 amps. They are:
5V@2A (same as USB 2.0); 12V @ 1.5A, 3A, & 5A; and 20V @ 3A & 5 Amps.
The power levels (voltage @ current) above the existing standard will only be delivered over a new USB-PD cable / connector which is automatically detected. If a new cable is not used, power delivery will be limited to the existing standard of 5V @ 2 Amps.
Great idea, but are you allowed to do that? I’d understood that Apple’s MagSafe patent covers any mag-assisted electronic plug, and they intend to hoard it 'til Doomsday. Or, well, until it expires. Don’t recall when that is exactly, but I wouldn’t be surprised if magnetic plugs become standard on everything shortly afterwards.
From the manufacturer’s position surely they should just add a freaking switch. You know… like the ones we currently use to interact with lights… I’m pretty sure people are used to that UI already
Dude… I’m good with computers but TERRIBLE with electrical diagrams and concepts. Can you explain why DC is good for computers? Seriously not trolling… just never read a decent explanation as to why and my lay person’s understanding of electricity is doing me no favours.
The most basic answer is that simple transistors only switch DC, not AC. Much longer answers exist, but that’s at least a term of building up from simple transistors to computer system architecture.
His strength was as the strength of 100, for his 'cause was “just”…
We could dive all the way down to the physics of dopants and regions and carriers in a semiconductor… but unless you want to go to that level, “because that’s the way the underlying technology works and everything else is built as levels of abstraction successively wrapped around those basics” is as informative as it’s gonna get.
Perhaps this will help you more: If you’re dealing with binary, all you need is on and off. For that, the simplest representation is current (or voltage, either way) versus no current (or voltage). Trying to do that with varying voltage would not only hugely complicate the circuitry, it would slow it down because you’d have to wait for a while to see whether a momentary no-current was really 0 or just the cross-over point in your AC curve… and that’s the least of the nuisances.
It isn’t an arbitrary decision. It works this way because that continues to be the best solution available. There are specific reasons for using a varying current, most notably that it can be stepped up or down with a transformer. If those reasons don’t apply, a constant current is almost always the better answer.