Why every new Apple Macbook that comes out needs a different goddamn charger



Since the Macbook USB3 charger is vulnerable to BadUSB, apparently, you might not want to borrow anyone else’s charger anyway.


USB has always sucked for a number of reasons, I just avoid it. I’d take a SparcBook over this thing anyday.


I’ve felt like this has been Apple’s policy not just with the MacBook but with other things.

“Want to transfer individual songs from iTunes on your computer to your iPhone like you used to? Well now that you’ve upgraded you can’t. You have to sync everything. Why? Because fuck you.”


I like how my old 160GB iPod Classic will only charge with the Firewire cable, and will only transfer data with the USB cable. The USB wall-warts won’t charge it, so I’m forced to hang on to a dedicated Firewire cable and wall-wart, even though I don’t own a computer with Firewire jacks anymore. And aftermarket charging cables all seem to work pretty well, except the USB-based ones. Thanks, Steve!


My personal favorite is the ‘strain relief dance’ that Apple does.

Cables have that little ridged/rubbery bit at the end to protect them from being damaged by bending. However, these are ugly, bulky, and really ruin the clean lines, so Apple loathes them.

Then they have to recall a few tens or hundreds of thousands of power adapters because of cable fraying and alarming heat-induced discoloration.

The next revision of the adapter has a moderately larger strain relief; but it is ugly and bulky.



people get strung out over the silliest things.


An old story with Apple. When I switched from a G4 to Mac mini, my old Apple monitor couldn’t be used with it because I couldn’t find an adapter anywhere for it.


No, I think people just get annoyed by bullshit nickle and diming tactics that are both an inconvenience and a waste of money.


They obviously took this article as inspiration, but not in the intended way.

And doesn’t it seem that all tech companies gradually turn evil once they hit a certain size.


I don’t think they ever recalled them; they just had a replacement program if you brought your frayed charger in. Sometime you would have to “remind” the Apple rep of genius that the replacement program existed. And if you have a problem with a replacement fraying, you’re screwed unless you have the receipt, as they base warranty service off of the computer’s serial number, and not from the serial number of the charger.

I also don’t think they reduced their strain relief designs that much, but instead new designs to reduce it (the move to the L-shape magsafe, then the super-weak magnets in the magsafe 2).

At least with the USB-C connector you have an widely-licensed standard design that can be easily replaced at much less cost than paying $80 for an Apple replacement, especially since Apple appears to be using a two-piece charger with a cable that detaches from the charging brick.

1 Like

I understood that they are going have only one port type, but they actually have only one port? Does that mean that you can’t use a thumb drive ( or wired ethernet, or a mouse, or an external monitor) and charge at the same time?

Or does the new USB spec support maybe charging the host from a powered hub? Even if so, just setting up this mac like the one I’m using now would require a hub and at least four dongles. Yeuch


There will be dongles, and you aren’t the target market.


Isn’t it all companies? Companies gradually turn evil when they hit a certain size because: i) they hire a hard driving CEO - i.e., a sociopath; ii) they go public and are under the hammer each time they release a quarterly earnings report, attract investors who are sociopaths and more interested in returns than customer service or product integrity, or the nice founders start to believe their success means they are special and start acting like douche bags (although some obviously start that way); and iii) they hire an army of lawyers who - for reasons good and bad - tie the company up in legal crap.

For what it’s worth (based on my own experience starting a business): once you get above 10 staff, things change. The family-like atmosphere evaporates as you start hiring specialists, implement “systems” and struggle to communicate everything to everyone. By the time you hit 50 employees, it’s a whole other ballgame. I never got to 100 employees. Partner woes led to it morphing into something I despised. Outvoted I sold out. It’s now an evil company (though not in tech).


I’m guessing you could hook it up to an external monitor, draw power from the monitor and use the monitor’s USB ports, all through the single cable. But this probably isn’t the target market, either. It’s an iPad-like device, meant to be used on the go, with plug-in connectivity being only rarely needed, and recharged overnight.


That’s not what they said when they courted me with systems which had built-in SCSI, serial, and RJ45 ethernet - plus slots for expansion cards. And yet somehow I suspect that they would complain that I install their OS onto other hardware so I can still use my applications!


What kind of monitor did you have? I’m guessing the Apple DVI to ADC Display Adapter would have connected it – using either Apple’s version, or many 3rd party versions. They were $99 new, or now $30-$60 on ebay, and a little pricier than most adapters because it includes a power supply for your monitor. Depending on your mini, you might have to add a cheap monoprice adapter, too.

USB 3 type C is much less vulnerable to things like BadUSB, and it allows for isolation cables to be made a lot easier and without having to use any logic to make them “generic.”

The problem with the traditional cables and making them into charge only cables is that the device won’t be able to know how much current the device says it can take. So, a USB 2.0 port with an active HCI on the D+/- pins can only provide 100mA according to spec, and if there is no negotiation it is supposed to cut off, and 500mA after negotiation. Normal chargers can either provide 100, 500, 1000, 1500, or 2000mA and the amount is determined by a resistor voltage divider, and that will tell the device “okay, you can take up to this much.”

So, if you cut the D+/- pins on a cable the device has no way of determining how much it can safely take from a device, and most devices will take 100mA (if any at all) from the device. Some smarter PMICs (power management integrated circuits) can try to drive that higher and back off the current when it sees a voltage drop, but that is never done to overcome cut D+/- lines because it can damage (and potentially ignite some of the horribly constructed) power adapters (those features are usually used to compensate for bad cables-- more on that in a sec).

Some people make a custom cable that just has the device end have a fixed setting for one of the current settings (each one requires different voltages at D+/-, or shorting them in the case of 1500mA), but this has its own drawbacks. It can lead to charger damage from overdrawing, using it on a computer can lead to the port being disabled (and potentially damaged if the impulse current is too high, but that’s only a problem with really poorly constructed and old host ports) and can potentially lead to the really bad adapters causing fires or electric shock. Active electronics to detect, negotiate, and set a voltage level for D+/- based on the host port’s settings is the solution here, but that adds costs and such.

The super-duper exciting thing about USB3 type C is the power negotiation is no longer done on the D+/- lines anymore, but now on it’s own separate channel CC. Two new pins CC1 and CC2 are added to the type C connection for this case specifically, and are used just to set up the port; nothing what you think of as “USB data” would go over these lines, it’s just messages or voltage levels for determining port settings. It either does active communications for device identification. setting up the USB 3.0/1 differential pairs for nothing, USB or alt modes and for power settings or it will have a voltage level to determine what power settings the port will need to have (similar to the old resistor voltage dividers). This is one of the most awesomest things of the type C connector. Now, you can have a charge-only cable by simply having only GND, Vbus, and CC1 and CC2 connected and you will have enough data isolation to protect yourself from USB protocol attacks, but still be able to communicate for power negotiation.

Now, some may shout about there’s still active communications and that can be exploited. There is a possibility of there being bugs in either the spec or the silicon of specific implementations of USB type C that can cause some sort of exploit, but this is much harder. The OS just gets some very specific information that can be very easily sanitized from the CC. Communication on this channel is very rigid, and contains no code and no different frames based on device type; it’s strictly a setup and configuration channel to get the USB port set in the right mode, since there’s only a finite number of settings for the port, it can be very well-defined. This gives a good sense of security against attacks.

Oh right, I was also going to mention shitty cables and holding back current from the device’s PMIC. So, I"m pretty sure everyone has a cable that they just don’t know why, but it seems to always take forever to charge with. Answer: you got yourself a cable with too small of conductors in the cable (usually with 30AWG or smaller conductors). Some cost-cutting no-name manufacturers often try to shave a few cents off the price of each cable by using smaller and smaller conductors in the cable. This leads to be a problem with power hungry devices. A 30AWG wire that’s 1m in length will have about 0.3-0.4 Ohms. Double that for both ground and power, you have 0.6-0.8 Ohms. At 1 Amp, this will lead to a voltage drop of 0.6 to 0.8V and that’s enough for the PMIC in your phone, tablet, or whatever to say “Heeeey, there’s something bad going on here, I’m going to dial back the current until I’m much closer to 5V”. (as it reduces the current that voltage drop will become lower). One may not think it’s a big deal to drop a little Voltage in that wire, until you realize what that does. For one, the device won’t want to charge properly at lower voltages, it’s specifically designed to work at that voltage, and anything lower won’t work as efficiently (and again with crap stuff it can damage). The other thing is that voltage drop is a dissipation of energy and energy has to go somewhere. That gets converted to heat in the cable, in particular in those tiny wires. Keep this in mind, as the diameter of the wire goes down, the resistance goes up and as the diameter goes down, the ability to handle a certain level of head twill go down. As resistance goes up, the energy dissipated goes up, which in turn means the heat goes up. So you can see how this will compound itself. If you start going to 32 or 34 AWG wires (it happens), you can actually measurably heat up the cable at those points if you try to actually pull 1A, and go any smaller and you run the risk of starting those conductors on fire. So, yeah, don’t buy the absolutely cheapest cables you can get; take a look at them and see their quality first :wink:


Macs Not Vulnerable to BadUSB Attack

Gizmodo seems to believe the 12-inch MacBook is vulnerable to this direct attack, even going so far as to suggest that the NSA will distribute hacked USB-C power adapters designed to take over your notebook. But unlike Thunderstrike on vulnerable Macs (see “Thunderstrike Proof-of-Concept Attack Serious, but Limited,” 9 January 2015), the USB port uses Intel’s xHCI (eXtensible Host Controller Interface), which can’t be placed into a DFU (device firmware upgrade) mode to overwrite the MacBook’s firmware. Thus the MacBook itself can’t be infected with BadUSB, so plugging in an unknown power adapter can’t give someone control of your MacBook.


This is a completely standard connector that will be ubiquitous within a year (new Pixel is using it as well, and I expect a lot of other laptops will follow) just like USB1/2 ports/cables are now.

People are just never happy I guess.