Apple engineers quietly discuss refusing to create the FBI's backdoor

[Read the post]

1 Like

“If all the employees capable of making the FBI’s backdoor exit the company,”

That assumes that the set of all such employees is static, or even is at Apple. Unfortunately, the FBI etc. could subpoena source code, private keys, and have contract engineers do the deed. A great many are or can become “capable”, including the NSA’s, as well as foreigners.

1 Like

Out of all of that what I took away was ‘Is that guys name seriously Window?’ That is maybe the worst name I’ve ever seen.


I continue to struggle with the notion that none of the FBI/DOJ have ever seen The Bridge over the River Kwai.


Continuing the discussion from Apple, basically: “If it pleases the court, tell FBI to go fuck themselves": Well, this is what I was saying here.

If the FBI is asking for code that bypasses certain security features one specific phone, how can they properly QA it? The closest they can do is to create a version that bypasses it for a small list of phone’s keys. What would happen if a “goto fail;” style bug was introduced that impossible to QA code?

1 Like

Yeaaahhh enginneers. Keep your backbone and your integrity.


More power to the Apple employees. I would love to see the FBI go down in flames on this one. (Though have we all forgotten Nixon and his 1973 Saturday Night Massacre? It would only take one engineer to make the change, and while it may take a few firings to find him or her, the outcome does seem inevitable.)

But I must be missing something. As I understand it, the FBI’s problem is that it can’t try a brute-force attack to discover the correct passcode for the phone in question, because iOS has a counter (so we’re told) that will wipe the phone clean after ten failed attempts.

The first bit of hacking I ever did was altering Apple ][ games. This was waaaay back in 1980 or 1981, and I’d spend an hour or so paging manually through the 6502 machine code, looking for a decrement-test-branch sequence of instructions. By changing just one byte, that decrement becomes a NOP. And with this trivial change I could start the game with three lives, die fourteen times, and still have three lives. (Yes, that gets boring really quickly. Stage 2 of the hack is to leave the decrement intact, but give more initial lives.)

Isn’t the FBI asking for essentially the same thing here? By neutering a single decrement instruction, the FBI can have an infinite number of tries to guess the passcode, with no risk of the phone wiping itself. Seems the FBI ought to be able to make such a trivial change themselves. For that matter, any fanboi with rootkit skills should be able do it.

For that matter, why doesn’t the FBI physically open up the damn phone and read out the contents of the flash memory? Then it could throw its collective might at decrypting the phone’s data without fear of the data going up in smoke.

The FBI surely knows all of this, even if we assume historic levels of bureaucratic incompetence. But instead of just quietly cracking the phone’s contents, the FBI persists in asking Apple to do something it could do itself.

This has been long-winded, but here (finally!) is my point: The triviality of the iOS change shows that the court case is a charade. On the surface it’s about forcing Apple to cooperate. But it’s really about cementing the idea, in the court of public opinion, that our tech companies need to choose sides and fight with the “good guys”.

So unless I’m really overlooking some technical challenges, unless the iOS changes require significant engineering effort, or if that phone will self-destruct when opened, or if decrypting the phone’s raw memory would be orders of magnitude harder than it appears, this ain’t about some angry Muslim’s phone. The fight for our souls isn’t off in the indefinite-but-soon-to-come future, it’s right here and now.

What is to stop the government from finding the engineers themselves in contempt and threatening jail time? I don’t see how quitting your job would make much of a difference.
If an employee shreds documents during a court case, quitting wouldn’t save them from punishment.

The court order would involve Apple itself, not specific, named, employees, for one. The government also can’t compel them to not quit, or not take vacation time they’re owed (and once they don’t work for Apple, the government can’t compel them to take Apple property and come work for them, either). And if Apple is left without the employees that would allow them to fulfill the court order, I can’t imagine the government could then compel them to hire people to do so - you could compel a business to do the things they’re set up to do, but not something else (you can force a bakery to sell cakes, but not become a locksmith). Though I wouldn’t think that would be a strategy that Apple could maintain long enough to be very useful, since those are, ultimately, needed workers.

1 Like

The 13th Amendment.


Window is a woman and a very respected security engineer.


If all the employees capable of making the FBI’s backdoor exit the company, then a court order to produce the decryption device could die, too.

This sentence contains the insight to democratically fixing so much corporate and govt-on-behalf-of-corporate overreach.

If all the employees capable of making INSERT_TASK exit the company, then a court order to produce the INSERT_COERCIVE_ACT could die, too.

Evidently her first name is Mwende (Kenyan)

The answers are:

  1. your 6502 experience doesn’t apply on an iPhone. It will just refuse to even boot firmware that’s not signed using Apple’s key. That is, if you can somehow get it into the phone in the first place. If you make a change to the firmware, even a single byte, the checksum will fail and the signature is invalid. All you can do is brick the phone, and, if they’ve done iOS in a particular way, it might just wipe the thing clean anyway.
  2. Removing the flash and decrypting it is possible, but again, who knows what changes they’ve made to the flash firmware; it might not be possible.

Today’s processors are very different from the simple open-memory architectures we all cut our teeth on. You can’t get away with all the stuff we used to do back in the day.

On the other hand, when you say

I agree completely; there probably isn’t anything particularly useful on the phone - maybe a bit of corroboration, but nothing more. It is indeed about questions of backdoors, privacy and security.


I think this plays strongly into Apple’s corporate defense, too. If the FBI compels the company to do something so egregious that all of its most-capable software engineers quit rather than comply, that’s a serious harm to Apple as a company in the short-term. Apple loses at least some of its ability to stay competitive in the market, and everyone who uses Apple’s operating systems are put at greater risk due to a loss of security-wise developers. It’s also a long-term harm to Apple in terms of being able to attract and retain qualified talent. Given the animus against the Three Letter Agencies in the tech sector, especially since the Snowden leaks, Apple is going to have a hard time finding new employees willing to work for what would basically be an FBI proxy outfit if they lose. Those are things that I think have to be factored into the “undue burden” consideration.


Apple should do that - send their lawn guy to work on it, and he can play tetris all day.


I strongly suspect that(while having Apple just do it for them would be vastly easier) the real issue is whether or not they can get the signing key, or use of it.

Much to the annoyance of the software industry, there is a fair supply of warez kiddies who will grovel through deliberately hostile binaries and defang whatever DRM mechanisms are in place for some mixture of fun and social acclaim.

That’s irrelevant in the case of a device that will refuse to boot or execute anything without a cryptographic blessing from Cupertino; but if you can get the keys, I’d be strongly inclined to suspect that you could find someone capable of stubbing out the stuff that keeps track of the number of incorrect passcode attempts with nothing but the iOS binaries of appropriate version, some patience, and some test phones of the same model. It wouldn’t be the easiest way; but the main muscle behind iOS integrity verification and ‘app’ DRM enforcement has always been the fact that the hardware won’t boot anything but a blessed OS and the blessed OS won’t execute anything that hasn’t been suitably signed.

I’m sure that some app developers, and possibly Apple to some extent, have done some additional obfuscation; but since the supply of iDevices that will run unsigned applications is somewhere between ‘small’ and ‘commercially irrelevant’ odds are excellent that much less time and money go into futzing with tamper resistance than is the case with software that runs on default-allow platforms, where if you wish to enforce licensing restrictions you have to harden your application yourself.


I think it extremely likely that the FBI already have effective toolkits for hacking iPhones, and already possess the contents of this particular one. Consider; if they win in court, to force Apple to help them, the results will naturally be classified for “national security”. What FBI wants is to use this opportunity to present themselves as good guys vs encryption and to push the nation closer to formally embracing the “inevitable” future where total surveillance is supposedly all that keeps us safe.

1 Like

I love the argon out of that sign?