Apple's letter explaining why it won't give the FBI a backdoor to the iPhone


I assume that the limitation is how quickly a phone can accept passcodes, not how quickly a powerful computer cluster can generate and enter them.


I feel like I must be missing something here, but: can someone explain to me why the FBI can’t just gain access to whatever e-mail account is associated with the phone’s Apple ID, trigger a password reset, grab the password reset e-mail from the account and go form there?

Surely compelling a company to provide administrative access to the e-mail account of a deceased mass murderer in the context of an ongoing investigation is easier - legally, ethically, and technically - than trying to compel a company to create new software that will backdoor all phones everywhere.


It’s because the FBI wants easy access to pwn any iPhone. They don’t need help in this instance, they only are going for this now because they thought it’d be politically favorable, since this case has to do with terrorism.


Because after they got the phone, the FBI reset the password and locked themselves out.


This smacks of theater. I’ll bet anyone here a dollar that an organization with the resources such as the fbi could easily write out the contents to a big fpga and bang on it till it cracks in a few minutes. The OS and bootloader are already knowns, for pete’s sake.
I feel like this has to be a pissing contest in order to make it admissible as evidence and get their fingers in every pie-hole imaginable.

We said no to total information awareness years ago, and they did it anyway.


There’s an option when you are setting the passcode to switch to the keyboard. Apple’s support has a page on that


There is no password recovery process for the screen lock password. If you forget yours, your only option is to restore phone to factory settings (erasing all data).
The link M-dub is providing deals with a separate issue. Apple could theoretically have hacked this phone only to the extent of forcing it to backup to iCloud and then given the information from iCloud to the FBI. But the can’t communicate with iCloud any more since the iCloud password was changed.


Apple’s Letter:

Is it technically possible to do what the government has ordered?
Yes, it is certainly possible to create an entirely new
operating system to undermine our security features as the government
wants. But it’s something we believe is too dangerous to do. The only
way to guarantee that such a powerful tool isn’t abused and doesn’t fall
into the wrong hands is to never create it.

Apple’s security whitepaper:

If one step of this boot process is unable to load or verify the next process, startup is stopped and the device displays the “Connect to iTunes” screen. This is called recovery mode. If the Boot ROM is not able to load or verify LLB, it enters DFU (Device Firmware Upgrade) mode. In both cases, the device must be connected to iTunes via USB and restored to factory default settings.

DFU mode deletes the key. There is no debug access in consumer hardware. Even if Apple complied with the order and signed a modified boot chain, how will they install it on the device? I was initially of the impression that this was a publicity stunt but now I am genuinely doubting Apple’s previous claims on the security of iOS have been wrong.


So it’s their incompetence around a basic computer protocol that means they are attempting to compel a major precedent with worldwide privacy implications.

The terrorists must be quaking in their boots at levels undetectable by LIGO…


I suppose. But in the end, couldn’t someone always just open up the phone, hook up a couple of probes, and dump all the encrypted data onto something that could then be subjected to a computer cluster? It’s all just flash memory in the end, isn’t it?


That’s easier than brute forcing a pin code how?


So this entire situation is basically a shit burrito made from technical incompetence and bureaucratic ass-covering wrapped in a national security tortilla. Got it.

Sounds to me like someone at the FBI fucked up so completely that it was preferable to create the massive noise of an unenforceable court order rather than admit it.

I bet it was Larry. That guy has no idea what he’s doing, ever.


Alternately they locked the phone according to plan and were hoping Apple would be more helpful.


Now that’s just zany.


I’d assume it was incompetence were it not the FBI we were talking about here. They don’t have a stellar track record on privacy rights or forthrightness, but sure seem really comfortable when underhandedness serves them. The data they could recover from the work phone (rather than the personal phone the attackers destroyed) doesn’t seem likely to be helpful, esp. given that they already have the cellular records, the list of contacts they’re claiming they want from the phone, and a previous backup of the work phone. The whole ordeal smells suspicious, and this kind of thing doesn’t help:


The data is actually encrypted with a combination of the passcode and a unique device key. The device key can’t be read, only used to encrypt/decrypt (the processor contains a hardware AES implementation).

Theoretically, you could use an electron microscope to discover the device key, and then proceed as you suggest. However, it is likely to be embedded within a whole bunch of noise in the mask, so as to make it extremely time consuming to track down those bits amongst all the camouflage. And if the attempt fails, you’ve ruined the phone and foreclosed any other possibilities.

More here:

edited: to fix reply to incorrect post


The piggy says “Oooooh, encryption makes my job sooooooo haaaard

I say “Fucking get used to it. Making your job easy isn’t worth making myself less secure.”


[quote=“enso, post:31, topic:74087”]That’s easier than brute forcing a pin code how?[/quote]Because you bypass all the assorted built-in protections against manually retrying many pins in rapid succession.

[quote=“kyle_c, post:36, topic:74087”]The data is actually encrypted with a combination of the passcode and a unique device key. The device key can’t be read, only used to encrypt/decrypt (the processor contains a hardware AES implementation).[/quote]But even then, regardless of whether it’s encrypted with one key or a combination of keys, it’s all just encrypted data, right? Even if it’s exponentially harder without the device key, that just means you have to throw an exponential amount of computing power at it. But I’m sure it’s not that simple.


Yes, with sufficient computing resources and time they could crack it. It’s just that it’s AES 256, so with current computing resources and approaches that would be billion of years on a supercomputer. In the future things might be different, but for now it’s not practical.


About like how I am with Google. I know with Google services I am the thing being sold, even if the data is filtered and anonomyzed there is a non-zero chance it can be put back together to make a profile of Me.

Still. I assume the risk out of convenience.