I assume that the limitation is how quickly a phone can accept passcodes, not how quickly a powerful computer cluster can generate and enter them.
I feel like I must be missing something here, but: can someone explain to me why the FBI canât just gain access to whatever e-mail account is associated with the phoneâs Apple ID, trigger a password reset, grab the password reset e-mail from the account and go form there?
Surely compelling a company to provide administrative access to the e-mail account of a deceased mass murderer in the context of an ongoing investigation is easier - legally, ethically, and technically - than trying to compel a company to create new software that will backdoor all phones everywhere.
Itâs because the FBI wants easy access to pwn any iPhone. They donât need help in this instance, they only are going for this now because they thought itâd be politically favorable, since this case has to do with terrorism.
This smacks of theater. Iâll bet anyone here a dollar that an organization with the resources such as the fbi could easily write out the contents to a big fpga and bang on it till it cracks in a few minutes. The OS and bootloader are already knowns, for peteâs sake.
I feel like this has to be a pissing contest in order to make it admissible as evidence and get their fingers in every pie-hole imaginable.
We said no to total information awareness years ago, and they did it anyway.
Thereâs an option when you are setting the passcode to switch to the keyboard. Appleâs support has a page on that
There is no password recovery process for the screen lock password. If you forget yours, your only option is to restore phone to factory settings (erasing all data).
The link M-dub is providing deals with a separate issue. Apple could theoretically have hacked this phone only to the extent of forcing it to backup to iCloud and then given the information from iCloud to the FBI. But the canât communicate with iCloud any more since the iCloud password was changed.
Appleâs Letter:
Is it technically possible to do what the government has ordered?
Yes, it is certainly possible to create an entirely new
operating system to undermine our security features as the government
wants. But itâs something we believe is too dangerous to do. The only
way to guarantee that such a powerful tool isnât abused and doesnât fall
into the wrong hands is to never create it.
Appleâs security whitepaper:
If one step of this boot process is unable to load or verify the next process, startup is stopped and the device displays the âConnect to iTunesâ screen. This is called recovery mode. If the Boot ROM is not able to load or verify LLB, it enters DFU (Device Firmware Upgrade) mode. In both cases, the device must be connected to iTunes via USB and restored to factory default settings.
DFU mode deletes the key. There is no debug access in consumer hardware. Even if Apple complied with the order and signed a modified boot chain, how will they install it on the device? I was initially of the impression that this was a publicity stunt but now I am genuinely doubting Appleâs previous claims on the security of iOS have been wrong.
So itâs their incompetence around a basic computer protocol that means they are attempting to compel a major precedent with worldwide privacy implications.
The terrorists must be quaking in their boots at levels undetectable by LIGOâŚ
I suppose. But in the end, couldnât someone always just open up the phone, hook up a couple of probes, and dump all the encrypted data onto something that could then be subjected to a computer cluster? Itâs all just flash memory in the end, isnât it?
Thatâs easier than brute forcing a pin code how?
So this entire situation is basically a shit burrito made from technical incompetence and bureaucratic ass-covering wrapped in a national security tortilla. Got it.
Sounds to me like someone at the FBI fucked up so completely that it was preferable to create the massive noise of an unenforceable court order rather than admit it.
I bet it was Larry. That guy has no idea what heâs doing, ever.
Alternately they locked the phone according to plan and were hoping Apple would be more helpful.
Now thatâs just zany.
Iâd assume it was incompetence were it not the FBI we were talking about here. They donât have a stellar track record on privacy rights or forthrightness, but sure seem really comfortable when underhandedness serves them. The data they could recover from the work phone (rather than the personal phone the attackers destroyed) doesnât seem likely to be helpful, esp. given that they already have the cellular records, the list of contacts theyâre claiming they want from the phone, and a previous backup of the work phone. The whole ordeal smells suspicious, and this kind of thing doesnât help:
The data is actually encrypted with a combination of the passcode and a unique device key. The device key canât be read, only used to encrypt/decrypt (the processor contains a hardware AES implementation).
Theoretically, you could use an electron microscope to discover the device key, and then proceed as you suggest. However, it is likely to be embedded within a whole bunch of noise in the mask, so as to make it extremely time consuming to track down those bits amongst all the camouflage. And if the attempt fails, youâve ruined the phone and foreclosed any other possibilities.
More here: Apple can comply with the FBI court order | Trail of Bits Blog
edited: to fix reply to incorrect post
The piggy says âOooooh, encryption makes my job sooooooo haaaardâ
I say âFucking get used to it. Making your job easy isnât worth making myself less secure.â
[quote=âenso, post:31, topic:74087â]Thatâs easier than brute forcing a pin code how?[/quote]Because you bypass all the assorted built-in protections against manually retrying many pins in rapid succession.
[quote=âkyle_c, post:36, topic:74087â]The data is actually encrypted with a combination of the passcode and a unique device key. The device key canât be read, only used to encrypt/decrypt (the processor contains a hardware AES implementation).[/quote]But even then, regardless of whether itâs encrypted with one key or a combination of keys, itâs all just encrypted data, right? Even if itâs exponentially harder without the device key, that just means you have to throw an exponential amount of computing power at it. But Iâm sure itâs not that simple.
Yes, with sufficient computing resources and time they could crack it. Itâs just that itâs AES 256, so with current computing resources and approaches that would be billion of years on a supercomputer. In the future things might be different, but for now itâs not practical.
About like how I am with Google. I know with Google services I am the thing being sold, even if the data is filtered and anonomyzed there is a non-zero chance it can be put back together to make a profile of Me.
Still. I assume the risk out of convenience.