FBI demands iPhone backdoor access; Tim Cook tells them to get lost

You’re completely right - it shouldn’t. The police/FBI should be forced to figure out how to get into the phone all on their own - exactly like the lockbox or car, the manufacturers of which aren’t obligated to create special keys that they hand over to law enforcement when they make the product. Oh, is that really hard to do in the case of the iPhone? Well too fucking bad - thems the breaks.

Well that’s begging the question.

15 Likes

If it stopped at that. But it can’t be stopped at that. We’re dealing with devices that are designed to function while networked. There is no such thing as a purely physical backdoor to encrypted digital information. Please conceive of a way to do this that doesn’t weaken security and share it with the rest of us, because that’s your burden as someone willing to accept this.

Good luck with that when the government forces Apple to compromise your random number generator, or collect keys or otherwise compromise critical OS-based encryption infrastructure. Your encryption system is more than the algorithm.

9 Likes

I’ve never cared much for Jobs or his products, I’d buy Tim Cook a beer any time.

4 Likes

:blink blink:

Once somebody gains access, they have gained access.

Or are you trying to parse the difference between a burglar who enters through a door vs. one who enters through a window vs one who enters through the garage?

In any case – they’re in, they have your stuff. You’re fucked.

What are “regular” people?

Also known as “a backdoor.”

If history is a guide, such a thing doesn’t exist. It is unlikely there is a law that has been on the books for more than a day or so that has not been abused.

13 Likes

Ok, how about a different analogy: telephone switches. The police were not required to figure out how to get access all on their own, legislation was created that required telecom operators to only install equipment that was capable of supporting wiretaps. That’s what makes me think that if the FBI isn’t able to get this phone decrypted, legislation will be created to prevent wireless operators from activating phones that don’t allow police access.

I think this is a decent analogy because you are still allowed to install your own telephone encryption equipment that stymies wiretapping efforts. Likewise, if Apple were to concede, users would still be able to install their own encryption software.

No, I’m trying to point out that there’s a difference between Apple holding a key that can unlock your phone vs Apple sabotaging their random number generator.

For the first case, only Apple can unlock your phone and it would likely require Apple to physically possess your phone to decrypt anything.

For the second case, anybody can discover the random number weakness and that puts at risk all phones and all communication from those phones. Physical access isn’t necessarily required.

So yeah, both are backdoors, but the risks are different.

2 Likes

Which brings up another question: are phone wiretaps a morally decent way of obtaining information about a crime? They get abused. I suspect that the Apple resistance has a lot to do with it being a ‘hip’ company rather than a ‘good old boy’ company like AT&T, who had everything to do with adding wiretapping equipment to the phone system in the good old monopoly days.

I’m glad that Apple is calling the FBI’s bluff, as this subject needs to be litigated by a deep-pockets company.

8 Likes

The use of “criminal” here is tricky. For one thing, it’s implying that only the privacy of “bad guys” is at stake here, and that the rest of us have no personal interest in the matter.

For another, it assumes that you and I are outside of the “criminal” designation. But it’s probably safe to say that both of us have violated the law before and will again. So we can expect any surveillance measures designed for use against criminals to be directed against us as soon as it’s convenient, because we literally are criminals.

5 Likes

Absolutely. If we determine that allowing any access to encrypted devices by police is bad, then logic says we would be better off if police could no longer wiretap any phones. I agree with you completely – it might be a good time to look at the value and costs of surveillance.

2 Likes

I’m not sure what your point is, but I do agree with you. At any moment, the police could come and kick down our doors, search our homes, seize what they want, and shoot our dogs (as is the custom).

1 Like

The value: that the cops can sneak and snoop with even more impunity than they do now. That the scales can be tipped even more in favour of the prosecution than they are already.

The cost: that once the backdoor’s secret handshake gets leaked and posted all over the internet, all you have to do is steal someone’s phone, and then you own their entire life – you can clean out their accounts, impersonate them and borrow on their credit rating, pretend to be them on social media and trash their digital reputation, lure their friends and family into a trap, whatever you want.

All this talk about cops being able to have access to physically locked objects is really missing the point. Smartphones are not like safes. All you’ve got in the safe or locked desk are some papers, and unless you’re keeping cash or bearer bonds in there, the papers aren’t actually worth anything. But thanks to the way that your email and your SMS number have become the universal key to resetting your passwords on every online service ever, having a bad actor gain access to your phone can be far, far more devastating than having your house ransacked, both financially and personally. And that applies regardless of whether the bad actor is a criminal or an official of the state with a badge.

5 Likes

The court has never ordered that all safes must have a special key that can open it upon request. It seems like you are actually the one asking for phones to be treated differently.

Just because it’s harder for law enforcement to break in doesn’t make it everyone else’s problem.

12 Likes

I think that’s a perfect example, given that we’ve seen that this approach has led to rampant civil rights abuse and the monitoring of, well, everybody.

I don’t see how you can imagine that being able to force phone manufacturers to crack their own encryption would lead somewhere different.

8 Likes

What’s the point of this discussion if you aren’t even going to try to consider alternate viewpoints? It’s not that unreasonable to think that the electronics owned by terrorists might contain information of value to law enforcement.

Same to you.

Oh look, it’s the terrorist red herring. Terrorists are scary, therefore we must surrender our civil rights so that Big Brother can protect us from terrorists.

Clearly you haven’t been paying attention to how “terrorist” has been legally defined as “anyone the government doesn’t like.”

14 Likes

Why shouldn’t we manufacture and install all our own door locks?

1 Like

From what I’ve seen, the FBI asked once they found out that iOS has a built-in failsafe that permanently deletes all of the data on the device if 10 incorrect password attempts are made in a row. I’m thinking they would’ve been happy to have a team of monkeys or robots or monkey-bots try infinite passwords otherwise.

6 Likes

Well, what you really need access to is the key stored in the secure enclave.

I’m not certain, but I believe the secure enclave is written to then fuses are blown to prevent any further access to the data stored within. If a fused link is able to be physically restored (and I have no idea if it’s possible) then access to the secure enclave could be restored. Would that qualify as a physical backdoor?

Edit: When I think about this more, I think the FBI is asking Apple to create a new version of iOS with weakened security, flash it to a similar phone, then desolder the firmware chips in each phone and solder in the chip from the compromised phone. That way the FBI’s phone could be brute forced. If the phone has a good password, it could still take years to decrypt.

1 Like

A key difference here is that, unlike telephone switches, iPhones are the property of the people who bought them. So what we’re really talking about is forcing every citizen in the U.S. to allow a backdoor to be created into their own property, or to only purchase devices that have such a backdoor.

7 Likes

Yes, the Feds are asking Apple to not only create a custom iOS but install this patched iOS on the phone. According to the internet rumbling the Feds want to disable wiping the phone after 10 failed auth. attempts (though the letter doesn’t go into specifics). I am very glad Apple is not complying.

5 Likes