FBI demands iPhone backdoor access; Tim Cook tells them to get lost

[Read the post]

6 Likes

This will get ugly. But I’m happy Apple is pushing back.

24 Likes

The day we always knew would someday come is here - Apple has been ordered to install a backdoor in the iPhone.

3 Likes

I think that we are quickly approaching a watershed moment in technological history that makes Scalia’s death extremely significant.

20 Likes

It’s really not worth taking seriously. This is the same government that cries about access circumventions being counter to the DMCA. They will simply need to make up their minds. It is basically all or nothing. The applicability of the axioms of law are independent of who the party in question may be. We get to have privacy, or complete transparency - but the FBI and federal US are not getting a double standard with special rules just for them.

2 Likes

Bravo to Apple for doing the right thing firmly and publicly (while secretly complying (hopefully joking))

9 Likes

All Writs Act of 1789

Who says the FBI doesn’t have a sense of humor!

5 Likes

Minor point. The FBI asked, but the order comes from a magistrate judge. So Apple’s first move is to quash the order. Much legal drama before this gets settled.

Also, the order doesn’t require Apple to do anything impossible, so there may also be technological barriers too.

Even if a court orders it, you can’t make a tail a leg (A Lincoln: Q: How many legs does a dog have if you call the tail a leg? A: Four, just because you call it a leg does not make it one.)

3 Likes

I’m glad Apple is pushing back, but I think that the FBI will eventually get what it wants and I’m not terribly upset about that.

An encryption backdoor would be a terrible thing, but being able to decrypt a phone that the police have physical possession of doesn’t seem that unreasonable. At least no less reasonable than the police being able to break into a safe in your home.

If a user encrypts their data, then a decrypted phone will just yield more encrypted data. The lesson is - don’t rely on Apple (or Google or Microsoft or Blackberry or …) to keep your secrets.

2 Likes

Good on you, Tim Cook.

5 Likes

One of many problems here is that if the police have the ability to decrypt a phone that they have physical possession of, it opens the door for access by many others: corrupt individuals within the police departments, the NSA, foreign governments, and more.

Ordinary people will probably not bother with the second layer of encryption that you describe, but the bad actors this measure is meant to catch will. All it will serve to do is make regular users more vulnerable to all kinds of attacks.

12 Likes

How is that different from any other ability the police have? Why should a phone be treated differently than a lock box or the trunk of my car? I think that requiring physical possession of a device before the data can be accessed is a pretty good layer of security.

This is all about trade-offs though and I can understand why you would have a different position than me. I think there is some value in letting the police decrypt a criminals phone. As you point out, there is a cost to that too. Do the benefits outweigh the costs? With a generic encryption backdoor, certainly. With a backdoor that requires physical possession? The costs to the public are probably less. The question is one of finding balance.

enabling said capability would be enabling an encryption backdoor.

“requiring the police to have access to all your keys would be horrible, but requiring the manufacturers of locks to make keys available to the police on demand isn’t so bad”

A backdoor by any other name is still a backdoor.

Apple isn’t keeping any secrets. They make devices that feature the ability to automatically wipe their contents if you repeatedly try to break into them. They are refusing to create a way to bypass that security feature.

That’s all fine and good for secret agents, uber nerds, and l33t hackers who will take the time and trouble to learn how to separately encrypt their data on a per-app basis.

And your position is that ordinary mortals should just learn to live without the security of knowing their private info is safe from being rifled through by any person who steals their phone, because the cops find it inconvenient to not be able to rifle through the data of anybody whose phone they confiscate. Whereas Apple’s position is that an easy, simple, built-in whole device encryption system should exist so that everyone can enjoy the same privacy as those uber-nerds. I’ll side with Apple on this one.

15 Likes

Has anybody told them about the threat posed by locks? Why, people could be doing anything in the so-called privacy of their homes!

17 Likes

I guess, but it’s troubling that the right to personal privacy gets decided by Apple in the first place.

They did the right thing (very loudly and publicly) in this case, but they surely won’t and haven’t every time. Why should we leave it up to them at all? It’s our privacy, not theirs.

4 Likes

Two things surprise me (pleasantly, I guess):

  1. The FBI asked, and
  2. They were unwilling and/or unable to just go ahead and handle it themselves.
5 Likes

I’m surprised this didn’t come through a National Security Letter, which would have included a gag order preventing Cook from saying what they’d asked for. Thank you EFF for fighting the good fight!

17 Likes

I can’t even.

5 Likes

[quote=“Glaurung, post:13, topic:73844”]
A backdoor by any other name is still a backdoor.
[/quote]

Yeah, but not all backdoors are equal.

If the encryption itself was weakened, then all traffic involving the device is at risk all the time. If decryption requires physical possession, then the danger to regular people is considerably less.

You have to trust them on that (and I do). As I understand it, the GUID that they burn into the secure enclave is a master key of sorts that Apple claims they don’t record.

I presume that any phone decrypting could only be done by Apple. If I trust them to make a secure device, then I trust them to not abuse my privacy by unlocking phones they aren’t required to. A random thief will hopefully not be able to convince Apple to unlock random devices.

Like I said in an earlier comment, it comes down to the value of a backdoor vs the cost of it. A backdoor that puts everybody at significant risk is a non-starter. One that is likely to be only used in a highly targeted manner is worth thinking about. If Apple holds their ground here, I’m afraid that the FBI will be able to use this incident to get legislation passed that would ban the wireless network operators from activating non-approved devices (and we know how well the wireless companies like to cooperate with the government).

If Apple’s software was 100% secure they would not be able to comply with the request, because any backdoor designed after the encryption of this particular phone would not work on it. Since we know that in the real world nobody can design 100% foolproof encryption, the question becomes one of how much effort it would take to decrypt this device. My guess is that Apple doesn’t know, but that’s just based on the fact that when I was trying to design secure communications, co-designing cracking means wasn’t on the agenda; far from it, just make the thing as difficult to crack as possible.

But Apple cannot introduce a backdoor, because Snowden. Snowden’s real threat to the authorities wasn’t any imaginary agents in the field being put at risk; it was the revelation that they could not guard their own secrets reliably. A load of diplomatic stuff didn’t perturb the Russians too much; but if there had been an iPhone backdoor and if Snowden had been in possession of it, it would have been “Here’s your choice, give us the iPhone backdoor and be secret FSB colonel Snowden, or don’t, and spend the rest of your life shovelling snow on the Kurile Islands while you think about it.”

7 Likes