FBI demands iPhone backdoor access; Tim Cook tells them to get lost

I guess the FBI will ask Microsoft to write a special version of Windows each time they need access to a laptop then?

1 Like

5 Likes

The FBI’s been trying to get this for a while for drug cases - see
https://www.justsecurity.org/27214/quick-update-apple-privacy-writs-act-1789/

And police have been getting the ability to confiscate a phone from anybody they arrest and lots of other people, copy all its data, and do whatever they feel like with it (a tool they exploited a lot in #BlackLivesMatter protests), though they’ve been starting to get push-back from judges about whether that’s valid under the 4th Amendment, so if the FBI bullies Apple into giving them a backdoor into all iPhones, the police will be able to ask them for help on any random arrestee’s phone.

7 Likes

Why would I be okay with Apple retaining the keys to my phone any more than I would be okay with Chevrolet retaining the keys to my car? Apple doesn’t even have the keys in the first place, they’re generated by the Secure Enclave based on a UID baked into the chip at the factory, and are not stored anywhere else [page 7], which is why the FBI isn’t asking for that information. They’re asking the court to force Apple to give the FBI the tools to decrypt the phone by building an intentionally-vulnerable version of iOS for them which bypasses the PIN lockout. That puts those tools in the FBI’s hands permanently, which means they can then crack any phone they get their hands on. And I can guarantee you that the FBI, CIA, and NSA won’t be content to just sit on those tools. Just look at the revelation that the NSA doesn’t reveal 0-day vulnerabilities that they discover; it’s their job to weaponize software vulnerabilities. Consider: the OS patch that installs the backdoor would have to be signed by Apple in order for the phone to accept it, and because the OS update process requires a dynamically-signed patch containing the device’s unique ID [page 6], Apple would have to give the FBI their signing key (!!!). The FBI/CIA/NSA could then easily stand up a MITM attack on Apple’s Software Update server to push this surveillance-friendly patch to anyone - or everyone - and the phones would happily treat it as an Apple-authorized patch.

And what if the government, now flush with the knowledge that this backdoor is possible, forces Apple to include it on every phone? What happens when the mechanism for bypassing the phone’s PIN lockout security gets discovered by black hats? It can’t be patched out because it’s an intentional software vulnerability, meaning it can then be exploited by anyone. Forever. Now if your phone gets lost or stolen, you have no way of securing your data from criminals.

This is no different than TSA-friendly luggage locks, which were always sort-of-pointless, but are now extra-pointless because the shapes of the master keys got leaked by the agency in a 60 Minutes piece. Now anyone with a 3D printer or a metal grinder can replicate a TSA key and unlock your luggage. The secret is out, it can’t be taken back. Why would I want that for my phone, which has almost infinitely more personal and financial information in it than my suitcase?

9 Likes

Of course. It is not unreasonable to think that.

It is also not unreasonable to think that drone strikes that only kill terrorists are a good thing.

Sadly, they are also a fictional thing.

11 Likes
3 Likes

[quote=“Chesterfield, post:39, topic:73844”]
I’m not certain, but I believe the secure enclave is written to then fuses are blown to prevent any further access to the data stored within. If a fused link is able to be physically restored (and I have no idea if it’s possible) then access to the secure enclave could be restored. Would that qualify as a physical backdoor?[/quote]

No idea, just wanted to point out the phone in question is a 5c, which does not have the “Security Enclave” [pdf] feature

It’s even simpler than that. The court is directing Apple to provide a one-off IPSW file (a firmware update) that removes the “10-failed-unlock-attempts-wipes-the-phone” and “each-failed-unlock-slows-down-the-next-unlock-attempt” features. The FBI will then deploy monkeys to throw zillions of codes at the unlock screen until they get the right one. Once unlocked, the phone’s contents are pwned.

Can Apple even do this? Good question…

3 Likes

No, your first analogy was better. But the thing about phone taps is that the police did figure it out on their own to begin with - the technology was trivially easy to tap into. Only after this had become something that the police - and the whole criminal justice system - expected, the technology changed such that it required laws to be passed to force telecom operators to provide access. We’re now seeing creeping expectations and law enforcement greed at the thought of all the information and surveillance capability provided by smart phones that weren’t available before.

Which would just move the FBI’s demands to the providers of the encryption software.

1 Like

Technically it’s a very simple thing for apple to code.
Just completely remove all the encryption. Anything else would be a huge waste of time and research to get the same effect.

2 Likes

1 Like

This metaphor started out strained and the way you’re using it is now completely unhelpful. Perhaps you’re trying to say that like manufacturing your own door lock, building your own phone security is prohibitively complex and impractical. But you don’t have to build your own phone security, there are peer-reviewed open source tools already ready to pick up and use.

But even accepting that in the most general sense in both cases it’s harder to have security than not: the stakes are completely different. If the fate of privacy and balance of power in our society rested on whether our door locks could be overridden, then I might seriously advocate for manufacturing and installing our own door locks. But it doesn’t. Our door locks aren’t that important, which is why nobody really cares that the locks on our front doors can be easily defeated. Data encryption is so much more important and has so many implications for society at large that the metaphor stops making sense.

1 Like

I think you overestimate the technical capabilities that the FBI has. They pay shit and most of their people aren’t actually that good in the computer space.

2 Likes

The FBI are trying to leverage the notoriety of the San Bernadino case to get something that they’ve been trying to get for a while.

14 Likes

But this also suggests they didn’t just hand it over to the NSA (based on the assumption that they are not similarly afflicted).

1 Like

but they still did

legislation was created that required telecom operators to only install equipment that was capable of supporting wiretaps

so that it could be admitted as evidence in court, which it already was, but constitutionally, which it previously wasn’t.

This is a new arena from that. This is more about the right to not self incriminate, the right to privacy. The right to not self-surveil.

ETA: Maybe even the right to control the output of this nifty little movable type machine, too.

1 Like

The circumstances of the order center on the investigation into last
year’s San Bernardino terror shootings in California: “Specifically, the
FBI wants us to make a new version of the iPhone operating system,
circumventing several important security features, and install it on an
iPhone recovered during the investigation. In the wrong hands, this
software — which does not exist today — would have the potential to
unlock any iPhone in someone’s physical possession.”

What I find a bit troubling is the wording seems to hint at forward secrecy not being part of the design.

ETA: Oh I see. This has to do with unlocking and automatic deletion, not session keys.

1 Like

And people wonder where the anti-government "militia’ nuts come from.

Both he Left and the Right now fear the government as it can no longer be trusted to be on the side of the People.

One solution to this security problem is to overwhelm those who would read your private data: Load your phone with heavily encrypted files that contain nonsense.

Let them spend hours and days decrypting garbage. Much harder to know you’ve decrypted something when it still looks like encrypted data.

2 Likes

I didn’t take his comment to mean “manufacture” of security. I thought he meant insisting on legislation that protects our privacy rather than electing people who keep getting lobbied to pass legislation that erodes our privacy. IOW, we need better leaders.

3 Likes

Agreed, although that really was already the case. This is rather more minor to my mind.

1 Like

All official typewriters (ones that used a particular typeface and point size) in the Soviet Union had their output recorded in sufficient detail that the KGB could work out what typewriter had been used to produce a given document.

I offer this modest little suggestion to the FBI and NSA to use, free of charge. All phones in the US must have baked in a code sequence which is sent out with every message, uniquely identifying the origin.
After all it worked - it’s not as if the Soviet Union collapsed or anything.

7 Likes