"Phooey": a pre-eminent cryptographer responds to Ray Ozzie's key escrow system

Originally published at: https://boingboing.net/2018/04/26/party-like-its-1999-2.html

5 Likes

Hey, hey, that card is from Steve Jackson’s Illuminati. Anyone else play that? Man what a fun game.

2 Likes

I am selling extremely pure Unobtanium at a really low, Low, LOW prices. Get it before it’s Unobtanium-able!

Unobtanium Now!

2 Likes

Yep. Ultimately the trust lies in the key escrow system and there’s no such thing as a perfect chain of trust.

When it comes to getting access to Apple devices, law enforcement (and the organizations they work with) need to simply follow Apple’s instructions and stop shooting themselves in the foot.

Take San Bernadino: The FBI could have gotten access to the phone easily. It was a enterprise-managed phone. You use the mobile device management tools to remove/reset the PIN. Some genius in the health department tried to give the FBI access by resetting the iCloud account password… that disconnects the phone from iCloud until you unlock the phone and enter the new password. Now you’re screwed unless you either know the existing PIN or wipe the phone.

4 Likes

Alex Jones: “That’s no game! They^ just try to make people think it was a game!”

^ spin the dial for the They of the moment.

6 Likes

Aside from the grievous technical issues; this seems like it would be at considerable risk of ruining everything while still failing to satisfy the ‘zOMG going dark!’ crowd.

The intended architectural safeguard against dragnet deployment of escrowed keys obtained from a cooperative and/or coerced vendor is the magic forensics mode coprocessor that uses methods, unspecified but no doubt quite practical to implement, to shut down everything(in a read only sort of way) if the escrowed key is used against the device(but in no other cases; because those other cases would imply both amusing DoS capabilities and a means of forcing a device into forensics mode without the key; and the key is the only way to unlock the device because otherwise this would be insecure).

This means that any sort of clandestine search, surveillance, or ongoing monitoring of a target’s communications would be off the table with as much absoluteness as the almost definitely flawed technology could enforce.

That isn’t going to satisfy the feds of the world for long. The ability to unlock seized items will be seen as better than nothing; but given the long use and general legal acceptance of things like ‘wiretaps’ they will soon be back with an urgent request for some sort of ‘nondestructive remote forensic capability’; which will either have to be denied or answered by creating a second layer of escrow, with all the problems of the first(plus a network accessible attack surface by design rather than just by accident).

You’ll have the escrowed keys that can only be used to destructively unlock devices; which are kept impeccably secure by the vendor and only released to duly authorized law enforcement acting with appropriate judicial approval; because otherwise building this backdoor would look like a terrible plan.

Then you’ll have the escrowed keys that can be used for ‘Delayed Notice Warrant’ purposes; without triggering the omniscient and infallible killchip. These will presumably be kept even more impeccably secure than the impeccably secure destructive readout keys; because they are vastly more dangerous as tools of dragnet surveillance; targeted harassment; and good old fashioned economic crime(SMS ‘two factor’, lol.).

For extra credit; such covert inspection is typically only granted(in the world where law enforcement methods are humane and proportionate, judicial safeguards in place, and state objectives not nefarious; luckily that’s the same world where impregnable HSMs and omniscient killchips live; so we are set) for a defined period of time(either an exact timespan after which reauthorization is necessary; or ‘until ongoing investigation goes to trial’ or the like).

Unless we just want to trust all the cops to just delete any covert inspection keys that are no longer authorized, scout’s honor; we’ll actually need another level of indirection and some additional infrastructure: acceptance(without trigggering destruction) of the covert access key by the omniscient killchip will have to be based on cryptographically verifiable timestamps attached to escrow keys when they are released by vendors and the omniscient killchip having guaranteed ongoing access to the Lawful Good Trusted Timeserver(since, if this access isn’t assured; malefactors could skew their system clocks to cause valid access attempts to be rejected as either outdated or not yet valid; and the risk of reuse(better tack on something for replay resistance; but not in a way that would allow the surveillance target to infer that they are being targeted by seeing how quickly the replay-detection-cache is filling up) of formerly valid but now expired remote authorizations by tampering with a device’s time would also be possible(including against an entirely different target, if a piece of hardware changes hands).

In order to prevent people from using device access to the Lawful Good Trusted Timeserver as a signal that they are compromised we will presumably either make routine check-ins with this server(also not an unimaginably high-value target) mandatory for every device; or modify all current and future network stacks and interfaces to add the Law Enforcement Delayed Notice Trusted Not A Side Channel; a (super authenticated, by means so trivial and evident as to not be worth discussing here) communications channel that Shall be provided by all networking mechanisms and Must Not allow an entity without a LEDNTNASC escrow key(implementation left as an exercise to the reader) to inspect either the contents or view the transmission activity or inactivity of the channel; which could be used for checking in with the LGTT to verify the validity of DNT-non-destructive-decrypt escrow keys before honoring or ignoring them.

It might, I’m not a fancy number theorist enough to say, be possible to escape the need for the timeserver by using an initial seed value and time(in the manner of the RSA fobs) to the vendor to retain the initialization seed and time of each device(in a duly impregnable HSM; as history as proven is not difficult); so that they could issue escrowed non-destructive-access keys tied exclusively to a specific set of iterations on the initial seed value; allowing the omniscient killchip to verify the temporal valididty of a request without any need for remote communication. A small, but clearly worthwhile, drawback of this elegant mechanism would be that any device whose trusted RTC ever loses power or is otherwise put into an unverifiable state permanently loses the capacity for temporal validation; and so Must be designed so as to either activate the read-only-killswitch state or accept all temporally conditional access requests without validation; lest trivial hardware attacks on the RTC be used to place a device beyond the reach of justice.

13/10: seems legit

3 Likes

Phooey.

There’s much to be said for single, killer word replies. (The FULL story of Gen. McAuliffe’s “Nuts” is worth the read and enlightening.)

1 Like

Back to the San Bernadino case: didn’t the FBI propose something that was actually doable by Apple, if they wanted to? That was: issue a software update to just that phone and have it change the security to allow unlimited passcode tries? Bonus was to allow passcode tries electronically?

This all hinged on Apple being able to sign any update with their secret key. Isn’t it this secret key that’s the weakest link? Any agent that knew this secret key could then have access to any of that kind of phone by being able to push their own payload as an OS update.

The fear was that someone could use that same update and send it to other similar phones to gain access to them/

With individual keys, an update could be signed with Apple’s master key and the individual key, so the update only works on it. Bonus points for having the OS see that it’s signed just for that phone and puts it into tattletale mode.

How much less secure is this that what’s currently going on?

And both Steve Jackson Games and Alex Jones are headquartered in Austin. Keep Austin Weird.

1 Like

Point #1: Ozzie has made clear that the physical vault security requirement is essentially identical to that already used to protect operating system updates.

I didn’t bother reading any further. I’ve better things to do.

I’ve never been particular impressed with Ozzie as a software engineer – he’s always struck me as more of the kind of person that talks a good talk, has some good ideas, and surrounds himself with smart people that make him look good.

2 Likes

… and reading TFA my thought is,“that’s it?”

It’s basically a form of public key cryptography where the issuing authority has the ability to get the private key on demand.

I’m no expert in cryptography but even I know that once a third party has the ability to reverse your encryption, it’s not secure.

2 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.