doctorow — 2013-10-15T11:38:57-04:00 — #1
lemoutan — 2013-10-15T12:31:03-04:00 — #2
Had Lavabit had in place measures to prevent disclosure of its master
key, it would have been unable to comply with the ultimate court order ...
Although the closure of the service may still be the final result, this time compelled by force majeur, it would still be a better outcome since such an action could only be interpreted as spite.
dragonfrog — 2013-10-15T12:41:58-04:00 — #3
Sort of. But also not really.
For instance, a company can implement technologies that make it resistant to the bribery but not the court order. Consider:
A company receives a court order. In order to extract the data, two employees must both authenticate themselves to the system (ideally by biometric + password), enter the court document number, the user ID or IDs to which it specifically applies, and the public key + certificate under which the data will be encrypted. One of these users must be a manager or higher in the technical side of the business, and the other must be a company lawyer.
Annually, the system produces, and the company provides to [its board of directors / its customers / its shareholders / the public at large] a list of
- court orders complied with (court document number included)
- names, or if a gag order applies, at least number of affected accounts in each case
- certificates under which the data was encrypted
- staff members who attested to the correctness of the demand, in each case
If a staff member receives a bribe or extortion threat, they are going to have to cook up a fake court order, find a co-conspirator in the organization, and know that when the report comes out, the encryption certificate section will show that they extracted information on behalf of "firstname.lastname@example.org"
tuseroni — 2013-10-15T16:21:49-04:00 — #4
except the court order, and the fact that a court order was given must be secret.
richard_kirk — 2013-10-15T17:41:54-04:00 — #5
This is a simple and neat argument. It is almost a proof. We can argue that surrendering private data to a trusted body is somehow different to surrendering the same data in response to a threat. Descartes reasoned that everything he sensed could be influenced by some daemon simulating everything he sensed, and so all his inferences could be mislead. Worse than anything he imagined, even his reasoning processes could easily be corrupted. Descartes argued that a loving god would not allow this to happen. Security patriots say that a loving government would not allow this to happen.
Any finite system will be unable to determine whether a request from outside that system is legitimate. Even a request from within the system cannot be trusted, as the NSA now should know. The safest thing would be to dump all the requested data, and hope the user has a non-volatile copy. Anything less than this is not safe, though you could imagine a long-standing agreement to encrypt everything with a key supplied by the client when the account was set up, and throw away all local copies of that key. The onus is then on the customer to not release that key, but at least they are protected against malicious request attacks that would destroy all their data...
dragonfrog — 2013-10-15T18:17:13-04:00 — #6
Which is a whole problem in itself - then the full report can only be published to the board of directors, which is a very inferior control compared to publishing it more broadly.
It still could provide a fairly good protection - a single rogue insider would still have a hard time getting away with a leak.
doctorow — 2013-10-20T11:38:56-04:00 — #7
This topic was automatically closed after 5 days. New replies are no longer allowed.