Ah, I just assumed you did not, since you kept referring to it as dumb. Didn’t sound very diplomatic, but I certainly wasn’t asking for advice.
It’s worth remembering that hunting down political dissidents was what the FBI was originally created for, and it has always been their key function.
(yes, these are from the pre-FBI days, when it was just the Bureau of Investigations. But it was the same people; Hoover organised the Palmer Raids)
What always gets me about this particular “NERD HARDER” is the meanness and pettiness of spirit that it demonstrates.
I mean, what kind of a person, when he thinks we can get around laws of nature if only the nerds can be motivated to nerd harder, his dearest desire is to snoop on people’s email?
Not getting free energy so we can all be rich.
Not finding a way to reverse old age.
Just using that grand power over nature to try to set yourself up as second-rate Stasi.
Well, it’s not 0. Otherwise you’d just stop with the previous digit.
See? I just did 10% of the job for you and I don’t know shit about math. Now get to work, nerds.
The point is that in cases like that, investigators should go after the criminals and their crimes, not attack the fundamental infrastructure of society. They get frustrated by a case that they just aren’t quite good enough to solve and decide that the ends justify any means, no matter the collateral damage it will cause. Like a superhero destroying all of Metropolis, killing countless innocents and the victims in the process, just to get one bad guy. That’s not a superhero, that’s a supervillain.
Sure you could solve those crime problems by poisoning the ground water or nuking everything to slag, but it’s not the right solution, and that should be obvious, at least to someone who’s competent enough to get to FBI Director level.
It sounds like the cases that he’s frustrated by are not anything with an obvious real-world manifestation, things where they can’t do a conventional investigation. Maybe things like DDOS attacks, ransomware, cryptocurrency theft, or other forms of hacking? Doesn’t really make sense otherwise, unless he’s just saying that the FBI is incompetent.
That would be expensive! Even talking to a few isn’t cheap.
I think you’re on to something here. Security based on a combination of procedure and/or physical objects and/or code:
- the president has to shoot the officer carrying the nuclear codes in order to be able to use them
- tamper-evident packaging
- single-use credit card numbers
- telling my neighbor I’m going away for the weekend and asking him to feed the cat so burglars see lights going on and off, but also giving my neighbor who is coming in to feed the cat a dedicated code for my front door smart lock, then reviewing the spycam vids later
- voting machines that give you a printed receipt to view but not keep to let you verify your electronic vote (VVPAT)
Yes, breakable crypto is an oxymoron propagated by actual morons. But situation-specific combinations of security controls is what we already do.
Of course, a backdoor accessible to law enforcement on a particular platform means motivated bad people will use a different platform. But if we can get this guy and others to think about taking a granular, procedure-dependent approach rather than voice-of-deity omnipotent approach, it would occupy their little minds for quite a while.
No doubt. I was merely criticizing Mr. Gates’ words in that, firstly, that “obvious breakthrough” is not something that anyone should expect to happen in their lifetime; and, secondly, that Mr. Gates’ statement (as reported by @hastur) is wrong. It is trivially easy to factor large primes as they are factored already. This second point is, admittedly, pedantic.
(Edited to fix a verb.)
innovation on the level of Newton or Einstein, or maybe that one really nerdy guy who got killed in the movie with Robert Redford, that was a good movie, even though Redford basically played himself - a godless commie.
I think he knows he is blowing smoke but it gets good press and keeps him looking like he is “doing something”.
A door you can go through is a door others can go through and he wants a door in encryption.
Or he might just be that stupid, I am no longer surprised by people in high places or wielding huge sums of money that think concrete facts are just someone’s opinion ir “merely a theory”.
I suspect he envies the long gone East German Vopo who required you to provide samples of your typewriter’s output they could keep on file in case you decided the State wasn’t the Workers’ Paradise you were told it was and wrote something treasonable to the People.
No Doubt. FBI will adapt too though, and I don’t think people will necessarily like where that leads. There are certain parallels between the situation now and difficulties NSA was facing twenty years ago, and we all know how that turned out.
This is true, but convenience plays a significant role in the decision of what to use. There will always be people who take the time to learn about the systems and deploy completely impenetrable defenses, but the vast majority will just use whatever solution is ready and waiting out of the box.
To be clear, I’m not talking about HTTPS, Tor, or end to end encryption, and nor is Wray. The main issue here is device encryption on things which are already sitting in evidence lockers, already have warrants issued, but remain inaccessible because the owner is unwilling or unable to unlock them.
I deliberately say owner rather than suspect, because there are plenty of times when the device belongs to the victim. It used to be that manufacturers were eager to assist under those circumstances, but not any more. For what it’s worth, I think they were justified in removing that as an option. I just hope it was a net positive.
That’s how Cory frames it, but he’s being disingenuous.
They aren’t talking about all encryption, they don’t want to crack it, and they don’t want a back door. They’ve even largely abandoned resistance to end to end encryption. Encrypted storage is where there’s potential for compromise, something like an additional private key for the front door, held by a trustworthy third party, like the EFF, who’ll only hand it over with a warrant.
That particular setup isn’t viable for the reasons already discussed here by others, but it’s clearly not impossible. Finding a compromise which is acceptable to everyone may be impossible, but framing it as a mathematical impossibility is not accurate.
When Wray talks about it requiring innovation, he makes it clear he’s not just talking about crypto. An example from the procedure side would be prioritizing the device during an arrest, waiting until it’s unlocked to make a move. Then there are little doodads which can be attached to prevent it from going idle and locking, and evidence bags lined with foil that prevent ‘self-destruct’ routines from being triggered remotely.
In the longer term the sort of innovations you can look forward to are a big uptick in ‘lawful hacking.’
I agree with this quote.
We can know how long it takes to break a code using a given technique but we cannot be certain that this technique is the best one. There have already been breakthroughs mathematics, and steady development in computer hardware. Together they have bought down the processing times to break a given key by many orders of magnitude. All codes get weaker with time. Or, at best, stay the same. They won’t, ever, get better.
So far, this progress in cracking has been easily countered by making the key a bit longer. However, we do not have the God edition of the Book of Mathematics with the answers in the back: our confidence largely stems from the fact that a lot of smart people have tried, yet the gains have been slow and steady, rather than dramatic. And that quantum computing does not look like doing jobs this complex any time soon.
The bitcoin value is pegged to this confidence. People are giving up gold and fiat currencies, and putting our trust in human ignorance and failure. These are truly Interesting Times. But if it gets bankers interested in funding fundamental quantum physics, then I’m happy.
Clearly you don’t understand that HTTPS, end-to-end encryption, and at-rest storage encryption rely on the same underlying cryptographic algorithms (and HTTPS is end-to-end encryption, by the way). If you break one, you break them all. There’s no way to selectively break AES for device storage without also breaking it for online shopping or banking security. There’s no way to selectively make the Secure Enclave in an iPhone vulnerable to FBI intrusion without also making it vulnerable to everyone else. We already went through this idiotic notion of “unbreakable encryption for me but not for thee” in the 90s when cryptographic algorithms shipped in American products were federally prohibited from using keys that exceeded a certain bit length in products sold outside the US so that they’d be easier to surveil. That never came back to bite anyone in the ass. Nope.
People’s mobile devices are filled with a staggering amount of sensitive personal information, which is so highly sought after by criminal enterprises that Apple and other manufacturers have started designing their devices in such a way that even their rightful owners can’t retrieve data from them if they forget their passcode. This has on numerous occasions made people very angry when Apple can’t unlock their phone, or perhaps the phone of a deceased relative, but Apple considers the security of people’s personal information to be so important that they’re willing to inconvenience their own customers for the sake of protecting them from thieves. It also just so happens that there does not physically exist a process by which a phone can be told “this is the FBI” in a way that is unspoofable by people who aren’t the FBI.
They very clearly and plainly do want exactly that. Any ability to force one’s way into an encrypted system without the permission or knowledge of the parties involved is, by definition, a back door. They just hate it when security experts call it that, because people understandably dislike the notion of the FBI and NSA having the keys to every digital lock in the country. There is no way to permanently guarantee the security of this kind of crypto-breaking until the end of time. The knowledge that such a key exists will overwhelmingly attract efforts to break it or exploit weaknesses in the code supporting it by actual criminal enterprises. It will also hamper the ability for American products to compete in the rest of the world, where people are going to be even less excited about the notion of the NSA/FBI being able to break into their personal lives. The US government doesn’t allow contractors to buy many products from firms in China because there is a fear that the Chinese government has had those companies build exactly these kinds of back doors into their own devices, and it issues advisories to security-sensitive industries to avoid them as well. What you and Wray are proposing would put anything manufactured by a US company in the exact same boat for the exact same reasons.
Criminals also won’t have to roll their own devices or invent their own cryptography. There are already countless open-source applications and frameworks that can render content unbreakably encrypted, and plenty of devices onto which this software can be installed with trivial amounts of effort. Even if iOS 12 had some sort of magical government-mandated back door in it that could function in the way you and Wray are proposing, that would do nothing to stop applications like Signal – which are already in the App Store and thus extremely easy to adopt — from storing their own data using encryption that is not vulnerable to that back door. It would be like unlocking a safe to find a pile of documents written in Linear A.
It’s sort of mind-boggling that you think pitching this as an inherently insecure front door somehow sounds better. Who in their right mind would ever buy a front door that was designed to be unlocked without their permission? Would you buy a lock if you knew that the police department got a copy of your key?
If you think this sort of thing is safe to develop because it’ll only ever be used by government agencies who promise only to go after terrorists and human traffickers, you’re impossibly naive. Putting aside the mountain of evidence that the FBI and other three-letter agencies use every ounce of the power already in their possession to pursue and surveil law-abiding citizens and organizations that they happen to dislike (from Martin Luther King Jr. to mosques and Muslim community groups to the Black Lives Matter movement), do you remember when the TSA’s universal luggage key was shown on national television in sufficient detail for anyone to make their own copy? Even the NSA has had its hacking tools stolen and turned to malicious ends in the past. Keys that only work for the “good guys” do not exist and can not exist, because mathematics works the same way for everyone, and mathematics is the foundation of digital cryptography. It is fundamentally impossible to design any lock that can easily be picked by the “right” unauthorized people without also making it vulnerable to attack from the wrong ones.
The government does not have the right to gain access to anything they please, nor should they. They cannot coerce testimony from defendants. They cannot force a defendant to make encoded or encrypted physical documents readable by providing the decryption cypher. They can’t even compel a defendant to provide the password to an unencrypted system. Why is this settled understanding of personal rights thrown out the window the moment a cell phone enters the picture? Law enforcement is supposed to be hard, because making it easy makes it easy for it to be abused. If the only evidence of a physical crime like human trafficking exists on an encrypted cell phone, frankly law enforcement has fallen down on the job. (And in all of these evidence-only-exists-on-a-cell-phone cases, how on earth did they get a warrant for the device and/or the person in the first place?)
Thanks for posting all that so I didn’t have to!
They are talking about all encryption. I’m assuming, for the moment that you aren’t being disingenuous, and are just ignorant and gullible, if you believe that. Cory is not being disingenuous, he just knows more about the subject than you. Any data in storage on a device that can connect to the internet is just as vulnerable as data in transit, and requires the same protections.
What you are describing is known as a “key escow” system. These have been being proposed for over 20 years, and, to date, even if you grant that the escrow has not been copied, there aren’t any that meet the basic requirements of guaranteed security and auditability. Either you have some specific, pubished algorithm/protocol in mind, which has been published for at least a couple years, and doesn’t have any holes in it yet, or your assertion that this is possible shows that you are too ignorant to have an opinion on the subject. I don’t believe that this has been proven to be impossible, and it might well be possible, but it’s not the kind of thing you can just say “nerd harder” and throw more money at it and have any expectation of results at all, because we don’t know that it is possible.
That leaves aside the fact that the key repository is the highest value target in the world. Everyone, including nation states, will be going after it. The government certainly can’t keep it safe. They recently had the Chinese hack the database with all of the detailed forms and interviews from everyone who has a security clearance (and a lot of people who don’t, including me, according to the letter they sent me). A trustworthy organization wouldn’t take that on because they would know they couldn’t keep it safe. Anyone in any industry (like mine) that has regular episodes of state-sponsored industrial espionage would be negligent if they didn’t use another layer of (working) encryption on top of that. You’d have to assume either that it had been compromised or that it would be (rendering all communications made with it retroactively compromised). Even more so for financial institutions. Individuals would make the same decision.
At this point, if you want to actually keep people from using their own encryption, you have to outlaw collections of random numbers. Engineers who need them for work would have to get special random-number permits. You’re SOL if you’re a hobbyist or working on an open source project. You’d have to be careful not to leave the record button on lest it catch some line noise.
If it’s illegal to send a message in a code, then you live in a totalitarian state.
I’m not sure why you’re bringing this up here, because nobody has a problem with this. It falls under “better police work,” not “mandating insecure backdoors that will be used for mass surveillance.”
Hopefully it would need to be “this is the FBI with a crypto-signed warrant”. Wait, hang on while the phone checks the GPS. Oops, “needs a warrant signed by a Canadian court”.
I suppose requiring physical possession to unlock could eliminate some of the international problems. (Like a phone in an embassy.)
You’re absolutely right, but you’re addressing a whole lot of problems that don’t apply to the approach I’m advocating for. I’m clearly atop mount stupid here and shouldn’t have said anything, because I don’t possess the ability to properly convey what I mean while maintaining the abstraction necessary to not be fired, let alone avoiding the impression that I’m suggesting everyone simply email Chris Wray their passcode.
I’ll give it one more try. There is a way to handle a very specific subset of these cases which should be acceptable to everyone. It’s counter-intuitive, but it wouldn’t introduce vulnerabilities, wouldn’t be usable for surveillance, and wouldn’t be susceptible to malfeasance from criminals or law enforcement. It may turn out that there’s some other reason it’s not viable, I don’t know, but if it pans out, it’ll be published and I’ll flag it here.
No, but I would expect the police to break the door down if I’m being assaulted on the other side.
By identifying them in photos recovered during other investigations or that were uploaded by the perpetrator. To be clear, I’m talking about child exploitation here, and it’s not that they can only be prosecuted with evidence on their phone, it’s that it leads to the identification of more victims. Those details then get handed over to law enforcement, they find them and come back with even more information. Twenty years ago it would have been physical photos, address books, and diaries. Now those are all digital, and increasingly inaccessible.
I largely agree with you overall. I like privacy. I think warrantless surveillance is bad. I fucking love Marcy Wheeler. I donate to the EFF, and I have a repeatedly transplanted “come back with a warrant” sticker, front and center on my laptop, but I intend to continue trying to preserve one of the most effective methods of helping victims, because the scale of the problem is unfathomably large and continuing to grow.
The person there who I’ve spent the longest talking to about it told me that wasn’t the case. It’s possible I misunderstood what they meant, or that they were speaking for themselves rather than more generally, or maybe they’re just full of shit. Chalk it up to ignorance, naivete, or gullibility as you see fit, but please also consider the possibility that they aren’t all goose-stepping liars and what they say might actually be true.
It was intended to demonstrate that they aren’t simply sitting there twirling their authoritarian mustaches. Yes, some of the history is messed up, and yes, some people there today don’t act that differently now, but the vast majority are good people who take privacy far more seriously than anyone I’ve worked for.
Now if you’ll excuse me, I have a mountain to fall down.
This topic was automatically closed after 5 days. New replies are no longer allowed.