Wishful thinking versus terrorism: why crypto backdoors are a dumb idea

[Read the post]

2 Likes

I sort of wonder why the gov’t is always proposing a half-measure (probably because they are disingenuous/stupid).

If we want “front-door” exceptional access, and want it to be compatible with our legal system, then its simple: Give judges cryptography. Tech companies can build in encrypted, front-door access and encrypt the keys such that only the relevant judges can decrypt them. Judges would approve a surveillance order by providing the encrypted keys to law enforcement. Keys would be per-user and changed regularly (say every 24 h) so that the key in-hand by law enforcement has a strongly-defined scope of application, no chance of access for one thing being used for other things.

hard part: getting judges to understand encryption

1 Like

There’s also the fact that judges in general kind of suck, and there have been enough cases of judicial misconduct that I wouldn’t want a judge to be able to spy on that person he’s been stalking.

1 Like

Wait.

Wouldn’t that mean that every encrypted file would have to be re-encrypted daily? That’s a lot of wasted CPU power.

And what’s to stop the real criminals from using not-compromised encryption?

And how do we stop the government from setting up something like the FISA court and having those judges just give out the keys to decrypt everything, like they’re currently giving permission to surveil everything?

1 Like

Hard part: Getting me to use your broken encryption system.
You can have my general-purpose computer when you pry it out of my cold, dead hands.

5 Likes

Hopefully, the hard part is making laws that would restrict my freedom to execute a mathematical function on my own computer. That’s pretty fucking far into thoughtcrime territory any way you slice it.

3 Likes

Ugh. Politics can’t be programmed. Just ask anyone who has ever written “simple datetime handling” code that needs to handle arbitrary timezones (and unexpected changes to those), daylight savings time (and unexpected changes to those), various other political weirdness in our calendars and times, etc. That’s a simple example.

For a real exercise, try creating an ecommerce system that calculates and applies every bizarro tax rule from the entire world properly (and will continue to do so even though many of them have changed in the time it took to write this sentence).

Computers just aren’t designed to be inherently irrational, illogical and erratic.

3 Likes

The judge wouldn’t be able to unless they had the encrypted data. A judge isn’t going to subpoena Apple for iMessages personally, a prosecutor would do that. In this scenario a prosecutor would subpoena whatever encrypted data they wanted, and the judge would provide the key capable of decrypting data within the scope of the subpoena. A judge is unlikely to issue a bogus subpoena because it would be legally odd and invite scrutiny.

And yes, some [any class of human]s are terrible and shouldn’t be trusted. Babies/bathwater etc.

1 Like

That has nothing to do with this. Nothing about this says what you can or cannot execute on your own machine.

1 Like

I don’t think so. You’d just have a table indicating which key was being used during which time period. The date on the file would tell you which key you need. So if a company was subpoenaed for encrypted data for a specific user in a specific time-frame, they would provide the prosecutor with that encrypted data, and a judge would supply the keys which will only decrypt that data. In this way, access to the data is controlled by the judge specifically. Data outside the scope of the subpoena would be useless. Also, compromising any one key would be fairly useless as it would only decrypt a few things, and you may have no idea what those even are.

There would be no “master key” which would decrypt any iMessage.

And what’s to stop the real criminals from using not-compromised encryption?

Nothing.

And how do we stop the government from setting up something like the FISA court and having those judges just give out the keys to decrypt everything, like they’re currently giving permission to surveil everything?

Probably the same way we keep the gov’t from doing all kinds of bad things, by voting and democracy and stuff.

Also of course, under my dumb-little-scheme, there is no key which would decrypt everything. Any one key would only decrypt a small amount of a specific kind of data from a single person.

I doubt this would ever happen though, its too complex a system.

However it is possible, just very hard.

And while I’m very “fuck you” about helping law enforcement do their jobs, I do think there are some frightening implications of this new world of ubiquitous encryption.

I think the idea that privacy should be total, that everything should be encrypted and anonymous, is frightening. Currently we catch criminals when they make “mistakes”. They leave a receipt behind in their home, keep unencrypted digital files on their computer, or don’t delete SMS messages. The current trend seems to be moving towards all that going away though. Your instant-messages are already mostly encrypted invisibly and by default, paper receipts cant be long for this world, bitcoin or something like it seems inevitable, encrypted-by-default email has got to come soon, and TOR will continue on, getting faster and easier to use everyday.

I am similarly frightened by synthetic biology though. It get’s easier and easier everyday for someone to make a truly horrific and utterly devastating new organism. All it takes to get an organism to do something is to have the right DNA sequence and a way to make that molecule. Soon enough we will have desktop machines which will make any DNA sequence you want. Once you have that you’re in very dangerous territory.

1 Like

If I can execute whatever algorithm I want on my own machine, then there’s no way to make me use the governments encryption, with a judge-controlled back door, instead of using my own, secure encryption, or choosing an open-source implementation that somebody else made.

You tell me, how can you make people use the government’s broken encryption instead of encryption that actually works, without making laws about what algorithms you’re allowed to run on your own machine?

2 Likes

Why, then, would the key that a judge happens to hold be relevant? I will, of course, take care to execute code on my own machine that encrypts my data with a key that has not been given to a judge somewhere.

Well, first, there are exactly two alternatives:

  1. Allow people to use encryption.
  2. Outlaw general-purpose computers.

And if you decide on option 2, remember that decrypting private communication nowadays gives police/secret service/etc. access to way more information about private lives that they ever had. Even more than the STASI of East Germany ever had. Which is too much.

But don’t panic. People used to be afraid of the idea that citizens could freely voice political and religious opinions, even if they were critical of the king or contrary to official doctrine! It would lead to chaos and a breakdown of public order and of morals! The idea that every citizen should vote and be part of important decisions was considered even more dangerous.

I see your point. Federal wiretapping laws required telephone operators to
make their systems amenable to wiretapping. I think the assumption is that
software companies would need to do the same thing. So while you can use
your own software if you want, commercial software would have to be
amenable to this kind of system.

How could we make people do it? The same way we make people do everything,
a law.

It is, for instance, already a law that you can’t send encrypted data over
HAM-radio. A similar law could be made for any other kind of wireless data
transmission. I can’t MAKE you use it, but I can certainly encourage you
too.

Also, I’ll just say, I am not totally for this idea that “I should be able
to run whatever software I want.” Software now is still relatively
harmless. In the future script kiddies will write software that creates new
viruses, nano-tech weapons, and invasive species. Should it be legal for
someone to run “an algorithm” which designs a genome for a strain of
herbicide-resistant honeysuckle which makes nerve-gas? What about a strain
of yeast that makes nerve-gas? It would only take a few-hundred liters to
kill everything on earth with a nervous system.

Given the crumbling walls between the digital and physical worlds, wanting
everyone to have a computer which runs any program is like wanting everyone
to have a replicator which can make any object.

The stakes on this kind of problem are getting higher by the day.

They aren’t thinking of making it illegal for YOU to use encryption, they
are going to make it illegal for Apple to use encryption.

And eventually we will have to outlaw general purpose computers, they will
become too powerful. We worry today about software that can steal our
personal data. 3D printed guns is nothing compared to 3D-printed
drug-resistant smallpox. What about herbicide-resistant poison-ivy that
makes a nerve-agent instead of that itchy urushiol? What about a virus that
wipes out this year’s corn crop?

As computers get more and more powerful, we’re going to have to think hard
about how much power one person should have. Failure to do so displays a
lack of vision.

In turn, you can make countermeasures - vaccines that target the specific DNA/RNA strand in the pathogen, for example.

Once you have fast sequencing/analysis together with fast “DNA printing”, there are no large-scale risks. Then either the pathogen in question is fast enough to limit its proliferation by killing its hosts before it can spread much (Ebola-class), or it is slow enough to get countermeasures deployed before it can do much damage.

The germs are limited to minimum attack speed measured in days or at least tens of hours. Once the response can be measured in hours, and the technology is in every provincial hospital, the risks become rather limited in scope.

Then there’s the issue of deploying it all at once over large area while undetected. Chemical and biological weapons, for all their scariness, are rather difficult to weaponize and successfully deploy. Much of taxpayer money was spent on this set of problems, with quite lousy results and low return on investment.

I WANT ONE!!!
And I am running a set of projects with that as a long-term goal.

Conventional antivirals, perhaps. Tailored inhibitory RNA that stops the virus itself? Same technology, other side of the coin. We must not allow only somebody to have this - we need it everywhere for rapid response to threats both man-made and natural.

No. Chance. In. Hell. To. Enforce. That.

I am not alone who would not tolerate being denied something of such power. By far not alone.

The potential of general purpose machines, whether computers or manufacturing/synthesis, is too high to just deny it to the people. And once you have even one self-copying machine, everybody can have one. And given their potential, people will want them and there’s no force in the universe to stop people getting what they want if enough of them want it. (See e.g. porn.)

ALL of it. For everybody.

2 Likes

Straight-up totalitarianism, right there. No other word for it. That’s the kind of shit I would expect to hear from the Chinese government. All of those possibilities you mentioned are
a) far in the future,
and
b) not inherent in general purpose computers.

What you’re saying is like saying that because someone could, theoretically use a general purpose computer to program an autonomous gun to kill people, that we should outlaw the computer part of that system, rather than the gun part.

Mandatory backdoors are a fundamental violation of free speech, and are essentially the same as suggesting that it should be illegal for two people to have a conversation in a location where it isn’t recorded by the government.

There is also, implicit, the assumption that the individuals who comprise the government, who are largely self-selected for wanting to have power over other people, are somehow more trustworthy and less likely to abuse power than the people who are subjected to their governance, which only a fool would beleive.

I find it incredibly offensive that you feel like you’re in a position to decide who can read my email, what kind of math I should be allowed to study, and what kind of algorithms I can run on my computer. The instant my algorithm runs on the computer of someone who doesn’t want it running there, it legitimately becomes a crime, but if I and my friends want to do some applied math on our own computers, saying we can’t is a fundamental violation of free speech and free expression. It scares the shit out of me that, even in America, there are so many people who feel like you do, because if people like you ever get in control, we’ll all wish we were living in the Soviet Union of bygone days.

4 Likes

And that’s why we must keep and develop our communication (and manufacturing, and all other) technologies, and keep them opensource - because such people inevitably seek positions of power and then want to grab our stuff and ask us to register every fart and ask for license for every shit we take. We must have enough tech to make and keep them irrelevant, without ourselves incurring undue risks.

1 Like

Governments with that much power are enormously more terrifying to any sensible person than a few random screwball terrorists. The worst thing that can really happen with the terrorists is that they sometimes get it together to become a government with enormous power over the people they rule.

1 Like

That.

But how many voters are sensible? Look at how castrated the chemistry sets became over years, we cannot afford the same fate to extend to other technologies present or future.

Or that they catalyze the government to become de-facto terrorists.
With the voters, bullshitted that it is for their “safety”, applauding.

1 Like

I’m not deciding anything. But computers become more powerful exponentially.

The whole point of society and governance is to manage how much power
people are allowed to have. It’s nonsensiical to regulate things like
weapons and chemicals and not computers.

I don’t know how to do it, but at some point the capabilities of a general
purpose computer will be more dangerous than you realize. The idea that we
should regulate some things that are dangerous but not others is silly.

You just have to ask yourself if you want to live in a world where a 3d
printer can make ANY FORM OF MATTER.

Before you know it a white-supremecist will be cooking up a virus which
only kills black people. I see absolutely no way to stop them.

Also of course it’s already illegal to do certain things with a computer.
You can do a lot of damage with a computer as it is, but you have to be
smart. It gets easier every day to do shitty things though, and at some
point those shitty things will start getting a lot of people killed.

Should anyone be allowed to run a super-intelligent AI on their smartphone?