Deepfakes that hurt people are already illegal, so let's stop trying to rush out ill-considered legislation

Originally published at: https://boingboing.net/2018/02/13/there-ive-done-something.html

6 Likes

Deepfakes

Good 60’s psychedelic cover band name.

5 Likes

The art of politics is to find a parade, then get in front of it, said somebody very clever. Therefore there will always be new laws about the latest outrage, and they will always be ill-conceived rush jobs, keeping lawyers fully employed.

4 Likes

“Leadership involves finding a parade and getting in front of it.” ~ John Naisbitt

“Like the ones they have in France.” ~ Cadet Bone Spurs

10 Likes

Are you not familiar with the way post-Reagan US lawmaking bodies work, Corey?

There are carefully drafted laws that corporations want, that screw everyone else, that get passed with little fanfare and much campaign contribution.

And there are ill-considered laws passed with much sound and fury in rapid response to memes and false urgencies created by popular media, that either do nothing that wasn’t already adequately dealt with under existing law, or do something unintended and awful.

7 Likes

Yea, that guy.

1 Like

Don’t expect grandstanding and panic-mongering “series of tubes” types in Congress to take your advice, Cory.

2 Likes

I guess a possible positive in this is, it would be very hard, to impossible to enforce a law banning the use of machine learning to face swap video. We’ve always had this ability through many different methods. It’s only this one method that’s made it easy.

We also already had laws against distracted driving before there were cell phones, but no one enforced them and now we have laws specifically against texting while driving.

And some of those same laws you mention have been little use to victims of doxxing, SWATting, and the like.

And the anti-murder laws have not been a great help to those killed by police.

I’m not disagreeing with rushed or ill considered, but sometimes re-emphasizing in law that new ways of doing bad things are still illegal can be useful and necessary.

8 Likes

I’d disagree that there is a good legal to combat deep fakes in the same way there isn’t a good legal way to fight revenge porn. (Vice did a couple great stories about it, one video and another article about a lawyer who was fighting for the legislation due to her lack of being able to get justice.)

Just to review the options presented by the EFF:

Criminal:

  • Extortion/Harassment: This only applies if the person who created the deep fake is extorting/harassing. Creating and then releasing on the web without this intent (did it just because they were “hot”) would make this no longer apply. Also harassment has done a poor job of covering revenge porn to date.

Civil:

  • False Light + Defamation: Would only apply if the creator tries to pass it off as real. From my admittedly limited understanding, the “Deepfake” community is open about them being fakes. Being redistributed as real would absolve the creator of that criteria. Please correct me if I am wrong there.
  • Intentional Infliction of Emotional Distress (IIED): Key word is “Intentional”. Intent can be hard to prove (see did it just because they were “hot”). A consequence may have been emotional distress but if intent isn’t there the creator could also escape that.
  • Copyright/Publicity rights: Can remove the content, cause some monetary impact but is burdensome to chase after all the copies being distributed individually and would have to be done by the copyright holder of the covered content and not the victim. In the case of publicity going after the creator can only provide limited relief based on the worth of a person’s publicity.

In society we generally hold sexually offensive crimes to be more heinous than the non-sexual version of that crime. We differentiate Assault from Rape, Physical Abuse from Sexual Abuse.

Likewise, these remedies prove to be insufficient due to the gravity and violation level of Deepfakes; they primarily offer financial compensation. They do nothing to address the sense of violation a victim would feel seeing themselves in the context of a sexual video. I can’t imagine the mental disconnect involved watching yourself with a unfamiliar body have sex with someone else. Winning restitution because of a Publicity Rights does nothing to sooth this sense of violation. You wouldn’t feel vindicated or feel justice had been served because a Copyright has been upheld.

I would argue that similar to the fact that Defamation, Harassment, IIED and Copyright laws do not address the core problems of revenge porn, the laws in place will fail at addressing Deepfakes. Therefore legislation of a criminal nature is needed to provide a proper sentence to those who would violate others sexually in this manner.

While this is primarily a problem of celebrity, as you continue to move down the actor chain (working actors, people with side gigs acting like a former roommate of mine) you reach a much more vulnerable population. It is only a matter of time till this technology spreads to individual people giving the ubiquity of the ability to record high quality video of a person.

3 Likes

Maybe this sudden “Deepfakes” coverage is part of a plan to rasie public awareness of face swapped porn videos so that there is credible deniability when the Trump piss tapes is released.

4 Likes

3 Likes

Not long now until we cant trust anything online anymore.

1 Like

I’m gonna grab Already Illegal if that’s cool.

2 Likes

I’m not sure how deepfakes differs from penned caricatures. Believability? I have limited belief in any outrageouos online video.

But I’m probably not the person to ask, since I’m pretty much faceblind. Replacing the face of someone I don’t recognise with another face I don’t recognise doesn’t do much for me. Side by side, I can tell the difference, it is a fine technical achievement, but lacking in emotional impact.

This topic was automatically closed after 5 days. New replies are no longer allowed.