No Fakes act would limit replicating voice or likeness of individuals

Originally published at: No Fakes act would limit replicating voice or likeness of individuals | Boing Boing

6 Likes

No doubt the Republicans will oppose this. They can’t afford to give up a new cheating method.

7 Likes

Trump has a long history of threatening to sue impersonators even in cases clearly protected as parody. He’d blow his freaking top if someone cut together a video using an AI-generated version of his voice without permission.

8 Likes

The No Fakes Act would prohibit the use of AI to replicate the image, voice, and visual likeness of individuals without permission

Too late, folks. Remember the old saw about putting the genie back in the bottle?

6 Likes

Folks that put out fake videos in order to fool the public into thinking the real person said or did those things… they should go to jail, there needs to be an actual penalty for deception in this way.

Nip it in the bud, otherwise election campaigns will show the opponent stomping on kittens.

5 Likes

This seems like kind of a dangerous exception. I can think of a pretty hefty handful of “news” organizations that could and would do significant damage with AI replicas…

So this is this just about making sure no one releases a posthumous “Rolling Stones” album commercially?

6 Likes

Shouldn’t that be covered by fraud laws? Or perhaps forgery laws?

4 Likes

Good. I’m starting to get bored of this meme

1 Like

DO. IT.

2 Likes

You can do it if you’re clever enough to trick the genie? I mean, that’s what happens in One Thousand and One Nights.

1 Like

I suspect that the viability would vary pretty sharply.

Against people with primarily noncommercial motives the viability is likely to be pretty limited: if you want to bot out a politician saying something unpopular or spread a faked sex tape or whatnot the objective is to do damage not readily compensated in cash and often to do so more or less covertly so you can potentially avoid the suit entirely.

Commercial use, though, normally means making yourself visible and identifiable in order to sell whatever the product is; and, if you want to stiff your voice actors and vocalists by replacing them with bots involves doing damage that’s more readily boiled down to a dollar value(both for the purposes of the potential suit and for the purposes of the agreement you could otherwise strike with the supplier in question).

People in the second category seem much more likely to be influenced: the entitlement to damages isn’t quite as toothy as you’d get with an unauthorized derivative work(you are entitled to seek damages; but there’s nothing about being able to halt distribution); but it’s still a real rights clearance issue for someone looking to put together an album or a movie, or a video game, or whatnot and actually make money selling it.

This topic was automatically closed after 5 days. New replies are no longer allowed.