Originally published at: Man who used AI to create CSAM jailed for 18 years - Boing Boing
…
They should program these AI programs to flag certain prompts, and then when triggered it says, “Yeah, sure, I’ll get right on that, buddy. It’s going to take awhile to generate.” while calling the cops.
This use of AI is repugnant but I’m not sure I want an AI to decide when to call the cops…
The AI could report to an independent body much like banks do for suspicious financial transactions. Then humans could assess whether the possibly offending material meets a threshold for reporting to police.
In general I agree with your statement.
I think I might be willing to make an exception with this specific scenario…
Yes, something should be done - by people - not code.
18 years. That’s a shame.
Speaking from experience, this is one of the worst jobs one could give a human on the web.
Now if only they could prosecute the people doing this with pictures of unconsenting adults, too. It might not be so inherently, viscerally horrific, but it’s being used as a pretty ugly tool to harass real people, often with no consequences.
Also: how the hell did these get created in the first place? I don’t see how you create these kinds of images without a training set of CSAM, which becomes its own crime that’s even more worthy of prosecution.
In theory these systems were supposedly fundamentally set up in such a way that it was impossible to make CSAM images in the first place. Yet someone obviously got around those limitations, so I don’t see this working any better, unfortunately.
Maybe he was just doing research. You know, like Pete Townsend?
Agreed, speaking from some experience, but someone has to do it. I’d say mandatory debriefs with colleagues, mandatory therapist visits, mandatory punching bag sessions in the gym, mandatory screaming sessions, and limited rotations in that role.
Frick, now I’m tearing up. Pass me a shot of that please.
I’m sure he would have preferred under 18 years.
This topic was automatically closed after 5 days. New replies are no longer allowed.