Google turns in child porn owner who used its cloud services

[Permalink]

So if they don’t examine the pictures, they have to prosecute because of “a picture of a child at bathtime.” Actually, the algorithm probably produces false positives where the subject is actually an adult - or even a pumpkin.

If they do examine the pictures, they’re “inspect[ing] our naked children for the authorities’ consideration.” Bad either way. I guess they better terminate the algorithm. Wait, no, that would be facilitating child pornography.

Is the only win to simply never host anybody’s photos?

8 Likes

Well, I guess that helps elucidate their privacy policy, now doesn’t it?

8 Likes

Girl at party: “So, what do you do?”
Guy at party: “I work at Google.”
Girl: “Wow, that’s so cool! What do you do at Google?”
Guy: “I’m a Kiddie Porn Inspector. I pretty much just look at pictures of naked children all day.”
Girl: “Oh, I uh, think I see my friend over there! Nice meeting you!”

9 Likes

the fallacy here is the assumption that Google’s kiddie porn inspectors are socially adept enough to either go to parties or talk to women.

4 Likes

I read an interesting about those guys once, but unfortunately I can’t find it right now. It’s a pretty nasty job and Google came across rather badly. Apparently usually they are not prepared for what awaits them and burn out very fast.

6 Likes

From what I read, it looks like they use a hashing algorithm to compare pictures to known kiddie porn pictures. When a positive is generated (which is more or less a guarantee that an identical picture to the known kiddie porn picture is present) they go in and actually take a look at the account.

I could be wrong, but that sounds like what happened to me.

6 Likes

So they have people who scan for inappropriate images and yet they thought it was fine to leave the photo up of the murdered kid by the tracks? Oh, that’s right, they put up that photo. Offensive things are only originated by others.

3 Likes

I’m pretty sure it works similarly to Googles “reverse-image search,” so it doesn’t only find matches, it finds similar pictures as well. And depending on what exactly the algorithm picks out, that could mean any nudity.

1 Like

a better question is why did he copy it to his phone in the first place
is it too hard for him to stay away from his porn stash that he need it on the road ?!

1 Like

If he’s the kind of guy who’s collected three thousand images… then yeah, it’s not hard to imagine he wants to have it with him at all times.

I am also certain that is correct. The system is automated and would flag images matching the digital fingerprint (hash) of those that are already known to be illegal.

2 Likes

I had a friend whose summer job was as a customs agent at the Vancouver airport (in Canada lots of them are students for some reason). Because he was the new guy, one of his jobs was to watch CDs that people were bringing in (this was circa 2000).

Nothing could possibly make a person less interested in sex after work than sitting and watching porn on fast forward for 8 hours. He had to make a determination whether it was obscenity (according to Canadian laws) and/or child porn. Sometimes he found some, which was never a good thing.

1 Like

Could have been an Android phone which (at least for some setups) syncs semi-automatically.

The FBI says the investigation began in March when Google’s hashing technology found two child porn pictures in his Picasa library.

Certainly sounds like it. When I worked for a hosting service about 15 years ago, we received a hash list (shortened) and sample images (harmless ones) from a branch of the German police.

I don’t remember what algorithms were being used - it could have been home-grown, we knew that it was a project started by one investigator on his own before it gained traction.

Anyway, the idea was the same: Propagate a hash list of (supposedly) known child porn and have providers check against those.

The hash (back then) was easy to fool - just toggling a few bytes was enough, but we were told that most child porn peddlers didn’t bother or knew of such obfuscation.

I assume that 15 years later the growth in computing powers allows for fuzzier hashing.

3 Likes

Facebook, Yahoo, all those sites have people who review flagged content and they see the worst of the worst digital humanity has to offer. As I recall, the Facebook employees are offshore and get paid very little.

I couldn’t do it. I would rather work at a fast food place than see CP, animal abuse and suicides all day long.

7 Likes

Because really three thousand images isn’t much if you are talking about a fairly mundane and vanilla porn collection. Or so I’ve heard…

Any more, pretty much. If you leave it alone, depending on the jurisdiction, you can be found complicit. And if you just leave it to an algorithm…I know, I know, Google has some top people from AI and other fields working on such things, but it’s not perfect. It’s still hard to detect a face, let alone whether there’s a child being molested in a photo. Hence, someone has the terrible job of checking out those terrible photos.

EDIT: Besides, once we get an AI that’s capable of being able to determine correctly whether an image is of an underage individual, and whether or not it’s a child being displayed in a pornographic manner or taking a bath, and whether or not the child is the victim of abuse, that’ll be when we have to have to have court cases on whether or not our digital assistants are enslaved, and whether or not they should enjoy basic human rights.

I don’t remember the name of the op-ed, but I remember reading one written by a Facebook mod who gets to see every single post where someone clicks “Report”. It sounds absolutely soul-crushing, because there are so many reports of people posting photos and videos of child porn, rape, beatdowns, executions, murders, suicides, and so on, and then there’s the people who hit Report because they saw where an atheist friend posted something they didn’t like, or because they want to report the horrible misogynist who posted a picture of a woman in a bikini, and dealing with the rage of having to deal with that crap when having to look at horrible criminal activity all day.

1 Like

It’s only a matter of time before one of these inspectors is caught hoarding thousands of CP images they came across on the job.

1 Like

Google automatically scanning my photos for images known to be objectionable to the authorities so that they can turn me in really does not make me want to store my photos with them. What about when it’s something else besides child porn that the FBI doesn’t like?

5 Likes