YouTube pushes children's videos to pedophiles through content recommendation engine

I think we had this issue crop up last year on BB. The problem is, the youtube algorithm that says, “Hey, Mister.44 likes this Toy Galaxy Channel, maybe he would like this Comic Tropes channel.” can be used to shuffle otherwise innocent videos to people with less than noble intent. It finds you “like” a type and gives you more. Now for comics and toys stuff about science, no one cares. Keep it coming. If its just little girls running around doing stuff that creepers are looking at - then it is uber icky and wrong. And they have a net work and a system, where they would share links (which means we can never stop this, but could reduce it) and the comments would have code words and even time stamps so you could freeze frame where something innocent looks less so. You know how one can get focused on a topic and head down a rabbit hole? Well last year making the rounds was a video showing how quick and easy one can head down a rabbit hole like this before all the videos suggested are of the same type.

One can absolutely reduce this on youtube, but it takes time, moderation, and actually being proactive. Things that YT has not been great at in the past. But I’d think just ending suggested feeds for certain types of stuff would be a start.

I wonder if this will affect the more “legit” you tube kids channels. Like I know some of the biggest channels include kids doing unboxing or toy reviews. I have no idea if they are being exploited, but if killing the automatic sharing of videos of kids playing, wouldn’t it nuke others? Maybe it should, let your kids be kids and not working as a youtuber? Eh, that probably isn’t fair, I bet opening toys is fun. I dunno. Like I said, YT has to give a rats ass and work on it constantly.

I guess this is probably one reason why my ex-wife does’t have video of the kiddo on there except for some stuff viewable by direct link only. I agree with that precaution. She just turned 13 2 days ago - and so I probably have a year or two before she will be navigating social media. Is there a dread emoji?

Seems like we can never find a happy medium.

5 Likes

We (in the widest sense) allow this to happen. There are all kinds of laws and regulations about what broadcasters and publishers are allowed to do and what responsibilites they must take. But when Google, Facebook or whoever mumble something about algorithms, computer models and vast numbers of users, they’re allowed to shun all responsibility. Legislation and politics has not kept up, and we’re allowing this to happen our own societies while tech giants get rich, but say they can’t afford to police themselves. YT could stop the problems described immediately. Right now. But choose not to, because it might hurt revenue, either directly from attracting pedophiles to the site or indirectly because the countermeasures might not be quite specific enough. But this is a choice, made by YT executives - how expensive is our pedophile problem vs. the cost of fixing it. We need more legislation, and better application of existing laws. New media have shown themselves to be completely incapable or unwilling to police themselves. (The excitement and promise of the Internet of the 90s seems naive and sullied - how sad).

3 Likes

So does this mean YouTube has a list of all the noncriminal pedophiles?

That data would be utterly fascinating.

As for asking people who let their kids post nonsexual videos that then get viewed with a sexual purpose, it seems to me that the people already did their jobs by not having their kids naked and suggestive. So why the focus on their agency here?

Honestly though, YouTube is weird. The search suggestions for everybody’s accounts in the house did weird things when my nephew searched for other kids performing wwe once, it was bizarre.

Don’t know. Not opting my kid into some governments device ever. Will find a new school or homeschool.

they probably have some sort of bucket that lets people advertise to them on adsense.

My son is required by the school board to have a google account through the school, on which he uses Google Docs, does searches, etc for school. This is not the same account that he uses at home to, among other things, watch YouTube – and he wouldn’t want it to be because the school account is more restrictive than I am.

So, yes, Google is tracking him, albeit in a slightly fragmentary way.

2 Likes

jesus christ you’re right. feels like thursday.

1 Like

I agree. Responsibility lies on both ends.
Parents know any video on YT is public. Leave aside pedos. A simple question to ask is - Do you really want any family picture/vid out in the open to be seen by anybody? If not, keep it off youtube, share it via other means.
On other hand, being a public sharing platform of ENORMOUS proportions, YT has a responsibility of maintaining decorum/ sanctity. A lot can be forgiven if they were trying and acting upon it continuously. In these cases, its clear they are not.
I am trying my best to keep my kid off YT, using paid content to watch (no finger songs pls) and sharing any video/pic via private means with family & frnds. I really dont know what will happen when she will have agency to make these choices herself.

While this is true, it requires the algorithm to be aware of the content of what it’s recommending, instead of only the metadata of who viewed what before. A much more difficult (and hence expensive) thing to do. The same goes for the child-porn recommendations, though for that subject they probably already have a lot of infrastructure in place.

People wanting this kind if filtering is what got us the new european upload filters.

This whole recommendations business/filter bubble is fraught with problems and the world would be a better place if it just didn’t exist, but I fear we’ll never get that cat bag in the bag.

You can’t really forbid generating recommendations from metadata. Not without chinese-style censorship anyway.

The best thing youtube could do is probably just ignore all videos posted by children in their search engines and monetization schemes. And all videos containing children? But that would also exclude a lot of commercials. And people would probably create lots of accounts with false <16 ages to prevent ads on their videos.

All in all it seems a pretty complicated problem to me.

on a side note:
Sometimes I wish long for the simpler times of my youth, when 1984 and Brave New World were considered fiction.

1 Like

Isn’t it possible to post videos that are excluded from search? Though even in that case the link could still be shared.

This

is probably the safest course of action.

1 Like

You can classify a video as private, but chances are, after a certain point of time, one can miss clicking that button.

2 Likes

You could post a video of your baby being cute in the bath, and all it would do is make a bunch of people smile; or there could be footage of your kid winning a spelling bee on the local news, and someone out there spends hours staring at it in a way that would make your skin crawl. You don’t know what the public does with this stuff, and it wouldn’t be good for you to try.

I am extremely reticent about having videos of me shared publicly; if I had kids it wouldn’t even cross my mind to publish images of them. It has nothing to do with me imagining hypothetical creeps to feel outraged about. I just hate the idea of anyone else controlling the way I am presented. Which, yeah, is a borderline neurosis in my case, but I do think other people could benefit from seeing it that way. Everyone should jealously guard their own image.

It follows that YouTube probably shouldn’t allow videos of kids, period. For that matter, I’m not sure that a decent society would even permit child actors.

1 Like

I don’t disagree–but even if a private option was made perfectly clear–I would have to wonder if a lot of parents would care. Because, you know that clip of your children doing something silly could be the next viral $en$ation. There’s still some way to go before the culture grows out of this “look ma I’m on the T.V.!” attitude.

And very much this:

1 Like

No such thing. (IMHO YMMV) :wink:

Thank god the Sears Catalog doesn’t exist any more. (Actually JC Penney’s was racier.) is sexualization of children a problem? Yes — and remind every mother who puts her toddler in a two piece that she’s doing that. (There’s nothing to cover up.) Are your kids in danger from some random Youtube user? No. Abuse is almost always by a family member or acquaintance. So why all the fuss? Because the Patriarchy and its Fascist propaganda mongers want to control all (Female) sexuality and thereby all females. Take a look at Islam. That’s what they want our society to be like.

How do you figure on either count? The insult here is that YouTube is specifically finding videos that normal people would find inoffensive and delivering them poorly. How does what some pervy person does mean society should entirely ban children from photography? Does it follow that they should be banned from public?

Actually can we do that, please?

The insult here isn’t in the videos or in the existence of pedos, it’s in the discovery that this corporation that runs an addiction finding machine seriously does not care about anything other than the views.

That wasnt a surprise to me, but apparently…

1 Like

A few questions that occurred to me.

  1. What is the statistical chance that more children will be physically molested as a result of YouTube’s actions or inactions with these videos? Given that upwards of 90 percent (some studies 95 percent) of molestations are by family members or someone else already well acquainted with the victim, I would guess that the chance is extremely low.

  2. Is there a compelling social interest in preventing pedophiles from seeing legal, non pornographic images of children, an interest so compelling that it justifies prohibiting legal activity by the vast majority of citizens who are not pedophiles? Unless a clear causal link can be established with actual molestations, I would politely suggest that the answer is “no”.

  3. If I post or allow to be posted an image of my child to a site where it can be viewed by anyone in the world with a net connection, am I entitled to any expectation whatever of privacy in who can see it? No statistical analysis needed here. The answer is no, nix, nyet, nada, no way.

  4. Is YouTube now a publisher rather than a common carrier, and subject to civil actions in the courts for what it allows on its site? Given the degree of editorial control it is now exerting, I think it is. And that can’t come soon enough for me.

We turned into seventeenth century New England so silently, I barely even noticed.

I’d like to say that’s a wild guess. First of all, 90% and 95% are not 100%. More than 10% of people are sexually abused as children. So in America, that would be in excess of 30 million people, 10% of that would be 3 million people. Youtube causing a .01% increase in child abuse by strangers would be 300 children abused. That may be statistically insignificant, but the idea that the math just doesn’t add up to any extra children being abused is unsupportable.

Plus, you whole point is based on a false assumption. The idea that posting an image of your child on youtube will lead your child to being attacked by a stranger who traveled across the world is a little far fetched. The idea is that by normalizing being attracted to children, it makes it more likely viewers will abuse children - most likely children they know. If that link exists then it applies just as much to the 90-95% of abuse cases as it does to the 5-10%.

As far as I know (and I can’t find any different right now) we simply don’t have evidence to prove whether people viewing images like these will increase abuse. It’s nearly impossible to study. Personally I’m unsure. What I am considerably more likely to believe is that an online community that promotes sharing these videos is normalizing something people should know is wrong (even if they are inclined to do it).

In seventeenth century New England you were probably allowed to beat your child within an inch of their life without legal reprecussions.

4 Likes