AT&T becomes latest YouTube advertiser to pull ads over pedophile problem

Originally published at:

“Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube,” an AT&T spokesperson told CNBC.


After watching a recent rant about how this works.
While there is nothing that is actual CP in the video what is actually happening is creepy and disturbing.


I think I’m going to be sick.

Assuming Google’s bloodsucking ghouls in charge of YouTube give a crap, how can this be fixed? Banning children from YouTube videos seems extreme and unfair to them, but is there any way YouTube could human-moderate the comments on the volume of videos it hosts, even if they were willing?


Either don’t allow comments, only allow commenting by verified users (such that a ban actually means something), or actual human moderators. If they can’t manage one of those three solutions, they don’t deserve to make money off of the content they are serving.


That still leaves the problem of all the videos being linked together and more like them recommended.

The more I think about it, one thing YouTube could do to fight this is use flags and reports to train their proprietary algorithm through negative reinforcement. If they weighted the training data from the flags strongly enough, especially based on reputation of the users, then it should block it from automatically linking and recommending these videos to the pedophiles. Then, since no one who likes keeping their lunch down and is a decent human being is going to want to go through and create enough training data, hire people to do it, pay them well and make sure they have mental health care.

Granted I’m speculating on how their algorithm works, but there’s fundamentally only so many ways to do what it does.

Also, as you say, ban comments on videos featuring kids, ban accounts that repost in full videos of kids from other accounts and ban accounts that are doing the linking (apparently the comments the guy in the video posted above reported were deleted, but the accounts weren’t banned).


My understanding of this is that YouTube is not allowing any actual pedophilic videos, but that the comments sections of some perfectly normal videos are being used to point out that the videos contain images of children that some might find prurient.

YouTube comment sections have been a problem for years - perhaps their time has passed.

As for banning children from YouTube, I do not see an issue with restricting access to children under 13 from standard YouTube. Children under 13 can be diverted to YouTube Kids.


Google shamed me after I tried to figure out what happened to Hugh Abbot, a character in The Thick of It. I’m having a great deal of difficulty watching Nicola Murray, primarily because “Social Affairs and Citizenship” is such a nasty remit.


There seems no way to fix it that doesn’t penalize someone who doesn’t deserve it. Having a human moderate comments is not feasible due to the volume. You could ban all videos featuring children, or you could ban comments on all videos featuring children, both of which would harm innocent users. Having users report creep comments and perma-banning those who make them seems the best route, but that isn’t perfect either.


Dunder Mifflin Paper Company, Inc. went the the same problem.

I’m not sure about the scope of the problem and frankly don’t want to know too much more. My forbearance is at an all time low this year, but I will say that I don’t see that much would be lost by removing comments from Youtube altogether. I suppose there would still be an issue with bad actors building their own forums and linking, but that’s an already existing problem that Youtube itself doesn’t really make better or worse.


I have some shockingly bad news. If we protect the children from horrific comments, who will protect the women, and the men, and the LGBTQ, and the black and brown people, and the Jews and the Muslims, and omg the poor cats and dogs.

If people need to be protected from outrageous, trolling comments, then we all need it. And that might not be a bad idea.


I have a modest proposal.

Right now, moderation is nobody’s responsibility. By default, a generic youtube channel accepts comments on its videos and there’s no human being in charge of reading those comments and vetting them for objectionable content. Which is why everyone always says to avoid youtube comments because they are a cesspool of the worst the internet has to offer, as is the case with any online forum where there is no moderation.

Why not have youtube change their system so that comments are off by default. If a channel owner chooses to turn them on, then all comments go into a moderation queue, and the channel owner or someone they designate has to go through and approve the comments they want to have on their videos. There’s no option to have unfiltered comments.

In other words, instead of comments you have “write a note to the uploader” and it’s up to the uploader to make a note public if they want to do so. Youtube can provide robust tools for managing large numbers of submitted notes for videos that go viral. Hell, for a fee, youtube could offer a moderation service - you give up some percentage of your ad revenue in return for someone pre-screening the comment approval queue for you so it’s not an impossible amount of work. ETA: and if the number of submitted notes gets to be too great, the uploader can just hit “delete all” once they’ve screened as many as they care to deal with.

Most channels would have no comments because moderation is work. I don’t see how that’s a bad thing. Those uploaders who care about having a commentariat can have one. Comments become the responsibility of the uploader - if they fail to moderate toxic comments, then they are in violation of the TOS. But since no comments are published without going through a moderation queue, that’s not a problem for anyone other than toxic people.

Youtube will never do this voluntarily of course because their analytics tell them that comments help drive “engagement” and thus ad revenue. But if enough bad press like this comes along, maybe they can be pushed into doing something like it.


Sounds like a good idea to me.


Ironic that this crisis is occurring just as they’re shutting down Google+ once and for all.


we could trace the ips of those making the comments and round them up

1 Like

Seriously, I have no idea why so many news publications made the huge mistake of enabling randos to make comments on articles for over a decade, turning the bottom of every news article into a toxic wasteland, and then overcorrected and decided they were going to disable comments entirely.

Letters to the editor worked so well in the days of paper publications because some poor cub reporter was tasked with filtering out all the weirdos, psychos, and sociopaths, so that the letters page only contained “comments” that made a positive contribution. So why the hell not do the same thing with comments on the internet. Make it the default that randos cannot just hit “post” and have their bilge pushed out to thousands or millions or readers, while still making it possible for people to provide feedback and discussion of the feature article or video or whatever.

Sure, if you want to put in the time and money you can hire human moderators to keep things civil on a live message board like Boing Boing does, it’s better, it creates more of a community… but if that’s too much work (and it is a lot of work, especially for sites that get thousands of comments on a slow day), disabling live comments and instead publishing a selection of “notes to the editor” provides most of the benefit while ensuring that your readers are never forced to wade through troll droppings.

ETA: and since most trolls are only posting their comments for the attention, putting a screen on all comments will cause the number of toxic comments to fall precipitously - it’s no fun spilling your bilge onto the screen of just one person who will glance at what you wrote and hit “delete.”


This is the part that always seemed weird to me…videos with women in them get the same treatment. Why are people only outraged when children are involved?

Being over 18 doesn’t magically make a person free game for harassment or prurient attention.


Glad to see that AT&T’s #1 priority is protecting the brand and not, you know, caring about these exploited and abused children.


Yeah, my ex posted this early today. Unlike a lot of her hysteria driven posts, this one seems legit.

For those who don’t want to sit through the guy shows how you can start doing a search for a fairly innocuous topic, and with in 2-3 clicks you go down a rabbit hole of videos of mostly preteen girls doing preteen girl stuff, and the comments are full of pervs making weird comments and then time stamping where to freeze frame for an image that could be construed as sexual.

Now clearly part of this is the job of the algorithm, where if you like one sort of video it shows you other videos liked by people. Only instead of watching a video about Star Wars toys and seeing other videos watched by other nerds, it seems a net work of pedos are watching and sharing all these vids and once you start down that rabbit hole, youtube keeps serving up more.

Now supposedly they do have a way for some comments to trigger and shut the comments off. But like the people making those comments are still on youtube. (And if you banned them, a new account would be a min away.)

So, yeah, not sure what the solution is. I can tell you I don’t want videos of my kid on there for that reason.


one problem I see with this is it becomes impossible for users to call out scam promotional videos and such in comments but I guess theres pluses and minuses to any solutions. maybe if a certain tier of registered verified accounts didn’t have to go through the moderation que a balance could be struck I’m sure there are lot of nefarious unloaders that would love to be able delete any negative comments calling them out .

1 Like