YouTube demonetizing videos where LGBTQ keywords are said

Last year, Google reported revenues of 110.8 billion dollars. I find the concept that they “can’t afford” anything ludicrous at best.

18 Likes

Again, you really have a skewed sense of just how much content is uploaded to YouTube. There are over 500 hours of content uploaded to the site every minute. That’s 82 years of video a day. How do you address that with human moderation? You’re being incredibly unreasonable to expect a site that again, already bleeds money, to be able to do this and not shut down in a few weeks. It would bankrupt the entirety of Google to do that and they’re one of the richest tech companies in the world.

None of you really seem to understand what it’s like to run a product at this scale. The tone I get from this article and community is that YouTube engineers (many of whom are LGBTQ+ people themselves) are bigots and unfairly targeting queer people. That’s a conclusion that might make you feel better because it’s easy to look at YouTube and Google as the big bad evil, but the problem is not as simple as you are all making it to be. It also ignores the real problem here which is not algorithms or YouTube moderation or even the Internet. It’s the normalization of bigoted viewpoints and a culture of hate that runs rampant on the Internet. Look at all of the big sites, Facebook, Twitter, Reddit, etc. They are ALL ineffective at stopping this because there’s too damn much of it and these groups come up with new words and phrases and memes to bypass the software.

1 Like

And none of that is made from YouTube. In fact, YouTube cuts into that number.

All-or-nothing arguments just sound like, “If we’re going to let a twenty year old talk about wanting to go to a gay prom, well, I guess we’ll just have to put up with random explicit snuff videos.”

C’mon.

14 Likes

YouTube makes revenue. That revenue is not split out as a separate line item from other Alphabet revenue.

It’s probably structured so that YouTube operates at a “loss” with ad revenue directed into other parts of Alphabet.

8 Likes

They should nerd harder.

2 Likes

I never made that argument. The unfortunate answer to this problem and the one YouTube seems to be taking is to deprecate monetization entirely and only advertise on hand-picked channels that have been manually vetted for all by YouTube. Not a great outcome for small content creators but again, it’s the only way YouTube can really keep the trust of advertisers because human moderation of 82 years of video every day is impossible and algorithmic moderation will always fail. Expect YouTube to look more like cable TV in the years to come.

Hm. If they were being lazy, a training group of hate videos that used LGBTQ keywords would tend to train the algorithm to recognize those as hate flags.

Did they forget to include non-hate videos that also used LGBTQ keywords?

7 Likes

That post wasn’t responding to you.

9 Likes

Unfortunately if I had to guess, there is a LOT more hate content being uploaded (and posted on the Internet in general) that uses LGBTQ keywords than non-hate content. How you delineate the two is another problem that I don’t think ML can solve. There are plenty of crypto fascists who say incredibly hateful things with a nice voice, no cursing or charged language, coded language etc. It’s basically a genre on YouTube: people who call themselves “skeptics” but are really bad-faith actors trying to spread white supremacist viewpoints.

It’s immaterial if they never intended to discriminate - they did discriminate. Now their choice is whether they wish to continue to discriminate knowingly.

Intent won’t matter to your users as an excuse - and it won’t pass a disparate impact test.

14 Likes

I wasn’t talking about content uploaded in the bit you quoted, I was talking about user accounts – a much smaller problem set to be addressed.

With approx. 25 years in the industry, including advising some major content sites in their earlier stages, I do understand the challenges of scalability. I also know that there are ways to address these problems more effectively – if the company is willing to assign the appropriate resources in what is an on-going battle with bad actors (I’ve spent all too much time trying to get corporate boards and major shareholders to understand that it’s worth it from a business POV).

It’s not a trivial problem, to be sure. And addressing it won’t be anywhere near cheap. But it’s not impossible if the corporate will is there to make the platform better and, ultimately, more beneficial to both users and advertisers in the long term (as opposed to making the quarterly numbers).

Then you’re misreading the tone, because what’s being discussed here is the laziness, tunnel-vision, corporate corner-cutting, Californian Ideology free-speech absolutism, and techno-utopian Libertarianism that’s plagued the tech industry since at least the early 1990s – all of it mixed with a horribly flawed engagement-based advertising business model.

The Internet, and especially the monopolistic social media walled gardens that a disturbingly large number of people experience it though, is in large part responsible for the normalisation of those viewpoints over the past 20 years. These companies want all the benefits of an old-fashioned lowest-common-denominator mass media business without any of the (costly) responsibilities.

That’s why I think Alphabet’s first order of business if it’s broken up is to jettison YouTube (which, due to its comments section, has always been a cesspool). But right now, financial losses taken into account, it serves the company’s purposes in other ways.

They should spend some bloody money on the problem instead of just hoping that machine-learning systems can instantly replace the human nerds.

15 Likes

No, another iteration of a privately owned social media platform that’s cheaping out on trying to give the appearance of not tolerating intolerance, and failing spectacularly in the process.

They run into the same basic problem of what and/or who is doing the classification. It’s going to cost them money to do that, and they’re not willing to spend it if they can pretend an algorithm can do it.

9 Likes

This would be a great point if only anyone you’re responding to was calling for pre-screening every video that gets uploaded. Instead people are calling for things like moderating videos that have been flagged by users:

12 Likes

YouTube is the bad guy here in that they allow advertisers to select “I don’t want to be associated with LGBTQ+” content.

Doesn’t matter if their automated system is working or not… making that an option is discriminatory.

8 Likes

From my post, once more unto the breach…

Tech companies sometimes like to hide behind the suggestion that algorithms—computers making automated decisions—can’t be bigoted. This is a example that makes clear how empty that argument is, and how an automatic process can baldly reflect human bigotry.

Whether they feel responsible for their code’s decisions is immaterial. They will be held responsible for it, one way or another.

17 Likes

It’s not so much that people underestimate the cost of scaling human moderation. They’re demanding an end to the bigoted behavior, automated or otherwise, irrespective of what that costs YouTube. If the cost ruins it, too bad.

I guess that YouTube will develop a more fine-grained ad system that deemphasizes monetization status. YouTube might not want to do this because its current advertising products are stable and clearly differentiated. So perhaps YouTube will ultimately just allow the discrimination in a more formal way and take its chances with the backlash.

12 Likes

If their business model depends upon discriminating - they have an illegal business model. Sorry - close up shop or fix it.

13 Likes

You say this like it’s a bad thing. I’m not sure that’s a unanimous view.

Couldn’t they simply enforce a set of guidelines, and fine any channel’s owners who post materials that don’t adhere? Sort of like the FCC fines tv channels who allow stuff to be broadcast that break the rules?*

*im not saying I agree with all the fcc guidelines, but I do agree that when someone provides a global platform for people to amplify their views, and gets revenue from it, there should be some accountability for the content.

4 Likes

Sure. China solved this problem a while ago, there’s no reason why Google with all their money can’t do the same thing.

1 Like