Why Youtube's algorithms push extreme content on every possible subject

anthropomorphism is the issue here.

you nailed it!

And if you gaze long into YouTube, the YouTube also gazes into you.
– Netchee

While most of it can be explained by YouTube’s algorithms reacting to our choices, trying to figure out what we’re interested in and showing similar content, I’d be shocked if there weren’t attempts to game the system by various groups, like the alt-right and otheЯs.

The easiest level would be to have a lot of peons and bots subscribe to channels, watch and like videos to try to boost their visibility.

Slightly higher would be to have bots do directed walks through the videos to try to lay down strange ruts in the viewing patterns. (e.g. 23% of the people who watched this gardening video followed with this “blood and soil” one.)

Higher up would to attack how YouTube associates videos and channels, so that if you watch one video, you might be interested in this similar one. That would involve deconstructing how Google analyses videos. Certainly they go by the description and keywords, and they machine-generate transcripts that aren’t too shabby. Facial recognition of people in the videos? I wouldn’t count that out.

That would be the realm of the institutional players like Cambridge Analytica and KochBroCo.

All hypothetical, of course, but anyone with the resources and desire to manipulate YouTube’s results would look into it.

1 Like

There was a time that watching whatever interesting loon that got posted here would screw up my recommended videos for months.

4 Likes

What happens if you start watching videos that are extremeley neutral and un-extreme, like Norwegian train videos and that video of Ron Swanson drinking whiskey by the fire for 45 minutes?

1 Like

Same is true of news.google.com . These algorithms (youtube, google news, facebook) can and are used to agitate and manipulate public opinion on political issues.

1 Like

“Video: How I Made a Car Entirely Out of Wood”

6 Likes

The next video is of Mr Bean drinking a double whisky for an hour and a half.

3 Likes

I’m not sure that’s a meaningful difference between choosing and filtering out, but I also think it is the opposite of how you present it:
Rather than filtering out the interests you don’t share with others you are being compared to, it filters out the interests that people outside the group do share.

It’s the “you may also like socks” problem: Everyone who likes X also likes socks, so the recommendations are always for socks. Like politics? You probably also like socks. Cooking? Socks. Woodworking? Socks, socks, socks.

So, instead, you have to go for what people who like X like, that no one else likes. A lot f people have a casual interest in running. But the videos that appeal only to people who are interested in running, are about super-marathons.

Right. “A direction of travel.” That’s not a bad metaphor.

Exactly - it will always tend to take you to more extreme content, because, as you say,

So, someone who has no particular interest in hardcore veganism, (but maybe also no particular aversion to it,) winds up seeing a bunch of videos that argue that yeast is slavery, and this seems like the normal discussion happening online. When rally, they just were open to not having meat in every single meal.

I disagree with you here. I think that the actual problem is that the algorithm is that a not-intelligent “purely mechanical thing” is setting us on paths (by selecting the content we see) that normalizes very fringe content.

Right. I can never ‘like’ a song with a female vocalist, because that seems to override all other considerations, and I get only female vocalists, across a variety of genres, from there on out.

Right. I would say the woodworking videos are already pretty extreme, in that, unlike cooking, running, and politics, you probably don’t have a huge number of people who are casually into woodworking. What would be interesting is if other kinds of craft/maker videos led people into the woodworking genre.

7 Likes

And so you joined up just to tell us all how you really feel!

How lucky are we?!?

Velcome to Boing Boing, new comrade…

8 Likes

After a few years of thimbs up/thumbs down, my Beatles channel plays the Cure, and my Neil Young channel plays the Beatles (and CSNY, but no Y by himself).

Algorithms are fun.

6 Likes

I guess that the “down vote” could be personal to the user’s account, so that the user could help fine tune what was shown on their specific account. Google could call it something like, “Personal Echo Chamber” and it would allow user to tailor their experience. I find the same thing on Steam, where it says, “We thing you might like x because you looked at y.” Where I want to be able to say, “I looked and y and didn’t like it; please show me less games like y, and that includes x.”

1 Like

The path to conspiracy theories on YouTube runs both ways…

That being out of the way, I haven’t had this experience on YouTube, but Google Assistant/Now definitely did this to me.

It wasn’t always this way. I used to like that Google Assistant didn’t seem to understand politics and would give me articles from both sides of any issue I had recently read about. At some point a few months ago, though, I got a huge uptick in really radical recommendations. I swiped them away and it stopped.

I actually would like a switch somewhere to temporarily turn off the bias. Sometimes it seems as if I can’t find a result from the opposing side even if I try. Of course, maybe everyone agrees with me.

1 Like

Maybe the search engine could pick up a story from each extreme for each story the reader looks at?

Well, yeah. I was trying to stick with the conceit that there is a “correct” set of recommendations at any given point, and talk about how YouTube does or doesn’t get it right; but you’re right, it’s also a feedback loop with a human component. And I would guess that the fringier stuff has more lurid headlines, so those choices tend to be emphasized even if normal people don’t stick around for the whole video.

Still, I don’t think it helps to think of a moustache-twirling robot that’s trying to radicalize everyone; no matter how benevolent make the robot, an unguided stroll through humanity’s stream of consciousness is always going to lead you toward the most feverish thoughts.

3 Likes

I don’t think it’s necessary to anthropomorphize it to see it at highly problematic, nor do I think that’s happening here. The algorithm exists in such a form that it necessarily has these (highly negative) outcomes. It’s designed such that it doesn’t show you ‘more’ of what you looked for, but seemingly pushes more highly-charged, related content. It’s probably much like some Facebook algorithms, presumably also because of issues relating to “engagement”. This may be entirely a side effect of something else, but that doesn’t make it any better, especially since it’s probably ultimately the side effect of systems built with the intention of delivering more eyeballs to advertisers.

We don’t need to anthropomorphize anthrax to recognize the danger of being infected with the bacteria, that it will continue to eat away at you - utterly unconsciously - until you’re dead. And if someone unleashes that bacteria on you, even accidentally, we don’t absolve them of responsibility.

4 Likes

Right, I agree. And I think a ‘perfectly neutral’ robot that just happens to radicalize everyone is even scarier.

5 Likes

SEO spammer here. I am sure google know how dangerous it would be to let users push search result rankings up and down directly, rather than using expensive backlinks and spam domains.

4 Likes

slurry of zomg. Such a great band.

1 Like

Well, it does offer slight protection against downvote bombing. Alas, the same protection does not help in the other direction.

Algorithms are terrible curators.

2 Likes

I’ve noticed this with science videos - youtube’s more than happy to steer you over into perpetual motion machines/flat earth/lizard people territory. It’s hard to stay on track.

7 Likes