Why Youtube's algorithms push extreme content on every possible subject

It had a wooden chassis, wooden engine, and wooden wheels. But it wooden go.

8 Likes

I’m confused. Where are you now getting your flat earth news?

When the rest of us learn that the earth is actually beyond flat and a bit concave, you’ll be totally in the dark.

5 Likes

Youtube isn’t trying to promote extremism, it’s trying to promote engagement (and ad revenue).

It’s a simple question Google is trying to solve, given that a user has watched videos X and Y then what suggested video Z are they most likely to watch? Remember that some users won’t watch any Z you suggest, so you’re optimizing not just for the video that each user is most likely to click, but the video most likely to be clicked by the users that are most likely to click.

The problem is if X and Y are left/right wing political videos (or any video with controversy) then the users most likely to click the next video are the most passionate, and generally the most extreme. And the videos the most extreme users are likely to click are the most extreme videos.

I don’t think it’s nefarious on Google’s part, maximizing engagement isn’t much different then trying to show you the content you’re most likely to find useful. It’s hard to give you the stuff you want while not also promoting stuff that falls down the rabbit hole.

It’s a solution for a made-up problem. Nobody needs “autoplay” or “sticky engagement.”

If websites like youtube can be thought of as tools at all, then those things wouldn’t have been implemented. But once the site became dependent on ad revenue, a hole was opened and thinking of users as a product to be sold instead of participants in a service makes things like this possible.

Clearly, that was a mistake, since the result is You Tube promotes extremism. “They aren’t trying to do that” is pretty irrelevant at this point.

No doubt it is. But since it was unnecessary to begin with, the solution seems fairly straightforward.

7 Likes

Maybe you’re searching for a specific Welsh lullaby, but you don’t know it by title. If it doesn’t come up in a keyword search for “Welsh lullaby,” the suggestions provide you with other Welsh music and/or lullabies that don’t contain those keywords. That’s a genuinely useful search enhancement that also keeps people engaged.

That engagement ends when one has exhausted a search, and the same stuff is coming up. Sites like YouTube and Pandora don’t have a button for “Let’s mix it up a bit,” or “Show me something really different.” They only consider engagement in terms of relevance, and not discovery. They’re not trying to introduce you to new and surprising subjects, but they could.

Also, here’s a Welsh lullaby:

You’d be surprised how extreme ASMR videos get.

3 Likes

I’ve noticed that the more you watch tech-related videos, the more likely you are to have Linus recommended to you.

Perhaps I should refine my statement: if you watch something that can be radicalized, YouTube (or any other recommendation algorithm) will probably eventually show you something related that’s more radical than what you’re already watching. Is that actually a function of the algorithm, or just plain statistics?

1 Like

That’s quite common, in both directions. The line between woodworker and maker can get fuzzy, and seems (to me, at least) to mostly boil down to whether someone works with just wood or is also interested in other materials and skills, in addition to (or instead of) wood (e.g. leather, plastic, metal, electronics, 3d-printing, etc). There are certainly many makers who frequently work with wood, and so the recommendations tend to cross fluidly between wood and craft / maker / etc skills.

1 Like

I just dislike the way it only shows you one thing, so when i watched two all the stations videos in a row, all my recommendation where all the stations videos, then i watched some vinwiki stuff about cars, and guess what all it recommends now is said car videos.

The big problem is its hard to find new things you can only find more of the same never new stuff…

1 Like

I’ve been telling that bastard thing not to show me any more football since I started using it. It still thinks football is Important News.

2 Likes

After reading this article and discussion, I sort of wonder why youtube does not show me extreme content.

1 Like

I watch a lot of the same stuff.

It seems to me that YouTube’s recommendations lean hard toward the professional, by which I mean, channels that have frequent updates, which also tend to be channels with high production values, actual production teams, sponsors, paid promotions, etc.

Those channels have a lot of virtues, but they are (I’m guessing) still a minority of the sum total of content YouTube hosts. There are also some video-makers that I neglected to bookmark, favorite, or subscribe to (because they were easy to find before) that are lost to me. I can’t find them in the search results, they don’t show up in recommendations anymore.

Anyway, my point is, there is a perspective from which my experience (and possibly your own) matches up pretty well with the “radicalization” idea; if I watched a few dumpy amateur-in-the-garage videos about how to use a router, YouTube is likely to respond by offering me videos on how to really use a router, with good lighting in a fancy workshop, expensive add-ons, etc. Those videos may be very useful, but they are also extremist in a sense.

But I think to make that point stick we’d need a more refined idea of what axis YouTube pushes us on. The article is sort of saying that “radical” ends up being the proxy for “engaging” that is used by the algorithm. If we make “radical” mean something non-standard then it meaningfully changes the point of the article.

I think the article makes an interesting point despite the counter-example, but I do think that “radical” or “extreme” probably aren’t the right term. If there is a thing that when applied to politics ends up meaning more extreme views and when applies to arts and crafts ends up meaning more professional production quality, what is that thing?

I think what you’re missing is that the author is detailing the results of letting YouTube auto-play, rather than you watching a video and intentionally selecting something from the subsequent recommendations. With auto-play enabled, there’s no conscious selection involved in what you see next, either by the viewer or someone at YouTube. The recommendation algorithm just picks whatever related video it best “thinks” will keep you on the site longer, which seems to inherently bias results toward more and more extreme material on its own. It makes a certain amount of sense. Auto-play is a mechanism for trying to ensure you stick around by automatically throwing more content directly into your face, rather than relying on you to make a conscious decision to watch something else by selecting a video from the sidebar or post-video recommendation thumbnails. More extreme content tends to be more attention-getting - and attention-maintaining - than pap. The more videos you sit through, the more ads YouTube can push, and the more ads they can push, the more money they can make.

The Algorithm isn’t intentionally trying to radicalize viewers, sure. But The Algorithm’s ultimate goal is to generate revenue, not simply show you things it “thinks” are related. Thanks to YouTube’s increasing focus on watch time over view count (which ticks up after about 30 seconds, rather than at the very end of the video), it seems to have figured out* that pushing increasingly extreme content on people is a great way to keep people’s attention, and thus fulfill its primary objective of making Google more money.

*I’m aware that The Algorithm can’t think or reason intelligently. But if it’s possible to “teach” an algorithm what a stop sign looks like by asking millions of people to select the parts of an image that contain road signs, you can “teach” an algorithm that it leads when it bleeds.

1 Like

I just want to add that my main source of new stuff is, well, this blog and others like it. Human curators, algorithms are nigh useless.

3 Likes

But did you choose them yourself, or did the autoplay?

I have used the “recommended for you” on occasion, but I always do the selection myself, and typically leaned on recognition of the channel/user who uploaded the video over a more random result.

(I find autoplay to be obnoxious in other ways, too. If I’m just checking from a different computer or browser, I have to explicitly turn off autoplay—this stuff is enabled by default by You Tube.)

Even on autoplay, there is input anout the user in the algorithm. When browsing with my computer from some friends’ internet access, I get advertisements which are obviously tailored to them. I get advertisements for students when browsing at the university with the same computer. I suppose google uses the i.p. to determine what to do in that case. They would do the same when running youtube on autoplay on a new machine.

I browse by way of the recommendations. Autoplay is a useful option if you’re listening to a playlist or something, but otherwise it’s annoying AF.

2 Likes

But the whole point is that someone at You Tube prioritized keeping users “engaged” and on the site over solving their problem, and then designed algorithms to address that need over what they should have been doing in the first place. The resulting descent of results into extremist content is probably incidental.

Did they intend normalizing extremism? Gee, I’d hope not (but with stories of white supremacist recruitment in tech, I do occasionally wonder). But it’s interesting how indifferent some folk are to the issue. If this isn’t a big deal, why are we now reading about this from a third party? Why, exactly, isn’t this a big deal? A tendency to anthropomorphize algorithms isn’t exactly relevant.

1 Like

I think their algorithms serve a useful purpose by yielding more relevant results than a keyword-only search, but have led to unintended consequences that are bad for society. Just as we had to learn how badly fossil fuels were harming the physical environment, we’re just now realizing that algorithms are polluting the mental environment. The concept that algorithms should be socially responsible is still sinking in.

I remember searching the 'net through Lycos, how websites gamed the results with lengthy keyword footers, and how I’d have to repeat my search for “groundhog” using the word “woodchuck” because the search engine didn’t understand that there was a relationship between the two. I’m not sure many people would want to go back to more rudimentary tools.

But it might become trendy for websites to implement algorithms that embrace debate and exploration, counterpoint and discovery. They have the data to identify filter bubbles—they can develop ways to open lines of communication between them and keep their users engaged at the same time but, to my knowledge, nobody has made it a priority to do so.