Why Youtube's algorithms push extreme content on every possible subject

Originally published at: https://boingboing.net/2018/03/11/why-youtubes-algorithms-push.html


The really annoying part is that as with Google’s search engine, the overweening arrogance of the programmers that they know how to deliver great search results means that there is no button for downvoting a recommended video, no way to tell the robot “this is not relevant, you guessed wrong.”

You’d think that such a button would be supremely useful for helping Google weed out the spam and the bogus hits from their results, but no. They no longer care about delivering the best search results, they only care about delivering higher ad revenue. And it shows.


I don’t know about that. Another symptom of silicon valley arrogance (or, more often, simple privilege) is a naive over-dependence on the wisdom of the crowd. As seen on Steam, Amazon, Reddit, and elsewhere, a ‘downvote’ button just becomes a ‘review bomb’ button when it’s attached to content that riles the easily-amassed armies of shitty internet men.


That’s not strictly speaking true. YouTube occasionally asks me about videos it showed me asking me to rate the recommendation. Presumably, that’s how it gets the feedback.


One of the things I liked about Pandora (back in the day when I could still get it legally) was the chance to tell it that I didn’t think the algorithm’s choice for the “next best song” worked for me. If I wanted to get it to hone in on a very specific type of music, I could get it to do that remarkably well; if I fancied keeping things a bit more loose, I could do that too.
I realise that YouTube is not remotely the same, but there may at least be some options that don’t merely result in mass down-votes (as girard observes.)


With YouTube, you can remove videos and channels from your recommended results. (Click on the three vertical dots beside the video, “not interested”, “Tell us why”, and then block the video or whole channel.)

I spent an hour a while ago weeding out all the flat earth channels. I also whack GamerGate types as they pop up.

I assume that YouTube is tracking that somewhere, and that it counts for more than a weak downvote.


Only useful if I pledge my internal organs to Google sign up for an account. Ah well.

I have several accounts, one of which I use for my culled YouTube experience.

I figure that with or without an account, they’re tracking anyway. You’re already in that bathtub of ice, might as well get the benefits.


I wonder what I will get if I search for BoingBoing?


Best not go there. You get BoingBoingBoing. (shudder)


I watch a lot of woodworking and maker videos on Youtube. As I watch more, they show me… more woodworking and maker videos. The “radical” factor doesn’t seem to ratchet up in any meaningful way. Sure, YT will sometimes recommend a really complicated woodworking project, but then the next one will be something garden variety. In other words, I’m not really seeing this pattern.


Thanks for the tip. I enjoy watching academic history lectures uploaded by museums and universities. Youtube then interprets that to mean I must be super interested in the wacko history theories of random nutters. Usually I can tell from the clickbaity titles what’s up but sometimes they fool me and no doubt the AI sees that as confirmation I want more of that nonsense.


I find it difficult to imagine radical woodworking.


Some place in Vancouver has the first page of my search results for that. And in the 2nd:

1 Like

Google Android’s page -1 news and weather app also tends towards giving me “more of the same please”. It also looks for a commonality in stories that I read, and adds it to my topics of interest.

I should tell it that I’m not interested in the X-Men that much. The feature would be really creepy if it wasn’t so helpful.

ETA: And yes, I did have to block Breitbart, Infowars, etc, from my results.

I feel likes there’s a great deal of ignorance in the article as much as in the comments on this subject. the youtube recommendations are based on where people with similar interests went after seen a video, most people tend to escalate their interests in that way. the algorithm doesn’t choose what you get it filters out the material/interests that you don’t share with others who you are being compared to.

It’s too bad the project manager wasn’t clear enough that he wanted XTREME content surfaced - not “extreme” content


This doesn’t seem surprising to me, and I think the perceived problem arises from anthropomorphising The Algorithm as some kind of conscious agent – like a human narrator – rather than the purely mechanical thing that it is.

You start with “neutral” content, basically inoffensive morning-show pap I guess, and are shown a randomish selection of next steps. You pick a cooking video that doesn’t involve meat, and now you’ve established some potential directions of travel (“more cooking”, “more vegetarian”, “more Chef Steve” etc). Next, you look at a recipe that’s explicitly labeled as vegetarian, so now your direction of travel is straight along the line from “neutral” to “hardcore ultravegan”. As you click further, you’re getting into quite specific individual interests, but those sideways connections are too niche for the algorithm to detect, whereas the forward direction is well-travelled, so its best guess will continue to be that you want to see ever-more vegan content.

This isn’t an intentional choice by “the algorithm” or by YouTube employees. It’s just the closest you can get to creating a meaningful narrative out of a random walk, in the absence of a conscious author. Clicking through YouTube videos will never work like reading a book, because in a book, the author chooses what direction to take you in, and can take surprising twists and turns because they have a plan for where you’ll end up. I’m sure YouTube’s recommendations could be better, but ultimately it can only ever take you where you point the steering wheel.

If you think about it, do you routinely just sit down and start browsing YouTube from the front page, or do your YouTube browsing sessions usually start from a video in an interesting blog post? I think for most people it’s the latter, and the reason is that when a separate, conscious human gives you a video link, it kicks you into a space of videos that you couldn’t have found by just letting YouTube echo your own clicks back to you.


Then why does it keep trying to get me to listen to The Clash on my Nick Drake channel.
Pandora also turned my Cramps channel into all surf all the time and wont play The Meatpuppets on my Meatpuppets channel. I find that though it does give lots a few options to design particular channels it doesn’t really listen to any of them but instead is only interested in what an individual likes.