5 Minute Crafts, debunked

Originally published at: 5 Minute Crafts, debunked | Boing Boing


I’m actively watching the video so I can’t comment on the ‘who is the origin of these things?’ however I feel like there’s this toxic value in spreading misinformation because it spreads. If you make a video showing someone doing something and having an outcome, either you follow the video and go “wow that worked! it’s amazing!”, or you follow the video and it didn’t work and now you go “wow this doesn’t work!”

which one of those are you more likely to share? I’m going to assume the one that didn’t work, because it looks like it should work and did not. Now you have some sort of “aha, I found a trick!” mentality. I imagine it’s like those TaboOutbrain things at the bottom of blogs that are like “The Rock jailed for six months!” and it’s a picture of Vin Diesel from a movie where he was in jail.

My second assumption, based on something I saw on twitter years ago, is that they’re deliberate misinformation from malicious state actors. That sounds like an instance of what I said above, though…

Edit: ahh, it’s ‘for the algorithm’. The engagement algorithm. I think the most nefarious thing about the engagement algorithm is that it doesn’t really require that people try to actively push bad things on people. Humans react most to things for certain reasons that seem to be part of human nature, and if you simply optimize for that, you end up with negative, inflammatory, misinformative, quick-fix content.

edit 2: ahh and there’s the pro-russian history video stuff I am remembering!

i’d be curious how many of the views were powered by bot farms. while i’m sure google tries to weed bots out, they exist on twitter and facebook, so why not youtube. ( and, as long as the bots are convincing advertisers it’s not really in the ad platform’s best interests to get rid of them )

also, if bots seem more “life like” by engaging multiple services across different ad seller platforms then maybe the videos also help disinformation in an indirect way: by providing known content for the bots to consume. then the bots can go post propaganda on twitter, et al. as a seemingly normal user

i was a little disappointed because they said that the russian content farms were pushing fake history, and we never got to actually see that. they instead described two “maybe some day in the future x might happen” videos

Watch Ann Reardon’s OTHER debunking videos on a TON more fake craft channels on Youtube, some of them are basically clones of Five Minute Crafts in form of cooking shows.

Did we cover the “fake turtle rescue” channels yet? (just search for that on Youtube)


This topic was automatically closed after 5 days. New replies are no longer allowed.