Originally published at: https://boingboing.net/2019/05/07/production-of-ignorance.html
danah boyd explains the connection between the epistemological crisis and the rise of far-right conspiratorial thinking
Originally published at: https://boingboing.net/2019/05/07/production-of-ignorance.html
Google has some ‘splainin to do…
these assholes love beating their drums to some imaginary beats. the tin foil hats (red hats) always hear the beat and are ready to march and salute.
I think the problem today is that people treat computers as truth tellers rather than machines that give you random statements of perceived fact and it’s great at organizing those seemingly random statements too. But once you get into the realm of validating the truth of the statements/facts it gets hazy and downright impossible in some cases. This is why it’s better to see computers as worthless in this aspect at best and dangerous at worse since they wind up reinforcing existing narratives (seems as if the post-modernists and their criticisms of narratives are valid after all, eh?). So if you go onto YouTube to find out about anything whether it’s some boring programming language or the latest fashion it’s going to have significant bias from its content producers as a given. And all YouTube’s search algorithm is going to do is relate each video to each other on the basis of what the content producer assigned to it for its search terms (categories). The rest comes down to gaming the search algorithm to bring it to the top of the first page of results. So never take anything you see online at face value is all I’m saying.
I don’t think you actually need to search for anything, if I am not logged into my account then I end up with far right shit being recommended to me. I hate having google easily collect their records on me, but it feels like the lesser of two evils right now.
This article makes the whole thing seem very intentional, like there are clever malicious people manipulating the media. I am sure there are people doing this, I doubt there are terribly clever people doing this and successfully manipulating people on a large scale.
Like maybe the most insidious sounding (to me) thing in the article is the idea that people are intentionally watching the CDC video on vaccines and then watching anti-vaxx videos and commenting on both in order to link the two together in youtube’s “mind”. Again, I imagine some people are doing this with bots, but I bet google is better at catching bots than those people are at making them. What I’m more concerned about is that if that’s how you link things together on youtube, it is inevitably going to happen whether anyone tries to do it on purpose or not. I don’t think I’m going out on a limb to guess that hardcore anti-vaxxers are considerably more likely to watch CDC videos about vaccines than a person who got their kids vaccinated when their doctor told them and never gave it a second thought.
These algorithms are trying to maximize engagement. I’m asking myself in what ways “maximizing engagement” itself is the problem. You maximize one thing, you necessarily set another aside.
A related, anecdotal story I just read:
Make no mistake, the alt-right and their sugar daddies (Putin, Thiel, a whole host of others) are doing this deliberately, bamboozling and grooming vulnerable people as recruits to their cults.
To be fair, a lot of 13 year old boys go through an oversimplified power-fantasy phase where they want the world to be black and white, and want to be lone warriors against a backdrop of bossy, withholding girls and even bossier moms. Most, of course, grow out of it with the growth of their frontal cortex. Strike while the iron is hot, I suppose…
Not enough of them. As bad as it is when the alt-right jumps on this opportunity to groom a kid, the more long-lasting and destructive effect in the U.S. has been the Libertarian and Objectivist siren song of “mommy can’t tell me what to do!” that’s stuck with some powerful and determined men long past adolescence.
Papasan explains the connection between the epistemological crisis and the rise of far-right conspiratorial thinking
1] Stupidity & Ignorance are a perfect breading ground for the Right Winger Hucksters
2] See #1
One tactic is to exploit “data voids.” These are areas within a search ecosystem where there’s no relevant data; those who want to manipulate media purposefully exploit these. Breaking news is one example of this.
Recent example: Bob Barr and his Mueller Report “summaries”.
That’s a terrifying article. It’s so reasonable to try and keep things in context and think that the internet / social media is just the latest technology that the older generation is freaking out about (telephones are ruining teenagers! too many books make kids lethargic!) and that there’s nothing new under the sun…BUT…I just don’t know if that’s the case when it comes to the speed and all-consuming nature of the internet now for kids.
It’s so easy for a kid (hell, adults, too) to turn on a never-ending firehose of information that confirms and amplifies toxic views like this shit, and then get caught in a kind of feedback loop that is different in scope and kind of what was there before.
Get off my lawn, maybe?
Yesterday at work the new hippy kid was listening to some live Dead on YT. Between videos an “ad” popped up talking about left wing bias in media and pointing out how many times conservative media is called out for racism. I skipped without looking at the source or hearing more, but I immediately thought about what might have been in his search history to alorithmically try to recruit him. I’m not assuming he has any right wing ties or sentiments, he seems quite the opposite in fact, but he is a young man from the rural South and I imagine listening to the Dead signals some sort of “searching for identity” trigger. In other words, bait.
My oldest kids are thirteen now. We have been extremely limiting with media acccess. They have no devices other than their Nintendo DS (which they only recently got and lose the second there’s any turmoil), were never allowed to watch unsupervised and then only from an extremely limited and curated list and will never, ever have a smartphone until they move out. We have simultaneously been working with them on media and tech literacy so they’re not “behind” (as if). I don’t regret any of my decisions one bit.
I don’t know if that’s how it works, but if it were, it should at least work in reverse. That is, someone watching anti-vax videos to support their assumptions might be exposed to the actual benefits from a video of the opposing view, and the benefits of vaccinations should be the stronger argument.
But I don’t really believe that people go into the internet with a malleable viewpoint. They start with an assumption and look for those who will affirm it.
If you’re a young earth creationist, you’re not going to be swayed by the occasional exposure to scientific accounts of the Pleistocene era. You’re just going to hang out with people with the same views to reinforce what you already believe and ignore all evidence to the contrary as the stuff of fantasy.
ahem - russia and the 2016 election?
That is incredibly fucking depressing.
But people have studied how anti-vaxxers react to information. I’m sure correct information can be useful to put out into the public, but apparently anti-vaxxers tend to become convinced they are right when given actual facts.
I know they did this, but I’m far from convinced they were competent. I’m pretty sure spies are idiots. (I’m not claiming they didn’t influence the election, with how close it was we could probably pin the election on light rain in a few counties, Trump needed everything to go his way, including the Russians)
So, the world is tough, and people, especially young people, are stupid. But they make tons for media companies, so their stupidity is fostered.
There has to be a bottom somewhere down here. We’ll hit it eventually, right? Before the world literally ends? Right?
That sounds awfully nihilistic. I think you get it right that computers and the internet are misrepresented as non-partisan arbiters of truth (giving rise to absurd ideas like e.g. that machine learning is the answer to all the problems of society and justice), and it should be considered more as a tool. But then you say that computers are worthless and basically we’d be better off without them, when really what you then describe is the capitalist implementation of media on the internet where ad revenue and minimum liability is the only thing optimized for.
You aren’t going to beat the right wings attack on epistemology by telling people not to trust anything they see on the internet. No, you can only beat the right wing by using every tool in your disposal to organize the left and to fight the right.