Facebook blames those pesky algorithms for approving pro-genocide ads in Kenya

Originally published at: Facebook blames those pesky algorithms for approving pro-genocide ads in Kenya | Boing Boing

4 Likes

To understand the harm that Facebook’s platform is creating around the world one must only take a look at how much hate speech and misinformation is spread through the English-language portions of the website, then extrapolate how much worse things are in places where genocide is commonplace and Facebook has literally no moderators who speak the local language.

22 Likes

I have no doubt that the world would be a better place without this toxic slag pile of a company.

15 Likes

Add to that the fact that Facebook has worked hard to make its “services” indispensable in the region. Their market penetration is far higher in Africa and it’s an even bigger part of daily life on the internet than it ever was in most other regions.

How Facebook took over the internet in Africa – and changed everything | Facebook | The Guardian

13 Likes

I thought this sounded depressingly familiar.:

Each time Global Witness has submitted ads with blatant hate speech to see if Facebook’s systems would catch it, the company failed to do so. In Myanmar, one of the ads used a slur to refer to people of east Indian or Muslim origin and call for their killing. In Ethiopia, the ads used dehumanizing hate speech to call for the murder of people belonging to each of Ethiopia’s three main ethnic groups — the Amhara, the Oromo and the Tigrayans.

10 Likes

Why would they moderate content when that content is the prime mover for “selling” the platform?

(I know, it is hard to blame the algorithm when that same algorithm is their bread and butter, but ultimately Facebook need to acknowledge and own the problem, since a human has approved that algorithm for production use.)

3 Likes

both machines and people make mistakes

In Facebook’s case, over and over and over again. Though it’s not a “mistake” when the machine is doing exactly what it was programmed to do…

9 Likes

Ironically, there was a terrible ad on this post when I read it. Not genocide level bad, but … well maybe. Depends on your perspective I guess. Anyway, I’ve actually never seen an ad for this product online before and didn’t realize it was actually legal in the US, but a quick Google search tells me it is, surprisingly. I thought I took a screenshot, but apparently I did something wrong. Anywho, it was an add for Winston menthol cigarettes. I was kinda shocked.

5 Likes

And because it is a neural net, the ‘algorithm’ is a black box so you can’t determine its reasoning process. What a great get out for the company!

2 Likes

I’m sure Facebook doesn’t mean to amplify this kind of thing; it’s just that they don’t care, and the ROI magic of “platform” companies is that they use zero marginal labor per user, and so can expand as fast as servers can be plugged in. It would annihilate their share price if they admitted that the accountable, non-lethal version of Facebook would mean building out a hierarchy of human staff in every city on Earth.

And because the commodity they’re algorithmically moving is attention, the machine will automatically seek out and amplify propaganda. Affluent users can be funded from Apple or BMW’s ad budgets, but if users don’t have money to spend, the only “advertisers” who will pay for their eyeballs are political interests seeking to buy power. That is baked into the foundation of Facebook’s business.

The justification for “disruption” is that you have to try stuff to see if it works. But the converse is that if it doesn’t work, you have to stop doing it. At this point, there is no excuse for being invested in Facebook, and I resent having to endlessly debate this closed question just so their shareholders can feel OK about their dirty money.

2 Likes

If your excuse for not running your business ethically is “It’s too expensive” then you are the problem.

9 Likes

…look no further than my dad. He was always a conservative-leaning sorta fellow, distrustful of government, but a nice rational man who went along and did his civic duties. Since he found Facebook (and now spends all his free time there, replacing all other media) he has become a gun nut, antivaxxer, conspiracy theorist whom I can now barely have a conversation with.

Fuck you, Zuck. You ruined my dad and there’s a special circle in hell for you. I’ll make sure of it when I get there.

8 Likes

Yep…

Sometimes I feel like I’m stuck in a loop with older family:

  1. No I didn’t see that on Facebook. Not everyone sees the same posts .

  2. That’s not true… Yes you can lie on Facebook.

  3. It’s more complicated that that. That is ignoring a lot of improtant details.

  4. Yes relative/friend X is a conspiracy theorist/jerk.

Repeat…

There seem to be a lot of people in the Boomer and older generation who intrinsically trust everything on Facebook just like they trusted conventional news previously. Except now they dont trust the news because a friend reposted something and that is intrinsically trustworthy.

The only reason I am on facebook is for a couple of special interest groups (hobbies) and even that is getting algorithmed into a useless shell of what it was. Ugh.

4 Likes

Saw an article weeks ago at The Guardian UK about this.

faceschnook simply doesn’t give a rat’s ass about anything but making money off the people who use it, and getting all data possible on everyone who doesn’t.

Delete it. Get it out of your life. Call people you love with your telephone. E-mail still exists, and it won’t try to ruin your life the way faceschnook does. Start a blog on tumblr. It may be a hellsite, but it’s nothing compared to faceschnook.

2 Likes

I’ve had that exact conversation so many times. I know exactly what you mean.

Yah, it’s an actual biological fact that we become more trusting and naïve as we get older (scams target the elderly for a reason) and that combines into a perfect storm with echo-chamber algorithms and inexperience with new media in the Boomer generation. My dad now regularly says insane things and when I ask him where got the information he says “online”. But even from the wording he uses, you can tell it was a Facebook meme.

He finally learned to stop sending me said memes (and I left FB so it’s harder now) but he genuinely believes all of them. I explained that they are literally just made up by Russian trolley farms and idiots on Reddit, but he doesn’t understand the ecosystem. It doesn’t make sense to him that people would make up toxic shit just for the lulz so he feels they must be mostly true. Because why would anyone bother?

5 Likes

My father-in-law has a similar condition. He’s deep into conspiracies, believes Rand Paul and has recently taken to calling my spouse “Satan” (for arguing with him? Or confusing him? Or accusing him? Not clear.). She is enjoying the new title and is leaning into it on Twitter.

I read her some of what you and @BradC said because it rang so true and she suggested this tweet because “even Satan left Facebook for Twitter :open_book::fox_face:

3 Likes

One aspect I have always found interesting, in the increased trust we see in Seniors, is that there seems to be a lot of trust in individuals (even strangers, with many studies showing this), combined with a mistrust in larger groups/organizations (unless they align with your existing worldview)

i.e. Person I barely know said this crazy thing, and/or my friend reposted, so I trust it. Vs. The news/gov said X and it’s obviously not true because (insert something along the line of conspiracy or politicians always lie).

Facebooks ability to tap into this “personal connection” trust mindset for something from a random Russian troll farm is really an impressively evil accomplishment.

Related Tangent-
As someone who works in government run healthcare, I sometimes feel that I should be flattered people think we are organized enough to pull off a conspiracy :joy:

3 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.