Once again, Facebook blames an algorithm instead of taking responsibility

I am 100% prepared to believe that it is not a tractable problem for Facebook to have been able to flag this video automatically. I haven’t seen it, but the fact is that images of mass murder are a staple of mainstream discourse (though a nipple or, Christ Jesus forbid, a dick, is an outrage in any context).

The issue is that so fucking what because this video was up for about a day – long enough for newspapers to write stories about the fact – before a single one of Facebook’s human employees took the initiative to click the button to block it.

Reporters and politicians need to stop being so pathetically blinded by nineties science. In 2019, Facebook’s spokespeople shouldn’t be allowed to even finish a sentence like this without being gavelled for contempt of bullshit.

(As another software developer with a CS degree)

The problem is that this is a bed that Facebook, Google, et al made. You’re a congressperson: Google tells you they have algorithms that tell you if a dog is in a picture. Facebook suggest that you’re in a picture only seeing the picture, they claimed that this algorithm would prevent exactly these kinds of videos 2 years ago. The only purpose of these censorship algorithms is to look like magic so that they have something to blame when legislators ask them what they’re doing about bad thing x so that they don’t have to spend real money hiring real moderators and getting them real psychiatric help when they inevitably are harmed by their jobs.

2 Likes

I wonder what it would take for the authorities to drag out and arrest Zuck, kicking and screaming Julian-Assange-style. https://gizmodo.com/julian-assange-dragged-out-of-ecuadorian-embassy-and-ar-1833964012

3 Likes

Close enough?

1 Like

Aside from the difficulties of machine vision in general; the “looks like Call of Duty” comment from one of the congress member seems like a very good reason for Facebook’s people to know that the task they were selling their algorithms as being capable of had an even more difficult task ahead of it(which makes selling it as a solution sketchier).

FB doesn’t seem to have dethroned YouTube(for episodic production) or Twitch(for live streams); but they have some designs on the area; as well as designs on being a news source; so they know that any anti-violence algorithm is going to be subjected to video game stimuli as well as TV-level news and true-crime-drama stimuli; which is going to make achieving an acceptable false positive/false negative rate trickier(especially ‘real life is unrealistic’ is probably in play; and footage of real people actually being shot/stabbed/punched is probably generally less dramatic in terms of visible gore than game or theatrical footage; as well as less readily available for training purposes).

If there is evidence that Facebook is sandbagging on machine vision tech this isn’t it(and one doubts that they are); but the fact that Facebook wants to be all things to all people makes their job harder; and so any selling of their tech as adequate for purpose less honest or plausible.

1 Like

‘Image: Wax figure of the famous Mark Zuckerberg from Madame Tussauds, Siam Discovery, Bangkok’

This is the only waxwork in history to look less creepy than the original.

This topic was automatically closed after 5 days. New replies are no longer allowed.