Originally published at: http://boingboing.net/2016/09/11/why-facebooks-its-too-ha.html
…
Nude picture of a child screaming in pain. I’m OK with Facebook flagging that by default.
Yes, facebook should suppress pictures of people screaming in pain. I don’t want Abu Graib in my face, or all those sorry civilian casualties in those far-off countries in my news. I prefer images of raising the flag over Iwo Jima, and the “mission complete” stuff from 2004 desert storm, and other true depictions of the USA’s military might. And would it kill facebook to airbrush happy faces onto all these funeral mourners and bereaved? All this other stuff is just gross and makes it hard for me to concentrate on my portfolio.
And yeah as for nudity, it’s of course freakin’ disgusting all by itself. C’mon, facebook, do your job!
Facebook should ask the mutaween for help. They certainly share the same moral values.
Maybe a Raistlin filter would help. All those pretty young milennials turn into corpses, all those food photos moulder and rot, and all those endless lovely sunsets turn grey and dull and dead:
And my eyes! I see through hourglass pupils and therefore I see time-as it affects all things. Even as I look at you now, Tanis," the mage whispered, "I see you dying, slowly, by inches. And so I see every living thing.
OT I guess but I would use the everloving fuck out of FB if you could have that installed.
Engineers solve engineering problems because that’s how their brains are wired, I guess. Which is fair enough. But when your solution is earning a lot of money, and other, equally-talented engineers’ solutions aren’t, that kind of implies that you’ve also got someone in to solve the money-management problems at some point. And if you can get someone whose brain is wired up to solve money problems, why can’t you hire someone whose brain is wired-up to solve social-management problems?
Many organizations have been able to do this before, not mistaking one form of nude image for another. You’d think something could be changed in the reporting process. If they tested it enough, they could probably offer more than one option for nudity reporting and tweak it until they could tell from report categories and percentages which images fall where.
Eg: I’d like to report:
A historical image that displays nudity
An artistic image that displays nudity
A pornographic image that displays nudity
That might not be the best wording or categories, but tweak it enough and I bet you’d get a reasonably accurate account of where an image falls on the spectrum, and what to do about it.
Simple. All Facebook has to do is develop an AI that understands context. Maybe they can get some help from the MPAA.
I’m just glad to see a quote on BB pointing out that engineering problems are relatively easy in comparison to the much more complex anthropological ones.
That’s not really a fair characterization of the problem. I think the issue for Facebook is that the cost/benefit of solving this is way off. If they get it wrong, they’re distributing child porn. It’s not being prudish about nudity (in this case; I think we have an established history of Facebook being prudish), it’s that naked pictures of children are illegal.
This photo has incredible social value. But I can’t see how they ever turn that assessment into an algorithm that an army of underpaid reviewers can apply without inevitably ending up with something illegal on the site. So while I wish they would solve this I understand why they don’t.
It’s a business and is under no obligation to push or even countenance a social agenda you or I agree with nor are they under a legal obligation to be fully transparent as to their reasoning.
How they make their money is no more or less shady than BB using link tagging/cookies to monetize your Amazon shopping basket when you curiousity-click something.
How they control content is no more nefarious than BB masking my violent anti-police and anti-authority commenting.
I’m sure most of you are, in life, adults who don’t act like it’s “unfair” for a private business to behave like a dispassionate money vacuum and have tantrums over it.
I still have to side with FB on this one. The problem isn’t this ONE photo that could be clearly an exception to the rules. The problem is the THOUSANDS of other photos which might be grey areas, and the people griefing, trolling, or have legitimate concerns about being able to post them.
Moderation is HARD and IMPOSSIBLE to do consistently. Just look at the little slice of internet that is the BB BBS. Moderation here a few years ago was much less consistent. Even today there are disagreements on what should or should not be discussed, which comments are out of line, whether an argument is a true argument, or someone just looking to be a dick and argue about something.
On top of this, FB is mainly a general social platform, to keep track of friends, loved ones, and cat videos. It isn’t like a legit news site where people are dependent on it to learn the truth of events (according to some of the shit that pops up on my feed, it is just the opposite.)
They could have cropped the photo, put a click baity title (You won’t believe why this child was covered in napalm and lit on fire.) and then linked to a page showing the full photo and the context behind it.
I can agree that the policy is a bit lazy, but at the same time moderation and weeding out content is time consuming, expensive, and there WILL be inconsistent moderation. Maybe if they started charging users a monthly fee it would be worth their while? But considering it is FREE, I think they owe the user nothing. If you disagree with the way they run their ship, use something else.
Same with the BBS. Not everyone agrees with mod decisions, but it is their ship and their servers, and sometimes it is easier to ban someone than let them continue to muddy the waters with their voice.
ETA - also private groups can post anything the want.
They’re not dispassionate about vacuuming money, though. They’re very passionate about that; just try getting them to do something that will mean they will earn a fraction less money, say by accepting some trifling regulation, or hiring editors to make value-judgements about deciding the precedence of social mores.
So, because you can’t demand engineering levels of consistency in human behaviour, no attempt should be made? Perhaps you’re waiting for consistency in the production of human beings to reach a sufficient standard to implement social solutions?
The question for me is whether or not they have passed into the category of “government” though. Some corporations have come close to this point in the past; I think this may be the first time it has actually happened. And since it’s a very new thing, we haven’t yet even really begun to consider what it might mean. But if they have, then they need to be viewed in a very different way.
I don’t see a point in making a distinction in what is a government and what isn’t a government at this point.
I have no more ability to actively resist Mark Zuckerberg if he wanted to destroy my life than I do if any government wanted to do the same. Fuck, if they wanted me in jail for some reason I have no doubt that they could fabricate enough evidence to sway a judge and a jury.
I probably have more influence over an elected government than I do over Facebook.
But let’s erect a pillory for Facebook because they don’t like controversial images and in the meantime let’s elect Hillary Clinton to the office of president and shit on people who question her integrity.
I think all of your points are accurate, but…there’s that “but”, you know? Other for-profit organizations manage to come up with ways to be more than just money-making machines. Other for-profit organizations recognize that sometimes the rules need to be bent for a good reason. Other for-profit organizations factor human behavior into their product design in order to do better work, not just better-paying work.
My thought is that yes, an algorithmic approach can be excused for sometimes making the wrong decision, but there always needs to be the recourse of someone noting a wrongly-removed photo. In this case, it would have been easy for a thusly-notified human to quickly assess that this photo is monumentally significant, culturally and historically, and to not just reinstate it but to both add that photo as one the algorithm never removes again, and to flag “similar” photos (same photographer, or same posting account, etc) as “humans should review future flaggings in this category til we feel the algorithm is heuristically steered back on course”. From what I can tell, in this case Facebook stonewalled, even upon being notified of the photo’s symbolism… and past that, I don’t even understand why the photo wasn’t already in a big repo of “do not remove these photos because they’re incredibly well-known and valuable”.
(NB: I’m familiar with machine-learning/sorting/AI algorithms, have used and written several myself, and this type of thing seems eminently doable to me.)
I suppose one can urge FB being “better”. But I guess I am confused the level of outrage based on one pic. Certainly if you want to post how horrible Imperialist America is - go right on ahead. FB is full of this and other memes.
I mean, it feels like FB is being condemned for limiting speech. Ironically while one could post that particular pic on BB and probably be fine, the range of free speech allowed HERE on this very BBS is WAAAAAAAAY more restricted than the speech allowed on FB.
So I guess I find it ironic in a lot of ways anyone is getting worked up over this.
I guess I have to ask - what is it you want FB to be better at or expand from just being a money making machine? Allowing the exchange of opinions and ideas? Because they are pretty unrestricted in that area. It seems every fringe cause, good and bad, has a voice somewhere on FB. Are they failing because they aren’t making an exception for one pic?
Again, the shear VOLUME of people on FB makes moderation way more challenging that it would on a smaller platform, so such comparisons are harder to make.
Also, while there are notable exceptions like Tom Shoes, most for profits are more or less about being money making machines. They can off set some of this in various ways. Even evil walmart has partnered with orgs like the United Way and other charities. I think many people feel Apple is forward thinking, helping make the world a better place sort of company - only their products are made from raw materials from less than ideal situations, their products made in rather dismal processing facilities, and they have tax havens that would make Mr Burns heart flutter with jealousy.
It’s not unheard of that private (for-profit) organizations that provide vital services or goods and have a monopoly position get broken up or under partial governmental control.