Look at that photo and tell me what kind of sadist would find it pornographic. Slippery slope not included.
I don’t disagree that it’s possible. I just consider it highly unlikely they’ll invest resources in being able to clear the relatively small percentage of naked children photos that aren’t illegal.
I usually trust big powerful entities about as far as I can throw them. I can’t throw Facebook very far, but by voting I can throw my government out of office.
So what? As far as I can tell, western companies are under no legal obligation to refrain from violating human rights in third world countries, from devastating the environment there, or to refrain from just buying the appropriate laws from poor, corrupt governments.
We, the potential customers, can still make demands and threaten to use somebody else’s product if they don’t do what they want us to do.
The difference being that BB is a discussion forum that is moderated in order for us to be able to discuss things with each other, while FB is a platform that people use for publishing things. Facebook happens to be private, but it is being used like a public space.
BB is like a small room where someone hosts a discussion. Maybe a classroom where the teacher decides what speech is allowed and what isn’t, maybe a private person who invites a few friends over and will kick out anyone who does not behave.
FB is like a shopping mall. It’s a place that fulfills about the same function as the city center of an old fashioned (European-style?) town, where there are shops, restaurants, cafés and places to walk in-between. Only you can’t stage a political protest there, because the owner has the right to kick you out whenever he feels like it.
The vast majority of naked children photos aren’t illegal. Here in Austria, “being naked” is not enough to constitute porn. And I sincerely hope that the total amount of photos involving kids in bathtubs or on beaches in places where nude bathing is considered OK far exceeds the number of photos of kids being exploited sexually.
The law might be stricter in America, but that would only add to the “foreign prudery is being imposed on us” outrage in Europe.
It’s like when the majority of shopping malls in America were Chinese-owned and they started to impose rules about what political messages are OK on T-shirts worn by people shopping there.
True. Whenever you delegate censorship activities of any kind* to an intermediary, you get a multiplication effect. If a newspaper decides to publish the image, they’ve got a clear interest in publishing it, and they will be willing to take the responsibility that the image complies with applicable laws. Facebook, on the other hand, has a negligible interest in that particular photo, so they will always err on the side of censorship.
Which is why I’ve always thought that intermediaries like FB should be absolved of the duty to exercise censorship on the content they host.
The intersection between nude images and pornography is fairly small and explicit. Being nude is not in itself a sexual act of any kind. If people are having sex - nude or not - it could be considered pornographic. If people are nude in non-sexual contexts, it is not pornographic. I think that people make this much more difficult in practice than it needs to be.
Beyond whether or not there is sex depicted, most jurisdictions also have a distinction about whether the sex is simply being depicted for artistic reasons, or intended to titillate the viewer (“appealing to prurient interests”). The former would be classified as erotic art and the latter pornography. This categorization I think is too subjective for legal review. Rule 34 being as it is, worrying about if a given photo might possibly turn somebody on (the answer is always “yes”) is a waste of time.
That’s not what’s going on in this discussion. This is a discussion on their obligation to host or not to host content and their further obligation to be transparent about their reasoning behind such choices.
If your deal is to always bring this up when Facebook is up for discussion I can get behind that. I personally enjoy reminding folks about how compromised Clinton is pretty much all the time, so I get tilting at windmills.
But one of my personal windmills is pointing out how people are inconsistent with how the treat like circumstances based on unrelated bias. You know, hypocrisy. It’s my thing.
The fact is though it isn’t really too hard of an engineering problem, unless of course you want 100% foolproof solutions. You could definitely provide a solution that alerted you that a takedown was somehow especially problematic within hours, a day at the most (which face it they knew within a day but they kept trying to double down) and then you could have guidelines for images that would be reinstated that might have something inside of it like: images that would otherwise be deemed objectionable will be allowed if they have clear historical or newsworthy value whereby historical is deemed any image that has been published in historical books not meant to purvey a viewpoint that is in the list of objectionable viewpoints (racist, sexist… and so forth) and newsworthy is a picture published by any major news organization that is not considered to function as primarily the propaganda arm of a government.
as I said not foolproof, not 100%, but pretty easy to process allows you to not step in big piles of shit like this (with the understanding that rules such as the ones above are not meant to achieve justice or anything like that but to provide organizational cover - not saying that’s okay either, just saying that is the purpose)
I think this comes in part from how they were getting bad publicity from rightwing nutjobs who like to lie for hiring human editors to avoid putting articles that are completely false in their trending news. Since then, they are trying for the appearance of algorithmic impartiality.
Hear, hear. I’m not familiar with the details of this sort of thing, but I know it can be done. FB is just being lazy and cheap.
And it’s too bad they’ve let bad be more important that good: i.e., sure we’ve got to work on the child pornography problem, it’s awful; but at the same time, we’ve got to make sure we never forget things that are this incredibly important. Some people, especially younger people, may never have seen it, and should, so that maybe horrific blunders of this magnitude are less likely to happen again. And what better way than on a social forum where it can be discussed with close family and friends, rather than dispassionately in a boring lesson in class. Maybe someone is linked with someone from that period and part of the world, and can get some more personal context.
Good outweighs bad here, IMO. /flame
Well, a sadist. Kind of goes without saying…
For me, the point is the photo is literally exceptional, and that censoring photos of naked children in pain is a pretty sound default for a social media platform.
Can we put a moratorium on using the word “outrage” to describe a bunch of people simply voicing a negative opinion on something? Or we can continue using it that way, contributing to the evolution of the language like “literally” meaning “really” meaning “very” meaning $INTENSIFIER, if you like, but, you know, you’ll find people reading “outrage” as a pretty weak word after a while, which will lead to it being far less effective than you want it to be.
It’s not a reaction to one pic, it’s a reaction to the background of general Facebook policies, not to mention Facebook’s reaction to the reaction. It’s dumb, irresponsible, and careless policy-making. “It’s too complicated for our steely-eyed engineers” is weak, as is “We only exist to make money, why are you hating us?” Because humans hate inhuman, remorseless monsters, especially ones bigger than we are. Do we have to get the pitchforks and torches to make the point?
Why DOESN’T FB allow people to flag ‘good things’ or ‘valid things’? Crowd-tagging the latest Jaden Smith-is-dead posts as ‘Already Snopesed’ would be useful, just as allowing ‘this is art’ tags would also be. They can put it in beta, see how it works, and abandon it when /b/ fucks it up.
Fair enough. I guess outrage is a bit much.
[quote=“Nelsie, post:30, topic:85103”]…it’s a reaction to the background of general Facebook policies, not to mention Facebook’s reaction to the reaction. … Because humans hate inhuman, remorseless monsters, especially ones bigger than we are.
[/quote]
Wait, what? I thought we just put a… mor…a…tor… oh wait, I get it, you want to make “inhuman” and “remorseless monsters” mean something much less. Because having a lazy policy you don’t agree with hardly makes them “inhuman, remorseless monsters”.
Whatever US corporate law may say, corporations aren’t human, composed as they might be of humans sewn together by contracts and mutual desire for money. Thus, “inhuman”. If the only valid reason for a corporation’s existence is to make money, as you claim to believe, then it must needs be remorseless to be successful. A corporation that allows anything to hold it back isn’t performing adequately, and must be replaced by another money-vacuuming machine with no conscience and no desire except to suck up all of the money. Remorseless and inhuman, isn’t that the very definition of “monstrous”?
For someone who suggested I tone down the word “outrage”, lest I mutate the language, you sure are painting your opinion of corporations with some very colorful hyperbole.
It almost makes me want to delete my account, lest I become one with the beast.
So this is about a strict adherence to a non-workeable ideology (that moderation must be automatable and this problem must be solved by technology only)? Is that why they can’t juat hire one or two human beings to do the highly subjective work of moderating posts the system flags?
I’m not so sure it can be. How can a machine tell the extremely subtle difference between a pornographic photo of a nude child and a picture of a nude aboriginal child? The difference is minutely subjective. Can or could algorithms differentiate between art and noise?
More manufactured outrage. How many acceptable-to-post photos of naked children exist? I’m surprised to be in a position to defend Facebook since I despise most of what they do but a 0% tolerance for photos of naked children is just fine with me.
If there was an Olympic medal for missing the point in favor of getting angry, many BoingBoing readers would win medals.
This is likely the issue; in some jurisdictions, if there’s the slightest chance that some pedo is getting off to the image, it’s illegal.
As it should be imho.
I suggest “pearl clutching”.
Actually, some of these ‘soft’ problems are fundamentally insoluble, and are just not worth wasting time and money on unless your fundamental objective is specifically to fight one side of that particular argument. Otherwise, no matter what you do, you will piss people off. If your actions piss off group A, do you take them back? When that pisses off group B do you reinstate them?
Facebook is pretty global, and has to deal with all kinds of different cultures and beliefs about what is right/wrong/acceptable/unacceptable. Finding a common ground that everyone agrees on is not going to happen, no matter how good their programmers are.
Maybe it’s not their place to deal with them. You solve them, and then once you’ve figured out what the requirements are, the techies can implement those solutions in software. You don’t expect a dentist to fix your car, or a mechanic to do your brain surgery. Why do you expect software engineers to solve your cultural/anthropological problems?
[Edit: In case my stance is unclear, I’m against censorship, but I also think that it’s unreasonable to blame and fault software engineers for failing to solve moral/ethical problems that the world’s greatest philosophers, social workers, and anthropologists can’t even begin to deal with.]
What a lovely straw argument you make. Tell me “SomeDude,” how many different pictures of naked children screaming in pain do you think should be on Facebook to meet your standards?[quote=“FFabian, post:4, topic:85103”]
Facebook should ask the mutaween for help. They certainly share the same moral values.
[/quote]
Really? Is Facebook telling people how to dress and physically assaulting them for not covering their head and face? Really?
Of course they aren’t. Facebook’s posting guidelines are not comparable to the physical coercion of the religious police in Iran. Both you and “SomdDude” are bringing up straw arguments. What is more shameful is the people who liked your post without bothering to figure if what you said was actually a valid analogy. It was not.