xeni — 2014-07-03T17:12:59-04:00 — #1
jerry_vandesic — 2014-07-03T17:29:40-04:00 — #2
Every company that does data based marketing does experiments on their customers. Catalog companies try out different covers, credit card companies experiment with different interest rates, cable companies investigate the impact of changes in envelope color and typeface. All of this is to achieve desired business outcomes. This has been going on for fifty years, and isn't new. In the past decade, I have worked for companies that did tens of thousands of market experiments a year. If you want to opt out, you are going to need to drop FB, throw away your catalogs, cut up your credit cards, and disconnect your cable (among other things).
regular — 2014-07-03T17:36:19-04:00 — #3
Logical fallacy: appeal to tradition.
The difference is that most users do not consider messages from friends and family to be 'marketing'. Credit card solicitations in the mail are far different than your communication between you and your loved ones.
lloydcogliandro — 2014-07-03T17:52:49-04:00 — #4
Not to give Facebook a pass here, but how is this any different from the Fox News tactics convincing their sheep that the sky is falling as long as God's Own Party isn't running things? What differentiates a website from a news channel?
funruly — 2014-07-03T18:11:19-04:00 — #5
The "website" purports to connect you to your friends and family.
Not your happy friends. Not your sad friends. ALL OF THE FRIENDS.
shuck — 2014-07-03T18:26:45-04:00 — #6
But if we just look at Facebook, they have been doing similar experiments from the get-go as well. The only real difference is that this time they were open about what was done and published the results. Not that I'm trying to justify anything (I consider all that usual experimentation to be morally suspect, at best), and I recognize that people can only really respond to what they know for a fact has happened (as opposed to what they can only assume is constantly happening), but it does feel a little perverse to be pushing Facebook in the direction of being less open, which, if anything, is what this outcry does.
regular — 2014-07-03T18:46:56-04:00 — #7
they have been doing similar experiments from the get-go as well.
That totally makes sense. That's why I explained to the last cop that pulled me over, "But officer, I ALWAYS drive 30 mph over the speed limit."
it does feel a little perverse to be pushing Facebook in the direction of being less open, which, if anything, is what this outcry does
Which is why we should not teach children the difference between right and wrong because that'd just teach them to hide their bad deeds better.
jerry_vandesic — 2014-07-03T18:48:21-04:00 — #8
@funruly: "ALL OF THE FRIENDS"
If only it were true. They probably have language in their TOS that lets them select what they think you should see. They have been doing this with companies for a while now. If you really want to reach more than a couple percent of your followers, you need to pay up.
funruly — 2014-07-03T19:00:58-04:00 — #9
You seem remarkably blase about the market segmentation of friends.
shuck — 2014-07-03T20:08:51-04:00 — #10
This feels more like those "you can do whatever bad thing you want, so long as I never find out about it" types of moral instruction. The parent who knows that their child is off doing horrible things, but they never ask or press the issue so that they can pretend nothing's wrong and whereby they'll have a certain sort of plausible deniability, but when the child confesses, unbidden, to something, they unleash their wrath on the child. Seems like that just teaches the child to hide their transgressions, not to avoid committing them.
Again, this is not a defense of Facebook, it's just a thought that perhaps people should have been at least this upset - and asking questions (and demanding answers) - a whole hell of a lot earlier.
arghanurism — 2014-07-04T00:11:23-04:00 — #11
I think the fact that we are uncomfortable about fb in general is because it isn't something that we can control, in that we have no idea what is and isn't being done with the information.
But the pickle here is that it has chosen to collaborate with researchers who publish their research, and in so doing conform to a standard of ethics. In that standard, it is generally accepted that you must seek to inform people before you do experiments on them.
It appears that these researchers are in violation of these long established ethics. Doing it at this scale makes it all the more egregious.
The question that I'm interested in is whether through this controversy facebook can legitimize this type of research--something in line with their previous history in pushing it's 'privacy is immaterial in the modern age' agenda, something which they profit from directly.
imaguid — 2014-07-04T01:47:33-04:00 — #12
if people (and FB) could stop pretending what was published in a scientific journal was just more of the same product/service testing that sites have been doing since time immemorial, that'd be great.
it just doesn't pass the laugh test. that supposed kind of testing a) isn't worth publishing in a scientific journal, and b) has no business trying to spread unhappiness (there's no corporate interest served there).
and while we're at it, if people could stop pretending wall posts are facebook's "content" (to be tweaked at will with no consequence) when they are clearly user's communication (requiring quite a bit more care and consideration when intervening with), that'd also be great.
humbabella — 2014-07-04T09:24:17-04:00 — #13
Facebook manipulated about 700,000 of its users' newsfeeds, to see if changes could alter the users' emotions.
Spoilers for Hannibal here, you have been warned.
I read this and think of the look of terror on Will Graham's when he realizes that Hannibal Lecter committed horrible murders, interfered in investigations, psychologically tormented him, and did all manner of other terrible things just to see what would happen. Is manipulating people just to see what happens just so inhuman that it disturbs us?
From "On the Pale Criminal" from "Thus Spoke Zarathustra";
Hearken, ye judges! There is another madness besides, and it is
before the deed. Ah! ye have not gone deep enough into this soul!
Thus speaketh the red judge: "Why did this criminal commit murder?
He meant to rob." I tell you, however, that his soul wanted blood, not
booty: he thirsted for the happiness of the knife!
But his weak reason understood not this madness, and it persuaded
him. "What matter about blood!" it said; "wishest thou not, at
least, to make booty thereby? Or take revenge?"
And he hearkened unto his weak reason: like lead lay its words
upon him—thereupon he robbed when he murdered. He did not mean to
be ashamed of his madness.
And now once more lieth the lead of his guilt upon him, and once
more is his weak reason so benumbed, so paralysed, and so dull.
What a bizarre world.
And that's a fine point for anyone who already thought of Facebook as an immoral company that persistently manipulated people to commodify them. Or perhaps this is a case where the experiment provides us with the backdrop to see how bad Facebook has been the whole time? I don't know, I guess I already thought they were pretty bad.
And this the crux of the problem, Facebook is not a human being. We feel there is something inhuman about what it is doing, that is because it is not human. People don't make human decisions on behalf of the companies they work for. If your child is a psychopath then yes, the only thing they will learn from moral lessons is to avoid being caught. Large corporations make decisions more like psychopaths than they do like normal people.
xeni — 2014-07-08T17:13:12-04:00 — #14
This topic was automatically closed after 5 days. New replies are no longer allowed.