Facebook's massive psychology experiment likely illegal


What you are saying would make a lot of sense to me if manipulating people into buying things had nothing to do with their emotions. While we are accusing each other of being callous towards mental illness, I'll note that anyone who has developed an eating disorder in part because of unrealistic stereotypes of beauty in the media will be happy to hear that their mental health was "one thing' and that making someone read a sad story on Facebook is "another." I think I'll stop short of hoping that people you know will die for your opinions, though.

I'm pretty sure that they did not determine from their experiments that they could control people's minds through their facebook feeds, but merely that they could make them, though some metric, feel better or worse. As for the results of this study being "obvious," there are a lot of things that seem "obvious" until we study them. The author of the linked article arguing against this seemed to think this might serve as a wake-up call for the degree we are being manipulated by invisible customization, I don't feel I'm alone in thinking this result might matter.

If we use "common sense" we can see that filtering articles out of a news feed selectively for a limited time basically makes you the reincarnation of Mengele. I think I understand.


Genuine question: Facebook have been controlling what appears on our
news feeds for a long time, largely in order to make more money (as
discussed elsewhere). What makes this any different?

You could have read the linked to article for the answer to this question.

But at the very least what makes it different is that general business practices and research are one thing and federally funded academic research projects are something else entirely and the latter come with a set of rules, regulations and laws that govern them. If you want to participate in federally funded academic research, you have to follow the guidelines and laws.


Personally I find this unethical. However I know as a fb user that my data is being used and monetized. So it's a line that has been crossed, in my mind, into unethical territory. Nevertheless I find Facebook's seemingly forthright and transparent announcement to be commendable and I welcome this in order to allow a dialogue about these issues to occur (the results of which Facebook will no doubt try to utilise to their own ends). If any harm has been done, I hope justice can be sought through the correct channels and Facebook will act promptly and appropriately on the basis of the any judgement given - without any gag-orders if settlement is agreed out of court. Let's have these issues, that will become more and more a part of our lives, out in the open, to be discussed and refined. And let's encourage that - even if the original action is disagreeable.


Ooh, I like that thought. Surely somebody has already proposed (attempted?) building a peer to peer decentralized (encrypted) social network?


I wasn't referring to the legality (as per the article) so much as the morality.


I like how their response is "What are you guys getting upset about? We do this sort of thing all the time."


Me too. I think anyone who's surprised by the fact that this happened doesn't really understand Facebook or their business. The only thing surprising is that they publicly released the data and then got shat on by the internet for doing so.


I guess they watched The Truman Show a bit too much!


I hope there is a payout which is relative to time spent on the site. I know some people who would be making bank if using facebook was a paying gig.


I was going to post something similar, manipulating emotions is one thing but ANY kind of manipulation in a negative way is a huge no-no. It's why there's so little research into the nocebo effect. Deliberately introducing negative emotions and mindset is unethical by most psychological research standards and most likely illegal in most cases. Even with explicit consent it's a pretty murky ethical area.


I sort of agree, though it will be nice for them to have it explained that the attitude they have had all along is antisocial. What did you expect from a network conceived of by frustrated nerds in a college dorm, and then operated by those same nerds with more resources, more yes men, and fewer boundaries?


It all depends whether the data is made available for scrutiny.


My apologies, I agree with you then, morally there is probably no difference really. I guess there might be some difference in that morally/ethically they are ignoring the law in this case and they are not ignoring the law when they just do as they wish as part of business as usual.


Even better is a non-lawyer opining about interpreting laws.


Some "folks like you" take it seriously, some "folks like you" don't seem to take it so seriously. I don't know which one you are since I don't know you from Adam. All I can say is given the amount of control psychologists in our society have over the lives and livelihoods of others, the number of times they fuck it up, and the severity of those fuck ups I get pissed off when "folks like you" act as though the rest of society doesn't have any stake in the field of psychology.

Instead of getting pissy with people who you think don't understand the situation try educating them. It's your flippant tone that worries me that you're not taking ethics violations seriously. If you do take ethics violations seriously then prove it by showing a modicum of respect for people who are at the very least interested in what's going on in research psych even if they are somewhat uninformed.

If you can't stand that some uninformed people think they have a stake in psychological research then maybe you should be in another field.


I guess Facebook could take the exactly wrong message away from this. The right message is that if you want to perform psychological experiments you have to obtain direct approval from the parties to the experiment. They could've rather easily done this, but they didn't.

And keeping this kind of experiment secret doesn't make it less unethical. It makes it more unethical. So the message you think Facebook will take from this is perverse.


What's even more egregious in this case is that there is no way to observe the subjects during the experiment. In a normal psych experiment the subjects are monitored and if something goes wrong the experiment can be stopped. With this, applying negative stimuli via algorithm to hundreds of thousands of subjects that are totally unknown to the "investigators", where they have no knowledge of the experiment, and where there is no observation of the test subjects is unethical and reckless in the extreme.


No its not. And the fact that you think it is, and that you are actually a psych researcher, is troubling. This is not A/B testing. Its not seeing whether one phrase or another, one image or another, sells a product better. Its exposing hundreds of thousands of people to unknown amounts of negative stimuli without their knowledge and seeing what happens. That you don't see the difference is scary.


The problem is treating the Facebook agreement as informed consent. Could you do any of your studies treating the invitation to come sit in a room - with the consent form in 6 point font on a small sticker on the door and say that all people who who enter the room have read and agreed to it?

(yes I know the general outrage is directed at the experiment not the bypassing of consent - however if they had done a proper job on the consent side - would people have been so outraged?)


The "experiment" is part of a monetization technique Facebook uses to maximize their ad revenue. It's their "secret sauce." They're going to continue to refine this however they like, they just won't tell you about it next time. If you think this is unethical, you should discontinue use of their service. You're a customer and if you don't like the product/service you are receiving, discontinue being a customer. If this is the sort of thing that bothers you, you should probably stay away from social media altogether.