Facebook's massive psychology experiment likely illegal

@doctorow any details of how the study was funded?

I read the Cornell press release yesterday which said the report was partly funded by the “Army Research Office” Interestingly, the article has since been ‘corrected’ and they now state the study received no external funding.

What I can’t get my head around is why anyone would think a company worth $170 billion need funding in the first place?

http://www.news.cornell.edu/stories/2014/06/news-feed-emotional-contagion-sweeps-facebook

Those “negative stimuli” were going to be in their feeds anyway. It was exposing hundreds of thousands of people to less positive stimuli than normal or less negative stimuli than normal. Since facebook is a place they go and this stuff is stuff they see anyway, I think the analogy to putting up a different sign and seeing if people like it better holds. I am just as abused be a mean spirited advertisement on a billboard that I pass by on my way to work as I am by a story I didn’t want to read on my facebook feed.

That sounds like diaspora* - covered here on BoingBoing at least once, that I can find:

I think Snake is also pretty similar in scope:

And there are many, many more.

1 Like

While I understand your (and my) offense at the study, I also agree with @anon50609448. This study comes as little surprise. Before I even read any of the comments here, I did go check out the study. They’re looking for ways to tweak people - only ever so slightly - in an emotional direction. Their goal isn’t to send you into a spiraling depression - you’d never open your wallet. They just want proof that remote emotional tweaking works. (Don’t worry, I still think it’s horrible and manipulative.)

Here’s why they want to do it: Basically, online ads fail a lot of the time. This article discusses part of the problem with online ads - the erratic behavior of online buyers. Advertisers can’t herd them like they can people at brick & mortar stores or people watching a TV show.

Since they can’t guarantee regular buyers online, they needed to figure how to encourage what normally drives people to shop other places while they are online, and then put that into force. That’s why emotion is getting put into play - because Yahoo doesn’t want people buying the thing they saw on Yahoo later in a store. They want it bought from Yahoo right then.

Every single large company does this sort of thing. You have to do experiments.

If you’ve been in a supermarket or something like that, you’ll at some point be subject to some experiment about merchandise placing, advertising placing etc etc Ditto with a thousand other businesses.

“there’s a federal law that prohibits universities from conducting this kind of experiment without explicit, separate consent”

Facebook isn’t a university, I’m also not sure why you would need explicit consent for this sort of experiment even if it was conducted by a university.

There are a bunch of academics who do argue the ethics committees and so on are far too zealous in their rulings and stop non-harmful, pretty basic research in medicine, in development and the like.

Even Ben Goldacre thinks the ethics committees harm patients in medicine by stopping and slowing down trials etc, it’s going to be much the same elsewhere. They have a very important place but we should not be quite so restrictive.

http://www.badscience.net/2011/03/when-ethics-committees-kill/

1 Like

If it really was illegal, I imagine someone will sue them over it. Wouldn’t be surprised if nobody ever got around to it, though. It’s more fun to complain about how it might be illegal than to actually try to find out whether it really is. After all, the courts might not agree with you.

A thousand times yes! Open source! Fund it through Kickstarter. I wouldn’t even mind limited advertisements so it can fund itself and pay the programmers (make it a non-profit). Why hasn’t this already happened? And Google Groups doesn’t count because of the “Google” part.

The Facebook guy in charge of the experiment said they were reacting to reports that Facebook was depressing people (all those pictures of other people’s babies, people at parties you weren’t invited to, humblebrags, etc). He said they were trying to prove this isn’t true, but it seems unlikely that the experiment (as described) could do that.

1 Like

The experiment lends weight to the claim that pictures of people’s babies and people at parties are not depressing on average as it found that positive word made people happy and negative words made people unhappy. It’s not proof but it’s not a far cry to think that positive and negative images would have the same effect.

If we are really concerned about people’s mental health then we aren’t talking about averages, we are talking about individuals who might be suffering. But I don’t think that’s what facebook cares about. I think they care about refuting the idea that facebook is depressing for a majority of users.

Obviously the study can’t prove that one way or another, but I do think the results are useful for their purposes.

I’m calling bullshit. This has nothing to do with selling and everything to do with manipulating people’s moods and emotional states. And even if it were true that they’re experimenting to see if making people depressed or happy has a reliable affect on their purchase activity its still unethical. Just because its hard to prove when companies commit unethical acts isn’t an excuse for condoning or shrugging off when they do.

You’re trying to conflate the fact that negative stimuli exist on facebook with facebook purposely aggregating only negative stimuli for hundreds of thousands of users. Its not the same thing at all.

This topic was automatically closed after 5 days. New replies are no longer allowed.