Facebook COO: "Facebook cannot control emotions of users"

[Permalink]

1 Like

“Facebook cannot control emotions of users. Facebook will not control emotions of users.”

Uhhh … why promise that you will not, if you can not?

I can think of all sorts of petty ways FB could control emotion. Imagine slowly being dropped from feeds from the most popular kid at school. That kind of thing.

1 Like

They don’t give people a dislike button.

Controlling input (news stream) and output (limiting the acceptable behaviors) does affect emotions.

1 Like

Wait. Are you trying to control my emotions right now?

2 Likes

Perhaps BB has been doing this for years, in their slow transformation from “Destroy Serious Culture” to “Daily Dose of Righteous Indignation.”

Can’t wait for the report to be published!

3 Likes

Well, obviously. If they already had a mature capability, why would they still being doing basic research in the area?

Facebook is well on its way to becoming irrelevant… thank God:
http://mankabros.com/blogs/onmedea/2012/10/04/facebook-is-irrelevant/

1 Like

“We communicated very badly on the emotions study,”

In other words:

Just because our entire business model is based on “communication” doesn’t mean you should expect us to be any good at it.

1 Like

Serious question.

Except for blogs and the crowd that gets outraged at everything, is there any real outrage out there about this anywhere?

I have not seen one person anywhere even mention this let alone be outraged by it. I just keep reading blog posts about how outraged everyone is.

She said the experiment “was small, over one week,” and claims it was conducted in a way that “protected privacy.”

Good to know. Doesn’t really help much, though, given that the whole creepy mess has exactly jack to do with privacy.

User: I’m very angry that you’ve been manipulating my emotions!

Facebook: We would never do that. We couldn’t even if we wanted to.

User: Oh good. That makes me feel bet—GODDAMMIT.

4 Likes

There are a lot of blogs that aren’t just the rants of “Joe Random” that show a lot of concern from academics, researchers, journalists/media, and lawyers… But there seem to be a few reasons that this story doesn’t seem to be engaging the wide audience it deserves:

  1. There are a few big issues, and many small ones: complexity limits the audience.
  2. Diffusion of responsibility: there’s no single villain at whom to point the figure.
  3. The press (hell, even FB and the researchers) don’t use a consistent terminology.
  4. Kneejerk anti-FB trolls address the wrong things, and valid objections suffer because of it.

Please note: my conclusions were arrived at via methods which were not approved by any ethics review boards… In fact, they probably wouldn’t even pass a logic review board.

Ms Sandberg, who is visiting India, emphasised that Facebook takes the
privacy of users very seriously. "We communicated very badly on the
emotions study… "

In certain circles, telling the truth is considered bad communication.

2 Likes

There’s one thing I don’t get about this whole affair. I’ve served on Institutional Review Boards and as far as I can tell so far, Facebook has violated federal law regarding protection of research subjects. You almost always need prior informed consent, even if the only risk is that the subject might feel sad, and in those few cases where deceit is an essential part of the experiment, the IRB closely monitors the situation and post informed consent is required. As far as I can tell from the news, this project was never reviewed.

So if there is an enterprising attorney reading, this sounds like ideal class action territory.

1 Like

Hello, yes I’m outraged. I just visited Facebook, read some stuff off my feeds, and now for no apparent reason, I’m outraged. So I came on boingboing to make sure YOU know that I’m outraged, as apparently you need the reassurance of numbers.

2 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.