Leaked confidential memo reveals Facebook program to identify and target "insecure" kids


Originally published at: http://boingboing.net/2017/05/01/stressed-defeated-overwhelmed.html


*brief pause* “Ok, investigation done. It was money. No more questions. Like us on Facebook!”


Looks like Mark Zuckerberg is wary of potential future competition.


As a parent, how can you allow a child access to this? It’s disturbing…


I miss the days of Eliza.


I’m going to go a bit against the grain and say that what matters here is what they were planning to DO with this analysis. It certainly looks like it could be used to try to lower teen suicide rates. Of course social media is real source of social anxiety these days. But figuring out who is vulnerable could be used to change the prioritization of posts shown… more positive stories, fewer “perfect celebrities” and negative stories. Or it could be used to target people for makeup and weight loss ads…


Looks like that article is profoundly paywalled. Here’s hoping its covered in-depth by the media!


Shouldn’t we have learned from the Gryzzl fiasco?


How can you people be so calm in the face of such a menacing headline? They’re probably grooming those kids for Soylent Green! Nobody could possibly work to discourage teen suicide. Won’t somebody think of the children?


The article does seem to clearly indicate marketing:

“the world’s biggest social network is gathering psychological insights on 6.4 million “high schoolers”, “tertiary students”, and “young Australians and New Zealanders … in the workforce” to sell targeted advertising.”

And while the idea of using the data to try to lower suicide rate sounds nice,I have to wonder how that would work on a platform like facebook, where the content is user selected, except for ad space.

Edit: extra word


The marketing towards kids and young teens isn’t new by any stretch of the imagination. There’s a lot of services, sites and apps built with them primarily in mind and i’m sure they do their fair share of data mining and research into how to better sell them things.

I do thing it’s reprehensible that Facebook would try to specifically target kids who are having a hard time, but overall it’s nothing new. If you don’t want your kid’s data out there you’d have to try pretty hard to minimize their internet usage, so it’s a damned if you do, damned if you don’t situation.


I had a thought along the same wavelength, but with darker tones - what if they were trying to identify kids and teens on the edge of doing something violent? Some kind of “Sandy Hook Sniffer” engine? I know it sounds tasteless, but if Facebook can shortlist potential red-flag human beings, wouldn’t this actually be a good and responsible thing? If FB could turn to the authorities and say “hey, our highly paid experts and massive amounts of computing power say you should probably go knock on this kids’ door. Probably nothing, but there might be a trench coat and a duffel bag full of guns rattling around over there,” wouldn’t that be FB taking a proactive approach to a large social cancer?


And sure, there’s only a minority of such reports that’ll come out incorrect.

What could possibly go wrong?


Was “targeted advertising” in the original document, or is that the reporter’s assumption? That makes a pretty big difference in how a reasonable person might interpret this, but the article’s behind a paywall so I can’t tell.


Pity the 3 laws of robotics doesn’t apply to corporate entities.


I can imagine something like this being in place for sure. Not sure if FB would be bold enough to do it, but at least various agencies seem to spend a lot of time data mining social media.


Oh, lots could/would go wrong, but I think the real question is would it be more wrong than what is already happening? I don’t know, I really don’t, but I still think it is a question worth asking.


Advertising during deflated moods is explicity mentioned in the document according to Ars Technica, although the Australians’ article is indeed behind a paywall.

"Facebook’s secretive advertising practices became a little more public on Monday thanks to a leak out of the company’s Australian office. This 23-page document discovered by The Australian (paywall), details in particular how Facebook executives promote advertising campaigns that exploit Facebook users’ emotional states—and how these are aimed at users as young as 14 years old.

According to the report, the selling point of this 2017 document is that Facebook’s algorithms can determine, and allow advertisers to pinpoint, “moments when young people need a confidence boost.” If that phrase isn’t clear enough, Facebook’s document offers a litany of teen emotional states that the company claims it can estimate based on how teens use the service, including “worthless,” “insecure,” “defeated,” “anxious,” “silly,” “useless,” “stupid,” “overwhelmed,” “stressed,” and “a failure.”

The Australian says that the documents also reveal a particular interest in helping advertisers target moments in which young users are interested in “looking good and body confidence” or “working out and losing weight.” Another section describes how image-recognition tools are used on both Facebook and Instagram (a wholly owned Facebook subsidiary) to reveal to advertisers “how people visually represent moments such as meal times.” And it goes into great detail about how younger Facebook users express themselves: according to Facebook Australia, earlier in the week, teens post more about “anticipatory emotions” and “building confidence,” while weekend teen posts contain more “reflective emotions” and “achievement broadcasting.”

This document makes clear to advertisers that this data is specific to Australia and New Zealand—and that its eyes are on 6.4 million students and “young [people] in the workforce” in those regions.

I think it’s naive to imagine that any of this data collection is in service of social media users or their mental health.


Wait a second, you mean it’s really just a modern, super-targeted version of the Charles Atlas workout comic book ad?


Soooo… it targeted all of them?