Suicide crisis hotline shares caller data with for-profit spinoff

They’d probably pull some shit like saying the hotline worker is technically not a medical professional so you are technically volunteering that info, and also volunteering that it can be used for market research purposes. Or some such bullshit.

8 Likes

That excuse may sound likely, but it won’t fly.

I sense a huge viable lawsuit coming…

17 Likes

Praying I Hope GIF by The Paley Center for Media

17 Likes

The hotline can’t do its job if the clients don’t trust the hotline. And it can’t do its job if it’s at all focused on generating profitable data rather than helping the client.

Why would anyone think that this was a good idea when the actual stakes are so damn high?

7 Likes

Yeah, no. A suicide crisis center depends on anonymity. Selling data is the antithesis of success for such a selfless endeavor, as suicide prevention. I sincerely doubt that anyone who has resorted to such an endeavor will consider a final phone call to complete a marketing survey to fulfill such a demographic data point, instead of simply asking for a helping hand and a way through the stigma of such despair.

5 Likes

solving for how to bring more empathetic conversations to the world.

“…Solving for how…” ?!?!?! A high school pal o’ mine would say, “They just said a sentence without any words in it.” Not empathetic, just pathetic.

[I prefer ‘empathic’ anyway.]

8 Likes

I assume that some sort of statement-to-speech expert system(one hesitates at the implied humanity of ‘spokesperson’) will stand there and claim, with a straight face, that it’s obviously the case that everyone who contacts a service that is explicitly positioned as a low-friction/easy-access suicide hotline has, in fact, read, understood, and consented to the terse and readable privacy policy and that:

"We may use and share anonymous and anonymized data with third parties for any reason, including but not limited to improving our Services, generating support for CTL, or as required by law. "

is obviously enough information to count as the ‘informed’ in ‘informed consent’ to having your data harvested for a specific chatbot company that the hotline knows very well and could have called out specifically, rather than mentioned vaguely as ‘generating support for CTL’, were it feeling honest.

(It also seems worth noting that their cool, hip, customer-service-AI commercial partners are…notably absent… from the “Using data externally” section of their ‘Data Philosophy’ page; which instead lists a bunch of like-minded nonprofits and assorted academic research. Maybe Loris.ai is just in a fragile place right now and needs its privacy…)

6 Likes

No.

11 Likes

I am gonna guess because no HIPPA form was signed? No paperwork, no existence you know.

8 Likes

While I realise that posting helpline info this into this particular topic on a helpline which sells your data, I still feel obliged to say to everyone who feels overwhelmed:

YOU are not alone.

14 Likes

I have the vague impression that “loris” is a reference to some sort of Computer Science problem, but

TIL,

3 Likes

Crisis Text Line has ended its controversial relationship with Loris.ai, a for-profit sister company that promises to help companies “handle their hard customer conversations with empathy.”

The decision followed outrage over a Politico story that detailed how Loris leveraged Crisis Text Line insights and anonymized user data in order to sell companies customer service optimization software, a use that struck some as unseemly given Crisis Text Line’s mission to prevent suicide and help people in distress.

The article prompted Brendan Carr, commissioner of the Federal Communications Commission, to request that Crisis Text Line and Loris “cease their practice of sharing — and monetizing — the data they obtain from confidential text messages of people reaching out for mental health counseling and crisis interventions.”

Without mentioning the FCC’s request, Crisis Text Line acknowledged in a statement that it understood users didn’t want their information shared with Loris.

“We heard your feedback that it should be clear and easy for anyone in crisis to understand what they are consenting to when they reach out for help,” the nonprofit said.

6 Likes

This is our current stage of capitalism. Companies literally making money from our tears.

2 Likes

As it turns out; the same person behind Crisis Text Line: Nancy Lublin.

What would be interesting to know is where on the spectrum of ‘long con’ vs. ‘gradually comes to believe one’s own lies’ vs. ‘decides that a sacrifice must be made to fund the greater good’ and various similar explanations the decision to ‘monetize’ the situation was.

Given she describes what she is currently up to in the below terms; it seems clear that she has come to believe in a synthesis of self-interest and psychological betterment, arguably profoundly unsupported by startup history to date, by one route or another:

Primiga’s focus is on consumer-facing products and services that make people feel hope and that empower or enable them to be the best version of themselves.

It would also be interesting to have been a fly on the wall during the attempt to sell the idea to the board. I would have expected at least some of them(dana boyhd, in particular, is normally more…skeptical…of the various fascinating plans to suck data out of the young and vulnerable for sweet, sweet, profit; I’m less familiar with the others listed).

6 Likes
4 Likes

Martin Shkreli Reaction GIF

2 Likes

Seems like someone I would buy empathy-enhancing chatbots from; "For when you want to look like you care; but cheaper"™

Mysteriously; the incident receives a single rather anodyne sentence on her own site; just below the statement “She was committed to giving young employees leadership opportunities.”; and a paragraph or two about TED talks. I suppose it’s not strictly false, of some of them.

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.