Do you trust games with your deepest fears, and to keep your secrets? Try them.
That’s the most disconcerting thing I’ve read all day.
Possibly useful for high-functioning autism-spectrum people, but is it covered under HIPAA and HITECH?
Just spent a few minutes with Player 2 and it got to a scary place pretty fast… it stopped feeling like a therapy bot and actually began to feel more like a game, where all my choices were things I’d never even consider doing in the real world. And it didn’t give me any better choices, so I quit.
Psychologically, these things are definitely… intriguing… but there’s an immediate creepiness from the start.
I find it a bit troubling, actually. Is either developer a trained psychologist? Is it ethical to create a game that so explicitly performs therapy (or at least can be used that way), without any sort of oversight or responsibility?
Player 2 got pretty intense, but it was a similar level of intense to a LovingKindness meditation.
It feels like another ridge that’s being crossed into the Uncanny Valley. For me, it seems that way because textual games from the past that relied on emotional input from the user have been tied to the responses the game has already received from that same user so they’re somewhat easy to see through. I messed with BECCAA for a bit and clearly there’s other stuff going on with that platform.
It’s interesting to see people talking about BECCAA as though it is a therapy bot; I never considered it that way. BECCAA’s intent is education, not therapy; it is meant more to help people put their online communications into context than to provide them with comfort. When I upgrade her to 1.0, though, maybe I’ll make a disclaimer that BECCAA is a confidante and advice-giver, not a therapist.
As for how people choose to use these programs, that’s up to them. People find solace and comfort in what they choose, and the creators of ‘empathy simulators’ may approve of this, but no one should be mistaking these games for treatment and they should not be held to the same standards that mental health professionals are.
I think people seem to miss the point of AI. We already have people who think stereotypically like people. The advantage of AI is using them to get outside of the human perspective, which is not the easiest thing to do.
This topic was automatically closed after 5 days. New replies are no longer allowed.