Elderly woman signed up for recurring subscription after asking voice assistant to say the Hail Mary

The predator must have done a lot of work to get his app to the top of things returned when you ask for a Hail Mary. I bet that he has hundreds of similar apps all waiting for a similar situation.



Don’t know if others had the same ad, but the ads I had embedded in it were for Alexa infused sunglasses. All you have to do with those is make eyes at something, or wiggle your nose and you’ll find you’ve purchased or subscribed to something.

Isn’t this always what happens when people mistake retail AI for an ‘assistant’?

Amazon don’t make Alexa devices to tell you the time, it’s a way to introduce an always on listening device into your home, which you willingly decided to turn into a Prime retail environment. And the hilarious thing is that Amazon don’t need to pay you to do it, people pay THEM, it’s one of the most dystopian things I’ve come across in this horrible timeline we’re in. No different for the ‘assistants’ sold by advertising networks. They’re trading your personal data for the convenience of reading you the weather out loud.

I feel sorry for the old lady, but I don’t blame Amazon, I blame her son for buying her such a crappy ‘gift’.

1 Like

If you’re vulnerable, confused or have limited mental capacity that may not help.

The takeaway here is don’t install retail AI in vulnerable people’s homes.


The sad thing is that it is exactly these people that can benefit most from a voice assistant. I think what’s needed is either a separate model made by someone else than a huge corporation or, realistically because of the overhead involved, a mode on the ones you can already buy that caretakers can turn on that doesn’t allow you to purchase things and has apps specifically for elderly people.

Of course Amazon wouldn’t be earning money on those but the good PR alone might make them do it anyway with a bit of public pressure.


Are you sure? I think we have to blame them a bit for being rapcious ultra-capitalists who came up with the idea of wiretapping the whole world to drive sales by permitting this sort of fraud.




Apart from announcing “Saying yes to the following question will cost you $5 per month for the indefinite future”, any Alexa-type device should have a switch to disable actions that generate follow-up charges in the first place. Of course, though, we can’t count on Amazon to provide that sort of thing voluntarily, as it would defy the purpose of selling Alexa-type devices in the first place.


This topic was automatically closed after 5 days. New replies are no longer allowed.