Amazon stores recordings of Alexa interactions and turns them over to internal staff and outside contractors for review

#1

Originally published at: https://boingboing.net/2019/04/11/1000-clips-per-shift.html

2 Likes
#2

I remember the outrage when it was divulged that Captcha was actually using people to decipher text that the ocr machines couldn’t.

8 Likes
#3

The annotation team’s goal is to improve Alexa voice recognition and command parsing.

And that’s all.

Trust us.

(I notice that Bloomberg reports this corporate announcement without comment.)

3 Likes
#4

Which I thought was a great idea.

The policy of no action when evidence of assault is found bothers me more than anything in this article.

12 Likes
#5

What?

Whew! I’m glad they said that. I was starting to worry.

Also:

I suppose that’s just the ethical dilemma that eavesdroppers face. Yet, it doesn’t stop them from doing this…

Ho ho ho.

9 Likes
#6

9 Likes
#7

The only way the voice recognition algorithm is going to improve its accuracy is with some sort of human cross-checking. Expect that sort of thing for any voice system. The part where Facebook really screwed up and lost any claim to valid engineering is when they preserved the user account tied to the recording. That took it from (mostly) anonymous to creepy stalkerware.

Usually we argue (rightfully so) about how good of a job researchers do anonymizing their data set. Here, FB did not even really try. Fuckwits.

5 Likes
#8

That claim just completely lacks creative problem solving. Once we stop thinking that way we’re going to actually start making good AI.

#9
1 Like
#10

I’m open to ideas.

Bear in mind that you are trying to simulate human comprehension. So somehow or other you need to involve humans in the solution testing path.

4 Likes
#11

Fuck human compression, we’re trying to make something better.

1 Like
#12

Well, they shouldn’t.

3 Likes
#13

We have bugged our homes. These company do not have the ethics to stop themselves from doing bad things. If we want voice recognition that is safe if need to run on a computer that only the owner has access to.

4 Likes
#14

We have bugged our homes. These company do not have the ethics to stop themselves from doing bad things.

Don’t forget the wireless, apple-or-android-branded surveillance device that just about all of us carry in our pockets.

2 Likes
#15

Bear in mind that you are trying to simulate human comprehension.

Fuck human compression

Your response has been nominated for Today’s Funniest Typo (4/11/2019). You will be alerted if your nominated entry actually wins a prize. Good luck!

8 Likes
#16

The only way the voice recognition algorithm is going to improve its accuracy is with some sort of human cross-checking. Expect that sort of thing for any voice system.

I’ve been explaining this to people I know for years. Mostly in the context of “hey, you used to do speech recognition research but you don’t have an Alexa-type thing in your house - why is that?”

4 Likes
#17

I say let the birds handle the “improvement” of Alexa.

3 Likes
#18

A former phone company employee told me a story one time about working late at night on the switches. They need to “busy out” a phone line sometimes to work on the system, but its a bad idea to blindly disconnect someone in the middle of a conversation. So they would occasionally have to monitor a call just to know when the line was free. (This was in the 70’s iirc)

Anyway, it’s the wee hours of the morning, our narrator is busy doing something completely unrelated, and a coworker has appearantly found something juicy while waiting to busy out a line… said coworker has patched the feed into the intercom system for the entire building (its late at night, there are only two people there) and for 90 minutes or so, the intercom is blasting a phone sex session between a horny couple who havent seen each other in a long time.

Not that I had ever been tempted before, but after this story it really would never occur to me to have pretend sex on a telephone line.

4 Likes
#19

I just assumed that the creator of the modern day Mechanical Turk and all other always on voice assistance devices had people listening in. It’s part of the reason I ask the devices the questions that I do.

“Alexa, I accidently awoke an Elder Thing¹. How do I defeat it?”

“Siri, I discovered a copy of the Necronomicon while camping. What should I do?”

Or my favorite “Kids, play with [name of voice controlled device].”

Note ¹: Links to full text of the book At The Mountains of Madness not the Necronomicon
Note : Wikipedia page on The Turk that Mechanical Turk takes its name from.

3 Likes
#20

Back in 2013-14 I had a job (not with Amazon or FB or Apple) where I was one of these human reviewers who listened to people’s voice commands and translated them into text to improve the system. It was a shitty job; you could work from home, but it was piecework and paid five cents for every 20 utterances transcribed (you do the math). It went something like this:

  • “Kebabs near me”
  • “Loughborough”
  • “directions to [address]”
  • “pubs near me”
  • “Naruto Shippuden”
  • gay porn
  • “LALALALALALALA POOPYPANTS! HAHAHAHAHAHA”

You could always tell when someone just gave their kid the phone to play with.

16 Likes