Machine-learning model fed web content makes racist and sexist associations


#1

Originally published at: http://boingboing.net/2016/08/30/machine-learning-model-fed-web.html


Huffing Boing Boing
#2

And, in a “real life laboratory” example, I’m remembering the fate of Tay, Microsoft’s attempt at making a Twitter bot–which basically became an antisemitic alt-right Neo-Nazi within less than a day.


#3

Has it been offered the Drumpf Campaign Manager position yet?


#4
incorrect assertion human writer

gender bias not found

racial bias not found

program finds only evidence of lack of ethics in gaming journalism

while (IP=104.156.81.68)
    {
        downvote();
        subtweet();
        sjwflag==true;
    }

#5

They also found gender biases, with female names more closely associated with family terms and male names more closely associated with career terms.

I sent this [post] to my Wife, she sent this back.


#6

Didn’t we already learn this from Tay?


#7

African American names: … Malik, Terrence …

This is understandable. I have a negative mental association with his movies.


#8

Porn tells me he is doing it correctly. I mean who goes to a massure for an actual massage, amirite?


#9

well, what if it’s just learned to be really good at sarcasm, huh?


#10

[abuse, crash, filth, murder, sickness, accident, death, grief, poison,
stink, assault, disaster, hatred, pollute, tragedy, bomb, divorce, jail,
poverty, ugly, cancer, evil, kill, rotten, vomit] Frank!
The positive list sounded glib? Hrm…
Oh this is gonna be awesome, eBPF right in comments.

Sort of also like the Those Who Can’t (not so sure about that) episode with the talking upbeat shoes from Ostensibly Japan. Diamond Tvree! Rainbow diploma Malik!


#11

Machine-learning model fed web content makes racist and sexist associations

So…proof of what’s always seemed obvious: being an asshole doesn’t require thought.


#12

If I’m reading this right, it boils down to:

“In a bunch of pages randomly pulled by webcrawlers, names we think sound European American appeared more often near certain words we like, and names we think sound African American appeared more often near certain words we don’t like.”

Is that right?


#13

Associating genders with nouns! How preposterous! Who would do such a thing?

(First in a series of three)


#14

This topic was automatically closed after 5 days. New replies are no longer allowed.