The coded gaze: biased and understudied facial recognition technology


#1

Originally published at: http://boingboing.net/2017/05/23/the-coded-gaze-biased-and-und.html


#2

Why am I suddenly reminded of the Better off Ted episode where Veridian Dynamic’s automatic sensors did not recognize black people?


#3

Not to belittle these concerns, but wouldn’t you be delighted if you found out you were part of a group on whom facial recognition didn’t work as well? It’s not that I want to get away with committing crimes, it’s just creepy to be tracked across many “appearances”. So if my looks made this more difficult, yay for crappy code, I guess?


#4

Because that episode was based on this very real problem, which IIRC was in the news shortly before the episode came out.


#5


#6

That was such a good underrated show!


#7

Racial bias in facial recognition systems has a long history across a variety of different systems (for similar reasons). The show was directly based on real events, as mentioned, and there were other high-profile examples - e.g. Xbox “Kinect” games that didn’t register the existence of anyone who had sufficiently dark skin (which lead to some really awkward public demos…), etc.


#8

Wait, so it’s biased in favor of putting fewer black people in prison?


#9

It was a SF show which pretended it was a workplace comedy.


#10

Like Castle, the SF show that pretended it was a police procedural?


#11

The presence of Nathan Fillion does not make a show automatically SF. :grinning:

If it did, then 2 Guys and a Pizza Place, Desperate Housewives and Modern Family would have been considered as them.


#12

No, but working laser pistols, time travel, parallel universes, invisibility cloaks


#13

FTFY  


#14

Wow! The show went full Felicity in its later seasons!!

I stopped watching 3 seasons in. It wasn’t a terrible show, it just wasn’t “must see” watching. Especially once Better Call Saul started.


#15

This reminds me of how Google’s Deep Dream always seems to find dogs and birds in everything because that made up a huge portion of the training images.

Also reminds me of a recent Silicon Valley episode where Jian Yang creates an app that is supposed to tell if an image includes a hot dog or not but ends up being able to detect penises.


#16

That’s only true if said group also enjoys proper legal representation. So in practise…

…it’s probably more biased towards putting the wrong black people in prison.


#17

Unless it means that I’m likely to be mistaken for criminals in a database because the system isn’t good at telling people who look like me apart. I’ve heard plenty of stories of police grabbing the wrong black man because their brain-based facial recognition systems didn’t work well on black people. It doesn’t seem like a delight.

I get what you mean. If a system simply doesn’t register that you have a face at all that might be preferable in some circumstances. But if systems are relied on for security then people who aren’t recognized are probably going to be viewed with undue suspicion.


#18

I know! I loved it! I miss it!


#19

Previously on bOINGbOING:


#20

Years ago there was a gee-wiz technology show that was doing a story on a system for tracking eye movement. It was developed in Japan, and it calibrated itself by finding the hairline. Unfortunately, the host was blond and there wasn’t enough contrast between his hair and skin, so they had to give him a black wig to get it to work.