Originally published at: https://boingboing.net/2019/04/19/racist-algorithms.html
…
Oh, it’s the scanners that struggle with prejudice, and nothing to do with the either the programmers or the screeners? Bad scanner! Bad! No biscuit!
The TSA probably
to single out POC in the first place.
Bug? Feature?
I politely opt out. Just my little wrench in the cogs of creeping creepiness.
Clearly the product engineers need a more diverse test population before shipping these machines.
I have a long braid, I’ve had it checked by TSA, and I’m pretty haole. I can only imagine the grief for someone with dreads or more intricate hairstyles.
Quick question: how many guns/explosives/other weapons have they found in those hairdo’s so far?
How these scanners work is that they compare the scanned body to either a male or female model (chosen by the TSA agent) and look for anything out of place. This is a really crappy way to scan, but it means that no human ever looks at, and in some cases could ever look at, the scanned image. This was a change to the scanners to allay the privacy issues with the original ‘pornoscanners’ that produced a grey-scale image that look a lot like the naked person.
Any one that is too far out of line with the model the TSA agent chooses for them gets flagged for inspection and the device indicates where the issue is. I have long hair and most flights I get pulled aside to get a shoulder or back pat down, because apparently men aren’t allowed to have long hair.
I could tolerate the scanners doing a crappy job and flagging people a little to often if the TSA agent response wasn’t just as bad. When I trigger the scanner’s warning as a white male I get a nice cursory pat down, maybe 30 seconds to a minute. I’ve flown with my mom and when she triggered the scanner’s warning they took probably 2-5 minutes to do the inspection. I can’t image what adding black to that would add to the time.
Yep, I’m sure that’s it - it’s those damn racist machines! (Just being around them makes you racist, too.)
It’s millions and millions, isn’t it?
Pretty sure I’ve gotten hit with this too. I’ve got really dense wavy-to-curly long hair and every time I go through the scanners they pat me down between the shoulders.
Coincidentally, I was actually thinking about Pam Grier’s Blaxploitation roles as I posted my comment; that’s probably the only time I’ve ever seen or heard of anyone keeping weapons in their afro, outside of cartoons like the one you posted… because that’s some impractical, unrealistic Hollywood BS.
Pretty sure people wearing turbans were more likely to be “randomly” selected for additional security screening even before these machines showed up.
I dunno - I’m pretty sure that Leprechaun movie, where Ice-T pulls out a knife and baseball bat from his 'fro was a documentary.
If the system as a whole is racist, the race of any given individual working within it is secondary.
one hack might be to opt out. the more invasive searches where you take apart the hairstyle are triggered by an alert, so theoretically if you opt out, they pat down, and find nothing you’d be out more quickly w hair intact
While I can understand the privacy reasons to not display the images I think they could greatly reduce the patdowns by showing an image of the area that triggers the alert, along with the reference image that it doesn’t match.
I’mma hide an AR-15 in my pubes all the time.