Originally published at: https://boingboing.net/2020/01/25/robo-racism.html
…
One sad part is that a well designed system could have been a good way to counter racist bias by police. Make false arrest lower TRAP score since the cops apparently were too eager to suspect you. But I guess that wasn’t the system the cops wanted.
Almost as if Minority Report wasn’t meant to be aspirational.
It didn’t work?
This gif reminded me of when the bar owner was basically like “you judge us but we never used a nuclear bomb on civilians”
Even Ferengi have limits, the original never trump republicans of the future
Has anyone noticed how often Cory warns us all about stuff like this, and then it happens… just sayin’
Even experts in machine learning will tell you that there is no way to find out why or how a particular model works.
This is such great news! The city implemented a crappy, biased program (probably against the advice of experts in the fields of social sciences and data processing, although I have no evidence of that.) The most important is that they recognized that it doesn’t work, and they shut it down.
I’m honestly surprised every time a bureaucracy cuts non-working programs; especially police. Well done, Chicago!
I wish I could share your optimism.
Unfortunately, since it is Chicago, where the police department is infamous for their horrific tactics, it’s more likely so they can direct funds towards something that this:
After all those Clearview AI bills won’t pay themselves.
This kind of data-driven policing is frequently corrupted by the people tasked to put it into place. Same thing is happening in New York with CompStat. Reply All did a great two-parter on how the system is misused or twisted to target certain people because of the prejudices of the operators.
Reminds me of Brazil. No one wants to admit there’s someone to blame so long as a machine is part of the process.
This topic was automatically closed after 5 days. New replies are no longer allowed.