Originally published at: https://boingboing.net/2018/09/10/black-box-justice.html
…
Honestly despite what a mess “algorithmic assessments” are, I think this can’t possibly be worse than the previous status quo. Basically, anything that puts bail bondsmen out of business is a good thing. And the cynical silver lining is that due to the state’s chronically overcrowded jails that have been ruled cruel and unusual punishment, they simply won’t be able to dramatically increase the number of poor defendants in pre-trial detention.
This is probably no large comfort to people who will unjustly detained over the next 4 years.
For an in depth analysis of the problem of using data analytics to make social decisions, I highly recommend reading “Weapons of Math Destruction” by Cathy O’Neil. She studied several examples, but one that sticks out for me is the use of standardized test scores as a part of No Child Left Behind in order to identify “failing teachers” and “failing schools”. What happened is the testing regimen was immediately gamed by savvy school superintendents; the ones who had to deal with real student problems, and who honestly tried to improve their schools (instead of drilling crappy tests into their students’ heads) found themselves on the street.
Famous last words.
I certainly could be wrong here. Its hard to find something so shitty that we can’t make it worse. But the bail bonds industry is pretty terrible.
What this thing will do, is it will wrap previous status quo in a shroud of pseudo science. Data which they will use is actually a snapshot of status quo, where your race, social background, level of income etc. marks you.
Next up precrime
I’ll take “‘black box’ algorithm probably ‘black cell’ algorithm” for $500, Alex…
All snark aside; this will probably end up helping some of the people who would otherwise have gotten an “as far as your economic status is concerned it might as well be eleventy-billion dollars”; but black boxing has the invidious effect of concealing the logic behind a given decision, which provides a tough resistive layer against anyone who would attempt to appeal a given decision by justice-bot.
Meat-judge models aren’t notably impartial; but people aren’t generally under the apprehension that the black box(it’s a grey and heavily vascularized black box, work withe here) between one’s ears is full of impartial algorithms; so “eh, I went with my gut” sounds dodgy, while formalized guidelines written for humans tend to be relatively (relatively) short and simple, so one can either challenge them or challenge an application of them. An expert syst of nontrivial size is less tractable.
So what’s the fast fourier transform of blackness look like?
If I remember correctly it looks also black. Fourier Transform transforms zero to zero.
Out of the frying pan into… the frying pan?
It’s definitely terrible. One time I got locked up with a homeless guy I knew previously who was in there basically for being homeless. Pretty fucking heartbreaking. I know that’s not a unique story, but personal experience always drives things home in a very real way. I can guarantee that an algorithm would put him in the same position, though. That’s how they work - look at what happened before, and do it again. It’s literally a recipe for disaster. Social atomization is the name of the game here, and more mediation means further removal from living beings.
In each individual decision, it probably won’t be worse. What it will likely do is make it much harder to improve that system in the future.
This topic was automatically closed after 5 days. New replies are no longer allowed.