SLIM: An open, transparent, hand-computable sentencing algorithm

Originally published at: http://boingboing.net/2017/07/25/machine-learning-done-right.html

1 Like

If your algorithm is “too sophisticated to explain” you’re probably doing it wrong. Even the most complex algorithms should be able to be broken down with straightforward “ELI5” explanations even if you gloss over the deeper details.

7 Likes

I’m probably being too optimistic, but lately there have been some hints that courts are more open to the argument that computer processes really can be reviewable by human beings – even those with JDs.

1 Like

Data can’t predict the future but it can tell us about the past. I’d like to train these algorithms using all data we have about people, including protected classes like race and gender. Then we could invent hypothetical individuals who are identical in every way except for their race and see what the algorithms recommend.

2 Likes

My concern with their data is that it is already skewed by past LE/JD behaviour, i.e. GIGO.

1 Like

I’m sure it’s terribly skewed. That’s why I’d like to see what happens if we give the algorithms everything. I don’t believe we can ever actually produce good results in terms of giving out sentences by basing them off of data. I do believe that we can use the data to get a very good idea of in what direction and by how much we allow our judgment to be skewed by things we agree we shouldn’t be taking into consideration.

2 Likes

I think I mis-interpreted your original statement, but now I’m just confused. :upside_down_face:

I now understand “everything” to include data not related to incarceration, but if the algorithm is supposed to provide sentencing based on past recidivism rates, what exactly would “everything” add or subtract?

Some thoughts - subtracting race is easy enough (and seems to be accounted for); biasing for or against certain types of violations is easy enough and is allowed in the algorithm, (though that seems dangerous to me!); gender needs to stay in as there are huge statistical differences there; not sure what else needs to be controlled for or how it would be handled?

(Not arguing any points per se, just looking for more details to better understand what you’re getting at.)

My understanding is that these algorithms aren’t given data on any protected categories for fear they would discriminate based on them. I recall reading about a bail algorithm that was specifically not given race data (since that would be racist) and then gave black people much larger bail amounts anyway. I’d like to see what an algorithm that does consider race does when you give it two identical people who differ only in race.

I’d like to have a website where you can give it a bunch of data, then give it a crime and ask it what your sentence would be, and it will tell you what your sentence would be if you are white, and what it would be if you are black. Sure you could do that just by multiplying by some common factor observed between white and black peoples sentences, but it would be far more interesting to have a learning algorithm of the sort that is actually used for sentencing tell you.

2 Likes

Wait. Stop for minute. I think I need to hit the rewind button.

Ummm… since when do courts and parole boards use fancy software to generate criminal sentencing, and am I, a humble software developer, the only one who finds this FUCKING TERRIFYING?

3 Likes

10+ years or so

No

4 Likes

I see what you’re getting at now, thanks for expanding!

I wonder if there was economic or geographic (or even age-based) data in there as well? The problem is that everything is skewed against minorities, it’s well nigh impossible to completely de-bias any given data-set. :frowning:

2 Likes

Yeah, if you have racist data going into the system then unless the system is terrible it’s going to generate racist outcomes coming out unless there is no demographic or geographic way to guess at someone’s race. But that could only be true in a completely integrated society with no racist outcomes. Racist In, Racist Out is the new GIGO.

2 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.