Could this be misused? Definitely.
Will is be misused? Possibly.
Could it be used properly? Definitely.
I don’t get the claim that it’s pseudoscience, the technical champion seems to be a legitimate academic and the approach is widely used. The pseudoscience accusation seems to be based on the fact that the data contains noise and biases, but very few datasets don’t and that doesn’t stop them from being used for proper science.
Now the fact it’s a scientifically valid approach still doesn’t mean it’s a good idea. The article suggests these people will be targeted for minor police harassment which would be a big problem. However, if the focus is intervention I don’t why this is any different from proper community policing.
If you think it could ever be used legitimately, you should be institutionalized as a possible danger.
As an employer it would be nice to know who is on this list so I can avoid hiring them. As an insurer it would be critical information in determining rates to charge them. As a lender I would like to know so as to refuse a loan… Of course we would never gain access to this important data now would we?
So the algorithmic system warns people who are at risk, due to their social networks, that they may be more likely to be caught if they commit a crime. Therefore the people judged to be at risk (by the algorithm) who were contacted would, in theory as rational entities, be less likely to commit a crime.
In other words, the CPD is using a data driven analysis to prevent homicides, assaults and other forcible felonies to make the streets of Chicago safer for those most likely to be victimized by violent criminals.
Sure beats “stop-and-frisk” in NYC any day.
Not bloody likely
Yeah, no, it’s pretty much the same thing as “stop and frisk” in that there’s a baseline presumption of guilt (or perhaps more accurately the “scientifically proven” high probability of future guilt), which is leading to increased surveillance, home visits by the police to intimidate (prior to any actual crime happening), etc…
Will there be unfortunate outcomes of people being on the “heat list”? Young (presumably black) men being shot while reaching for ID after being told to do so by police because the police “had scientific proof” that the individual in question was statistically more likely to be violent or carry a gun? Only time will tell.
Is it fundamentally against the presumption of innocence that our legal system is at least theoretically based on to single people out prior to any actual wrongdoing? You bet your bippy!
I’m willing to bet all that money could be better spent training officers to not beat unarmed people up.
Perhaps not officially, but unofficially, which the officials involved will spin as measures to improve the accuracy of their pre-crime predictive models.
When I was growing up, my hometown sheriff would cruise through the predominantly black ghetto in a patrol car, windows rolled down, calling over black youths from the sidewalk or the basketball court or the front steps and porches of their own homes in order to “give them a good talking to”, telling them they were no good rotten filth, that they and all their family and friends were a societal disease, that he looked forward to seeing them behind bars.
He’d laugh as they sat there and took his abuse, because those who didn’t invariably ended up with their homes raided under warrants issued thanks to “information from an anonymous informant” and they always conveniently “found” hard drugs in the process. I remember when he finally died, there was a massive celebratory block party throughout all the neighborhoods he would frequent. For a while I heard a particular phrase shouted out at school at every chance, “Ding dong, the prick is dead!”.
The police ostensibly exist to serve and protect. But anymore it seems all they do is intimidate and destroy.
You think a particular portion of the community is at risk to engage in criminal behavior? You even have fancy algorithms and predictive modeling programs and whatnot? Fine. Start an outreach program. Find ways to better the lives of those most at risk. Help them to lead happy, safe, productive lives. Develop trust and foster community.
Happy people don’t become hardened criminals. Desperate, miserable people do. If you want a world with less crime, then work toward a world in which fewer people have a reason to break the law. If instead you want a world with more crime, send your jackbooted thugs around to “squeeze” the most at risk segments of society.
I already started a pre-pre-crime database, and guess who was in it? That’s right: the entire CPD and Miles Wernick. Stats don’t lie! This was by-the-numbers.
The system indicates that kid X is a very high risk of becoming a criminal. A pair of officers visit the school and ask the teachers about the kid in private to see if the flagging was accurate, if so they arrange a meeting with the kid and parents to discuss their concerns.
That seems like a legitimate and possibly very beneficial use of this system.
I often wonder how things like this float on by without a whole hell of a lot of challenges from the public.
Then I see positive comments here from a crowd I would generally assume to be pretty smart.
Guess I know why, now…
It seems like harassment.
Would love to see the algorithm FOIA’d.
(Also, nice graphics in the Verge article.)
Why are you more sure that it will be used effectively than you are that it will be misused?
Do you know how it works? if not, then how can you be so sure?
Please tell me.
Yup. Got a whole bunch of people who never heard about lies, damned lies and statistics. Cory is entirely correct: if we can’t see their work, if no one has the means to check and duplicate their research, this is no more scientific than alchemy.
And you know pretty well who will be seeing these “interventions” - won’t be the spoiled white kid on the right side of town who’s bidding fair to become the school’s blow dealer. (I’m thinking kids like the pair who grew up to hold high elected positions in a major Canadian city.) You also know that the kinds of “interventions” the kids will encounter will sharply limit their opportunities to escape their situations.
But “nice” folk will like it: it’s “preventing crime”, and that means they don’t have to worry about really changing the very real social and economic handicaps these “bad” kids (gotta be bad - the stats says so) face that cause the majority of the crime in the first place. Equality of opportunity is all well and good, provided the “nice” folk are more equal.
<sigh> My stomach hurts.
I’m very skeptical but I’m also reacting to the knee jerk opposition and cheapening of the word ‘pseudoscience’.
Yes the police have a lot of institutional problems and a lot of bad programs, but that doesn’t mean every program they do is automatically bad. If you automatically assume the cops are guilty of every possible infraction then the people you hope to convince will simply tune you out.
We could ban all data collection of any sort tomorrow and we would still be hosed for the rest of our lives. They literally have our numbers.
The alcoholic parent just won’t show up. It won’t help.