Algorithmic decision-making: an arms-race between entropy, programmers and referees

Originally published at: http://boingboing.net/2017/06/01/adversarial-computing.html

1 Like

So there’s a career to be had in supervising AI? O brave new world, with such people in’t!

2 Likes

So there’s a career to be had in supervising AI? O brave new world, with such people in’t!

Furthermore, there’s a career to be had in writing algorithms to optimize the search for people able and willing to supervise AI.

Bigger algorithms have smaller algorithms upon their backs to bite 'em …

1 Like

Too bad I learned to program so long ago, C wasn’t even plus. In fact, if we wanted a plus we had to use a double negative. We called it C minus minus.

1 Like

I found this part to be not very thought out. An example is that we could use something like serving product recommendations to Amazon customers as a low stakes testing grounds.

First of all, low stakes for what? What higher stakes arena could one safely extrapolate to?

Second, one of the examples of a big algorithmic blunder was putting product ads on hate speech videos. So if I’m browsing for some clothes alone and I get a recommendation: If you liked this, you might also like Jewish Supremacism by David Duke, isn’t that basically exactly the same problem? The essential problem is that we can’t think of all the ways that algorithmic decision making might get us into trouble, so we really can’t tell a low stakes from a high stakes decision.

1 Like

A lot of the public discussion about algorithmic decision making really, really needs to be informed by the existing knowledge on the subject.

Knowing what we already know about overfitting

And the need for continual monitoring and calibration of models

Should be part of the discussion, but lately, the problems of algorithmic decision making are brought up and discussed as if none of this has ever been done before.

2 Likes

Interesting & helpful, thanks!

This topic was automatically closed after 5 days. New replies are no longer allowed.