The Safe Face Pledge: an ethical code of conduct for facial recognition scientists and developers

#1

Originally published at: https://boingboing.net/2019/01/28/ibm-at-auschwitz.html

4 Likes
#2

https://boingboing.net/2019/01/28/ibm-at-auschwitz.html
That URL!

2 Likes
#3

It’s a nice thought, perhaps I will frame it on my wall next to the ethical code of conduct for ebola-coated hollow-point bullet manufacturers.

I mean, OK, there are defensible uses for face recognition, but still. If you create something with obvious harmful applications, you don’t get to absolve yourself by piously expecting everyone else to refrain from those applications. But/and conversely, it is the people who use a technology to do harm who are to blame for that harm, not the people who created it.

Anyway, why make these ethical principles specific to face recognition, other than for SEO purposes? The problem is the industrialised processing of any personal information, whether it’s faces or email or credit card numbers. Focusing on one buzzwordy, niche aspect can be a smokescreen for the larger evil.

That’s why the fashionable conflation of “AI” with big data is frustrating. The present hype would have us believe that “making software smarter” and “owning monstrous silos of private data” are the same thing; but that’s kina stupid when you think about it. A truly advanced face recognition system would be able to recognise faces without any training data, like babies can. In a sense, a system that can recognise faces by having a database of every face that exists is no smarter than a phone book.

I’m sure some of this hype is plain old-fashioned dumbness, but I bet it’s partly a deliberate push by folks like Google and Facebook and their investors. They own mountains of data, which are held to be worth billions, but it’s increasingly difficult to explain how that data is so valuable.+ So there’s a powerful financial incentive to persuade the world that mass surveillance is the only way to achieve the lucrative vaporware of the future.

+ One answer would be “we can sell it to tyrants and criminals”, but while that would be a big deal for the victims (i.e. everyone), it’d be a rounding error in terms of Googbook’s accounts.

1 Like
closed #4

This topic was automatically closed after 5 days. New replies are no longer allowed.