We need a new science of measurement to figure out when AI is working and why it fails

Originally published at: http://boingboing.net/2016/09/12/we-need-a-new-science-of-measu.html



Maybe we could start by taking the lesson of this comic to heart…


Or this one:

ETA: link to original:


I just saw this in another thread!

But I have to say, if something is intelligent, the adjective “artificial” becomes meaningless. We were all created by the efforts of women (with a small contribution from men, for genetic variety’s sake). Whether or not you believe your programmers used artifice to create your intelligence seems mostly a matter of hubris.

None of the things people today call AI seem at all intelligent to me.


I think that an even more important thing is to realize that this clusterfuck was not an AI failure.

An algorithm has at best flagged the initial photo - which is to be expected, as it does indeed contain nudity and a piece of code cannot really be reasonably expected to know the historic context of this particular picture. Considering the volume of traffic and content Facebook handles, it is not reasonable to expect human staff review every single picture by hand neither.

However, where the real epic fail started was the incredibly incompetent reaction of the (human) team that actually reviews the flagged content (either right away or when appealed), their petulant digging in their heels, banning a critic and censoring articles critical of them. I wonder whether those people actually know the importance of the image at all - if Facebook has offshored this somewhere to India or Asia, then it is not surprising that they don’t …

Oh and the idiotic scripted “sorry but policy!” reply of the Facebook’s rep, trying to hide behind an arbitrary, made up, rule instead of engaging the brain.

This has been a completely human failure, AI has played only a very minor part in it. AI doesn’t ban users for criticism, petulant moderators do.

Of course, this doesn’t mean than a methodology for evaluating and tracking AI failures isn’t needed - but not because of this particular PR disaster.


May I ask why you blacked out parts of the photo? It seems that the impact of the picture showing the brutal effects of bombing civilians with napalm is somewhat diminished.
The way I see it, if you as the editors are offended by the picture you should not show it at all- as it means to offend.

Here is some of the context you have missed…

And here is some more…

1 Like

I think the whole thing is going exactly as Facebook would expect:

  1. someone posts nude child, gets flagged
  2. human reviews. Famous war photo. Should we un-flag?
  3. human reviewer’s human manager considers the question and determines that while it’s not obscene, it’s also not something he’d want appearing in his Nana’s feed, so to hell with it.
  4. Internet pitches a fit. Should we reconsider.
  5. Human PR manager considers the issue, weighing the number of times the internet pitching a fit has actually mattered in the past, as well as how quickly this teapot tempest is likely to settle down so everyone can get on with their lives. Again, to hell with it.
  6. Internet continues fit, calls decision bullshit/
  7. Human PR manager silently acknowledges bullshit, but publically proceeds with operation, “let’s get on with our lives,” citing policy he has no control over. Expects more fits tomorrow. She’s paid to do nothing but deal with fits, so to hell with it, job security.
  8. Internet fit continues.

    N-1. image of war atrocities replacing victims with posed kittens goes viral. Facebook could not care less.
    N. video of kitten moving in such as way that she appears to be doing latest hip-hop dance craze goes viral. Atrocities forgotten.
    N+1. Everyone at Facebook goes back to normal, continues to pay high rent on apartments, lead tedious lives. Except Mark Zuckerberg. He still has an island somewhere.

Who are the hypothetical people who wouldn’t be sufficiently affected by an account if this incident without also seeing the lurid photo? It’s superfluous and excessive, and not the sort of gruesome clickbait that people need to be assailed with while casually browsing their Facebook feed. The story should be told, but there is no moral imperative dictating that this image must or should be shown, and Facebook just isn’t the appropriate venue for it. The rejection of this particular image is a dubious basis for a pedantic discussion on AI. The whole thing almost feels like a false flag intended to embarrass and discredit dogmatic free speech demagogues.

Let’s be honest though: this is about certain people feeling entitled to post whatever they want on platforms that they didn’t create and that they don’t pay to operate and maintain. “I want to post whatever I want on your platform” is not an engineering problem that Facebook needs to solve for you. There are still plenty of places online where such images can be shared, albeit without the reach that you get on Facebook—by agreeing to the terms that they’re well within their rights to set.

We all pay, by Facebook trawling our data and selling it to companies to exploit. Just because we don’t transfer any money ourselves doesn’t mean there isn’t a price.


I’m rather taken aback by the pearl-clutching over this photo. First off, people seem to be forgetting it was taken during the Vietnam War. It’s won awards, been in major magazines, is published in history books, is shown as a still in plenty of documentaries. I don’t think I’ve gone two years of my life without seeing it reproduced somewhere, and I’m nearly the same age as the main subject of the photo.

Yes, it’s a photo of a hurt, naked child screaming.


The point was to bring the real costs of that war, and all past and future wars, home.

All this talk of blurring out these bits and cutting out those bits and only showing it in certain arenas makes me feel ill.

You’re SUPPOSED to feel uncomfortable. You’re SUPPOSED to want the situation in that photo to never happen again. You’re SUPPOSED to feel your heart reach out to that girl and the children running with her.

You’re not supposed to sniff about appropriateness or (so help me) discuss whether or not it’s porny.

Congratulations, BBS, you just out-pruded the news magazine mores of the Nixon generation. The adults of the 1970s knew that was a war photo. It seems that’s got lost.

All the more reason to re-publish it.


At least for the moment the AI’s being played with are pretty good at specific tasks like this one.

This topic was automatically closed after 5 days. New replies are no longer allowed.