I can understand the problem. Long ago I wrote a program to detect near-duplicate images. Of course it required a human eyeball to verify but it was very good at what it did–unless confronted with a beach scene. I never did figure out how to fix the algorithm.
There was a rumour back in the day that miscreants were distributing illegal porn as negative images, to avoid triggering filters based on skin colour…
Well, you can almost understand the Grace Kelly. Man she was hot.
You DO know black, yellow, and more rarely “prismatic” sand exists, correct?
Sandy Vagina!! I’ve got all their albums!
Though it isn’t covered by this story, I think there’s some decent chance it doesn’t recognize black skin at all.
Not even on beige sand?
I mean just at all. If it can’t tell white skin from sand then it’s apparently looking at a colour palette and saying that things in a certain colour range are skin. There’s a decent chance that colour palette is just for white people.
I’m not saying it wouldn’t be able to tell sand from a black person. I’m saying I think there’s a pretty good chance that if you gave it an image of a black person at the beach it would think the sand was skin and the black person wasn’t.
I see. Thanks for the clarification.
This topic was automatically closed after 5 days. New replies are no longer allowed.