Using Machine Learning to synthesize images that look NSFW but aren't

Originally published at: http://boingboing.net/2016/10/21/using-machine-learning-to-synt.html

4 Likes

Eeeeeewwwwwwwwww!!!

6 Likes

It’s per se NOT per say!
Add this to breaks on a car.

19 Likes

Volcanic-crack infection is particularly brutal.

13 Likes

Rule 34

6 Likes

Finally!

It’s about time silicon valley started doing its part to address the Internet’s catastrophic porn shortage

30 Likes

Take your car in and get those breaks checked out!

4 Likes

We don’t cotton to no machine learnin’ 'round here.

8 Likes

In the future, people will exit porn for 15 minutes.
The Internet of Side-Eyeing Things will save any of it from being distracting.
People will flag content for counterpurpose inequality and valid flags will replace ads. (ObBoop)
The cloud of brainbleach will still be a match 3 social game, but people will build an axiomatic wisdom of how to dress up any hot mess whether it breathes, transpirates, heat tempers or quakes.

16 Likes

Looking at the original article, it’s clear that the author has discovered a herring gull beak effect. Those who remember the original work of Tinbergen will know that herring gull chicks were stimulated to open their beaks by the yellow beak with red spot of the mother. A yellow pencil with a larger red spot elicited the same behaviour to a greater degree. This stimulated [deliberate pun] work that led to an understanding of, for instance, the objectification of women.
The neural network is assigning high probabilities for pornography to images that are not literally pornographic at all but look a bit like randomly distorted parts of human reproductive anatomy. It’s going to block everything related to sex education and a whole lot of other biology.
Or, in other words, the neural network has been designed by people with a screwed sense of values and, surprise surprise, it echoes those values. Garbage in, garbage out.

13 Likes

Plus, as of today, you have to prove you’re eighteen to get from Yahoo to BoingBoing.

3 Likes

I think that you may be asking too much of a single algorithm. One could always use this as a pre-filter and eliminate images that very likely aren’t porn. Alternatively, you could couple it with information about the surrounding text and other images. (The structure and text of medical sites look nothing like porn sites.)

2 Likes

Rule 34, as @Stilgarr notes above.

3 Likes

Good thing I’m working from home today!

5 Likes

I think this machine learning has invented something new: Abstract Pornography.

16 Likes

things that aren’t porn, but look porny.

Porny?

7 Likes

Pssh. Everyone knows d(X) pix are NSFW. We needed computers to tell us that?

5 Likes

I know, right? Shouldn’t it be “pornish” or “pornesque” or “porn-adjacent” or something?

7 Likes

Seriously…now all I think about is 2 Live Crew, Me So Horny (NSFW)…

5 Likes

10 Likes