Using Machine Learning to synthesize images that look NSFW but aren't


#1

Originally published at: http://boingboing.net/2016/10/21/using-machine-learning-to-synt.html


#2

Eeeeeewwwwwwwwww!!!


#3

It’s per se NOT per say!
Add this to breaks on a car.


#4

Volcanic-crack infection is particularly brutal.


#5

Rule 34


#6

Finally!

It’s about time silicon valley started doing its part to address the Internet’s catastrophic porn shortage


#7

Take your car in and get those breaks checked out!


#8

We don’t cotton to no machine learnin’ 'round here.


#9

In the future, people will exit porn for 15 minutes.
The Internet of Side-Eyeing Things will save any of it from being distracting.
People will flag content for counterpurpose inequality and valid flags will replace ads. (ObBoop)
The cloud of brainbleach will still be a match 3 social game, but people will build an axiomatic wisdom of how to dress up any hot mess whether it breathes, transpirates, heat tempers or quakes.


#10

Looking at the original article, it’s clear that the author has discovered a herring gull beak effect. Those who remember the original work of Tinbergen will know that herring gull chicks were stimulated to open their beaks by the yellow beak with red spot of the mother. A yellow pencil with a larger red spot elicited the same behaviour to a greater degree. This stimulated [deliberate pun] work that led to an understanding of, for instance, the objectification of women.
The neural network is assigning high probabilities for pornography to images that are not literally pornographic at all but look a bit like randomly distorted parts of human reproductive anatomy. It’s going to block everything related to sex education and a whole lot of other biology.
Or, in other words, the neural network has been designed by people with a screwed sense of values and, surprise surprise, it echoes those values. Garbage in, garbage out.


#11

Plus, as of today, you have to prove you’re eighteen to get from Yahoo to BoingBoing.


#12

I think that you may be asking too much of a single algorithm. One could always use this as a pre-filter and eliminate images that very likely aren’t porn. Alternatively, you could couple it with information about the surrounding text and other images. (The structure and text of medical sites look nothing like porn sites.)


#13

Rule 34, as @Stilgarr notes above.


#14

Good thing I’m working from home today!


#15

I think this machine learning has invented something new: Abstract Pornography.


#16

things that aren’t porn, but look porny.

Porny?


#17

Pssh. Everyone knows d(X) pix are NSFW. We needed computers to tell us that?


#18

I know, right? Shouldn’t it be “pornish” or “pornesque” or “porn-adjacent” or something?


#19

Seriously…now all I think about is 2 Live Crew, Me So Horny (NSFW)…


#20