Fhtagn! The inceptionized route from noise to latent doglizards, in 5:36


[Read the post]


you just know that somewhere out there some mutant asshole is re-training the neural network with body parts and instead of the jokesters getting back porn-shots where things have turned into dogs and birds and plants we are all going to be getting pink throbbing giger landscapes inserted into our eyesockets everytime we go online for the next few months


NSFW: http://www.jwz.org/blog/2015/07/deepdream-porn/


Doesn’t the MIT database have like 1000 different net pattern trainings? Most of the pics I have been seeing seem to concentrate on finding dogs and/or eyes. I wonder if some of the more popular software out there might not facilitate easily changing or creating new models.


man i have been trying to get this going under macosx… got all the way to the point where the workbook will finally run, and now blast is crashing python. might have to start over with homebrew instead of macports…?


That keeps clocking upward

(24-hour delay)


They need to get more cats up in that piece.




It’s turtles (and dogs) all the way down.


WARNING the JWZ link from the main page is NSFW! (Goatse, even after inceptionalization, is still Goatse).


Yep. That’s called the Müller-Fokker Effect.


I found the silence disconcerting. This video needs some nice music, like Trois Gymnopedies.


First comment from the source page:

“I need many like buttons for this. Like buttons with extra eyes and dog heads.”

EDIT: Oops, sorry, replied to OMike instead of CDoc.


Praise the Elder Gods let us pray that Cyriak never gets aholt of this technology, lest civilization fall into the Dark Abyss of Madness!


You are messing with the code, right? Presumably there’s a network of “neurons”, and training them means to establish the strengths of connection between pairs. It astonishes me that a lil’ ol’ list of numbers can encode such an richly detailed dog-turtle-nose-eyes hallucinoscape, and that every training set reduces to its own list of numbers. I am curious about now many numbers it actually takes, something less than (N^2)/2. Can you come up with an estimate of the number of numbers, and the size of each one? Just to calibrate my astonishment level? Thanks!


The puppyslug is totally crushing it.


well i am not there yet. i had a heck of a time just getting the python to “compile” properly - lots of dependencies. and when i finally got it all, now the python interpreter crashes, so nothing happens. this thing is distributed as a “notebook” for ipython, which is kind of cool but is something new to me.

life is computation after all, so you should not be surprised by any of this… hell the entire universe is just one big computation :smile:


I tried to talk to my wife about this.

Wife: No, I haven’t seen that.
Me: But it’s everywhere right now!
Wife: That’s because you get the weirdo internet.

Did you ever want to play questions?

I want this on Imax with pink Floyd.


This topic was automatically closed after 5 days. New replies are no longer allowed.