A neural net that generates weirdly evocative sentences

Originally published at: https://boingboing.net/2018/03/03/a-neural-net-that-generates-we.html


I like making anagrams of some things…

as with compulsion/cool pin sum


1 Like

He spends his time messing around with neural nets.
He leaned back against Ann around a foot long.
Arthur moved a line-covered second floor.
Mrs. Heredith’s feet were halfway.
Finally Tommy advanced his three-man.
Being a prisoner advanced, he listened.
This is pretty mesmerizing, I gotta say.


There’s a neural net between our ears that does this kind of stuff, too. Just load it up with fresh tier 2 and 3 vocab from an unabridged dictionary, and off it goes. Helped sometimes by having the flu, or lack of sleep, etc., etc.

And the story in M-D of the “devious-cruising Rachel” brings me up short every time I’m reminded of it:


In the beginning God created the heaven and the earth.
In the beginning M. Franz was touched in his ear.
In the instant everyone looked through it at his papers.
You’ve answered everything out of your mind, my darling, your Majesty.
You’ve now lived even if they’re done, the color of their bullets.
But let them pass through, and let me know if the attack is at my place.
And God saw every thing that he had made, and, behold, it was very good.

1 Like

The Star-Spangled one could use a little work. The imagery is good, but the meter is way off.

Heh heh, nice


From Hell. Mister Lusk.
His hands grew unseen.
His face seized Dan.
Hawks sighed pityingly.
Careful twenty tries.
Tawney turned white.
Signed Catch me when you can.

Oh yes, this is rather good. And, yes, I have just been reading a certain graphic novel.

1 Like

I think one of Douglas Hofstadter’s pieces, most likely a Scientific American column collected in Metamagical Themas, suggets a Lisp program that would take the first and last lines of novels and generate the content between. It is, of course, left as an exercise for the reader.

Aha! Found it:

-> (def readers-digest-condensed- version
(lambda (biglonglist)
(cons (car biglonglist) (cons (rac biglonglist) nil))))
Thus if we apply our new function readers-digest-condensed-version to the entire
text of James Joyce’s Finnegans Wake (treating it as a big long list of words), we will
obtain the shorter list (riverrun the). Unfortunately, reapplying the condensation
operator to this new list will not simplify it any further.
It would be nice as well as useful if we could create an inverse operation to readers-
digest-condensed-version called rejoyce that, given any two words, would create a
novel beginning and ending with them, respectively -and such that James Joyce would
have written it (had he thought of it). Thus execution of the Lisp statement (rejoyce
'Stately 'Yes) would result in the Lisp genie generating from scratch the entire novel
Ulysses. Writing this function is left as an exercise for the reader. To test your
program, see what it does with (rejoyce 'karma 'dharma).



This topic was automatically closed after 5 days. New replies are no longer allowed.