Some pretty impressive machine-learning generated poetry courtesy of GPT-2


Originally published at:


I expect “The Emperor Wu” would be quite happy with his poem. In fact, I think Kim Jong Un may wish to adapt it for his admirers . Many years ago I used to amuse myself with the musings of Racter, which I found hilarious. In the accidental, unfortunate “newspaper adjacent headline” way; otherwise I have little interest in machine generated art… it’s just servile . If the machines were to generate art intended to be appreciated by other machines, that might have some value :slight_smile:



Very interesting! This opens new dimensions for re-interpretation of world literature.
Btw. GPT-2 interpreted during my humble experiments with the poetry by Samuel Taylor Coleridge. (Sorry for shameful self-promotion, just couldn’t resist.).


As with a lot of machine-learning applications, I suspect that this one would be best served as the basis for a centaur, with the model churning out tons of poetry and a human acting as a discerner to throw away the garbage and pluck out the happy accidents.

One day we will all be Mechanical Turks. It will be the last job.


For some people, even bad poetry is good poetry.


OMG, that Goethe shoop.


there’s a poem that’s just “The Emperor Wu (the great Wu), majestical,” repeated 11 times, then a bunch of repetitions of “The Emperor Wu (the great Wu), rapacious,” salted with the odd “majestical”

There’s got to be someone out there who has already been paid a few thousand dollars for having done pretty much the same thing. Perhaps the only thing missing is to play a Casio demo song while reading it aloud.



1 Like

I took a great lesson away awhile back from a Jon Anderson (YES prog-rock group front man) interview. Music critics at the time complained about the “over-intellectualized” lyrics in his songs, meaning they had to work to eventually decipher the songs’ meanings. In the interview, in response to the ‘challenged’ critics, Anderson admitted that, in coming up with lyrics, he merely chose words that sounded nice together and without intentionally attempting to poetically veil any profound significance or meaning.

Cut to GPT-2 poetry: I would love to see these (under pseudonyms) submitted to publications (periodicals would be best, e.g. New Yorker Magazine) and see if they take the bait.

1 Like

I’d kind of like to see a rivalry made between the camps of a reviewer who somehow sent off to the BBC that Science was being wrecked by papers with ridiculous (irreproducible) large sample sizes and results congealed by ML, and others saying that guiding ML with human intuition as to how something works or worked is ultimately recidivism. Which is to say it’s not, and that somewhere in this half-decade there is room for people to get super pissed and make a factionhood and sparring out of it better than the Hi Niles-ism of another trainwreck (aka Frasier S9) ep.

You know, before the competitive asteroid farming starts in earnest.


I agree!

I submitted such a poem to our engineering school literary magazine, but alas it was rejected. Perhaps the fact that the first letter of each lines wrote out F U C K Y O U T O O had something to do with it.


I believe that any decent publisher would be on the lookout for acrostic poems.


But when they don’t the results are awesome.

(Yes, I know it isn’t poetry)


Case closed. :grin:

closed #16

This topic was automatically closed after 5 days. New replies are no longer allowed.