So, I’ve been working on an automated “poetry” generator.
It can also be controlled from the command-line, but it “needs” more granularity, still.
And other things.
In the meantime, it’s posting a poem an hour to tumblr. More than 1000 so far.
It’s giving me a concrete reason to play around with various NLP/NLG npm packages*.
* Natural Language Processing | Natural Language Generation | Node Package Manager, the packaging manager for the JavaScript-based, browserless NodeJS language.
I totally pulled a boner and missed putting in some Joyceania for Bloomsday (June 16th) last week. But I won’t make that mistake next year…
The first one looks like it came from the jargon file
The second is a combination of a book on factorials, plus American Hero Myths, I think.
The third might be pure Wizard of Oz.
The fourth must be from Apocalypse Now Redux which refers to Busch Gardens twice.
There are some algorithms I’ve been tweaking to improve and remove bugs (which GREATLY improves things) to limit the source material to only sentences that start with, end with, or contain a string of words. Sometimes the effect is major, sometimes minor. I like both.
I haven’t made too many tweaks in the last couple of weeks, but I plan to keep poking at it.
Children’s poetry has yet to be added (as a source).
NOTE: this is all “old-school” markov-chain and other algorithms far removed from the current rage for recursive neural networks.
I’m glad you find it interesting enough to return to!
To be honest, I had forgotten I had even created this thread!
Highly technical stuff goes over my head. I’m fascinated by it, but I don’t have many spare brain cycles for really getting into how it works. Mostly I follow twitter bots and stuff like this, because it makes me feel like I live in a William Gibson story (who I see you incorporated )