Continuing the discussion from Why all of us need to be futurists:
Yes, let’s talk about Prisma. It started popping up all my timelines, and after a quick download-and-try I must say this looks actually very interesting.
Yes, the “painting effect” gimmick has been available for decades in Photoshop and the like, and usually best avoided, to be fair, but this feels to me like a cleverer way of making computers fake artistry than what came before.
Still often looks like crap, but sometimes… Kind of good, surprisingly. That’s new if you ask me.
Looks like it’s doing the same kind of job as these ‘style transfer’ software demos, even so far as using some of the same source images:
(both via kottke)
There are neural networks involved somehow? I don’t know how it works, you geniuses tell me. But the result is clearly applying elements figured out from a source image to another image of your choice.
There’s a bunch of preset filters in the app, but to me an obvious evolution would be making new “filters” from any image you throw at it. At that level I can imagine some trippy still image and video uses for a more pro-software-plugin version of this.
Not sure how threadworthy this topic is, but felt like sharing in case anyone else finds this an interesting mainstreaming of a graphics-nerd tech concept.