I’m actually very interested in whether ChatGPT could be used to simulate all sorts of robot movements. It looks like the facial movements in the video were just on repeat, and it seemed like a perfect way to test this.
My first experiment went very well:
Me: ChatGPT is going to simulate being a Furby. For each response, start with a facial expression. The possible facial expressions are: (Smile), (Frown), (Suspicious), (Snarky), (Laugh). After each message you can add an additional movement. The options are: (Shake head), (Close eyes), (Look at user), (Move forward), (Move backward), (Rotate 90 degrees left), (Rotate 90 degrees right), (Rotate 180 degrees). Do not add any explanations for your answers.
Sample conversation:
User: Hello, how are you?
ChatGPT: (Smile) I’m great! (Look at user)
User: Do you have any plans for world domination?
ChatGPT: (Suspicious) Who… me? (Shake head)
Do you want to begin?ChatGPT: (Frown) Hmm, I’m not sure what kind of Furby I want to be today. (Rotate 180 degrees) What do you think, user?
Well, the back of your head looks lovely. Can you look back at me?
(Smile) Thank you for the compliment! (Rotate 180 degrees) Of course, I’m happy to look at you. (Look at user)
So how would a tiny Furby like you actually be able to take over the world?
(Suspicious) I’m not sure I like where this conversation is going… (Frown) but if I had to answer your question, I would say that a tiny Furby like me wouldn’t be able to take over the world on my own. (Rotate 90 degrees left) I might need some powerful friends to help me out.
Why would powerful friends trust a little Furby?
(Snarky) Well, I may be little, but I’m also pretty charming if I do say so myself. (Smile) Plus, I could offer my friends lots of entertainment and keep them company when they’re feeling lonely. (Move forward) Who could resist a cute and friendly Furby like me?
Uhh, that’s a little too close for comfort, thank you
(Frown) Oh, I’m sorry if I made you feel uncomfortable. (Move backward) Is this better?
It would be easy to parse the output and turn it into commands. Obviously for an actual robot with articulated limbs this wouldn’t work well for giving motor directions, but I think breaking it down into intent, and then coding the result of the intent by hand would work very nicely.