The "uncanny valley" effect might be a learned behavior


Originally published at:


attempts to create the sense of an intentional mind in these everyday AIs.

We will adapt.

ETA: didn’t look at the study, but getting this kind of headline makes me suspicious…


I am more inclined to believe their explanation than the behavioural-evolutionary one, but you can judge and see for yourself. Needless to say, I would be very careful about the conclusions.

Image from the paper:


Kids younger than 9 didn’t find the humanoid bot unsettling – but kids older than 9 did. This suggests that our sense of the uncanny valley is a learned behavior, not something purely innate. Interesting!

It doesn’t suggest that. It leaves open that possibility.
It more likely shows us something about brain development in childhood and youth.
It shows how kids younger than nine will animate objects through belief, like teddy bears or dolls.

Nine years of age seems to correspond with the age at which the last kid, no matter how much it still really wants to, really can’t believe in Santa. Children at this stage have a growing need for independence in their decision-making and thinking processes an increasing attention span, and expand their use of logic and reason in problem-solving.

(That’s how early indoctrination is the only way organized religion and other fanaticisms continue to fester. A developed brain is somewhat hardened to withstand utter bullshit. Gittem while they’re young.)


I think it depends on what is meant by “learned” - here the strongest hypothesis seems to be “become accustomed to what normal human beings look like and how they behave.” Greater cognitive plasticity in youth could conceivably enable children to accept a broader range of expressions as human.

Still - what would be learned probably isn’t the uncanny valley reaction itself, just what is supposed to trigger it.


Okay, but when can we have sex with them instead of each other?


This is very useful data.

We learn from this study that when the robots revolt we will have to exterminate all the humanoids older than nine solar-cycles.


I can believe that the uncanny valley effect is learned, principally because I never learned it. I roll my eyes at all the pearl-clutching posts about how this or that robot or movie CG creation is, “seriously sooo creepy, you guys! Ewww!” Maybe now we can unlearn it?


Since behind paywall, can’t rtfa.

Did they do anything to try to account for all the rampant anti-robotic propaganda ( :slight_smile: ) that Hollywood is putting out? Nine is probably about the age where they start watching movies with cgi bad guys…

(Seriously, I have wondered how much of the Uncanny Valley movies are responsible for since they normally cast CGI not-quite-human as the bad guy… It is not like the average human comes into contact with humanoid robots on a normal basis. And for decades the movies have been screaming about the evil AI… Which could help account for the distrust of AIs.)


You mean when they develop a virus that kills anyone past puberty…


The article below shows that some kids still believe in Santa past their 9th birthday, but at some point you have to wonder whether the parents are still fooling the kid, or the kid is fooling the parents. It also seems to have changed recently, too fast to be completely explained as an evolved inherited trait.


i came to say almost exactly the same thing, but you said it more clearly than I could.

Also, I’m proud of whoever wrote the headline, for using “might be.” Moderation? How did that get in here?


It pays off in the end.


Less messy to dump them in a constructed environment and have them set to task on differing tasks organic minds are better wired to deal with until a proper inorganic mind can either emulate in whole or part. Then the meatbags are allowed a graceful dieoff without fresh crops to replinish losses.

No sense in being needlessly cruel after all.


Dammmmnnnnnn! Check out the variance on that regression.


Urgs. OK, p = 0.04, but this smells like overfitting. I should look at the paper, maybe. Usual glm, is it? Hope they chose the right distribution, and checked their residuals. I hate modelling categorical data, and data obtained from humans isn’t particularly my kind of fun… :confused:



I wonder if controlling for childhood robot experience is a problem here.

It’s certainly possible that children, across the board, are more enthusiastic about external minds, even in peculiar packages(they certainly seem more enthusiastic about pretending that dolls and things have them); but given how the cost of things like CGI, 3d games; and various robot toys has fallen over time it also wouldn’t be terribly surprising if people in the 3-9 bracket today have had their experience with the denizens of the uncanny valley more or less as far back as they can remember; while older people were introduced to the them sometime later.


If you like uncanny valley girls…


I have an alternate theory that seems more likely to me. It’s not that the discomfort with the uncanny valley is learned, it is that it comes about as a natural consequence of a more mature understanding of what other minds are capable of.

Young kids take things at face value, and believe people mean what they say. They are not very suspicious of motivations, and are more gullible. But older kids know more, and know that in someone’s mind are motivations and desires and thoughts that are deep and possibly different than what they appear to be. A person, and possibly something else with a mind, might be motivated by fear, greed, jealousy, hate, desire, etc. In other words, they learn to be careful of other people’s motivations, and look for cues about whether to trust them.

My theory is that when a robot is very far from human, you can easily treat it as “other”, and you can characterize it, or think you understand it (possibly fooling yourself that way). But the closer a thing gets to having a human-like mind, or to being like you, the more you are compelled to fear the less admirable properties of a mind that you assume might be present.

People tend to trust other people that are similar to them, distrust people that are not, and completely dehumanize people that they fear and want to control. I think this is a similar effect. The uncanny valley may be the place where something is similar enough to you that you are instinctively fearful of human-like motivations, but where the thing is dissimilar enough to you that you don’t think you understand them, and don’t trust them.