A web search for “Uncanny Valley” will retrieve a lot of information about that discomfort we feel when an artificial creation is eerily lifelike. The syndrome tells us a lot about both human psychology and design challenges ahead. What about the opposite, when machines are clearly machines? Are we all clear? It turns out the answer is “No” as [Christine Sunu] explained at a Hackaday Los Angeles meetup. (Video also embedded below.)
When we build a robot, we know what’s inside the enclosure. But people who don’t know tend to extrapolate too much based only on the simple behavior they could see. As [Christine] says, people “anthropomorphize at the drop of the hat” projecting emotions onto machines and feeling emotions in return. This happens even when machines are deliberately designed to be utilitarian. iRobot was surprised how many Roomba owners gave their robot vacuum names and treated them as family members. A similar eruption of human empathy occurred with Boston Dynamics video footage demonstrating their robot staying upright despite being pushed around.
In the case of a Roomba, this kind of emotional power is relatively harmless. In the case of robots doing dangerous work in place of human beings, such attachment may hinder robots from doing the job they were designed for. And even more worrisome, the fact there’s a power means there’s a potential for abuse. To illustrate one such potential, [Christine] brought up the Amazon Echo. The cylindrical puck is clearly a machine and serves as a point-of-sale terminal, yet people have started treating Alexa as their trusted home advisor. If Amazon should start monetizing this trust, would users realize what’s happening? Would they care?
Ideally that power should be put back in the hands of the user, to be used for our own goals instead of goals of a retail giant. It’s easy to start doing it for ourselves. [Christine] described how one of her projects used a simple servo to mimic a pair of eyebrows that trigger an emotional response to frowns. We are also cautioned to remain wary, to keep asking ourselves who else is wielding this power and what they’re doing with it. Because one thing is certain: more robots are coming into our lives and some of them don’t look like machinery. The international space station just received CIMON, a flying ball with a face. Here on earth, we have products like Kuri and Misty on the horizon. People will feel a connection to these robots, how will that connection evolve?
But there’s no need to stand on the sidelines and wonder, this is an active area of development and we can jump in to help shape its future. And as a side bonus, such projects would make great entries for the Hackaday Prize Human-Computer Interaction Challenge. If it would help you to hear [Christine Sunu] talk more about robots that elicit emotion, check out her talk from Supercon 2017.
Doesn’t that empathy mean when the robot revolution comes we’ll be ready to hold hands and sing kumbaya with our overlords?
Skynet won’t need to make Terminators look like Arnold Schwarzenegger. Want to infiltrate a human settlement? Stick a pair of googly eyes on a box and send it in.
The mood organ will need a special setting for this !
Don’t forget two yellow puffs with eyes aka Keepon
https://youtu.be/nPdP1jBfxzo
If you haven’t heard it yet, Radiolab did an episode on the uncanny valley: https://www.wnycstudios.org/story/more-or-less-human
What I found particularly interesting is the discussion of Pleo, a “lifelike” dinosaur robot by the designer of the Furby. When I heard the story, they played audio from a YouTube clip where a couple guys “abused” Pleo to see how it would react. Needless to say, my intellectual reaction was curiosity, but I had this STRONG emotional reaction that affected me for hours (in fact I doubt I could listen to it again.)
The question is, why do I have this emotional reaction? Part of it may be that the vloggers seemed to take glee in their abuse towards the robot. Would I have had the same reaction if they had been stoic, scientific, or even reluctant to perform their abuse test?
I am not an extremely-sentimental type, I haven’t named my many Roomba’s nor any of my cars or other machines. I really don’t feel like my reaction was empathy for the toy, more revulsion from the vloggers. (Which given I can’t stand a lot of the “yell and laugh a lot” vloggers out there, I suspect that’s it.)
I also thought it disturbing that the designer is designing so that his next robot (a baby) cannot be abused. Is that the right reaction? I have thought of writing a short story/screenplay about a robotic baby that, when abused, calls the police and they arrest the person for violation of a law against abusing robotics.
Thoughts? Your reaction to the podcast?
Similar to watching a movie, there is a voluntary suspension of disbelief with emotive robotics. The uncanny valley arises from the robot breaking the 4th wall in some way.
Referring to your mention of the robot calling the police: “I have thought of writing a short story/screenplay about a robotic baby that, when abused, calls the police and they arrest the person for violation of a law against abusing robotics.”
The first thing my mind went to is companies trying to perform a torture test on their products and the product calling the police on them and having them arrested…not the way to go if you want any of your products to work for a long time, in a hotter environment than room temperature, after dropping a few times, etc..
Personally, I am super empathetic towards other humans, and somewhat so to various animal species too, however, and I don’t know if it’s just me, but I don’t show empathy towards machines/products/robots/etc..
Radiolab along with Ted and Moth are about as bad as it gets. Grammatically the programs are botched and unfit for my listening. Editing in radio for the MTV generation, jumble on, what.
Naming inanimate objects with human names and of course sex, that’s when you step off of Earth and into hell. No valley just the end of the world, like when it was “flat”. There was a segment on NPR the other day about gender neutral voices on voice interfaces being introduced in educational software platforms. This is needed for sure. Once sex is gone, human identity is removed. Kids need to grow up in our world not its.
Your first two sentences are so perplexing, I’m still sitting here trying to figure out how a story telling show and literal TED talk presentations are “as bad as it gets”
Me too, what did the Moth ever do to you other than be very emotional and cause my friends to turn away :( c’mon guys it’s a really good story….
This anthropomorphization is quite true. It took only days until my mother (aged 75) gave the new lawn mower robot a name. :-) Thinking about that, I am quite glad she did not give her smartphone or her PC a name :-)
I think she missed the point about the “valley” part.
(Maybe I fast forwarded past it.)
One of my favorite things about taking my Star Wars droids and K-9 into public are the reactions from people. Those that know the droid’s personality fully expect that droid to act like it does in the movie/shows.
People that have never seen an episode of Doctor Who react to K-9 as if he’s a “real” dog. They’ll pet him and call him good dog. Two separate times I’ve had K-9 out I’ve been approached by the parents of autistic children who are interacting with K9. They were both teary eyed. One child kept asking to see the “space puppy” and her mother had never heard her speak so much. The other did not interact outside his home environment but was talking up a storm to K9. I’m not ashamed to say I got teary eyed too.
Personally, I think the quickest way to get the public to embrace a robot would be to put it in an entertainment media and then release it as a product (R2-D2 and BB-8). Otherwise it would have to be a utility they grew attached to over time (Roomba and Alexa) or shaped like something familiar and beloved (K-9).
A very old Scientific American study of seagull chicks studied their response to obviously fake seagull adult heads (2D cardboard heads, iirc) with different exaggerated features, such as a red spot or beak shape (?), or lack thereof (eg. no eyes). IIRC, What they found interesting was that models that the chick responded to looked nothing like actual seagull adults. Perhaps this would explain why people anthropomorphize objects, meaning we should study, if not already study, what features of humans trigger what responses. Women’s makeup, including long eyelashes, double-eyelids, contacts that change the shape of the eye, etc. is a pretty fertile area of study, including differences in gender reactions. As for sex, fetishes, kinks, etc. attach sexual gratification to an object. behavior, physical or mental trait, etc. supporting that not everyone will be triggered to behave a certain way with the same physical or behavior stimuli. Like many aspects of robots, we’ll have to take a psychological step back and understand better human behavior.