Modeling machines off of biological patterns is the dry definition of biomimicry. For most people, this means the structure of robots and how they move, but Christine Sunu makes the argument that we should be thinking a lot more about how biomimicry has the power to make us feel something. Her talk at the 2017 Hackaday Superconference looks at what makes robots more than cold metal automatons. There is great power in designing to complement natural emotional reactions in humans — to make machines that feel alive.
We live in a world that is being filled with robots and increasingly these are breaking out of the confines of industrial automation to take a place side by side with humans. The key to making this work is to make robots that are recognizable as machines, yet intuitively accepted as being lifelike. It’s the buy-in that these robots are more than appliances, and Christine has boiled down the keys to unlocking these emotional reactions.
She suggests starting with the “inside” of the design. This is where the psychological triggers begin. Does the creature have needs, does it have a purpose? Humans are used to recognizing other living things and they all have internal forces that drive them. Including these in your designs is the foundation for lifelike behavior.
The outside design must match this, and this is where Christine has advice for avoiding the Uncanny Valley — an emotional reaction to machines that look almost too real but some cue breaks the spell. She suggests using combinations of critters as the basis for the design so as not to be locked into strong associations with the living things used as the model. The motion of the robot should be carefully designed to use acceleration that makes sense with the biological aspects of the robot and the task it’s performing. Think about how jerky, unnatural motion is used in horror movies to elicit fright — something you don’t want to recreate in a robot companion.
Her last parameter on successful biomimicry design is “dissonance”. This is perhaps the most interesting part. Humans will have expectations for living things, and expectations for machines. Trying to completely hide that machine side is a mistake. Christine uses the new Sony Aibo “pet” robot as an example. It behaves like a lovable dog without the unpleasant parts of pet ownership like house training and being around to feed it. The thing that Sony is likely missing is doing amazing “robot things” with the new robot pet. As Christine puts it, they kind of stopped being creative once they implemented the “low tech meat dog” behaviors.
Don’t miss Christine Sunu’s full Supercon talk embedded below. She has also published her talk slides and you can learn more about what she’s working on by checking out her website.
[Main image credit: @davidthings]
“Modeling machines off of biological patterns is the dry definition of biomimicry.”
Couldn’t that be said about a lot of the machines we use (except maybe the wheel)?
I think this line from the article is the most important:
“Does the creature have needs, does it have a purpose? Humans are used to recognizing other living things and they all have internal forces that drive them.”
It reminded me of a recent Wired article about a Japanese roboticist who is on a quest to create an android that is the “perfect woman.” His lab spent years studying how people feel about robots, and this has convinced him he can make an android that will be a better partner/wife/lover than a person. But for all his studies and work, he has failed to realize the most important part of our relationships with things we love: They love us back. We need things/people/pets we have a relationship with have a conscious experience and reciprocate our feelings, or else what’s the point?
https://www.youtube.com/watch?v=IrrADTN-dvg
“Machines that feel alive” sounds like a lot of fake fur and plastic.
I thought Sony quit making Aibos. I’ve always wanted one. They had a quite clever protocol for motor control whose specifications I haven’t been able to find. They had quite nifty sensors too. I would love to see someone take a broken Aibo and upgrade the guts with more powerfull processors to reflect more modern robotics stuff.
I thought $ony quit making Aibos too! So I DAGS,
https://spectrum.ieee.org/automaton/robotics/home-robots/sony-advanced-aibo-robot-dog-unleashed
I guess $ony (how I hate typing that name!) has figured a new generation of suckers has come of age since the last time. I wonder what tricks they are going to pull on the (unsuspecting) public this time.
From the same article:
The robot doesn’t come cheap at 198,000 yen (approx. $1,750). In addition, users must subscribe to an online plan to get the full range of aibo features and settings. These include access to photos taken by aibo and to an aibo store where owners can download apps, as well as a virtual version of aibo they can control with a smartphone. A three-year basic plan costs 90,000 yen or about $800. A support and care subscription that discounts repairs by 50 percent is also available for 54,000 yen ($475).
Fony Annoy certain groups they do take action, $ are lost $ony always shed more than tears.
Funny you mentioned. I picked uo an old aibo ers-210 to do exactly that. If you think there’s enough interest i wouldbt mind posting photos of my teardown before modding/using parts for another droid.
Aibo teardown? Yes please. And then tips@hackaday.com, if you would be so kind.
I’ve been toying with the idea of using real biology instead of mimicking it– for example, instead of 3D printing limbs for your robot cat, take an actual cat skeleton* and form the robot around that frame. There are a lot of holes that can be poked in the idea but it’s an interesting thought experiment.
*From a cat that died of natural causes
I think the bone joints would be a problem, they would have to be replaced with bearings or something.
Like in the movie Virus, where an alien AI uses human spareparts and mechanical components from a research ship for their robots. Before A.I. and fuzzy logic, robotics used analogue circuits and could replicate behaviour of insects quite convincingly.
I’m sorry, but that creeps me out. Reminds me too much of the robot-things from the movie “9”. I wouldn’t be surprised if the majority of consumers didn’t like the idea. Similarly, I am not sure where you are going to get a lot of dead cats for a low price, or how to efficiently manage the problem of creating custom hardware for each individual skeleton.
That being said, a less nightmare inducing solution would be to directly mimic the bones. Send a single (living) cat through some manner of medical scanning device, maybe even a “CAT” scan. Convert the resulting scan into a 3d model. From there, use 3d printers, injection molding or any other manufacturing method to produce identical “bones” without the creepy factor
Oh this was obviously never going to be a product. Like I said it’s a thought experiment
I interpreted that to mean something more like the bots from WALL*E: they’re not even trying to hide the fact that they’re made of metal, glass, and plastic, but you can tell from the way they move that they’ve got something more than cold if-then loops driving them. A prime example of that is WALL*E himself: compare his behavior and motions throughout the movie to his behavior and motions immediately after he rebooted in “safe mode”.
Interesting talk, and yes, I think the marketing people do miss the point of a pet completely.
That said… is it just me or did the motion in the video get very robotic around 22:00? It was fine prior to that, still is if I re-wind back, and it came good a minute or two after… but around that time, the video becomes very jerky on my machine.
yeah I thought my connection was weird, but it’s the video :( great video otherwise though :)
Vote Buzzfeed for President in 2020!
That might happen but would take a lot of facebook ad work.
Whaaaat. Give her a chance at least, was a pretty cool presentation!