Christine Sunu Proves the Effect of Being Alive on Hardware Design

Modeling machines off of biological patterns is the dry definition of biomimicry. For most people, this means the structure of robots and how they move, but Christine Sunu makes the argument that we should be thinking a lot more about how biomimicry has the power to make us feel something. Her talk at the 2017 Hackaday Superconference looks at what makes robots more than cold metal automatons. There is great power in designing to complement natural emotional reactions in humans — to make machines that feel alive.

We live in a world that is being filled with robots and increasingly these are breaking out of the confines of industrial automation to take a place side by side with humans. The key to making this work is to make robots that are recognizable as machines, yet intuitively accepted as being lifelike. It’s the buy-in that these robots are more than appliances, and Christine has boiled down the keys to unlocking these emotional reactions.

Continue reading “Christine Sunu Proves the Effect of Being Alive on Hardware Design”

Simon Says Smile, Human!

The bad news is that when our robot overlords come to oppress us, they’ll be able to tell how well they’re doing just by reading our facial expressions. The good news? Silly computer-vision-enhanced party games!

[Ricardo] wrote up a quickie demonstration, mostly powered by OpenCV and Microsoft’s Emotion API, that scores your ability to mimic emoticon faces. So when you get shown a devil-with-devilish-grin image, you’re supposed to make the same face convincingly enough to fool a neural network classifier. And hilarity ensues!

Continue reading “Simon Says Smile, Human!”

Robot Einstein could save humans from killbot destruction

einstein-robot

Earlier this year we saw the Einstein robot that is being developed to facilitate human facial emotions in robots. [David Hanson], the man in charge of this project, has given a TED talk on his work that includes a show-and-tell of his most recent progress. We’ve embedded the video after the break for your enjoyment.

The Einstein robot (head only in this video) shows off the ability to recognize and mimic the facial emotions of the person in front of it. There is also video of a Bladerunner-esque robot looking around a room, recognizing and remembering the faces of the people it sees. [David] makes a very interesting proclamation: he’s trying to teach robots empathy. He feels that there is a mountain of R&D money going into robots that can kill and not much for those that can sense human emotions. His hope is that if we can teach empathy, we might not be annihilated when robots become smarter than us.

That’s not such a bad idea. Any way you look at it, this talk is interesting and we wish the five-minute offering was five-times as long. But [Mr. Hanson’s] facial hair alone is worth clicking through to see.

Continue reading “Robot Einstein could save humans from killbot destruction”

Pulse, the emotional visualization organism


[Markus Kison] built a device called Pulse, which is part art installation and part data visualization tool. What the emotional visualization organism called Pulse actually does is scan new posts on Blogger.com blogs for synonyms of keywords related to 24 distinct emotions from eight emotional groups. A red cone in the center expands when keywords are detected, in effect acting as a mood indicator for Blogger.com blogs.

The 24 distinct emotions are based on [Robert Plutchik]’s psychoevolutionary theory of emotion, and the device itself is built from a glass case, various servo motors, and custom controller for the servos. This is a compelling idea, but we wonder whether it scans for modifying words or just the keywords alone. It wouldn’t make a lot of sense to have the sadness region expand drastically if many people simultaneously post the sentence “I’m not sad at all.” Video embedded after the break.

Continue reading “Pulse, the emotional visualization organism”