EmpathyBot recognizing emotion

Raspberry Pi Robot That Reads Your Emotions

It’s getting easier and easier to add machine intelligence to your hacks, even to the point where you sometimes don’t have to install any special software. In this case [Dexter Industries] has added the ability to read human emotions to their EmpathyBot robot by making use of Google Cloud Vision.

Press a button on the robot and it moves forward until it’s a certain distance from an object. It then takes a picture and sends it off to Google Cloud Vision along with a request to do face detection. The response that Google returns is in JSON format and, if it finds a face, includes the likelihood of the face being happy, sad, sorrowful or surprised. The robot parses that response and gives an appropriate canned speech using the text-to-speech software, eSpeak e.g. “You seem happy! Tell me why you are so happy!”.

[Dexter] has made the source code available on github. It’s written in python and is easy to read by anyone with even just a little programming experience. The video after the break gives a number of demonstrations, including some with non-human subjects.

Continue reading “Raspberry Pi Robot That Reads Your Emotions”

Robots Learning Facial Expressions

einstein (Custom)
Researchers at UC San Diego have been working on a robot that learns facial expressions. Starting with a bunch of random movements of the face “muscles”, the robot is rewarded each time it generates something that is close to an existing expression. It has slowly developed several recognizeable expressions itteratively. We have a few questions. First, are we the only ones who see a crazy woman with a mustache in the picture above? Why is that? What makes [Einstein] look so effiminate in that picture? Secondly, what reward do you give a robot? You can actually see this guy in action in a video after the break.

Continue reading “Robots Learning Facial Expressions”

Lexlrie

[youtube=http://www.youtube.com/watch?v=fbIzhlZHJfk]

Lexlrie is basically a feed display. It can connect to twitter, facebook and we feel fine for its updates. What makes this project different is that it is supposed to alter its lighting based on the mood of the updates. The system looks for words like “better” and “sorry” and displays color patterns based on those. We have no idea what “better” should look like, but it’s a cool idea. You can get more details of its construction here. This project vaguely reminds of Pulse, which intended to show the emotion of blogger.com updates.

Wiimote Driven Motion Effects

[vimeo 2515709]

Check out the video above by [Adrien Mondot] for a extensive demonstration of eMotion being used with a Wiimote. eMotion is a physics based visual tool for the Mac. It’s designed to enhance performances by reacting to real world motion. Its grounding in physics makes the resultant motion appear more natural than if they were arbitrarily generated. The video above combines eMotion with the output of Wiimote Whiteboard, a low-cost interactive white board that uses the Wiimote camera plus IR light pens. While the video takes place in a small area, we can see how this could be scaled to a much larger space with IR lights mounted to performers.

[via CDM]