Robots learning facial expressions

einstein (Custom)
Researchers at UC San Diego have been working on a robot that learns facial expressions. Starting with a bunch of random movements of the face “muscles”, the robot is rewarded each time it generates something that is close to an existing expression. It has slowly developed several recognizeable expressions itteratively. We have a few questions. First, are we the only ones who see a crazy woman with a mustache in the picture above? Why is that? What makes [Einstein] look so effiminate in that picture? Secondly, what reward do you give a robot? You can actually see this guy in action in a video after the break.

[Read more...]

Lexlrie

Lexlrie is basically a feed display. It can connect to twitter, facebook and we feel fine for its updates. What makes this project different is that it is supposed to alter its lighting based on the mood of the updates. The system looks for words like “better” and “sorry” and displays color patterns based on those. We have no idea what “better” should look like, but it’s a cool idea. You can get more details of its construction here. This project vaguely reminds of Pulse, which intended to show the emotion of blogger.com updates.

Wiimote driven motion effects

Check out the video above by [Adrien Mondot] for a extensive demonstration of eMotion being used with a Wiimote. eMotion is a physics based visual tool for the Mac. It’s designed to enhance performances by reacting to real world motion. Its grounding in physics makes the resultant motion appear more natural than if they were arbitrarily generated. The video above combines eMotion with the output of Wiimote Whiteboard, a low-cost interactive white board that uses the Wiimote camera plus IR light pens. While the video takes place in a small area, we can see how this could be scaled to a much larger space with IR lights mounted to performers.

[via CDM]

Follow

Get every new post delivered to your Inbox.

Join 96,712 other followers