How Do You Think This Quadcopter Feels?

how-does-this-quadcopter-feel

You don’t speak the language of dogs and yet you can tell when one is angry, excited, or down in the dumps. How does that work, and can it be replicated by a robot? That’s the question which [Megha Sharma] set out to study as part of her graduate research at the University of Manitoba in Winnipeg.

The experiment starts by training the robot in a series of patterns meant to mimic emotion. How, you might ask? Apparently you hire an actor trained in Laban Movement. This is a method of describing and dealing with how the human body moves. It’s no surprise that the technique is included in the arsenal of some actors. The training phase uses stationary cameras (kind of like those acrobatic quadcopter projects) to record the device as it is moved by the actor.

Phase two of the experiment involves playing back the recorded motion with the quadcopter under its own power. A human test subject watches each performance and is asked to describe how the quadcopter feels. The surprising thing is that when asked they end up anthropomorphising the inanimate device even further; making up small stories about what the thing actually wants.

It’s an interesting way to approach the problem of the uncanny valley in robotic projects.

14 thoughts on “How Do You Think This Quadcopter Feels?

  1. people get paid for this.

    my pen fell off the desk, it started rolling slowly, then it dropped quickly, then rolled very slowly, then stopped.

    i can attribute this to, unhappy and despising life, committing suicide, regret then death…

    my doctorate is in the post yeah?

    1. Believe it or not (as you wish), but this kind of research is important for human machine interaction, more so as and computers/robots become more ubiquouis. Portraying feelings can be used in many scenarios.

      One example a while back was research into using a simple robot with the elderly that could play songs from the early years of the people around it. The study was to see how people reacted, how it affected their emotions and how it could be used to increase their quality of life (from, memory). Imagine a robot that could convey emotion to convey information or to assist in improving the moods of people it is assigned to (e.g. paraplegics, people with poor motor skills, people with mental health issues or learning problems, etc). They used an example of this in the movie Moon.

      Also consider how this could be used for telepresence, being able to convey emotion remotely could be useful in making telepresense more interactive.

      I’m not sure where the research stands now, but from memory it’s been claimed that we project emotions onto animals. It was stated that some behaviour and expression of emotions of animals (and from memory infants) is learnt behaviour to reproduce a response (again, not sure where the current research is on this). As humans I think it’s pretty clear we project emotions onto all sorts of objects. I’ve seen people assign emotions to their car, etc. Just look at how Holywood uses emotions to make us feel for an animated robot (like wall-e, Number 5, Marvin, etc).

      My point is that emotions are part of the human condition. Studying how to portray emotions is an avenue for developing human machine interactions, regardless of whether they fly, walk, crawl or are stationary.

    2. You need a few hundred more pages of padding.

      Love child of his mother’s affair.
      Neglected and abused by the man he called father.
      Witnessing the death of his mother during a violent outburst by the same man.
      Managing to break from his past and make it through college.
      Promising career in business cut short when a rival revealed some compromising photos of him and a pencil sharpener during an office party.
      Years spent trading doodles and fake tattoos to support his white-out habit.
      Eventually hitting a dry spell and suffering withdraw that he wished had killed him.
      Sobering up enough to realize his life had hit rock-bottom and deciding to end it all.

      Looking over the ledge, deciding he’s past the point of no return, and plunges to his fate.

      Unfortunately he misjudged the height, and instead of the quick end he was hoping for, was granted a few minutes to reflect in the pain of his broken bones and punctured organs. Slowly, he manages to turn over, look to the sky, and wonder if he had a chance to do it all over, would he have been able to make different choices, or is it all pre-destined to end this way.

      What field is this doctorate supposed to be in?

    3. “I am a second year Master’s in Computer Science student under Dr. James E. Young and Dr. Rasit Eskicioglu at the University of Manitoba.”

      Nah, she is not a doc nor getting paid to do this. In fact she, or her parents are paying so she can have the privilege to do this and then be graded on it.

      More importantly this could be a decent start to create different ways of communicating. There would still need a great deal of research to be done in order to get accurate/desirable information transfer to a decent portion of people. I’d rather have miscellaneous stuff look sad and tired when they are running low on batteries than flash a light at me. I’d be more likely to recharge them (maybe out of guilt or sympathy).

  2. Pro CGI animators seem to be able to express emotion through movement of almost any object well. I wonder if they’re schooled in Laban (which sounds a little hokey on some levels), or if there’s something more mainstream?

    1. Looks to me like they’re collecting quantitative data during the ‘performance’ part of the experiment (can see this during the video), then qualitative interview data at the end. Completely makes sense as an experimental design in a fairly novel area, assuming they do something sensible with the qualitative data (another issue entirely).

  3. Smee wrote: “I’d rather have miscellaneous stuff look sad and tired when they are running low on batteries than flash a light at me. I’d be more likely to recharge them (maybe out of guilt or sympathy).”

    That’s the problem! I don’t want machines MANIPULATING me! Imagine the brand loyalty that could be developed by products that “act” like they love you. NO.

    Does this remind anyone else of the quote “Todays lecture on quantum mechanics will be conducted in the medium of modern dance”? (Carlin, was it?)

Leave a Reply to benjCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.