An aluminum box sits on a workbench. It is open and has a message saying "I Love You!" inserted in a wooden slot. There is a switch with a yellow LED on the front and a small compartment to the left of the wooden slot to store paper.

Share Your Feelings Like A Spy

While hackers can deftly navigate their way through circuit diagrams or technical documentation, for many of us, simple social interactions can be challenge. [Simone Giertz] decided to help us all out here by making a device to help us share our feelings.

Like an assignment in Mission: Impossible, this aluminum box can convey your confessions of love (or guilt) and shred them after your partner (or roommate) reads the message. The box houses a small shredder and timer relay under a piece of bamboo salvaged from a computer stand. When the lid is opened, a switch is depressed that starts a delay before the shredder destroys the message. The shredder, timer, and box seem almost made for each other. As [Giertz] says, “Few things are more satisfying than when two things that have nothing to do with each other, perfectly fit.”

While seemingly simple, the attention to detail on this build really sets it apart. The light on the front to indicate a message is present and the hinged compartment to easily clean out shredded paper really make this a next-level project. Our favorite detail might be the little space on the side to store properly-sized paper and a marker.

While the aluminum box is very industrial chic, we could see this being really cool in a vintage metal lunch box as well. If you’re looking for other ways to integrate feelings and technology, checkout how [Jorvon Moss] brings his bots to life or how a bunch of LEDs can be used to express your mood.

Continue reading “Share Your Feelings Like A Spy”

Hyundai Mini 45 EV Is A Small Car With Grand Ambitions

One of Hyundai’s recent concept cars was an electric vehicle named “45” in honor of its inspiration, another concept car from 45 years ago. When footage of a child-sized “Mini 45” surfaced, it was easy to conclude the car was a motorized toy for children. But Jalopnik got more information from Hyundai about this project, where we learned that was not nearly the whole picture.

The video (embedded below) explained this little vehicle is a concept car in its own right, and most of the video is a scripted performance illustrating their concept: using technology to help calm young patients in a hospital, reducing their anxiety as they faced treatment procedures. Mini 45 packs a lot more equipment than the toy cars available at our local store. The little driver’s heartbeat and breathing rate are monitored, and a camera analyzes facial expressions to gauge emotional stress. The onboard computer has an animated avatar who will try to connect with the patient, armed with tools like colorful animations, happy music, candy scent dispenser, and a bubble-blowing machine.

Continue reading “Hyundai Mini 45 EV Is A Small Car With Grand Ambitions”

Your Next Robot Needs Googly Eyes, And Other Lessons From Disney

There are so many important design decisions behind a robot: battery, means of locomotion, and position sensing, to name a few. But at a library in Helsinki, one of the most surprising design features for a librarian’s assistant robot was googly eyes. A company called Futurice built a robot for the Oodi library and found that googly eyes were a very important component.

The eyes are not to help the robot see, because of course they aren’t functional — at least not in that way. However without the eyes, robot designers found that people had trouble relating to the service robot. In addition, the robot needed emotions that it could show using the eyes and various sounds along with motion. This was inspired, apparently, by Disney’s rules for animation. In particular, the eyes would fit the rule of “exaggeration.” The robot could look bored when it had no task, excited when it was helping people, and unhappy when people were not being cooperative.

Continue reading “Your Next Robot Needs Googly Eyes, And Other Lessons From Disney”

Christine Sunu Proves The Effect Of Being Alive On Hardware Design

Modeling machines off of biological patterns is the dry definition of biomimicry. For most people, this means the structure of robots and how they move, but Christine Sunu makes the argument that we should be thinking a lot more about how biomimicry has the power to make us feel something. Her talk at the 2017 Hackaday Superconference looks at what makes robots more than cold metal automatons. There is great power in designing to complement natural emotional reactions in humans — to make machines that feel alive.

We live in a world that is being filled with robots and increasingly these are breaking out of the confines of industrial automation to take a place side by side with humans. The key to making this work is to make robots that are recognizable as machines, yet intuitively accepted as being lifelike. It’s the buy-in that these robots are more than appliances, and Christine has boiled down the keys to unlocking these emotional reactions.

Continue reading “Christine Sunu Proves The Effect Of Being Alive On Hardware Design”

Simon Says Smile, Human!

The bad news is that when our robot overlords come to oppress us, they’ll be able to tell how well they’re doing just by reading our facial expressions. The good news? Silly computer-vision-enhanced party games!

[Ricardo] wrote up a quickie demonstration, mostly powered by OpenCV and Microsoft’s Emotion API, that scores your ability to mimic emoticon faces. So when you get shown a devil-with-devilish-grin image, you’re supposed to make the same face convincingly enough to fool a neural network classifier. And hilarity ensues!

Continue reading “Simon Says Smile, Human!”

Robot Einstein Could Save Humans From Killbot Destruction

einstein-robot

Earlier this year we saw the Einstein robot that is being developed to facilitate human facial emotions in robots. [David Hanson], the man in charge of this project, has given a TED talk on his work that includes a show-and-tell of his most recent progress. We’ve embedded the video after the break for your enjoyment.

The Einstein robot (head only in this video) shows off the ability to recognize and mimic the facial emotions of the person in front of it. There is also video of a Bladerunner-esque robot looking around a room, recognizing and remembering the faces of the people it sees. [David] makes a very interesting proclamation: he’s trying to teach robots empathy. He feels that there is a mountain of R&D money going into robots that can kill and not much for those that can sense human emotions. His hope is that if we can teach empathy, we might not be annihilated when robots become smarter than us.

That’s not such a bad idea. Any way you look at it, this talk is interesting and we wish the five-minute offering was five-times as long. But [Mr. Hanson’s] facial hair alone is worth clicking through to see.

Continue reading “Robot Einstein Could Save Humans From Killbot Destruction”

Pulse, The Emotional Visualization Organism


[Markus Kison] built a device called Pulse, which is part art installation and part data visualization tool. What the emotional visualization organism called Pulse actually does is scan new posts on Blogger.com blogs for synonyms of keywords related to 24 distinct emotions from eight emotional groups. A red cone in the center expands when keywords are detected, in effect acting as a mood indicator for Blogger.com blogs.

The 24 distinct emotions are based on [Robert Plutchik]’s psychoevolutionary theory of emotion, and the device itself is built from a glass case, various servo motors, and custom controller for the servos. This is a compelling idea, but we wonder whether it scans for modifying words or just the keywords alone. It wouldn’t make a lot of sense to have the sadness region expand drastically if many people simultaneously post the sentence “I’m not sad at all.” Video embedded after the break.

Continue reading “Pulse, The Emotional Visualization Organism”