Hackaday Prize Entry: Hands|On Gloves Speaks Sign Language

The Hands|On glove looks like it’s a PowerGlove replacement, but it’s a lot more and a lot better. (Which is not to say that the Power Glove wasn’t cool. It was bad.) And it has to be — the task that it’s tackling isn’t playing stripped-down video games, but instead reading out loud the user’s sign-language gestures so that people who don’t understand sign can understand those who do.

The glove needs a lot of sensor data to accurately interpret the user’s gestures, and the Hands|On doesn’t disappoint. Multiple flex sensors are attached to each finger, so that the glove can tell which joints are bent. Some fingers have capacitive touch pads on them so that the glove can know when two fingers are touching each other, which is important in the US sign alphabet. Finally, the glove has a nine degree-of-freedom inertial measurement unit (IMU) so that it can keep track of pitch, yaw, and roll as well as the hand’s orientation.

In short, the glove takes in a lot of data. This data is cleaned up and analyzed in a Teensy 3.2 board, and sent off over Bluetooth to its final destination. There’s a lot of work done (and some still to be done) on the software side as well. Have a read through the project’s report (PDF) if you’re interested in support vector machines for sign classification.

Sign language is most deaf folks’ native language, and it’s a shame that the hearing community can’t understand it directly. Breaking down that barrier is a great idea, and it makes a great entry in the Hackaday Prize!

ASL Glove

Electronic Glove Detects Sign Language

A team of Cornell students recently built a prototype electronic glove that can detect sign language and speak the characters out loud. The glove is designed to work with a variety of hand sizes, but currently only fits on the right hand.

The glove uses several different sensors to detect hand motion and position. Perhaps the most obvious are the flex sensors that cover each finger. These sensors can detect how each finger is bent by changing the resistance according to the degree of the bend. The glove also contains an MPU-6050 3-axis accelerometer and gyroscope. This sensor can detect the hand’s orientation as well as rotational movement.

While the more high-tech sensors are used to detect most characters, there are a few letters that are similar enough to trick the system. Specifically, they had trouble with the letters R, U, and V. To get around this, the students strategically placed copper tape in several locations on the fingers. When two pieces of tape come together, it closes a circuit and acts as a momentary switch.

The sensor data is collected by an ATmega1284p microcontroller and is then compiled into a packet. This packet gets sent to a PC which then does the heavy processing. The system uses a machine learning algorithm. The user can train the it by gesturing for each letter of the alphabet multiple times. The system will collect all of this data and store it into a data set that can then be used for detection.

This is a great project to take on. If you need more inspiration there’s a lot to be found, including another Cornell project that speaks the letters you sign, as well as this one which straps all needed parts to your forearm.
Continue reading “Electronic Glove Detects Sign Language”

Hackaday Links: Sunday, August 4th, 2013

hackaday-links-chain

[Craig Turner] shows that simplicity can be surprisingly interesting. He connected up different colors of blinking LEDs in a grid. There’s no controller, but the startup voltage differences between colors make for some neat patterns with zero effort.

Remember the 3D printed gun? How about a 3D printed rifle! [Thanks Anonymous via Reason]

While we’re on the topic of 3D printing, here’s a design to straighten out your filament.

It takes four really big propellers to get an ostrich off the ground. This quadcopter’s a bit too feathery for us, but we still couldn’t stop laughing.

This Kinect sign language translator looks pretty amazing. It puts the Kinect on a motorized gimbal so that it can better follow the signer. We just had a bit of trouble with translation since the sound and text are both in Hebrew. This probably should have been a standalone feature otherwise.

Work smarter, not harder with this internal combustion wheelbarrow. [via Adafruit]

Sign And Speak Glove

This wire covered glove is capable of turning your hand gestures to speech, and it does so wirelessly. The wide range of sensors include nine flex sensors, four contact sensors, and an accelerometer. The flex sensors do most of the work, monitoring the alignment of the wearer’s finger joints. The contact sensors augment the flex sensor data, helping to differentiate between letters that have similar finger positions. The accelerometer is responsible for decoding movements that go along with the hand positions. They combine to detect all of the letters in the American Sign Language alphabet.

An ATmega644 monitors all of the sensors, and pushes data out through a wireless transmitter. MATLAB is responsible for collecting the data which is coming in over the wireless link. It saves it for later analysis using a Java program. Once the motions have been decoded into letters, they are assembled into sentences and fed into a text-to-speech program.

You’ve probably already guess that there’s a demo video after the break.

Continue reading “Sign And Speak Glove”

From Sign Language To Spoken Language

As part of a senior design project for a biomedical engineering class [Kendall Lowrey] worked in a team to develop a device that translates American Sign Language into spoken English. Wanting to eclipse glove-based devices that came before them, the team set out to move away from strictly spelling words, to combining sign with common gesture. The project is based around an Arduino Mega and is limited to the alphabet and about ten words because of the initial programming space restraints. When the five flex sensors and three accelerometer values register an at-rest state for two seconds the device takes a reading and looks up the most likely word or letter in a table. It then outputs that to a voicebox shield to translate the words or letters into phonetic sounds.

More Glove-based Interfaces

You may remember seeing the golf glove air guitar hack last month. Here’s two more uses for gloves with sensors on them.

On the left is a glove interface with flex sensors on each digit as well as an accelerometer. The VEX module reads the sensors to detect sign language as a command set. A shake of the hand is picked up by an accelerometer to delineate between different command sets. See it controlling a little robot after the break. This comes from [Amnon Demri] who was also involved in the EMG prosthesis.

Straight out of Cornell we have the SudoGlove, seen on the right. [Jeremy Blum] and his fellow engineering students bring together a mess of different sensors, sourcing an Arduino and a XBee module to control a small RC car with added lights and a siren. There’s embedded video after the break. You may want to jump past the music video for the description that starts at about 3:52.

Continue reading “More Glove-based Interfaces”