3D Printed Robotic Arms for Sign Language

A team of students in Antwerp, Belgium are responsible for Project Aslan, which is exploring the feasibility of using 3D printed robotic arms for assisting with and translating sign language. The idea came from the fact that sign language translators are few and far between, and it’s a task that robots may be able to help with. In addition to translation, robots may be able to assist with teaching sign language as well.

The project set out to use 3D printing and other technology to explore whether low-cost robotic signing could be of any use. So far the team has an arm that can convert text into finger spelling and counting. It’s an interesting use for a robotic arm; signing is an application for which range of motion is important, but there is no real need to carry or move any payloads whatsoever.

Closeup of hand actuators and design. Click to enlarge.

A single articulated hand is a good proof of concept, and these early results show some promise and potential but there is still a long ways to go. Sign language involves more than just hands. It is performed using both hands, arms and shoulders, and incorporates motions and facial expressions. Also, the majority of sign language is not finger spelling (reserved primarily for proper names or specific nouns) but a robot hand that is able to finger spell is an important first step to everything else.

Future directions for the project include adding a second arm, adding expressiveness, and exploring the use of cameras for the teaching of new signs. The ability to teach different signs is important, because any project that aims to act as a translator or facilitator needs the ability to learn and update. There is a lot of diversity in sign languages across the world. For people unfamiliar with signing, it may come as a surprise that — for example — not only is American Sign Language (ASL) related to French sign language, but both are entirely different from British Sign Language (BSL). A video of the project is embedded below.

Continue reading “3D Printed Robotic Arms for Sign Language”

Speech to Sign Language

According to the World Federation of the Deaf, there are around 70 million people worldwide whose first language is some kind of sign language. In the US, ASL (American Sign Language) speakers number from five hundred thousand to two million. If you go to Google translate, though, there’s no option for sign language.

[Alex Foley] and friends decided to do something about that. They were attending McHack (a hackathon at McGill University) and decided to convert speech into sign language. They thought they were prepared, but it turns out they had to work a few things out on the fly. (Isn’t that always the case?) But in the end, they prevailed, as you can see in the video below.

Continue reading “Speech to Sign Language”

Hackaday Prize Entry: Hands|On Gloves Speaks Sign Language

The Hands|On glove looks like it’s a PowerGlove replacement, but it’s a lot more and a lot better. (Which is not to say that the Power Glove wasn’t cool. It was bad.) And it has to be — the task that it’s tackling isn’t playing stripped-down video games, but instead reading out loud the user’s sign-language gestures so that people who don’t understand sign can understand those who do.

The glove needs a lot of sensor data to accurately interpret the user’s gestures, and the Hands|On doesn’t disappoint. Multiple flex sensors are attached to each finger, so that the glove can tell which joints are bent. Some fingers have capacitive touch pads on them so that the glove can know when two fingers are touching each other, which is important in the US sign alphabet. Finally, the glove has a nine degree-of-freedom inertial measurement unit (IMU) so that it can keep track of pitch, yaw, and roll as well as the hand’s orientation.

In short, the glove takes in a lot of data. This data is cleaned up and analyzed in a Teensy 3.2 board, and sent off over Bluetooth to its final destination. There’s a lot of work done (and some still to be done) on the software side as well. Have a read through the project’s report (PDF) if you’re interested in support vector machines for sign classification.

Sign language is most deaf folks’ native language, and it’s a shame that the hearing community can’t understand it directly. Breaking down that barrier is a great idea, and it makes a great entry in the Hackaday Prize!

Electronic Glove Detects Sign Language

A team of Cornell students recently built a prototype electronic glove that can detect sign language and speak the characters out loud. The glove is designed to work with a variety of hand sizes, but currently only fits on the right hand.

The glove uses several different sensors to detect hand motion and position. Perhaps the most obvious are the flex sensors that cover each finger. These sensors can detect how each finger is bent by changing the resistance according to the degree of the bend. The glove also contains an MPU-6050 3-axis accelerometer and gyroscope. This sensor can detect the hand’s orientation as well as rotational movement.

While the more high-tech sensors are used to detect most characters, there are a few letters that are similar enough to trick the system. Specifically, they had trouble with the letters R, U, and V. To get around this, the students strategically placed copper tape in several locations on the fingers. When two pieces of tape come together, it closes a circuit and acts as a momentary switch.

The sensor data is collected by an ATmega1284p microcontroller and is then compiled into a packet. This packet gets sent to a PC which then does the heavy processing. The system uses a machine learning algorithm. The user can train the it by gesturing for each letter of the alphabet multiple times. The system will collect all of this data and store it into a data set that can then be used for detection.

This is a great project to take on. If you need more inspiration there’s a lot to be found, including another Cornell project that speaks the letters you sign, as well as this one which straps all needed parts to your forearm.
Continue reading “Electronic Glove Detects Sign Language”

Hackaday Links: Sunday, August 4th, 2013

hackaday-links-chain

[Craig Turner] shows that simplicity can be surprisingly interesting. He connected up different colors of blinking LEDs in a grid. There’s no controller, but the startup voltage differences between colors make for some neat patterns with zero effort.

Remember the 3D printed gun? How about a 3D printed rifle! [Thanks Anonymous via Reason]

While we’re on the topic of 3D printing, here’s a design to straighten out your filament.

It takes four really big propellers to get an ostrich off the ground. This quadcopter’s a bit too feathery for us, but we still couldn’t stop laughing.

This Kinect sign language translator looks pretty amazing. It puts the Kinect on a motorized gimbal so that it can better follow the signer. We just had a bit of trouble with translation since the sound and text are both in Hebrew. This probably should have been a standalone feature otherwise.

Work smarter, not harder with this internal combustion wheelbarrow. [via Adafruit]

Sign and speak glove

This wire covered glove is capable of turning your hand gestures to speech, and it does so wirelessly. The wide range of sensors include nine flex sensors, four contact sensors, and an accelerometer. The flex sensors do most of the work, monitoring the alignment of the wearer’s finger joints. The contact sensors augment the flex sensor data, helping to differentiate between letters that have similar finger positions. The accelerometer is responsible for decoding movements that go along with the hand positions. They combine to detect all of the letters in the American Sign Language alphabet.

An ATmega644 monitors all of the sensors, and pushes data out through a wireless transmitter. MATLAB is responsible for collecting the data which is coming in over the wireless link. It saves it for later analysis using a Java program. Once the motions have been decoded into letters, they are assembled into sentences and fed into a text-to-speech program.

You’ve probably already guess that there’s a demo video after the break.

Continue reading “Sign and speak glove”

From sign language to spoken language

As part of a senior design project for a biomedical engineering class [Kendall Lowrey] worked in a team to develop a device that translates American Sign Language into spoken English. Wanting to eclipse glove-based devices that came before them, the team set out to move away from strictly spelling words, to combining sign with common gesture. The project is based around an Arduino Mega and is limited to the alphabet and about ten words because of the initial programming space restraints. When the five flex sensors and three accelerometer values register an at-rest state for two seconds the device takes a reading and looks up the most likely word or letter in a table. It then outputs that to a voicebox shield to translate the words or letters into phonetic sounds.