3D Printed Robotic Arms For Sign Language

A team of students in Antwerp, Belgium are responsible for Project Aslan, which is exploring the feasibility of using 3D printed robotic arms for assisting with and translating sign language. The idea came from the fact that sign language translators are few and far between, and it’s a task that robots may be able to help with. In addition to translation, robots may be able to assist with teaching sign language as well.

The project set out to use 3D printing and other technology to explore whether low-cost robotic signing could be of any use. So far the team has an arm that can convert text into finger spelling and counting. It’s an interesting use for a robotic arm; signing is an application for which range of motion is important, but there is no real need to carry or move any payloads whatsoever.

Closeup of hand actuators and design. Click to enlarge.

A single articulated hand is a good proof of concept, and these early results show some promise and potential but there is still a long ways to go. Sign language involves more than just hands. It is performed using both hands, arms and shoulders, and incorporates motions and facial expressions. Also, the majority of sign language is not finger spelling (reserved primarily for proper names or specific nouns) but a robot hand that is able to finger spell is an important first step to everything else.

Future directions for the project include adding a second arm, adding expressiveness, and exploring the use of cameras for the teaching of new signs. The ability to teach different signs is important, because any project that aims to act as a translator or facilitator needs the ability to learn and update. There is a lot of diversity in sign languages across the world. For people unfamiliar with signing, it may come as a surprise that — for example — not only is American Sign Language (ASL) related to French sign language, but both are entirely different from British Sign Language (BSL). A video of the project is embedded below.

Continue reading “3D Printed Robotic Arms For Sign Language”

Robot Hand Goes Wireless

We can’t decide if [MertArduino’s] robotic hand project is more art or demonstration project. The construction using springs, fishing line, and servo motors isn’t going to give you a practical hand that could grip or manipulate anything significant. However, the project shows off a lot of interesting construction techniques and is a fun demonstration for using nRF24L01 wireless in a project. You can see a video of the contraption, below.

A glove uses homemade flex sensors to send wireless commands to the hand. Another Arduino drives an array of servo motors that make the fingers flex. You don’t get fine control, nor any real grip strength, but the hand more or less will duplicate your movements. We noticed one finger seemed poorly controlled, but we suspect that was one of the homemade flex sensors going rouge.

Continue reading “Robot Hand Goes Wireless”

Hackaday Prize Entry: Open-Source Myoelectric Hand Prosthesis

Hands can grab things, build things, communicate, and we control them intuitively with nothing more than a thought. To those who miss a hand, a prosthesis can be a life-changing tool for carrying out daily tasks. We are delighted to see that [Alvaro Villoslada] joined the Hackaday Prize with his contribution to advanced prosthesis technology: Dextra, the open-source myoelectric hand prosthesis.

dextra_handDextra is an advanced robotic hand, with 4 independently actuated fingers and a thumb with an additional degree of freedom. Because Dextra is designed as a self-contained unit, all actuators had to be embedded into the hand. [Alvaro] achieved the necessary level of miniaturization with five tiny winches, driven by micro gear motors. Each of them pulls a tendon that actuates the corresponding finger. Magnetic encoders on the motor shafts provide position feedback to a Teensy 3.1, which orchestrates all the fingers. The rotational axis of the thumb is actuated by a small RC servo.

mumai_boardIn addition to the robotic hand, [Alvaro] is developing his own electromyographic (EMG) interface, the Mumai, which allows a user to control a robotic prosthesis through tiny muscle contractions in the residual limb. Just like Dextra, Mumai is open-source. It consists of a pair of skin electrodes and an acquisition board. The electrodes are attached to the muscle, and the acquisition board translates the electrical activity of the muscle into an analog voltage. This raw EMG signal is then sampled and analyzed by a microcontroller, such as the ESP8266. The microcontroller then determines the intent of the user based on pattern recognition. Eventually this control data is used to control a robotic prosthesis, such as the Dextra. The current progress of both projects is impressive. You can check out a video of Dextra below.

Continue reading “Hackaday Prize Entry: Open-Source Myoelectric Hand Prosthesis”

Pipe In (Robot) Hand

How do you make a robot hand? If you are [Robimek], you start with some plastic spiral tubing, some servo motors, and some fishing line. Oh, and you also need an old glove.

The spiral tubing (or pipe, if you prefer) is cut in a hand-like shape and fused together with adhesive. The knuckle joints are cut out to allow the tubing to flex at that point. The fishing line connects the fingertips to the servo motors.

The project uses an Arduino to drive the servos, although you could do the job with any microcontroller. Winding up the fishing line contracts the associated finger. Reeling it out lets the springy plastic pipe pull back to its original position.The glove covers the pipes and adds a realistic look to the hand.
Continue reading “Pipe In (Robot) Hand”

What Could You Do With 7 Fingers?

7 finger robotic glove

A strange thought yes, but MIT researchers think an extra two digits could really make a difference in many people’s lives. And as it turns out, having an extra robotic grasp allows you to do quite a few things single handed.

The extra two fingers provide three degrees of freedom each, and are mounted off the user’s wrist. A series of position recording sensors attached to the glove provide feedback to the system in order to control the fingers naturally, just by using your hand normally.

They taught the algorithm that controls the fingers by trying to pick up different (large) items using the hand and manually positioning the fingers. What they discovered is almost every grasp could be demonstrated as a combination of only 2-3 grip patterns.  Continue reading “What Could You Do With 7 Fingers?”