Sign language can like any language be difficult to learn if you’re not immersed in it, or at least learning from someone who is fluent. It’s not easy to know when you’re making minor mistakes or missing nuances. It’s a medium with its own unique issues when learning, so if you want to learn and don’t have access to someone who knows the language you might want to reach for the next best thing: a machine that can teach you.
This project comes from three of [Bruce Land]’s senior electrical and computer engineering students, [Alicia], [Raul], and [Kerry], as part of their final design class at Cornell University. Someone who wishes to learn the sign language alphabet slips on a glove outfitted with position sensors for each finger. A computer inside the device shows each letter’s proper sign on a screen, and then checks the sensors from the glove to ensure that the hand is in the proper position. Two letters include making a gesture as well, and the device is able to track this by use of a gyroscope and compass to ensure that the letter has been properly signed. It appears to only cover the alphabet and not a wider vocabulary, but as a proof of concept it is very effective.
The students show that it is entirely possible to learn the alphabet reliably using the machine as a teaching tool. This type of technology could be useful for other applications as well, such as gesture recognition for a human interface device. If you want to see more of these interesting and well-referenced senior design builds we’ve featured quite a few, from polygraph machines to a sonar system for a bicycle.
Continue reading “This Machine Teaches Sign Language”
One of the interesting benefits of the 3D printing revolution is the dramatic increase in availability of prosthetics for people with virtually any need. With a little bit of research, a 3D printer, and some trial and error, virtually anyone can build a prototype prosthetic to fit them specifically rather than spend thousands of dollars for one from a medical professional. [Dominick Scalise] is attempting to flesh out this idea with a prosthetic hand that he hopes will be a useful prosthetic in itself, but also a platform for others to build on or take ideas from.
His hand is explained in great detail in a series of videos on YouTube. The idea that sets this prosthetic apart from others, however, is its impressive configurability while not relying on servos or other electronics to control the device. The wearer would use their other hand to set the dexterity hand up for whatever task they need to perform, and then perform that task. Its versatility is thanks to a unique style of locks and tensioners which allow the hand to be positioned in various ways, and then squeezed to operate the hand. It seems like a skilled user can configure the hand rapidly, although they must have a way to squeeze the hand to operate it, or someone will need to develop an interface of some sort for people without needing to squeeze it.
To that end, the files for making your own hand are available on Thingiverse. [Dominick] hopes that his project will spark some collaboration and development, using this hand as a basis for building other low-cost 3D printed prosthetics. There are many good ideas from this project that could translate well into other areas of prosthetics, and putting it all out there will hopefully spur more growth in this area. We’ve already seen similar-looking hands that have different methods of actuation, and both projects could benefit from sharing ideas with each other.
Thanks to [mmemetea] for the tip!
Continue reading “Dexterity Hand is a Configurable Prosthetic Hand”
Behold the wondrous complexity of the human hand. Twenty-seven bones working in concert with muscles, tendons, and ligaments extending up the forearm to produce a range of motions that gave us everything from stone tools to symphonies. Our hands are what we use to interface with the physical world on a fine level, and it’s understandable that we’d want mechanical versions of ourselves to include hands that were similarly dexterous.
That’s a tall order to fill, but this biomimetic mechatronic hand is a pretty impressive step in that direction. It’s [Will Cogley]’s third-year university design project, which he summarizes in the first video below. There are two parts to this project; the mechanical hand itself and the motion-capture glove to control it, both of which we find equally fascinating. The control glove is covered with 3D-printed sensors for each joint in the hand. He uses SMD potentiometers to measure joint angles, with some difficulty due to breakage of the solder joints; perhaps he could solve that with finer wires and better strain relief.
The hand that the glove controls is a marvel of design, like something on the end of a Hollywood android’s arm. Each finger joint is operated by a servo in the forearm pulling on cables; the joints are returned to the neutral position by springs. The hand is capable of multiple grip styles and responds fairly well to the control glove inputs, although there is some jitter in the sensors for some joints.
The second video below gives a much more detailed overview of the project and shows how [Will]’s design has evolved and where it’s going. Anthropomorphic hands are far from rare projects hereabouts, but we’d say this one has a lot going for it.
Continue reading “Mechatronic Hand Mimics Human Anatomy to Achieve Dexterity”
For all those who have complained about Rubik’s Cube solving robots in the past by dismissing purpose-built rigs that hold the cube in a non-anthropomorphic manner: checkmate.
The video below shows not only that a robot can solve the classic puzzle with mechanical hands, but it can also do it with just one of them – and that with only three fingers. The [Yamakawa] lab at the University of Tokyo built the high-speed manipulator to explore the kinds of fine motions that humans perform without even thinking about them. Their hand, guided by a 500-fps machine vision system, uses two opposing fingers to grip the lower part of the cube while using the other finger to flick the top face of the cube counterclockwise. The entire cube can also be rotated on the vertical axis, or flipped 90° at a time. Piecing these moves together lets the hand solve the cube with impressive speed; extra points for the little, “How’s that, human?” flick at the end.
It might not be the fastest cube solver, or one that’s built right into the cube itself, but there’s something about the dexterity of this hand that we really appreciate.
Continue reading “Robot Solves Rubik’s Cube With One Hand Tied Behind Its Back”
Unless you are in the fields of robotics or prosthetics, you likely take for granted the fine motor skills our hands have. Picking up and using a pen is no small feat for a robot which doesn’t have a dedicated pen-grabbing apparatus. Holding a mobile phone with the same gripper is equally daunting, not to mention moving that phone around once it has been grasped. Part of the wonder of our hands is the shape and texture which allows pens and phones to slide around at one moment, and hold fast the next moment. Yale’s Grab Lab has built a gripper which starts to solve that problem by changing the friction of the manipulators.
A spring-loaded set of slats with a low-friction surface allow a held object to move freely, but when more pressure is exerted by the robot, the slats retract and a high-friction surface contacts the object. This is similar to our fingers with their round surfaces. When we brush our hands over something lightly, they graze the surface but when we hold tight, our soft flesh meets the surface of the object and we can hold tightly. The Grab Lab is doing a great job demonstrating the solution and taking steps to more capable robots. All hail Skynet.
We have no shortage of gripper designs to choose from, including pneumatic silicone and one that conforms to an object’s surface, similar to our hands.
Continue reading “Greasing Robot Hands: Variable Friction Makes Robo-Mitts More Like Our Own”
Sonar measures distance by emitting a sound and clocking how long it takes the sound to travel. This works in any medium capable of transmitting sound such as water, air, or in the case of FingerPing, flesh and bone. FingerPing is a project at Georgia Tech headed by [Cheng Zhang] which measures hand position by sending soundwaves through the thumb and measuring the time on four different receivers. These readings tell which bones the sound travels through and allow the device to figure out where the thumb is touching. Hand positions like this include American Sign Language one through ten.
From the perspective of discreetly one through ten on a mobile device, this opens up a lot of possibilities for computer input while remaining pretty unobtrusive. We see prototypes which are more capable of reading gestures but also draw attention if you wear them on a bus. It is a classic trade-off between convenience and function but this type of reading is unique and could combine with other bio signals for finer results.
Continue reading “Sonar in Your Hand”
A helping hand goes a long way to accomplishing a task. Sometimes that comes in the form of a friend, and sometimes it’s a pair of robotic hands attached to your arm.
Italian startup [Youbionic] have developed this pair of 3D printed hands which aim to extend the user’s multi-tasking capacity. Strapped to the forearm and extending past the user’s natural hand, they are individually operated by flexing either the index or ring fingers. This motion is picked up by a pair of flex sensor strips — a sharp movement will close the fist, while a slower shift will close it halfway.
At present, the hands are limited in their use — they are fixed to the mounting plate and so are restricted to gripping tasks, but with a bit of practice could end up being quite handy. Check out the video of them in action after the break!
Continue reading “Need A Hand? How About Two?”