Sign language is a language that uses the position and motion of the hands in place of sounds made by the vocal tract. If one could readily capture those hand positions and movements, one could theoretically digitize and translate that language. [ayooluwa98] built a set of sensor gloves to do just that.
The brains of the operation is an Arduino Nano. It’s hooked up to a series of flex sensors woven into the gloves, along with an accelerometer. The flex sensors detect the bending of the fingers and the gestures being made, while the accelerometer captures the movements of the hand. The Arduino then interprets these sensor signals in order to match the user’s movements up with a pre-stored list of valid signs. It can then transmit out the detected language via a Bluetooth module, where it is passed to an Android phone for translation via text-to-speech software.
The idea of capturing sign language via hand tracking is a compelling one; we’ve seen similar projects before, too. Meanwhile, if you’re working on your own accessibility projects, be sure to drop us a line!